Skip to content

[Snyk] Security upgrade org.apache.spark:spark-core_2.12 from 3.5.1 to 3.5.7#100

Open
snyk-io[bot] wants to merge 8 commits intomasterfrom
snyk-fix-dc07eb8195768e8483d560230e9b0f9c
Open

[Snyk] Security upgrade org.apache.spark:spark-core_2.12 from 3.5.1 to 3.5.7#100
snyk-io[bot] wants to merge 8 commits intomasterfrom
snyk-fix-dc07eb8195768e8483d560230e9b0f9c

Conversation

@snyk-io
Copy link

@snyk-io snyk-io bot commented Mar 15, 2026

snyk-top-banner

Snyk has created this PR to fix 1 vulnerabilities in the maven dependencies of this project.

Snyk changed the following file(s):

  • pom.xml

Vulnerabilities that will be fixed with an upgrade:

Issue Score Upgrade
high severity Deserialization of Untrusted Data
SNYK-JAVA-ORGAPACHESPARK-15623151
  243   org.apache.spark:spark-core_2.12:
3.5.1 -> 3.5.7
No Path Found No Known Exploit

Breaking Change Risk

Merge Risk: Medium

Notice: This assessment is enhanced by AI.


Important

  • Check the changes in this PR to ensure they won't cause issues with your project.
  • Max score is 1000. Note that the real score may have changed since the PR was raised.
  • This PR was automatically created by Snyk using the credentials of a real user.

Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open fix PRs.

For more information:
🧐 View latest project report
📜 Customise PR templates
🛠 Adjust project settings
📚 Read about Snyk's upgrade logic


Learn how to fix vulnerabilities with free interactive lessons:

🦉 Deserialization of Untrusted Data

@snyk-io
Copy link
Author

snyk-io bot commented Mar 15, 2026

Merge Risk: Medium

This is a patch upgrade for Apache Spark from version 3.5.1 to 3.5.7. While these are maintenance releases focused on security and correctness fixes, there is one notable behavioral change that warrants attention.

Behavioral Change in Spark 3.5.4:

  • Since version 3.5.4, when reading SQL tables, certain exceptions (org.apache.hadoop.security.AccessControlException and org.apache.hadoop.hdfs.BlockMissingException) will now cause the task to fail. Previously, these errors might have been ignored if the configuration spark.sql.files.ignoreCorruptFiles was set to true.

Recommendation:
Verify any data processing jobs that rely on ignoring corrupt files, as they may now fail if they encounter these specific exceptions. Otherwise, the upgrade consists of bug fixes, security patches, and minor dependency updates and should be safe to apply.

Source: Spark 3.5 Migration Guide

Notice 🤖: This content was augmented using artificial intelligence. AI-generated content may contain errors and should be reviewed for accuracy before use.

@snyk-io
Copy link
Author

snyk-io bot commented Mar 15, 2026

Snyk checks have passed. No issues have been found so far.

Status Scan Engine Critical High Medium Low Total (0)
Open Source Security 0 0 0 0 0 issues
Licenses 0 0 0 0 0 issues

💻 Catch issues earlier using the plugins for VS Code, JetBrains IDEs, Visual Studio, and Eclipse.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

0 participants