Skip to content

[Snyk] Security upgrade org.apache.spark:spark-core_2.12 from 3.4.3 to 3.5.7#103

Open
snyk-io[bot] wants to merge 1 commit intomasterfrom
snyk-fix-839318e58e682572a82498b1754f35b0
Open

[Snyk] Security upgrade org.apache.spark:spark-core_2.12 from 3.4.3 to 3.5.7#103
snyk-io[bot] wants to merge 1 commit intomasterfrom
snyk-fix-839318e58e682572a82498b1754f35b0

Conversation

@snyk-io
Copy link

@snyk-io snyk-io bot commented Mar 15, 2026

snyk-top-banner

Snyk has created this PR to fix 1 vulnerabilities in the maven dependencies of this project.

Snyk changed the following file(s):

  • pom.xml

Vulnerabilities that will be fixed with an upgrade:

Issue Score Upgrade
high severity Deserialization of Untrusted Data
SNYK-JAVA-ORGAPACHESPARK-15623151
  243   org.apache.spark:spark-core_2.12:
3.4.3 -> 3.5.7
No Path Found No Known Exploit

Breaking Change Risk

Merge Risk: High

Notice: This assessment is enhanced by AI.


Important

  • Check the changes in this PR to ensure they won't cause issues with your project.
  • Max score is 1000. Note that the real score may have changed since the PR was raised.
  • This PR was automatically created by Snyk using the credentials of a real user.

Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open fix PRs.

For more information:
🧐 View latest project report
📜 Customise PR templates
🛠 Adjust project settings
📚 Read about Snyk's upgrade logic


Learn how to fix vulnerabilities with free interactive lessons:

🦉 Deserialization of Untrusted Data

@snyk-io
Copy link
Author

snyk-io bot commented Mar 15, 2026

Merge Risk: High

This upgrade from Spark 3.4.3 to 3.5.7, while a minor version change, introduces several breaking API and behavioral changes that require developer action and verification.

API and Behavioral Changes:

  • API Change: The Row.json and Row.prettyJson methods have been moved to ToJsonUtil. Code using these methods must be updated to call them from the new utility class. [7]
  • Behavioral Change: The array_insert SQL function is now 1-based for negative indexes (e.g., an index of -1 now inserts at the end of the array), which may alter existing logic. [7]
  • Behavioral Change: Several JDBC options related to predicate pushdown (pushDownAggregate, pushDownLimit, etc.) are now enabled by default. This could change the behavior and performance of queries against JDBC data sources. [7]
  • Configuration Change: YARN-specific configuration properties spark.yarn.executor.failuresValidityInterval and spark.yarn.max.executor.failures have been deprecated in favor of the more general spark.executor.failuresValidityInterval and spark.executor.maxNumFailures respectively. [4]

Environment Changes:

  • Upgrading to Spark 3.5 may require a Java runtime update (e.g., to Java 17), which should be verified against your environment. [10]

Recommendation: Developers should review code for usage of Row.json and Row.prettyJson and update it. It is also critical to validate SQL logic that uses the array_insert function or relies on JDBC data sources. Finally, confirm your environment's Java version is compatible and update deprecated YARN configurations.

Source: Spark 3.5 Migration Guide

Notice 🤖: This content was augmented using artificial intelligence. AI-generated content may contain errors and should be reviewed for accuracy before use.

@snyk-io
Copy link
Author

snyk-io bot commented Mar 15, 2026

Snyk checks have passed. No issues have been found so far.

Status Scan Engine Critical High Medium Low Total (0)
Open Source Security 0 0 0 0 0 issues
Licenses 0 0 0 0 0 issues

💻 Catch issues earlier using the plugins for VS Code, JetBrains IDEs, Visual Studio, and Eclipse.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

0 participants