Skip to content

[Snyk] Security upgrade org.apache.spark:spark-sql_2.12 from 2.4.4 to 3.0.0#82

Open
snyk-io[bot] wants to merge 1 commit intomasterfrom
snyk-fix-9f22fc47546ef8b5ee29e971dc51621f
Open

[Snyk] Security upgrade org.apache.spark:spark-sql_2.12 from 2.4.4 to 3.0.0#82
snyk-io[bot] wants to merge 1 commit intomasterfrom
snyk-fix-9f22fc47546ef8b5ee29e971dc51621f

Conversation

@snyk-io
Copy link

@snyk-io snyk-io bot commented Mar 1, 2026

snyk-top-banner

Snyk has created this PR to fix 1 vulnerabilities in the maven dependencies of this project.

Snyk changed the following file(s):

  • pom.xml

Vulnerabilities that will be fixed with an upgrade:

Issue Score Upgrade
high severity Allocation of Resources Without Limits or Throttling
SNYK-JAVA-COMFASTERXMLJACKSONCORE-15365924
  170   org.apache.spark:spark-sql_2.12:
2.4.4 -> 3.0.0
Major version upgrade No Path Found Proof of Concept

Breaking Change Risk

Merge Risk: High

Notice: This assessment is enhanced by AI.


Important

  • Check the changes in this PR to ensure they won't cause issues with your project.
  • Max score is 1000. Note that the real score may have changed since the PR was raised.
  • This PR was automatically created by Snyk using the credentials of a real user.

Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open fix PRs.

For more information:
🧐 View latest project report
📜 Customise PR templates
🛠 Adjust project settings
📚 Read about Snyk's upgrade logic


Learn how to fix vulnerabilities with free interactive lessons:

🦉 Allocation of Resources Without Limits or Throttling

@snyk-io
Copy link
Author

snyk-io bot commented Mar 1, 2026

Merge Risk: High

This is a major version upgrade from Spark 2.4 to 3.0, which introduces significant breaking changes and requires code and environment updates.

Key Breaking Changes:

  • Scala 2.12 Required: Spark 3.0 drops support for Scala 2.11 and now requires Scala 2.12. All applications and their dependencies must be recompiled with Scala 2.12.
  • Date/Time Parsing: The calendar system and date/timestamp parsing rules have changed. This can lead to SparkUpgradeException for previously valid date/time patterns. To maintain the old behavior, you can set spark.sql.legacy.timeParserPolicy to LEGACY.
  • API Deprecations: The RDD-based spark.mllib API is deprecated in favor of the DataFrame-based spark.ml API. Additionally, UserDefinedAggregateFunction (UDAF) is deprecated.
  • ANSI SQL Compliance: Spark 3.0 enforces stricter ANSI SQL compliance. For example, operations like division by zero will now throw an exception instead of returning null.
  • Hive Version Upgrade: The built-in Hive execution version has been upgraded from 1.2.1 to 2.3.7, which may impact custom Hive UDFs or integrations.

Recommendation: This upgrade requires significant developer action. You must recompile your application and all dependencies using Scala 2.12. Thoroughly test your SQL queries, date/timestamp transformations, and any code using deprecated APIs to ensure compatibility with the new behaviors.

Source: Apache Spark 3.0 Migration Guide

Notice 🤖: This content was augmented using artificial intelligence. AI-generated content may contain errors and should be reviewed for accuracy before use.

@snyk-io
Copy link
Author

snyk-io bot commented Mar 1, 2026

Snyk checks have failed. 1 issues have been found so far.

Status Scanner Critical High Medium Low Total (1)
Open Source Security 0 1 0 0 1 issues
Licenses 0 0 0 0 0 issues

💻 Catch issues earlier using the plugins for VS Code, JetBrains IDEs, Visual Studio, and Eclipse.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

0 participants