Skip to content

[Snyk] Fix for 1 vulnerabilities#96

Open
snyk-io[bot] wants to merge 1 commit intomasterfrom
snyk-fix-16219539493e8ffca4d6dd0d39c7802f
Open

[Snyk] Fix for 1 vulnerabilities#96
snyk-io[bot] wants to merge 1 commit intomasterfrom
snyk-fix-16219539493e8ffca4d6dd0d39c7802f

Conversation

@snyk-io
Copy link

@snyk-io snyk-io bot commented Mar 10, 2026

snyk-top-banner

Snyk has created this PR to fix 1 vulnerabilities in the maven dependencies of this project.

Snyk changed the following file(s):

  • pom.xml

Vulnerabilities that will be fixed with an upgrade:

Issue Score Upgrade
high severity Insertion of Sensitive Information into Log File
SNYK-JAVA-ORGAPACHEZOOKEEPER-15443353
  114   org.apache.hadoop:hadoop-client:
2.10.2 -> 3.0.0
org.apache.hive:hive-jdbc:
2.3.4 -> 4.0.0
org.apache.hive:hive-metastore:
2.3.4 -> 4.0.0
Major version upgrade No Path Found No Known Exploit

Breaking Change Risk

Merge Risk: High

Notice: This assessment is enhanced by AI.

Vulnerabilities that could not be fixed

  • Upgrade:
    • Could not upgrade org.apache.hudi:hudi-common@1.0.0-SNAPSHOT to org.apache.hudi:hudi-common@1.1.0; Reason could not apply upgrade, dependency is managed externally ; Location: provenance does not contain location

Important

  • Check the changes in this PR to ensure they won't cause issues with your project.
  • Max score is 1000. Note that the real score may have changed since the PR was raised.
  • This PR was automatically created by Snyk using the credentials of a real user.

Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open fix PRs.

For more information:
🧐 View latest project report
📜 Customise PR templates
🛠 Adjust project settings
📚 Read about Snyk's upgrade logic


Learn how to fix vulnerabilities with free interactive lessons:

🦉 Insertion of Sensitive Information into Log File

@snyk-io
Copy link
Author

snyk-io bot commented Mar 10, 2026

Merge Risk: High

This set of upgrades includes multiple major version bumps for critical data infrastructure components, introducing significant breaking changes and requiring careful planning and migration efforts.

Top 3 Most Impactful Upgrades

1. org.apache.hive:hive-jdbc 2.3.4 → 4.0.0 (HIGH RISK)
This is a very significant upgrade, jumping two major versions from 2.x to 4.0.0. The Hive 2.x line is now End of Life (EOL).

Breaking Changes & Key Information:

  • Hadoop Dependency: Hive 3.0 and later require at least Hadoop 3.x. This upgrade must be done in conjunction with the hadoop-client upgrade.
  • ACID Table Migration: Before upgrading to Hive 3.0, any ACID tables that have experienced UPDATE, DELETE, or MERGE operations must have a major compaction run on them. Failure to do so can lead to data corruption.
  • Execution Engine: Hive 4.0 deprecates Hive on MapReduce and removes support for Hive on Spark. Workloads should be moved to Apache Tez.
  • API and Feature Changes: The upgrade introduces a vast number of changes across two major versions, including Iceberg integration, improved transaction handling, and Metastore API optimizations.

Recommendation: A thorough migration plan is essential. Verify and compact all ACID tables before the upgrade. All Hive-based applications must be re-tested against the new versions, paying close attention to the execution engine change and any deprecated APIs.

2. org.apache.hadoop:hadoop-client 2.10.2 → 3.0.0 (HIGH RISK)
This is a major version upgrade with several foundational changes.

Breaking Changes & Key Information:

  • Java Version: Hadoop 3.0 requires Java 8.
  • Classpath Isolation: Hadoop 3.0 introduces classpath isolation mechanisms, which can prevent dependency conflicts (like Guava) but may also require changes to how applications are packaged and deployed.
  • API & Dependency Changes: Some logging classes have been changed from commons-logging to slf4j. The library has also upgraded its internal Jackson dependency to 2.7, which may conflict with other dependencies in a project.
  • Configuration and Scripts: Shell scripts have been significantly rewritten, and some default ports have changed.

Recommendation: Validate that the runtime environment uses Java 8 or newer. Review application dependencies for potential conflicts with Hadoop's new dependencies. Test all interactions with HDFS and YARN thoroughly.

3. org.apache.hive:hive-metastore 2.3.4 → 4.0.0 (HIGH RISK)
This upgrade is directly tied to the hive-jdbc upgrade and carries the same risks and dependencies.

Breaking Changes & Key Information:

  • The breaking changes are the same as for hive-jdbc, including the required Hadoop 3.x dependency and the mandatory ACID table compaction before upgrading past 2.x.
  • Hive Metastore in version 4.0 includes API optimizations, support for Thrift over HTTP, and other significant architectural changes.

Recommendation: The migration for the Hive Metastore must be coordinated with the hive-jdbc and hadoop-client upgrades. All services connecting to the Metastore must be validated.

Other Upgrades

  • org.apache.hudi:hudi-common 1.0.0-SNAPSHOT → 1.1.0 (MEDIUM RISK): While a minor version change, the release notes describe it as a significant release. It deprecates the old payload classes in favor of new merge modes and merger APIs, which will require developer action for code that writes Hudi tables. It also adds support for Spark 4.0 and includes performance enhancements.

Notice 🤖: This content was augmented using artificial intelligence. AI-generated content may contain errors and should be reviewed for accuracy before use.

@snyk-io
Copy link
Author

snyk-io bot commented Mar 10, 2026

Snyk checks have failed. 711 issues have been found so far.

Status Scanner Critical High Medium Low Total (711)
Open Source Security 101 595 0 0 696 issues
Licenses 0 15 0 0 15 issues

💻 Catch issues earlier using the plugins for VS Code, JetBrains IDEs, Visual Studio, and Eclipse.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

0 participants