Skip to content

[Snyk] Fix for 1 vulnerabilities#89

Open
snyk-io[bot] wants to merge 3 commits intomasterfrom
snyk-fix-10e6b2d416b0511acac096d90d2b93b6
Open

[Snyk] Fix for 1 vulnerabilities#89
snyk-io[bot] wants to merge 3 commits intomasterfrom
snyk-fix-10e6b2d416b0511acac096d90d2b93b6

Conversation

@snyk-io
Copy link

@snyk-io snyk-io bot commented Mar 10, 2026

snyk-top-banner

Snyk has created this PR to fix 1 vulnerabilities in the maven dependencies of this project.

Snyk changed the following file(s):

  • pom.xml

Vulnerabilities that will be fixed with an upgrade:

Issue Score Upgrade
high severity Insertion of Sensitive Information into Log File
SNYK-JAVA-ORGAPACHEZOOKEEPER-15443353
  115   org.apache.hadoop:hadoop-client:
2.10.2 -> 3.0.0
Major version upgrade No Path Found No Known Exploit

Breaking Change Risk

Merge Risk: High

Notice: This assessment is enhanced by AI.

Vulnerabilities that could not be fixed

  • Upgrade:
    • Could not upgrade org.apache.hudi:hudi-common@1.0.0-SNAPSHOT to org.apache.hudi:hudi-common@1.1.0; Reason could not apply upgrade, dependency is managed externally ; Location: provenance does not contain location

Important

  • Check the changes in this PR to ensure they won't cause issues with your project.
  • Max score is 1000. Note that the real score may have changed since the PR was raised.
  • This PR was automatically created by Snyk using the credentials of a real user.

Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open fix PRs.

For more information:
🧐 View latest project report
📜 Customise PR templates
🛠 Adjust project settings
📚 Read about Snyk's upgrade logic


Learn how to fix vulnerabilities with free interactive lessons:

🦉 Insertion of Sensitive Information into Log File

@snyk-io
Copy link
Author

snyk-io bot commented Mar 10, 2026

Merge Risk: High

This upgrade includes a major version increase for org.apache.hadoop:hadoop-client from 2.x to 3.x and a significant minor update for org.apache.hudi:hudi-common, both introducing high-risk breaking changes that require developer action.

1. org.apache.hadoop:hadoop-client (2.10.2 → 3.0.0)

Risk: HIGH

The upgrade to Hadoop 3.0.0 is a major transition with several breaking changes. While the community aimed to preserve wire compatibility for clients, significant changes to the environment, APIs, and configuration require attention.

Key Breaking Changes:

  • Java Runtime Requirement: Hadoop 3.0.0 requires Java 8, whereas Hadoop 2.x was compatible with Java 7. Applications and environments must be updated to use Java 8.
  • API Changes: The logging facade has been changed. In the hadoop-mapreduce-client-core module, public LOG variables were changed from org.apache.commons.logging.Log to org.slf4j.Logger, which may require import and code updates.
  • Configuration & Port Changes: Default ports for services like the NameNode and ResourceManager have been changed to fall outside the Linux ephemeral port range, which will affect clients with hardcoded port numbers.
  • Classpath Isolation: Hadoop 3.0 enables classpath isolation by default, which can alter how application dependencies are loaded and may require configuration adjustments.
  • Shell Scripts & Environment Variables: Shell scripts have been substantially rewritten, and many environment variables have been renamed (e.g., HADOOP_NAMENODE_OPTS is now HDFS_NAMENODE_OPTS).

Recommendation: Developers must ensure their environment uses Java 8. Review code for dependencies on the previous logging implementation and update any custom scripts or configurations that rely on old port numbers or environment variable names.

Source: Apache Hadoop 3.0.0 Release Notes, Hadoop 2 to 3 Upgrade Guide

2. org.apache.hudi:hudi-common (1.0.0-SNAPSHOT → 1.1.0)

Risk: HIGH

Although a minor version update, Hudi 1.1.0 introduces significant new features and a key API deprecation that constitutes a breaking change.

Key Breaking Changes:

  • Record Merging API Deprecation: The release deprecates the use of payload classes for record merging in favor of new, more flexible merge modes and merger APIs. Code using the old payload mechanism must be migrated to the new API.
  • Dependency Reduction: The dependency on HBase libraries has been removed in favor of a native HFile writer. This reduces the overall package size but could affect systems with custom configurations relying on the transitive HBase dependency.

Recommendation: Developers should review code related to record merging and migrate from the deprecated payload classes to the new merger APIs as specified in the release documentation.

Source: Apache Hudi 1.1 Release Notes

Notice 🤖: This content was augmented using artificial intelligence. AI-generated content may contain errors and should be reviewed for accuracy before use.

@snyk-io
Copy link
Author

snyk-io bot commented Mar 10, 2026

⚠️ Snyk checks are incomplete.

Status Scanner Critical High Medium Low Total (491)
⚠️ Open Source Security 86 405 0 0 See details
⚠️ Licenses 0 0 0 0 See details

💻 Catch issues earlier using the plugins for VS Code, JetBrains IDEs, Visual Studio, and Eclipse.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

0 participants