Skip to content

[Snyk] Fix for 1 vulnerabilities#92

Open
snyk-io[bot] wants to merge 4 commits intomasterfrom
snyk-fix-155bc4e36ac4888ce457c33f03d904a3
Open

[Snyk] Fix for 1 vulnerabilities#92
snyk-io[bot] wants to merge 4 commits intomasterfrom
snyk-fix-155bc4e36ac4888ce457c33f03d904a3

Conversation

@snyk-io
Copy link

@snyk-io snyk-io bot commented Mar 10, 2026

snyk-top-banner

Snyk has created this PR to fix 1 vulnerabilities in the maven dependencies of this project.

Snyk changed the following file(s):

  • pom.xml

Vulnerabilities that will be fixed with an upgrade:

Issue Score Upgrade
high severity Insertion of Sensitive Information into Log File
SNYK-JAVA-ORGAPACHEZOOKEEPER-15443353
  115   org.apache.hive:hive-metastore:
2.3.4 -> 4.0.0
Major version upgrade No Path Found No Known Exploit

Breaking Change Risk

Merge Risk: High

Notice: This assessment is enhanced by AI.

Vulnerabilities that could not be fixed

  • Upgrade:
    • Could not upgrade org.apache.hudi:hudi-client-common@1.0.0-SNAPSHOT to org.apache.hudi:hudi-client-common@1.1.0; Reason could not apply upgrade, dependency is managed externally ; Location: provenance does not contain location

Important

  • Check the changes in this PR to ensure they won't cause issues with your project.
  • Max score is 1000. Note that the real score may have changed since the PR was raised.
  • This PR was automatically created by Snyk using the credentials of a real user.

Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open fix PRs.

For more information:
🧐 View latest project report
📜 Customise PR templates
🛠 Adjust project settings
📚 Read about Snyk's upgrade logic


Learn how to fix vulnerabilities with free interactive lessons:

🦉 Insertion of Sensitive Information into Log File

@snyk-io
Copy link
Author

snyk-io bot commented Mar 10, 2026

Merge Risk: High

This upgrade includes a major version jump for org.apache.hive:hive-metastore from 2.3.4 to 4.0.0, which introduces significant breaking changes requiring developer action. The upgrade for org.apache.hudi:hudi-client-common is minor but includes important deprecations.

org.apache.hive:hive-metastore (2.3.4 → 4.0.0)

Risk: HIGH

This is a major upgrade from a version that is now End-of-Life (EOL). It requires several manual steps and environment changes to ensure compatibility.

Breaking Changes:

  • Java Runtime Requirement: Hive 4.0.0 requires Java 11 or higher. The installer no longer includes a bundled JRE.
  • Mandatory Metastore Schema Upgrade: You must run the appropriate database schema upgrade scripts for your Metastore database (e.g., MySQL, PostgreSQL, Oracle) before starting the new service. These scripts are located in the scripts/metastore/upgrade directory of the Hive distribution.
  • API Changes: The Thrift client API has changed. Methods such as delete_table_column_statistics now require a new engine parameter, which will break existing integrations.
  • Engine Removal: Support for Hive-on-MapReduce has been removed, and Hive-on-Spark is deprecated.
  • Behavioral Change: The default null sort order in ORDER BY clauses has changed to NULLS LAST, which could affect query results.

Recommendation: This upgrade cannot be performed without careful planning and manual intervention. Developers must ensure the environment is running Java 11+ and schedule downtime to perform the metastore schema migration. All applications using the Hive Metastore client directly must be reviewed for API compatibility.

org.apache.hudi:hudi-client-common (1.0.0-SNAPSHOT → 1.1.0)

Risk: MEDIUM

This minor upgrade introduces a significant change in the API for handling record merges.

Key Changes:

  • Record Merging Deprecation: The legacy HoodieRecordPayload classes for custom merge logic are now deprecated. While they will work in a fallback mode, the new RecordMerger APIs and standard merge modes are the recommended approach.
  • Engine Support: Adds support for Apache Spark 4.0.
  • Performance: Delivers significant write performance improvements (2-3x) for Flink users by eliminating Avro conversions.

Recommendation: While not an immediate breaking change, developers should plan to migrate from custom payload implementations to the new RecordMerger APIs to align with the library's future direction and avoid issues in subsequent upgrades.

Notice 🤖: This content was augmented using artificial intelligence. AI-generated content may contain errors and should be reviewed for accuracy before use.

@snyk-io
Copy link
Author

snyk-io bot commented Mar 10, 2026

Snyk checks have failed. 176 issues have been found so far.

Status Scanner Critical High Medium Low Total (176)
Open Source Security 6 155 0 0 161 issues
Licenses 0 15 0 0 15 issues

💻 Catch issues earlier using the plugins for VS Code, JetBrains IDEs, Visual Studio, and Eclipse.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

0 participants