Skip to content

[Snyk] Fix for 1 vulnerabilities#90

Open
snyk-io[bot] wants to merge 2 commits intomasterfrom
snyk-fix-2c272f0c654c3b234a60dd4dec56deab
Open

[Snyk] Fix for 1 vulnerabilities#90
snyk-io[bot] wants to merge 2 commits intomasterfrom
snyk-fix-2c272f0c654c3b234a60dd4dec56deab

Conversation

@snyk-io
Copy link

@snyk-io snyk-io bot commented Mar 10, 2026

snyk-top-banner

Snyk has created this PR to fix 1 vulnerabilities in the maven dependencies of this project.

Snyk changed the following file(s):

  • pom.xml

Vulnerabilities that will be fixed with an upgrade:

Issue Score Upgrade
high severity Insertion of Sensitive Information into Log File
SNYK-JAVA-ORGAPACHEZOOKEEPER-15443353
  115   org.apache.hive:hive-jdbc:
2.3.4 -> 4.0.0
org.apache.hive:hive-metastore:
2.3.4 -> 4.0.0
org.apache.hive:hive-service:
2.3.4 -> 3.1.0
Major version upgrade No Path Found No Known Exploit

Breaking Change Risk

Merge Risk: High

Notice: This assessment is enhanced by AI.

Vulnerabilities that could not be fixed

  • Upgrade:
    • Could not upgrade org.apache.hudi:hudi-common@1.0.0-SNAPSHOT to org.apache.hudi:hudi-common@1.1.0; Reason could not apply upgrade, dependency is managed externally ; Location: provenance does not contain location

Important

  • Check the changes in this PR to ensure they won't cause issues with your project.
  • Max score is 1000. Note that the real score may have changed since the PR was raised.
  • This PR was automatically created by Snyk using the credentials of a real user.

Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open fix PRs.

For more information:
🧐 View latest project report
📜 Customise PR templates
🛠 Adjust project settings
📚 Read about Snyk's upgrade logic


Learn how to fix vulnerabilities with free interactive lessons:

🦉 Insertion of Sensitive Information into Log File

@snyk-io
Copy link
Author

snyk-io bot commented Mar 10, 2026

Merge Risk: High

This upgrade includes major version changes for Apache Hive from 2.x to 4.x/3.x and a minor upgrade for Apache Hudi. The Hive upgrade is a high-risk, complex migration that requires significant manual intervention and will introduce breaking API changes.

High-Risk Upgrades

1. org.apache.hive:hive-jdbc @ 2.3.4 → 4.0.0
2. org.apache.hive:hive-metastore @ 2.3.4 → 4.0.0
3. org.apache.hive:hive-service @ 2.3.4 → 3.1.0

These packages are part of a major Apache Hive upgrade from version 2.x to 3.x and 4.x. This is not a drop-in replacement and requires a carefully planned migration. Both Hive 2.x and 3.x are now considered End-of-Life (EOL).

Key Breaking Changes & Required Actions:

  • Metastore Schema Upgrade: The Hive Metastore (HMS) requires a manual schema upgrade to be compatible with Hive 3.x and newer versions. This is a critical prerequisite.
  • ACID Table On-Disk Format: The on-disk format for ACID tables changed in Hive 3.0. Any ACID tables that have had UPDATE or DELETE operations must have a major compaction run on them before the upgrade to prevent data corruption.
  • Managed Tables Conversion: In Hive 3, all existing non-ACID managed tables are converted to external tables. This change in behavior can impact table management and data lifecycle scripts.
  • API Breaking Changes: The ThriftHiveMetastore.Client API has breaking changes in version 4.0. For example, the get_table API is deprecated and replaced, and methods like delete_table_column_statistics require a new engine parameter. This will directly impact applications using hive-jdbc and hive-metastore.
  • Execution Engine Deprecation: Support for MapReduce as an execution engine is removed in Hive 3.0 in favor of Apache Tez.
  • Hadoop 3 Requirement: Hive 3.0 and newer require a Hadoop 3.x environment.

Recommendation: This is a major migration effort that requires careful planning, data migration, and code refactoring. It should not be merged without a dedicated migration project to address the prerequisite steps and API changes.

Medium-Risk Upgrades

4. org.apache.hudi:hudi-common @ 1.0.0-SNAPSHOT → 1.1.0

This upgrade moves from a snapshot version to a stable minor release. While it includes significant performance improvements and new features, it also introduces a breaking API change.

Key Breaking Changes & Required Actions:

  • Record Merging API Deprecation: The release deprecates the use of payload classes for record merging in favor of new merge modes and merger APIs. Code using the old payload classes will need to be updated.
  • Performance and Feature Enhancements: The release brings 2-3x write throughput improvements for Flink, 4x faster metadata table reads, and adds support for Spark 4.0.

Recommendation: Review code that implements Hudi record merging logic and update it to use the new, recommended APIs to avoid breakage.

Notice 🤖: This content was augmented using artificial intelligence. AI-generated content may contain errors and should be reviewed for accuracy before use.

@snyk-io
Copy link
Author

snyk-io bot commented Mar 10, 2026

Snyk checks have failed. 135 issues have been found so far.

Status Scanner Critical High Medium Low Total (135)
Open Source Security 20 115 0 0 135 issues
Licenses 0 0 0 0 0 issues

💻 Catch issues earlier using the plugins for VS Code, JetBrains IDEs, Visual Studio, and Eclipse.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

0 participants