-
Notifications
You must be signed in to change notification settings - Fork 5.5k
fix: Enable maven-enforcer-plugin for both spark2 and spark3 profiles #26559
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
…th spark2 and spark3
|
Saved that user @wangkewen is from Meta |
aaneja
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Short first pass
In general - can we try and figure out if we need to downgrade a library/ upgrade the module bytecode level instead of adding excludes
| <enforceBytecodeVersion> | ||
| <maxJdkVersion>20</maxJdkVersion> | ||
| <excludes combine.children="append"> | ||
| <exclude>org.jgrapht:jgrapht-core</exclude> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The reason we need to exclude jgrapht-core is that we upgraded to version 1.5.2 which requires JDK 11+. However, for presto-main-base we want to keep the JDK level at 8.
@sumi-mathew Would we be okay to downgrade this library to 1.4.0, which was the last JDK8 compatible version ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes ,We only upgraded to the latest version as part of adopting newer libraries, so reverting to 1.4.0 should be fine.
| <enforceBytecodeVersion> | ||
| <maxJdkVersion>20</maxJdkVersion> | ||
| <excludes combine.children="append"> | ||
| <exclude>org.apache.yetus:audience-annotations:jar</exclude> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need this because this upgrade accidentally moved us to a version that requires JDK 11+
Since presto-kudu is unlikely to be used with Spark (can you check internally @wangkewen ?) I propose we roll-forward with this and set the JDK level for this module to 17+
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your comments. @aaneja I will update it.
|
Took an initial look. It seems that for the duration that these checks were disabled, multiple changes have sneaked in to the presto project that are now being flagged by the enforcer. It might be infeasible for 1 person to hunt down all the changes and this PR will likely need community support to get it across. I agree with @aaneja though on trying to fix the dependency spec instead of excluding , atleast as much as possible. Let's start with the ones that @aaneja already identified and keep making progress on this @wangkewen |
|
In the interest of time, I am fine if we fix the issues we already identified and create issues for the rest. We can fix those before the next release |
Description
This PR fixes the issue-26465 by enabling maven-enforcer-plugin and duplicate-finder-maven-plugin for both spark2 and spark3 profiles.
Motivation and Context
It is needed to not skip checks by maven-enforcer-plugin and duplicate-finder-maven-plugin to fix issue-26465
Impact
Test Plan
Spark2 profile (default)
Spark3 profile
Contributor checklist
Release Notes
Please follow release notes guidelines and fill in the release notes below.
If release note is NOT required, use: