Patch Chronicle Core, Chronicle Map, Spark Unsafe, and Hadoop Client API to improve compatibility with Java 26+#481
Merged
Conversation
…s since Java 26-ea+6 Reuses the infrastructure used for patching the Hadoop Client API and moves the patching and remapping from the main 'renaissance' project to the 'apacheSparkBenchmarks' project, which removes the need for doing this twice in the 'renaissance' and 'renaissanceJmh' projects. Also moves most of the patching logic from build.sbt into patcher.scala (previous hadoop.scala) to reduce clutter.
Member
Author
|
I want to test this on JDK26-ea, for which we need an updated buildenv images, but it seems to be going in the right direction. |
This requires more recent Java to compile the suite, but avoids the complexity of byte code patching. The original code is kept alongside the patched code for reference.
1b2d6af to
0c9742c
Compare
Uses v17 images for style/build/readme jobs, but keeps v15 images (with Fedora 41) for the plugins/run jobs, because running Renaissance on JDK 11 in v16 images (with Fedora 42) kept crashing, and compiling plugins with JDK 11 in v17 failed to locate directory with C header files. Also updates the ea-jdk workflow to use JDK 26-ea to allow testing the JDK 26 compatibility fixes.
2bf809f to
e9bb43d
Compare
This requires using v17 images, but because of the problems with JDK 11 on Fedora 42 in v16, we keep v15 image for JDK 11 on Linux.
Member
Author
I finally managed to massage the GHA configuration to my liking (one would not believe to what lengths we have to go to share a workflow-level setting between multiple job-level keys), we seem to be running not only on JDK 25, but also on JDK 26-ea. |
farquet
approved these changes
Oct 7, 2025
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Patches Chronicle Core, Chronicle Map, Spark Unsafe to avoid reflective lookups for internal classes, such as
jdk.internal.ref.Cleanerremoved in Java 26-ea+6, which broke thedb-shootoutand Spark benchmarks on Java 26+.To move away from patching byte code using ASM, the patching process uses tiny dedicated projects containing specific classes modified to remove the offending code. This requires more recent Java to compile the suite, but avoids the complexity of byte code patching.
The previous patch to Hadoop Client API (#453) has been converted to use the same approach.
The source code of the original (unpatched) versions of the classes has been included alongside the patched versions to make it easier to see the differences if/when we need to update the patches (even though I would like them to disappear).
BTW, there are some good news: there has been progress on HADOOP-19212 and the Hadoop
trunkbranch (past 3.4.2) started using aSubjectUtilclass instead ofSubject, so maybe we can drop one of the patches whenhadoop-client-apiversion 3.4.3 gets released.Closes #474