Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Java 17 support #2248

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ jobs:
strategy:
fail-fast: false
matrix:
java-ver: [11]
java-ver: [11, 17]
java-dist: ['microsoft', 'temurin']
steps:
- uses: actions/checkout@v4
Expand All @@ -39,7 +39,7 @@ jobs:
strategy:
fail-fast: false
matrix:
java-ver: [11]
java-ver: [11, 17]
java-dist: ['microsoft', 'temurin']
steps:
- uses: actions/checkout@v4
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ Cruise Control for Apache Kafka
* The `kafka_2_0_to_2_3` and `kafka_0_11_and_1_0` branches compile with `Scala 2.11`.
* The branch `migrate_to_kafka_2_4` compiles with `Scala 2.12`.
* The branch `migrate_to_kafka_2_5` compile with `Scala 2.13`.
* This project requires Java 11.
* This project requires Java 11, also compatible with Java 17.

#### Known Compatibility Issues ####
* Support for Apache Kafka `2.0`, `2.1`, `2.2`, and `2.3` requires [KAFKA-8875](https://issues.apache.org/jira/browse/KAFKA-8875) hotfix.
Expand All @@ -77,7 +77,7 @@ Cruise Control for Apache Kafka
&& git tag -a 0.1.10 -m "Init local version."`
1. This step is required if `CruiseControlMetricsReporter` is used for metrics collection (i.e. the default for Cruise
Control). The metrics reporter periodically samples the Kafka raw metrics on the broker and sends them to a Kafka topic.
* `./gradlew jar` (Note: This project requires Java 11)
* `./gradlew jar` (Note: This project requires Java 11 or 17)
* Copy `./cruise-control-metrics-reporter/build/libs/cruise-control-metrics-reporter-A.B.C.jar` (Where `A.B.C` is
the version of the Cruise Control) to your Kafka server dependency jar folder. For Apache Kafka, the folder would
be `core/build/dependant-libs-SCALA_VERSION/` (for a Kafka source checkout) or `libs/` (for a Kafka release download).
Expand Down
22 changes: 18 additions & 4 deletions build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ plugins {
id "idea"
id "jacoco" // Java Code Coverage plugin
id "com.github.ben-manes.versions" version "0.42.0"
id "com.github.spotbugs" version "6.0.6" apply false
id "com.github.spotbugs" version "6.0.25" apply false
id "checkstyle"
id "org.openapi.generator" version "5.3.0"
}
Expand Down Expand Up @@ -45,6 +45,18 @@ project.ext {
buildVersionFileName = "cruise-control-version.properties"
commitId = project.hasProperty('commitId') ? commitId : null
scalaBinaryVersion = getScalaBinaryVersion(scalaVersion)

defaultTestJvmArgs = []
// "JEP 403: Strongly Encapsulate JDK Internals" causes some tests to fail when they try
// to access internals (often via mocking libraries). We use `--add-opens` as a workaround
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

often or always? As I see it is only used in the test task def, so perhaps we could rephrase this.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's more about the occurrences rather than temporal frequency. OFC those tests always fail without the add-opens.
BTW I copied it from Kafka :D. Most of the cases tests were failing because of easymock and powermock to be specific, but I have no clue why exactly java.time had to be opened, gradle wasn't kind enough to provide me more info about the failing test(s) before I added it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In Kafka easymock/powermock was replaced with mockito over time. It's still not done, but that's the way to get rid of these add-opens lateron.

Copy link
Contributor

@viktorsomogyi viktorsomogyi Feb 3, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, makes sense. I agree that we should rather use Mockito, however I think that comes with some test refactoring as well, so let's not do that now.

if (JavaVersion.current().isCompatibleWith(JavaVersion.VERSION_16))
defaultTestJvmArgs.addAll(
"--add-opens=java.base/java.lang.reflect=ALL-UNNAMED",
"--add-opens=java.base/java.lang=ALL-UNNAMED",
"--add-opens=java.base/java.time=ALL-UNNAMED",
"--add-opens=java.base/java.util.concurrent=ALL-UNNAMED",
"--add-opens=java.base/java.util=ALL-UNNAMED"
)
}

allprojects {
Expand Down Expand Up @@ -104,7 +116,7 @@ subprojects {
}

spotbugs {
toolVersion = '4.7.1'
toolVersion = '4.8.6'
excludeFilter = file("$rootDir/gradle/findbugs-exclude.xml")
ignoreFailures = false
jvmArgs = [ '-Xms512m' ]
Expand Down Expand Up @@ -137,6 +149,8 @@ subprojects {
}

test {
jvmArgs += defaultTestJvmArgs

useJUnit {}
testLogging {
events "passed", "failed", "skipped"
Expand Down Expand Up @@ -175,7 +189,7 @@ project(':cruise-control-core') {
implementation "org.apache.logging.log4j:log4j-slf4j-impl:2.17.2"
implementation 'org.apache.commons:commons-math3:3.6.1'
api "org.eclipse.jetty:jetty-servlet:${jettyVersion}"
implementation 'com.google.code.findbugs:jsr305:3.0.2'
implementation 'com.github.spotbugs:spotbugs-annotations:4.8.6'

api "io.vertx:vertx-core:${vertxVersion}"
api "io.vertx:vertx-web:${vertxVersion}"
Expand Down Expand Up @@ -311,7 +325,7 @@ project(':cruise-control') {
testImplementation project(path: ':cruise-control-core', configuration: 'testOutput')
testImplementation "org.scala-lang:scala-library:$scalaVersion"
testImplementation 'junit:junit:4.13.2'
testImplementation 'org.easymock:easymock:4.3'
testImplementation 'org.easymock:easymock:5.5.0'
testImplementation "org.apache.kafka:kafka_$scalaBinaryVersion:$kafkaVersion:test"
testImplementation "org.apache.kafka:kafka-clients:$kafkaVersion:test"
testImplementation 'commons-io:commons-io:2.11.0'
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -298,8 +298,10 @@ private void loadCapacities() throws FileNotFoundException {
Set<BrokerCapacity> brokerCapacities = ((BrokerCapacities) gson.fromJson(reader, BrokerCapacities.class)).brokerCapacities;
capacitiesForBrokers = new HashMap<>();
Set<Boolean> numCoresConfigConsistency = new HashSet<>();
for (BrokerCapacity bc : brokerCapacities) {
capacitiesForBrokers.put(bc.brokerId, getBrokerCapacityInfo(bc, numCoresConfigConsistency));
if (brokerCapacities != null) {
for (BrokerCapacity bc : brokerCapacities) {
capacitiesForBrokers.put(bc.brokerId, getBrokerCapacityInfo(bc, numCoresConfigConsistency));
}
}
} finally {
try {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -84,8 +84,10 @@ private Map<String, Set<Integer>> loadBrokerSetData() throws IOException {
final BrokerSets brokerSets = gson.fromJson(reader, BrokerSets.class);
final Set<BrokerSet> brokerSetSet = brokerSets.brokerSets;
final Map<String, Set<Integer>> brokerIdsByBrokerSetId = new HashMap<>();
for (BrokerSet brokerSet : brokerSetSet) {
brokerIdsByBrokerSetId.put(brokerSet.brokerSetId, brokerSet.brokerIds);
if (brokerSetSet != null) {
for (BrokerSet brokerSet : brokerSetSet) {
brokerIdsByBrokerSetId.put(brokerSet.brokerSetId, brokerSet.brokerIds);
}
}
return brokerIdsByBrokerSetId;
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
import java.io.IOException;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.NoSuchFileException;
import java.nio.file.attribute.PosixFilePermission;
import java.util.HashMap;
import java.util.HashSet;
Expand Down Expand Up @@ -118,7 +119,7 @@ String loadPersistedFailedBrokerList() {
String failedBrokerListString = null;
try {
failedBrokerListString = readFileToString(_failedBrokersFile, StandardCharsets.UTF_8);
} catch (FileNotFoundException fnfe) {
} catch (FileNotFoundException | NoSuchFileException e) {
// This means no previous failures have ever been persisted in the file.
failedBrokerListString = "";
} catch (IOException ioe) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ public static Collection<Object[]> buildParameters() {
new IntraBrokerDiskCapacityGoal(true),
cluster,
kafkaCruiseControlConfig,
replicaLoad != TestConstants.LARGE_BROKER_CAPACITY / 2 * balancingConstraint.capacityThreshold(Resource.DISK)
Math.abs(replicaLoad - TestConstants.LARGE_BROKER_CAPACITY / 2 * balancingConstraint.capacityThreshold(Resource.DISK)) > .0000001
});
}

Expand Down
4 changes: 4 additions & 0 deletions gradle/findbugs-exclude.xml
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,10 @@
<Bug pattern="SF_SWITCH_FALLTHROUGH" />
</Match>

<Match>
<Bug pattern="CT_CONSTRUCTOR_THROW"/>
</Match>

<!-- False positive of DMI_RANDOM_USED_ONLY_ONCE (see https://github.com/spotbugs/spotbugs/issues/1539) -->
<Match>
<Bug pattern="DMI_RANDOM_USED_ONLY_ONCE, ..." />
Expand Down