Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixing the intermittent Helix tool test failure #1162

Merged
merged 3 commits into from
Apr 30, 2019

Conversation

jsjtzyy
Copy link
Contributor

@jsjtzyy jsjtzyy commented Apr 29, 2019

The replica size is changed in testEverything() method. If new replica
from initial partition is added to new node disk and is not reset when
calling writeBootstrapOrUpgrade(), then the second time it is invoked,
validating cluster manager in Helix tool would see different size of
replica of same partition and fails the test.
This patch ensures that if new replica is from initial partition, the
replica is only allowed to be added to original nodes in HardwareLayout,
which would be reset by writeBootstrapOrUpgrade() and avoid inconsistent
replica size error in validating cluster manager.

jsjtzyy added 2 commits April 29, 2019 13:32
The replica size is changed in testEverything() method. If new replica
from initial partition is added to new node disk and is not reset when
calling writeBootstrapOrUpgrade(), then the second time it is invoked,
validating cluster manager in Helix tool would see different size of
replica of same partition and fails the test.
This patch ensures that if new replica is from initial partition, the
replica is only allowed to be added to original nodes in HardwareLayout,
which would be reset by writeBootstrapOrUpgrade() and avoid inconsistent
replica size error in validating cluster manager.
@jsjtzyy jsjtzyy self-assigned this Apr 29, 2019
@jsjtzyy jsjtzyy requested a review from pnarayanan April 29, 2019 20:45
@codecov-io
Copy link

codecov-io commented Apr 29, 2019

Codecov Report

Merging #1162 into master will decrease coverage by 0.06%.
The diff coverage is n/a.

Impacted file tree graph

@@             Coverage Diff              @@
##             master    #1162      +/-   ##
============================================
- Coverage     69.81%   69.75%   -0.07%     
+ Complexity     5312     5310       -2     
============================================
  Files           420      420              
  Lines         32547    32547              
  Branches       4139     4139              
============================================
- Hits          22724    22703      -21     
- Misses         8700     8711      +11     
- Partials       1123     1133      +10
Impacted Files Coverage Δ Complexity Δ
...java/com.github.ambry.store/CompactionManager.java 87.33% <0%> (-3.34%) 19% <0%> (ø)
...github.ambry.rest/AsyncRequestResponseHandler.java 88.59% <0%> (-2.29%) 23% <0%> (ø)
...b.ambry.network/BlockingChannelConnectionPool.java 70.17% <0%> (-1.76%) 9% <0%> (ø)
...in/java/com.github.ambry.store/BlobStoreStats.java 70.92% <0%> (-1.04%) 103% <0%> (-1%)
...src/main/java/com.github.ambry.commons/BlobId.java 93.18% <0%> (-0.36%) 71% <0%> (-1%)
...java/com.github.ambry.network/SSLTransmission.java 69.42% <0%> (-0.32%) 68% <0%> (-1%)
...ain/java/com.github.ambry.router/PutOperation.java 90.86% <0%> (ø) 107% <0%> (ø) ⬇️
.../java/com.github.ambry.router/DeleteOperation.java 93.61% <0%> (+1.41%) 48% <0%> (+1%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 57fe69b...10f851e. Read the comment docs.

for (Datacenter dcObj : hardwareLayout.getDatacenters()) {
dataNodes.addAll(dcObj.getDataNodes());
}
return dataNodes;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Collections.unmodifiable?

@pnarayanan pnarayanan merged commit d458497 into linkedin:master Apr 30, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants