Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UI: Fix allocation count in CSI volumes table #9515

Merged
merged 3 commits into from
Dec 7, 2020

Conversation

backspace
Copy link
Contributor

As detailed in #9495, the collection query GET /v1/volumes?type=csi doesn’t return ReadAllocs and WriteAllocs, so the # Allocs cell was always showing 0 upon first load because it was derived from the lengths of those arrays. This uses the heretofore-ignored CurrentReaders and CurrentWriters values to calculate the total instead.

The single-resource query GET /v1/volume/csi%2F:id doesn’t return CurrentReaders and CurrentWriters but you can see here that the missing values don’t cause the already-fetched ones to be deleted.

csi-alloc-count

Thanks to @apollo13 for reporting this in #9495 and to @tgross for the API logs and suggestion.

The API doesn’t return the ReadAllocs and WriteAllocs for the
index query, as discussed here:
#9495 (comment)

So this removes that, which will cause the allocation count in the
table to be 0, as in the bug report.
This will more accurately reflect the API.
The API includes CurrentWriters and CurrentReaders in the
collection response, so this uses that instead of counting
WriteAllocs and ReadAllocs, which are not returned in the
collection.

The individual volume response doesn’t include CurrentWriters
and CurrentReaders but their absence doesn’t erase the values.
@vercel
Copy link

vercel bot commented Dec 3, 2020

This pull request is being automatically deployed with Vercel (learn more).
To see the status of your deployments, click below or on the icon next to each commit.

nomad-storybook-and-ui – ./ui

🔍 Inspect: https://vercel.com/hashicorp/nomad-storybook-and-ui/r7rmzrzwz
✅ Preview: Failed

Copy link
Member

@tgross tgross left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I checked this branch out locally to proxy against the real CSI API and it LGTM! Thanks @backspace!

@github-actions
Copy link

github-actions bot commented Dec 3, 2020

Ember Asset Size action

As of 94247f3

Files that got Bigger 🚨:

File raw gzip
nomad-ui.js +595 B +107 B

Files that stayed the same size 🤷‍:

File raw gzip
vendor.js 0 B 0 B
nomad-ui.css 0 B 0 B
vendor.css 0 B 0 B

@github-actions
Copy link

github-actions bot commented Dec 3, 2020

Ember Test Audit comparison

master 94247f3 change
passes 1538 1537 -1
failures 0 0 0
flaky 0 1 +1
duration 7m 46s 439ms 8m 10s 630ms +24s 191ms

@apollo13
Copy link
Contributor

apollo13 commented Dec 4, 2020

Thank you both. I can also confirm that this PR fixes the issue.

Copy link
Contributor

@DingoEatingFuzz DingoEatingFuzz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for jumping on this! After reading the original issue, this implementation makes sense to me. :shipit:

@backspace backspace merged commit e84a360 into master Dec 7, 2020
@backspace backspace deleted the b-ui/csi-index-allocations branch December 7, 2020 14:51
@github-actions
Copy link

github-actions bot commented Dec 7, 2022

I'm going to lock this pull request because it has been closed for 120 days ⏳. This helps our maintainers find and focus on the active contributions.
If you have found a problem that seems related to this change, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Dec 7, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants