Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API virtualization/interfaces returns duplicate results when paginating #4926

Closed
volans- opened this issue Jul 29, 2020 · 0 comments
Closed
Assignees
Labels
status: accepted This issue has been accepted for implementation type: bug A confirmed report of unexpected behavior in the application

Comments

@volans-
Copy link

volans- commented Jul 29, 2020

Environment

  • Python version: 3.7.3
  • NetBox version: 2.8.8

Steps to Reproduce

  1. Have some virtual machines with related interfaces, make sure to have more than PAGINATE_COUNT interfaces.
  2. Call the API endpoint /api/virtualization/interfaces/
  3. Get the count value, in my case was 362
  4. Make a second call for all the remaining objects: /api/virtualization/interfaces/?limit=362&offset=50 (where offset is the value of PAGINATE_COUNT).
  5. Merge the results together checking for duplicates

Note: this is how pynetbox performs the calls when asking for all objects and how I discovered the bug.

Expected Behavior

Get all the 362 interfaces of virtual machines, all unique results.

Observed Behavior

I got only 332 unique results. I got 50 results in the first page and 312 in the second page, for what should have been a total of 362 results, but upon inspection I got only 332 unique results and 30 duplicated results in the second page that were already present in the first API call.
It seems like the results are returned without any sorting hence the unpredicted behaviour.
I've also tried invalidate all to clean the cache.
I've verified that making a single call with /api/virtualization/interfaces/?limit=362 returns all 362 results as expected.
Retrying with a clean cache I got the exact same 30 duplicates, so it doesn't seem to be totally random or unpredictable.

@jeremystretch jeremystretch added status: accepted This issue has been accepted for implementation type: bug A confirmed report of unexpected behavior in the application labels Jul 30, 2020
@jeremystretch jeremystretch self-assigned this Jul 30, 2020
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 5, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
status: accepted This issue has been accepted for implementation type: bug A confirmed report of unexpected behavior in the application
Projects
None yet
Development

No branches or pull requests

2 participants