You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using the Updated IDs api calls (/shows/updates/ids/date, /movies/updates/ids/date), the low maximum limit (100) frequently results in having to make MANY paginated requests.
When the start_date is relatively recent (within a single day), a limit=100 call to /shows/updates/ids easily requires a dozen api calls, and it doesn't take much longer for that number to shoot up, possibly into the hundreds. Making a request for all updated ids in the past month (the longest period allowed) returns 592 pages of results. (On top of that, requesting updated ids for both shows and movies almost doubles that number)
This seems absurd, given that each page is an easily cacheable list of integers less than 1kB in size. The overhead ends up drastically outweighing the amount of actual data returned. The end result is either spamming API servers with dozens of tiny requests simultaneously or having to wait ages for a full set of paginated calls to go through one by one.
The solution seems simple to me: raise the upper limit for these API calls significantly (I don't have the knowledge to be sure, but I'd imagine even as many as 10,000 ids per page wouldn't be unreasonable). Obviously this would be impractical for other parts of the API, but Updated IDs requests are uniquely small, simple, and cacheable, so this seems like an open-and-shut case.
The text was updated successfully, but these errors were encountered:
The limit is in place for the backend service that retrieves the data, not for the actual data output. I'll need to experiment and see if we can safely raise the limit. Previously there was a bug that wasn't enforcing the limit and it was causing performance issues.
When using the Updated IDs api calls (/shows/updates/ids/
date
, /movies/updates/ids/date
), the low maximum limit (100) frequently results in having to make MANY paginated requests.When the
start_date
is relatively recent (within a single day), alimit=100
call to /shows/updates/ids easily requires a dozen api calls, and it doesn't take much longer for that number to shoot up, possibly into the hundreds. Making a request for all updated ids in the past month (the longest period allowed) returns 592 pages of results. (On top of that, requesting updated ids for both shows and movies almost doubles that number)This seems absurd, given that each page is an easily cacheable list of integers less than 1kB in size. The overhead ends up drastically outweighing the amount of actual data returned. The end result is either spamming API servers with dozens of tiny requests simultaneously or having to wait ages for a full set of paginated calls to go through one by one.
The solution seems simple to me: raise the upper limit for these API calls significantly (I don't have the knowledge to be sure, but I'd imagine even as many as 10,000 ids per page wouldn't be unreasonable). Obviously this would be impractical for other parts of the API, but Updated IDs requests are uniquely small, simple, and cacheable, so this seems like an open-and-shut case.
The text was updated successfully, but these errors were encountered: