You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe the problem.
Is there already a way to get all items for a CollectionResponse? Currently the API/SDK returns only a page of results, and then we need to fetch more by using odata_next_link.
Is there a way to get all results in a single request?
I have made a helper function which I use, is this the best way;
Hello @jussihi thanks for using the SDK and for raising this.
Imagine a situation where you have a request that gives thousands or even millions of records, getting all of these in one API call becomes not feasile, to solve for this, graph allows for painated responses, with next pages being obtained using odata_next_link.
You can loop through these and extract page contents storing them in memory for your operation if it solves for your specific need. However, building a feature that gets all items in one go would not be a good idea for situations where an entity has a very high numr of records.
Ah! So such feature exists :) Could you give an example on how to use the PageIterator ? This would be a good addition to docs, if I knew about its existence, I would've used that, I think.
Is your feature request related to a problem? Please describe the problem.
Is there already a way to get all items for a CollectionResponse? Currently the API/SDK returns only a page of results, and then we need to fetch more by using odata_next_link.
Is there a way to
get
all results in a single request?I have made a helper function which I use, is this the best way;
It can then be used like this;
mailfolders: List[MailFolder] = await get_complete_paginated_list(graph_client.users.by_user_id(user_id).mail_folders)
Describe the solution you'd like.
something like what I've made
Additional context?
No response
The text was updated successfully, but these errors were encountered: