forked from tindie/pydiscourse
-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Users: Handle the case there are more than 100 results #19
Labels
Milestone
Comments
The simple work around / solution is to check if you had 100 users returned and try another download. I usually do something like this:
When it stops returning results it breaks out (sorry , I can't look up a real example right now). |
One thing we could implement here is a resource generator, so that the number of items or pages is irrelevant to the end user/application.
On Mon, Jul 15, 2019, at 8:05 AM, Karl Goetz wrote:
The simple work around / solution is to check if you had 100 users returned and try another download. I usually do something like this:
`counter = 1
while counter:
downloaded_things = download_things
counter = len(downloaded_things)
`
… When it stops returning results it breaks out (sorry , I can't look up a real example right now).
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub <#19>, or mute the thread <https://github.com/notifications/unsubscribe-auth/AAAZA3MEJ5MQZSMKT7K43DDP7RRWRANCNFSM4IDUQVSA>.
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
My apologies if it is already handled, but I didn't found how to do it in the code.
When I want my discourse users, client.users() only gives me the first 100 users whereas I have more than 100.
I've looked at the doc and it seems that the API doesn't tell that there are more users, but we can use the "page" param to get the next ones.
I guess the solution would be to loop on the pages until the call gives an empty list as a result.
I'll try to do a PR if I manage to find a solution.
The text was updated successfully, but these errors were encountered: