-
-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Caching request responses to prevent hitting rate limits #61
Comments
Caching could be done, but how should we implement it, would simple caching in form of a text file be ok, or would we want something like redis? |
I think a html/json/xml file will be ok, we should implement caching after #68 is implemented because this will probably take longer to work out what to do |
Hi there! I would like to contribute for Hacktoberfest 2020. Is this issue still up for grabs? |
Yes it's up for grabs, thanks for your interest |
If it is still up for grabs, I'd like to work on it. |
Hey, @hedythedev I have added a pull request for this issue. Please have a look. |
Description
GitHub API has rate limits, and sometimes when using starcli it takes up to 10+ secs to see the output, we should cache the responses based on options (like topic/stars/dates etc) somewhere so that when the user uses the same options again within say 1 minute, it can just fetch the cached info instead of sending the request again
Why is it necessary? (or how would it make it better)
This improves the execution speed/performance
how do you think should it be implemented? (If possible)
Every time search is called we cache the response together with the options used, so when the same options are used again we output the cached data instead
The text was updated successfully, but these errors were encountered: