Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chat is very slow #1

Open
tripleee opened this issue May 10, 2018 · 1 comment
Open

Chat is very slow #1

tripleee opened this issue May 10, 2018 · 1 comment

Comments

@tripleee
Copy link
Member

Frequently when a post is reported, you would like to see very quickly the first results from the analysis: is the domain name blacklisted? did the URL tail match any keywords? and only then proceed to report the more-detailed analysis.

The chat interface throttles sending to one message per second. Some messages could be collected into multi-line chat messages (but then you cannot use formatting -- no bold, italics, links etc) to make reports appear quicker.

Ultimately, I'm thinking the back-end queries should be done using some sort of asynch framework so that multiple queries could be pending at the same time and you don't have to wait for the queries to execute serially before you can get the result from the one you actually care about for this particular post.

@tripleee
Copy link
Member Author

https://pawelmhm.github.io/asyncio/python/aiohttp/2016/04/22/asyncio-aiohttp.html is inspiring but not perhaps the primary candidate for architecture now that Python has native async.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

1 participant