-
Notifications
You must be signed in to change notification settings - Fork 451
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Measure IO time spend when running the unit tests #2215
Comments
Small update: ran the unit tests a while ago and measured the sqlitecachedb class of Tribler running the Tribler unit tests (so dispersy also runs but is not captured. The results indicated around 1-2% of the time was spend on IO, not significant at all thus. Dispersy has a lot more IO and therefore impact. Running the Dispersy unit tests may give better insight, but won't simulate real behavior. |
Running only the dispersy unit tests and profile Dispersy shows similar results. Here is a screenshot when only filtering on database items: The .incl is the percentage of time spent in that function, which amounts to ~4%. So the unit tests do not perform much database activity (as somewhat expected). |
RIP Dispersy |
To gain insight into the IO time consumed by tribler, I will profile the functions with yappi in databamanagers such as
sqlitecacbedb
to measure the time spent on IO time when running the unit tests.Output:
The text was updated successfully, but these errors were encountered: