Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Double all timeouts on flutter_tools integration tests #20872

Merged
merged 1 commit into from
Aug 22, 2018
Merged

Double all timeouts on flutter_tools integration tests #20872

merged 1 commit into from
Aug 22, 2018

Conversation

DanTup
Copy link
Contributor

@DanTup DanTup commented Aug 21, 2018

Due to CPU contention we've seen these go pretty slow on Cirrus (see #19542 (comment)) and there's also a chance our flakes are timeouts due to running slow rather than hanging.

Due to CPU contention we've seen these go really slow on Cirrus (see #19542 (comment)), and there's also a chance our flakes are timeouts due to running slow rather than hanging.
Copy link
Contributor

@tvolkert tvolkert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@Hixie
Copy link
Contributor

Hixie commented Aug 21, 2018

@fkorotkov the Windows bot here seemed to time out after all the tests finished, which is a new kind of failure mode...

@fkorotkov
Copy link
Contributor

@Hixie, sorry about that. That is very weird. Do you know if running tests involve creation daemon processes? I found a few Go issues that show the same behaviour. Investigating...

@DanTup
Copy link
Contributor Author

DanTup commented Aug 21, 2018

We have had issues in the past where we didn't terminate processes correctly, though normally it's on failure (and it seems there weren't any here). Are you doing something that would wait for background processes and not just waiting for the main script to end?

Restarting this one to make it green before merging; I believe the failed run logs can still be accessed if required.

@Hixie
Copy link
Contributor

Hixie commented Aug 21, 2018

Yeah we could definitely have runaway daemon processes. Is it possible to dump all running processes to the logs on termination, or some such?

@fkorotkov
Copy link
Contributor

fkorotkov commented Aug 21, 2018

@Hixie @DanTup seems might be related to golang/go#13155. I've added some extra checks to workaround it. Will deploy shortly.

Update: deployed.

@DanTup
Copy link
Contributor Author

DanTup commented Aug 22, 2018

@fkorotkov I think I just saw the same again here:

https://cirrus-ci.com/task/5300252680650752

If you think it's us leaving processes around, let me know and maybe we can dump the list at top/bottom of the list to debug (maybe it'd be a neat feature if Cirrus failed with "script finished but these new processes were left hanging around:.."!).

@fkorotkov
Copy link
Contributor

@DanTup weird thing is that it's happening only on Windows so far 🤔Will try to get some debug information.

@DanTup DanTup merged commit 65985db into flutter:master Aug 22, 2018
@fkorotkov
Copy link
Contributor

@DanTup @Hixie I've simplified how Cirrus CI agent is running scripts and capturing output. Plus added some additional logs. Let's see if hanging windows builds will happen again.

@DanTup
Copy link
Contributor Author

DanTup commented Aug 22, 2018

Great; we'll let you know if we see it again 👍

@DanTup DanTup deleted the increase-integration-test-timeouts branch August 22, 2018 17:53
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Aug 10, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants