Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[NativeAOT] System.Reflection.Tests timeout/hang #69476

Closed
jkotas opened this issue May 18, 2022 · 10 comments · Fixed by #69571
Closed

[NativeAOT] System.Reflection.Tests timeout/hang #69476

jkotas opened this issue May 18, 2022 · 10 comments · Fixed by #69571

Comments

@jkotas
Copy link
Member

jkotas commented May 18, 2022

----- start Wed May 18 01:17:17 UTC 2022 =============== To repro directly: =====================================================
pushd .
chmod +rwx System.Reflection.Tests ^&^& ./System.Reflection.Tests -notrait category=IgnoreForCI -notrait category=OuterLoop -notrait category=failing 
popd
===========================================================================================================
/root/helix/work/workitem/e /root/helix/work/workitem/e
Running assembly:System.Reflection.Tests, Version=7.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51

...
[EXECUTION TIMED OUT]
Exit Code:-3Executor timed out after 2700 seconds and was killed
@dotnet-issue-labeler
Copy link

I couldn't figure out the best area label to add to this issue. If you have write-permissions please help me learn by adding exactly one area label.

@ghost ghost added the untriaged New issue has not been triaged by the area owner label May 18, 2022
@jkotas
Copy link
Member Author

jkotas commented May 18, 2022

Hit in #69468

@jkotas jkotas added area-NativeAOT-coreclr blocking-clean-ci Blocking PR or rolling runs of 'runtime' or 'runtime-extra-platforms' labels May 18, 2022
@jkotas
Copy link
Member Author

jkotas commented May 18, 2022

@LakshanF Looks like the System.Reflection tests are hitting intermittent hang in the CI. We either need to get to the bottom of it quickly or disabled the tests again.

@jkotas
Copy link
Member Author

jkotas commented May 18, 2022

Hit in #69451

@jkotas
Copy link
Member Author

jkotas commented May 18, 2022

Hit in #69469

jkotas added a commit to jkotas/runtime that referenced this issue May 18, 2022
@jkotas
Copy link
Member Author

jkotas commented May 18, 2022

This is failing too often. Submitted #69477

@LakshanF
Copy link
Contributor

Yes, I was concerned about this even though re-running seems to work but timeouts are frequent enough to be a concern. Some observations;

  • The time limit is 45 mins but the job takes about 25 mins when it passes. So likely some other issue happening rather than the needed time.
  • Swapping platform to win-x64 from linux-arm64 worked but need to run multiple runs to get a better read (also, @MichalStrehovsky had some concerns with CoreCLR tests not working in Linux-arm64 (which was the swapped test run), which I didn't investigate too much since the one time run passed)

jkotas added a commit that referenced this issue May 18, 2022
@jkotas jkotas removed the blocking-clean-ci Blocking PR or rolling runs of 'runtime' or 'runtime-extra-platforms' label May 18, 2022
@MichalStrehovsky
Copy link
Member

(also, @MichalStrehovsky had some concerns with CoreCLR tests not working in Linux-arm64 (which was the swapped test run)

The edit here: #69284 (comment) was to the CoreCLR test leg (running tests under src/tests), not the libraries test leg (running xunit tests under src/libraries), so it didn't affect System.Reflection.Tests runs. My concern was that adding anything ARM64 in that section won't work. ARM64 testing currently only works for the libs tests and those legs are defined lower in the runtime.yaml file.

@ghost ghost added the in-pr There is an active PR which will close this issue when it is merged label May 19, 2022
@MichalStrehovsky MichalStrehovsky added this to the 7.0.0 milestone May 24, 2022
@ghost ghost removed the untriaged New issue has not been triaged by the area owner label May 24, 2022
@ghost ghost added in-pr There is an active PR which will close this issue when it is merged and removed in-pr There is an active PR which will close this issue when it is merged labels May 31, 2022
@ghost ghost removed the in-pr There is an active PR which will close this issue when it is merged label May 31, 2022
@MichalStrehovsky
Copy link
Member

This is not fixed, just worked around. The test still hangs on Linux. Reopening.

@MichalStrehovsky
Copy link
Member

I see, so we now have a new bug for the same thing. Closing in favor of #70010...

@ghost ghost locked as resolved and limited conversation to collaborators Jul 1, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants