Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unusually long runtime for Deploy step with self-hosted runner #1070

Open
cegekaJG opened this issue May 6, 2024 · 5 comments
Open

Unusually long runtime for Deploy step with self-hosted runner #1070

cegekaJG opened this issue May 6, 2024 · 5 comments
Labels
Cannot repro Unable to reproduce

Comments

@cegekaJG
Copy link
Contributor

cegekaJG commented May 6, 2024

I am using a self-hosted Windows runner to deploy an app on an OnPrem environment, but am experiencing very long waiting times during the Deploy-step during the "Deploy To" job. From what I can tell, the job takes 6 minutes to resolve the function DownloadAndImportBcContainerHelper from the "AL-GoHelper.ps1` script, which isn't something I've encountered with my other runners. Since there isn't really any logging to speak of it's hard to tell what is taking so long, but the runner logs seem to write the same three lines over and over again for the entire duration of the problematic function:

[2024-05-06 12:05:57Z INFO HostContext] Well known directory 'Bin': 'C:\actions-runner\bin'
[2024-05-06 12:05:57Z INFO HostContext] Well known directory 'Root': 'C:\actions-runner'
[2024-05-06 12:05:57Z INFO HostContext] Well known directory 'Work': 'C:\actions-runner\_work'
[2024-05-06 12:06:07Z INFO HostContext] Well known directory 'Bin': 'C:\actions-runner\bin'
[2024-05-06 12:06:07Z INFO HostContext] Well known directory 'Root': 'C:\actions-runner'
[2024-05-06 12:06:07Z INFO HostContext] Well known directory 'Work': 'C:\actions-runner\_work'

The runner is running on a Windows 2019 Server in an admin PS session. The log of the job can be found here:
logs_23480834135.zip

@freddydk
Copy link
Collaborator

freddydk commented May 7, 2024

Downloading the containerhelper looks fast - the time does after it says running on 5.1 and analyzing releases.
It might be that you agent isn't running elevated? and it tries to check whether docker is installed / operational?

@cegekaJG
Copy link
Contributor Author

cegekaJG commented May 7, 2024

I am running the runner in an admin session, if that's what you mean. The "Deploy" action is identical to the one in 654dd54, so I don't know what else would be interfering.

@freddydk
Copy link
Collaborator

freddydk commented May 8, 2024

I am just suggesting what you can investigate.
Based on the log, I can see that resolving the DNS and downloading containerhelper is fast.
But I cannot see whether initialization of ContainerHelper is slow or what comes after.
I have seen earlier that machines without docker and/or domain joined machines without connection to the domain controller can be very slow when running docker commands - or maybe antivirus stuff - hard for me to troubleshoot from remote

@freddydk freddydk added the Cannot repro Unable to reproduce label May 8, 2024
@cegekaJG
Copy link
Contributor Author

Is there some way I can see additional logs to determine what's happening between the two log entries?

@freddydk
Copy link
Collaborator

You could try to logon to the computer and manually download the BcContainerHelper repo and run BcContainerHelper.ps1 and try to figure out what takes so long.
I do think it is caused by antivirus scanning your files before running them or something like that - and if things goes faster when importing the second time, this could very well be the problem.

Looking at the code the InitializeModule.ps1 is the first script to run when importing the BcContainerHelper.psm1 module and this script doesn't really do much before printing the BcContainerHelper version number (which appears 4 minutes after importing).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Cannot repro Unable to reproduce
Projects
None yet
Development

No branches or pull requests

2 participants