Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sporadic registration failure during VM extension installation #2479

Closed
hendrik-schaffer opened this issue Mar 9, 2023 · 1 comment · Fixed by #2880
Closed

Sporadic registration failure during VM extension installation #2479

hendrik-schaffer opened this issue Mar 9, 2023 · 1 comment · Fixed by #2880
Labels
bug Something isn't working

Comments

@hendrik-schaffer
Copy link

hendrik-schaffer commented Mar 9, 2023

Describe the bug
We are kicking off the runner registration process during the provisioning of a new Windows VM. The VM is based on Windows Datacenter Server 2022 Core Azure Edition. The runner registration process is called via a custom script extension that ultimately runs the following command (variables are passed through the script extension)
.\config.cmd --unattended --replace --ephemeral --url "$runnerConfigUrl" --token "$token" --name "$runnerName" --labels "$labels" --runasservice --windowslogonaccount "$serviceUser" --windowslogonpassword "$serviceUserPassword"

The registration works fine in about 90% but we regularly see the following error which causes a registration failure:

[...]
[2023-03-09 08:03:07Z INFO ProcessInvokerWrapper]   Force kill process on cancellation: 'False'
[2023-03-09 08:03:07Z INFO ProcessInvokerWrapper]   Redirected STDIN: 'False'
[2023-03-09 08:03:07Z INFO ProcessInvokerWrapper]   Persist current code page: 'False'
[2023-03-09 08:03:07Z INFO ProcessInvokerWrapper]   Keep redirected STDIN open: 'False'
[2023-03-09 08:03:07Z INFO ProcessInvokerWrapper]   High priority process: 'False'
[2023-03-09 08:03:07Z INFO ProcessInvokerWrapper] Process started with process id 4868, waiting for process exit.
[2023-03-09 08:03:14Z INFO ProcessInvokerWrapper] STDOUT/STDERR stream read finished.
[2023-03-09 08:03:14Z INFO ProcessInvokerWrapper] STDOUT/STDERR stream read finished.
[2023-03-09 08:03:14Z INFO ProcessInvokerWrapper] Finished process 4868 with exit code 0, and elapsed time 00:00:06.8021601.
[2023-03-09 08:03:14Z INFO Terminal] WRITE LINE: Service actions.runner._services.atcwin1-41ae3a3 successfully installed
[2023-03-09 08:03:15Z ERR  Runner] System.Exception: Failed to Lock Service Database for Write
   at GitHub.Runner.Listener.Configuration.NativeWindowsServiceHelper.InstallService(String serviceName, String serviceDisplayName, String logonAccount, String logonPassword)
   at GitHub.Runner.Listener.Configuration.WindowsServiceControlManager.ConfigureService(RunnerSettings settings, CommandSettings command)
   at GitHub.Runner.Listener.Configuration.ConfigurationManager.ConfigureAsync(CommandSettings command)
   at GitHub.Runner.Listener.Runner.ExecuteCommand(CommandSettings command)
[2023-03-09 08:03:15Z ERR  Terminal] WRITE ERROR: Failed to Lock Service Database for Write
[2023-03-09 08:03:15Z INFO Listener] Runner execution has finished with return code 1

Googeling for this error shows maybe related issues such as microsoft/azure-pipelines-agent#3560.

Is there anything we can do from our side or can you somehow harden the registration process that prevents above error?

Expected behavior
Runner registration process does not fail

Runner Version and Platform

2.296.2

OS of the machine running the runner? OSX/Windows/Linux/...
Windows

@pb185235
Copy link

I am using the actions-runner version 2.316.1 but still getting the error saying "Failed to Lock Service Database for Write" while trying to register an Azure VM as github runner. Please suggest how do we fix it? I am trying to register VM as github runner through "az vm run-command" and using config.cmd.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants