-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transfer the ownership 转移仓库所有权 do not work in V1.90~V1.92 #7947
Comments
(Please forgive if I read 转移仓库所有权 incorrectly. I am using Gitea with English language settings) I just tested in https://try.gitea.io/demoorg1/iss7947 (which is actually v1.10.0 dev build), and was able to transfer this repo from myself, to be owned by a different organization. In the gitea.log, I think it looks something like this:
I didn't encounter a problem when testing it. Hopefully your log may show some interesting messages |
log on console:
logfile:
the login ID is admin ,has the rights of all the repo,why Access is denied. |
Maybe you should not store you gitea data on C:\ on windows since it's a system disk |
@lunny I'm experiencing frequent 500 errors after transferring repos to an organization. I'm currently resolving this by |
@loup-brun A "500 error" is a generic code meaning "something unexpected happened". It would be very useful if you could paste here a relevant part of your gitea.log from the moment the error happened. |
Here is a log right after getting my 500 errors:
|
Seems like you are running out of memory |
@lafriks any suggestion on how to resolve this? That's what I thought, so I searched the Gitea documentation for options to configure memory limit on my gitea installation, but I found no option. I am experiencing this only on repos recently transferred to an organization (old repos are fine). |
@loup-brun Perhaps those repos have other issues besides having been recently transferred. Are they big? Have many files? Many commits? You could try running some git commands on them yourself directly from the shell and see what happens. For instance, there is git-fsck that does some sanity checks. Gitea can do git-fsck itself, but if you doubt Gitea it could be nice to have a "second opinion" directly from git. |
@guillep2k Small repos (max 4 MB), few files, max 10 commits. Only new projects (created since I updated to Gitea (old repos created before the update not affected by this bug) |
@loup-brun It wouldn't hurt checking them anyway. Could they have become corrupted somehow? |
If you are interested, the standard procedure in my company is for the dev to create the repository, have some structural work done and then transfer it to the company org. We are using 1.9.2 without problems so far. |
@loup-brun what's your git version and how did you install it? |
@guillep2k I run git fsck without any problem (and have this feature turned on in Gitea). @lunny I have I rolled back to gitea |
On Windows, seems that some file still locked during os.Rename |
The file object\XXXXXXXXXXXXXXXXXXXXXX.pack still locked |
This looks like a concurrency problem, and IMHO difficult to solve in a bulletproof way. You have more than one process accessing the repo at the time of migration, so the migration code fails. If you think your users are not accessing it, please check whether any automated tool either is running things on it or left processes behind locking the file. In Windows I've found that Process Explorer (a Microsoft official utility) has a search handle function that tells you which processes have a particular file open. It may be of help. Another related and very useful utility (you can check that out) is Process Monitor, although it's a bit more difficult to use. You need to set up the filters to log only the pahts you're interested in. |
I've tried with Process Monitor, pack file is locked by gitea.exe. I've put a 30sec pause before the rename, close the locked file with Process Monitor and after the pause, the repository is transfered right |
I think that some git operation was transferred from |
Another interesting test is placing the 30s delay in different parts of the code, from the earliest point to the latest (you know that one works), until we find the point that is introducing the lock. BTW good catch! |
if i go to transfer page, wait until the lock disapper, click transfer, i get 500 error and the pack file became locked. |
Good to know; if you can confirm that no other accesses were being made to that repo, that should tell us that Gitea is locking itself withing the transfer procedure. That certainly shortens the search. |
Hello, Same issue for me. Log Extract
I workaround by
edit: edit2: |
Have you installed antivirus software and enabled it? If that, could you try disable it and try again? |
@realslacker Perhaps Tortoise or another local git helper that it's integrated with Windows shell? |
Good luck, I have done several attempts just now and I can't put my finger on it. I have tried:
I still cannot reproduce it, while it still goes wrong on the prod instance. I'm not familiar with golang but I will try using delve to debug it perhaps. I debugged it, it happens somewhere in |
This sounds incredibly frustrating! @guillep2k you know it could be TestPullRequests... pr.testPatch doesn't create a temporary repo just a clean index. If transfer ownership doesn't do: Lines 609 to 610 in c15d371
Then it wouldn't know that the repo is supposed to be locked. (Because of the way transfer ownership is written we probably need to add all of those locks back in everywhere.) |
Somehow a handle to a pack file is kept open:
I observed it two times: one time one handle was kept open, one time three handles were kept open. It is not The handles are opened before SettingsPost and the handles are soon, but not immediately cleaned up after the function returns. After this line, the handle is cleaned up: *edit: that is not true, it was simply then that GC happened to trigger... *
|
I got a testcase, somehow. I'm not sure why but it always reproduces. I suspect it has something to do with the git packfiles. The handle is opened in the same thread as the thread that processes the request, so it is not a parallel operation that causes it. This is a complete testcase, in my case I ran it at Download link: https://1drv.ms/u/s!AuWWgEGGFWmIpOEXxslyffIglnw_6w?e=C9OKTQ |
This is the function that leaks handles:
In this function the leaking handle is created. |
Aha. Should we be closing the git repo? Line 591 in 1f3ba69
|
I'm beginning to suspect that #6478 might be the cause. It is the only large change in this codepath. RepoRefByType hasn't changed in the last two years except for a name change. In addition, 1.8.3 is the last known version to be working and 1.9.0 is the first version to include #6478 which is essentially using a different library to read the repositories if I read the pull request correctly. Edit: @guillep2k By reading the code I believe eventually we come to go-git:getFromPackFile: if !s.options.KeepDescriptors && s.options.MaxOpenDescriptors == 0 {
defer ioutil.CheckClose(p, &err)
} If we then zoom out back to where the repository is opened, this option is given: storage := filesystem.NewStorageWithOptions(fs, cache.NewObjectLRUDefault(), filesystem.Options{KeepDescriptors: true})
gogitRepo, err := gogit.Open(storage, fs) This means the handles to the pack files are not closed explicitly, which is exactly what I observed. Then you then rely on GC to close the handle. The handle is kept open, attempt to rename or move the repository folder is done and then "computer says no". |
@zeripath |
No but there is in See also // KeepDescriptors makes the file descriptors to be reused but they will
// need to be manually closed calling Close().
KeepDescriptors bool |
@guillep2k Just like what @Sebazzz said. It has @filipnavara Do you remember why use |
OK, We set this as a private field in Line 111 in a647a54
So we could add a |
It is performance optimization to avoid constant re-opening of the files [for the duration of one page load]. As @zeripath pointed out there's a |
Is that Close method also worth calling at the end of every request instead of relying on the GC in general (beside the solution of calling it early for these two bugs)? |
In investigating #7947 it has become clear that the storage component of go-git repositories needs closing. This PR adds this Close function and adds the Close functions as necessary. In TransferOwnership the ctx.Repo.GitRepo is closed if it is open to help prevent the risk of multiple open files. Fixes #7947
In investigating go-gitea#7947 it has become clear that the storage component of go-git repositories needs closing. This PR adds this Close function and adds the Close functions as necessary. In TransferOwnership the ctx.Repo.GitRepo is closed if it is open to help prevent the risk of multiple open files. Fixes go-gitea#7947
In investigating go-gitea#7947 it has become clear that the storage component of go-git repositories needs closing. This PR adds this Close function and adds the Close functions as necessary. In TransferOwnership the ctx.Repo.GitRepo is closed if it is open to help prevent the risk of multiple open files. Fixes go-gitea#7947
Backport #8901 In investigating #7947 it has become clear that the storage component of go-git repositories needs closing. This PR adds this Close function and adds the Close functions as necessary. In TransferOwnership the ctx.Repo.GitRepo is closed if it is open to help prevent the risk of multiple open files. Fixes #7947
Backport #8901 - Adjusted slightly for 1.9 In investigating #7947 it has become clear that the storage component of go-git repositories needs closing. This PR adds this Close function and adds the Close functions as necessary. In TransferOwnership the ctx.Repo.GitRepo is closed if it is open to help prevent the risk of multiple open files. Fixes #7947
i loopback to v1.83,it works.
The text was updated successfully, but these errors were encountered: