-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue (re)numbering #1
Comments
So old issues will be in an different repo. But will we be able to create new issues in the cpython repo? (I hope so, so at least for new issues things look “normal”.) |
Ideally it would be better to have all issues -- both old and new -- in the same place. If it's not possible to have them in the python/cpython repo, it might be better to have them all in a separate repo (e.g. python/cpython-issues). I'm investigating with GitHub what will be the consequences of having issues and PRs in two separate repos, and I already thought about a few potential problems/solutions:
It might also be possible to merge the two repos down the line, so I'm also investigating if this is a realistic possibility and if there is anything we can do to make this easier/possible in the future. FTR I looked at the last 50 issues on bpo sorted for activity: more than 1/3 have been created over 1 year ago, and more than 1/5 are over 3 years old. If we keep old issues in a separate repo we will have to do back and forth for a long time. |
isn't it possible to transfer the issues from one github repo to another later? |
It should be possible, and in fact we are planning to import to an empty test repo first, and then transfer the issues to the python/cpython repo (it is not possible to import into an existing repo). I still have to do some testing to verify this and make sure we can preserve the issue ID while transferring issues. |
FTR I just verified that after an import, creating a new issue starts from |
make sense |
The transfer tool doesn't support preserving issue IDs and will just assign IDs incrementally starting from the highest PR id plus one. In theory we could still try to create a fake issue with e.g. ID 99999 so that the first imported issue will have ID 100000 and a fixed offset, but this is very error prone and probably not worth it. If possible, it would be better to at least preserve the assumption that the ID order matches the creation order. |
The issues will be migrated by creation date, and will take the first available ID. This means that:
There will also be links from the GitHub issues to the bpo issues and vice versa. I also checked if the issue IDs matched the creation date and found a few exceptions among the old SF issues were an issue with lower ID has a more recent creation date than one with a higher ID: 233790 2001-02-23.18:02:27 404275 2001-02-26.13:10:42 406292 2001-03-06.13:46:02 406295 2001-03-06.13:46:17 406297 2001-03-06.13:46:25 406298 2001-03-06.13:46:15 406301 2001-03-06.13:46:57 406304 2001-03-06.13:48:34 406311 2001-03-06.13:49:20 406318 2001-03-06.13:56:07 406321 2001-03-06.13:56:47 406324 2001-03-06.13:57:10 431772 2001-06-10.07:55:33 432208 2001-06-11.21:24:15 There were also 6 more pairs that were created at the same time and had the same creation date: 515026 2002-02-08.22:22:16 2710 2008-04-28.19:44:50 2758 2008-05-04.17:42:06 5632 2009-03-31.21:01:37 8696 2010-05-12.14:27:09 |
that is great |
GitHub uses the same namespace for issues and PRs, and the current PR numbers already overlap with the original bpo numbers.
Current situation:
Questions and issues:
original_number + 100k
), >143k new GH issues/PRsoriginal_number + 40k
), >82k new GH issues/PRsoriginal_number
), 43k-56k old SF issues, 57k-80k current PRs (if PRs can be renumbered), >80k new GH issues/PRsPR
: current PRs;SF
: old SourceForge issues;BPO
: current BPO issues;NEW...
: new issues;_
: unused range.Other considerations:
max(issue_ids) + 1
) [confirm with GH].original_number + 40k
ororiginal_number + 100k
it will be easier to find the corresponding issue without relying on a mapping.Update
The text was updated successfully, but these errors were encountered: