diff --git a/.github/workflows/lint.yml b/.github/workflows/lint.yml index 542a3ec..201415f 100644 --- a/.github/workflows/lint.yml +++ b/.github/workflows/lint.yml @@ -11,10 +11,10 @@ jobs: runs-on: ubuntu-18.04 steps: - uses: actions/checkout@v2 - - name: Set up Python 3.7 + - name: Set up Python 3.8 uses: actions/setup-python@v2 with: - python-version: 3.7 + python-version: 3.8 - name: Install pre-commit hook run: | sudo apt-add-repository ppa:brightbox/ruby-ng -y diff --git a/README.md b/README.md index f20c662..40755e6 100644 --- a/README.md +++ b/README.md @@ -11,7 +11,7 @@ English | [简体中文](README_CN.md) -XRSfM is an open-source SfM codebase. It is a part of the [OpenXRLab](https://openxrlab.com/) project. +XRSfM is an open-source SfM codebase. It is a part of the [OpenXRLab](https://openxrlab.org.cn/) project. A detailed introduction can be found in [introduction.md](docs/en/introduction.md). ## Citation diff --git a/README_CN.md b/README_CN.md index a3e0f35..fc621df 100644 --- a/README_CN.md +++ b/README_CN.md @@ -11,7 +11,7 @@ [English](README.md) | [简体中文] -XRSfM 是一个开源的运动恢复结构代码仓库,它是[OpenXRLab](https://openxrlab.com/)项目的一部分. +XRSfM 是一个开源的运动恢复结构代码仓库,它是[OpenXRLab](https://openxrlab.org.cn/)项目的一部分. 关于XRSfM更详细的介绍放在[introduction.md](docs/en/introduction.md). ## Citation @@ -42,7 +42,7 @@ XRSfM 是一个开源的运动恢复结构代码仓库,它是[OpenXRLab](https 1.参考[installation.md](docs/zh/installation.md)进行安装编译. -2.下载提供的[测试数据](https://openxrlab-share.oss-cn-hongkong.aliyuncs.com/xrsfm/test_data.zip?versionId=CAEQQBiBgMCi_6mllxgiIGI2ZjM1YjE1NjBmNTRmYjc5NzZlMzZkNWY1ZTk1YWFj) 或者按照相同格式准备你自己的数据. +2.下载提供的[测试数据](https://openxrlab-share-mainland.oss-cn-hangzhou.aliyuncs.com/xrsfm/test_data.zip?versionId=CAEQQBiBgMCi_6mllxgiIGI2ZjM1YjE1NjBmNTRmYjc5NzZlMzZkNWY1ZTk1YWFj) 或者按照相同格式准备你自己的数据. 3.运行以下脚本进行重建: ``` @@ -53,7 +53,7 @@ python3 ./scripts/auto_reconstruction.py --workspace_path ${workspace_path}$ ## Build ARDemo 除了重建功能, OpenXRLab 项目还提供了定位功能。 -你可以构建自己的端云定位ARDemo,更多的信息请查看[ARDemo](http://doc.openxrlab.org.cn/openxrlab_docment/ARDemo/ARdemo.html#). +你可以构建自己的端云定位ARDemo,更多的信息请查看[ARDemo](http://doc.openxrlab.org.cn/openxrlab_document/ARDemo/ARdemo.html#). ## License 本代码库的许可证是[Apache-2.0](LICENSE)。请注意,本许可证仅适用于我们库中的代码,这些代码的依赖项是独立的,并单独许可。我们十分感谢这些依赖项的贡献者。 diff --git a/docs/en/benchmark.md b/docs/en/benchmark.md index a6248ef..03d7a88 100644 --- a/docs/en/benchmark.md +++ b/docs/en/benchmark.md @@ -11,7 +11,7 @@ The supported datasets include the sequential dataset [KITTI](http://www.cvlibs. ### Data preparation The KITTI dataset can be download in [link](https://s3.eu-central-1.amazonaws.com/avg-kitti/data_odometry_gray.zip). -As a necessary input, the image retrieval results can be downloaded from [link](https://openxrlab-share.oss-cn-hongkong.aliyuncs.com/xrsfm/KITTI.zip?versionId=CAEQQBiBgMCu.KallxgiIGM4MTk2MmJmNDU1YTQzYjBhYTJjZmIyYzQ3YzM2ODIx). +As a necessary input, the image retrieval results can be downloaded from [link](https://openxrlab-share-mainland.oss-cn-hangzhou.aliyuncs.com/xrsfm/KITTI.zip?versionId=CAEQQBiBgMCu.KallxgiIGM4MTk2MmJmNDU1YTQzYjBhYTJjZmIyYzQ3YzM2ODIx). ### Run matching stage @@ -57,7 +57,7 @@ python3 ./scripts/run_kitti_reconstruction.py --data_path /path/dataset/sequence ### Data preparation The 1DSfM dataset can be download from [project web page](https://www.cs.cornell.edu/projects/1dsfm/). -The image retrieval results and camera intrinsic parameters can be downloaded from the [link](https://openxrlab-share.oss-cn-hongkong.aliyuncs.com/xrsfm/1DSfM.zip?versionId=CAEQQBiBgIDF.KallxgiIDcyNDJmNTM4OWJhNzRlYzdhNDhkZmNjMjQ0YWU0ODA3). +The image retrieval results and camera intrinsic parameters can be downloaded from the [link](https://openxrlab-share-mainland.oss-cn-hangzhou.aliyuncs.com/xrsfm/1DSfM.zip?versionId=CAEQQBiBgIDF.KallxgiIDcyNDJmNTM4OWJhNzRlYzdhNDhkZmNjMjQ0YWU0ODA3). ### Run matching diff --git a/docs/en/tutorial.md b/docs/en/tutorial.md index 6f2cf06..cda80e5 100644 --- a/docs/en/tutorial.md +++ b/docs/en/tutorial.md @@ -75,7 +75,7 @@ The program will extract the apriltag from images to calculate the scale, and en ### Data capture -We provide [the capture application](http://doc.openxrlab.org.cn/openxrlab_docment/ARDemo/ARdemo.html#data-capturer-on-your-phone) to capture images and acquire an accurate camera intrisic parameters at the same time. +We provide [the capture application](http://doc.openxrlab.org.cn/openxrlab_document/ARDemo/ARdemo.html#data-capturer-on-your-phone) to capture images and acquire an accurate camera intrisic parameters at the same time. Users can also use images from other sources. However, since XRSfM does not support camera self-calibration currently, users need to provide camera intrisic parameters, which can be obtained by calibration. @@ -84,7 +84,7 @@ However, since XRSfM does not support camera self-calibration currently, users n In addition to the above image data and camera intrisic parameters, it is recommend to prepare the image retrieval results. . This image retrieval function is supported in [XRLocalization](https://github.com/openxrlab/xrlocalization/tree/main/docs/en/tutorials/generate_image_pairs.md). -Users can also directly download [test data](https://openxrlab-share.oss-cn-hongkong.aliyuncs.com/xrsfm/test_data.zip?versionId=CAEQQBiBgMCi_6mllxgiIGI2ZjM1YjE1NjBmNTRmYjc5NzZlMzZkNWY1ZTk1YWFj) to run the program. +Users can also directly download [test data](https://openxrlab-share-mainland.oss-cn-hangzhou.aliyuncs.com/xrsfm/test_data.zip?versionId=CAEQQBiBgMCi_6mllxgiIGI2ZjM1YjE1NjBmNTRmYjc5NzZlMzZkNWY1ZTk1YWFj) to run the program. Before running the reconstruction, you should ensure that there are the following data: the input images (images_path), diff --git a/docs/zh/benchmark.md b/docs/zh/benchmark.md index fac6d47..c6e4997 100644 --- a/docs/zh/benchmark.md +++ b/docs/zh/benchmark.md @@ -10,7 +10,7 @@ ### 数据准备 KITTI数据集可以从[链接](https://s3.eu-central-1.amazonaws.com/avg-kitti/data_odometry_gray.zip)中下载。 -作为必要的输入,可以从[链接](https://openxrlab-share.oss-cn-hongkong.aliyuncs.com/xrsfm/KITTI.zip?versionId=CAEQQBiBgMCu.KallxgiIGM4MTk2MmJmNDU1YTQzYjBhYTJjZmIyYzQ3YzM2ODIx)下载KITTI数据集的图像检索结果。 +作为必要的输入,可以从[链接](https://openxrlab-share-mainland.oss-cn-hangzhou.aliyuncs.com/xrsfm/KITTI.zip?versionId=CAEQQBiBgMCu.KallxgiIGM4MTk2MmJmNDU1YTQzYjBhYTJjZmIyYzQ3YzM2ODIx)下载KITTI数据集的图像检索结果。 ### 匹配阶段 @@ -54,7 +54,7 @@ python3 ./scripts/run_kitti_reconstruction.py --data_path /path/dataset/sequence ### 数据准备 1DSfM数据集可以在[项目网页](https://www.cs.cornell.edu/projects/1dsfm/)中下载。 -作为必要的输入,可以从[链接](https://openxrlab-share.oss-cn-hongkong.aliyuncs.com/xrsfm/1DSfM.zip?versionId=CAEQQBiBgIDF.KallxgiIDcyNDJmNTM4OWJhNzRlYzdhNDhkZmNjMjQ0YWU0ODA3)下载1DSfM数据集的图像检索结果和相机内参。 +作为必要的输入,可以从[链接](https://openxrlab-share-mainland.oss-cn-hangzhou.aliyuncs.com/xrsfm/1DSfM.zip?versionId=CAEQQBiBgIDF.KallxgiIDcyNDJmNTM4OWJhNzRlYzdhNDhkZmNjMjQ0YWU0ODA3)下载1DSfM数据集的图像检索结果和相机内参。 ### 匹配阶段 diff --git a/docs/zh/tutorial.md b/docs/zh/tutorial.md index 85d71a7..5b94de9 100644 --- a/docs/zh/tutorial.md +++ b/docs/zh/tutorial.md @@ -71,12 +71,12 @@ XRSfM支持顺序匹配("sequential")、基于检索的匹配("retrieval")和基 ## 运行你自己的数据 ### 数据采集 -我们推荐使用[采集工具](http://doc.openxrlab.org.cn/openxrlab_docment/ARDemo/ARdemo.html#data-capturer-on-your-phone)拍摄图像,它会同时获取一个准确的相机内参。 +我们推荐使用[采集工具](http://doc.openxrlab.org.cn/openxrlab_document/ARDemo/ARdemo.html#data-capturer-on-your-phone)拍摄图像,它会同时获取一个准确的相机内参。 用户也可以使用其他来源的图像,但鉴于当前版本不支持相机自标定,用户需要给出相机内参,这可以由标定得到。 ### 数据准备 除了上述的图像数据和相机内参外,还需要准备图像的检索结果,这部分功能目前被封装在XRLocation中,详细参见[XRLocalization](https://github.com/openxrlab/xrlocalization/tree/main/docs/en/tutorials/generate_image_pairs.md)。 -用户也可以直接下载[测试数据](https://openxrlab-share.oss-cn-hongkong.aliyuncs.com/xrsfm/test_data.zip?versionId=CAEQQBiBgMCi_6mllxgiIGI2ZjM1YjE1NjBmNTRmYjc5NzZlMzZkNWY1ZTk1YWFj)来运行程序。 +用户也可以直接下载[测试数据](https://openxrlab-share-mainland.oss-cn-hangzhou.aliyuncs.com/xrsfm/test_data.zip?versionId=CAEQQBiBgMCi_6mllxgiIGI2ZjM1YjE1NjBmNTRmYjc5NzZlMzZkNWY1ZTk1YWFj)来运行程序。 在运行重建前,你应该确保有以下数据: 存储着图像数据的文件夹(images_path) , 检索文件(retrieval_path),