Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Fix] Fix inferencer gets wrong configs path #996

Merged
merged 14 commits into from
Mar 14, 2023
62 changes: 61 additions & 1 deletion .circleci/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -134,6 +134,56 @@ jobs:
command: |
docker exec mmengine python -m pytest tests/

build_integration_test:
parameters:
torch:
type: string
cuda:
type: string
cudnn:
type: integer
default: 7
machine:
image: ubuntu-2004-cuda-11.4:202110-01
docker_layer_caching: true
resource_class: gpu.nvidia.small
steps:
- checkout
- run:
name: Build Docker image
command: |
docker build .circleci/docker -t mmengine:gpu --build-arg PYTORCH=<< parameters.torch >> --build-arg CUDA=<< parameters.cuda >> --build-arg CUDNN=<< parameters.cudnn >>
docker run --gpus all -t -d -v /home/circleci/project:/mmengine -w /mmengine --name mmengine mmengine:gpu
- run:
name: Build MMEngine from source
command: |
docker exec mmengine pip install -e . -v
- run:
name: Install unit tests dependencies
command: |
docker exec mmengine pip install -r requirements/tests.txt
docker exec mmengine pip install openmim
docker exec mmengine mim install 'mmcv>=2.0.0rc1'
- run:
name: Install down stream repositories
command: |
docker exec mim install 'mmdet>=3.0.0rc0'
HAOCHENYE marked this conversation as resolved.
Show resolved Hide resolved
- run:
name: Run integration tests
command: |
docker exec pytest tests/test_infer/test_infer.py
- run:
name: Install down stream repositories from source
# TODO: Switch to master branch
command: |
docker exec pip uninstall mmdet -y
docker exec git clone -b 3.x https://github.com/open-mmlab/mmdetection.git ../mmdetection
docker exec pip install -e ../mmdetection
- run:
name: Run inferencer tests
command: |
docker exec pytest test tests/test_infer/test_infer.py

workflows:
pr_stage_lint:
when: << pipeline.parameters.lint_only >>
Expand Down Expand Up @@ -173,10 +223,20 @@ workflows:
python: 3.9.0
requires:
- minimum_version_cpu
- hold_integration_test:
type: approval
requires:
- lint
- build_integration_test:
name: integration_test
torch: 1.8.1
cuda: "10.2"
requires:
- hold_integration_test
- hold:
type: approval
requires:
- maximum_version_cpu
- lint
- build_cuda:
name: mainstream_version_gpu
torch: 1.8.1
Expand Down
2 changes: 1 addition & 1 deletion mmengine/infer/infer.py
Original file line number Diff line number Diff line change
Expand Up @@ -419,7 +419,7 @@ def _get_repo_or_mim_dir(scope):
return repo_dir
else:
mim_dir = osp.join(package_path, '.mim')
if not osp.exists(osp.join(mim_dir, 'Configs')):
if not osp.exists(osp.join(mim_dir, 'configs')):
raise FileNotFoundError(
f'Cannot find Configs directory in {package_path}!, '
f'please check the completeness of the {scope}.')
Expand Down