Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenSet Detection Issue after COCO Fine-Tuning #78

Open
mio410 opened this issue Feb 26, 2024 · 5 comments
Open

OpenSet Detection Issue after COCO Fine-Tuning #78

mio410 opened this issue Feb 26, 2024 · 5 comments
Labels
bug Something isn't working Working on it now!

Comments

@mio410
Copy link

mio410 commented Feb 26, 2024

After fine-tuning on the COCO dataset with 80 classes, when I tested the model again, I found that it could only detect the classes present in the COCO dataset. For classes outside the COCO dataset, the model failed to detect them. It seems like the model has become a closed-set detection model. Why is this happening, and is there a solution to improve the model's confidence in detecting certain classes without compromising its open-set detection capability?

@taofuyu
Copy link
Contributor

taofuyu commented Feb 26, 2024

I meet the same issue

@Hudaodao99
Copy link

After fine-tuning on the COCO dataset with 80 classes, when I tested the model again, I found that it could only detect the classes present in the COCO dataset. For classes outside the COCO dataset, the model failed to detect them. It seems like the model has become a closed-set detection model. Why is this happening, and is there a solution to improve the model's confidence in detecting certain classes without compromising its open-set detection capability?

Did you get the same result as on github(YOLO-world-L map = 53.3)?

@mio410
Copy link
Author

mio410 commented Feb 27, 2024

After fine-tuning on the COCO dataset with 80 classes, when I tested the model again, I found that it could only detect the classes present in the COCO dataset. For classes outside the COCO dataset, the model failed to detect them. It seems like the model has become a closed-set detection model. Why is this happening, and is there a solution to improve the model's confidence in detecting certain classes without compromising its open-set detection capability?

Did you get the same result as on github(YOLO-world-L map = 53.3)?

截屏2024-02-27 11 08 55 This is the verification result of my last epoch.

@LiuChuanWei
Copy link

Have you solved this problem yet?

@wondervictor
Copy link
Collaborator

Hi all (@mio410, @LiuChuanWei, @Hudaodao99): happy to update a milestone,
now I've tried a new setting with SGD and fewer augmentation epochs, fine-tuning without mask-refine or copypaste works.

  1. reduce mosaic epochs, increase normal epochs
max_epochs = 40  # Maximum training epochs
close_mosaic_epochs = 30
  1. use SGD optimizer, add weight decay for BN and bias.
optim_wrapper = dict(
    optimizer=dict(_delete_=True,
                   type='SGD',
                   lr=1e-3,
                   momentum=0.937,
                   nesterov=True,
                   weight_decay=weight_decay,
                   batch_size_per_gpu=train_batch_size_per_gpu),
    paramwise_cfg=dict(custom_keys={'logit_scale': dict(weight_decay=0.0)}),
    constructor='YOLOWv5OptimizerConstructor')

Under this setting, YOLO-World-Large without mask-refine can achieve 52.8 AP on COCO (better than YOLOv8), and improve the former wrong baseline (48.6). BTW, fine-tuning with mask-refine now achieves 53.9 AP.

This is a milestone but not the terminus and we are still working on it for a better fine-tuning setting!

Those updates will be pushed in a day.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working Working on it now!
Projects
None yet
Development

No branches or pull requests

5 participants