Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to measure inference time for cudla standalone mode? #26

Closed
Railcalibur opened this issue Feb 20, 2024 · 2 comments
Closed

How to measure inference time for cudla standalone mode? #26

Railcalibur opened this issue Feb 20, 2024 · 2 comments
Assignees
Labels
question Further information is requested

Comments

@Railcalibur
Copy link

The code only measures inference time for htbrid mode.

Can I get the correct inference time for standalone mode by commenting the conditional statement ?
If not, how to measure the time correctly?

https://github.com/NVIDIA-AI-IOT/cuDLA-samples/blob/main/src/yolov5.cpp#L261

image

@lynettez lynettez self-assigned this Mar 25, 2024
@lynettez lynettez added the question Further information is requested label Mar 25, 2024
@lynettez
Copy link
Collaborator

We only recommend to measure the DLA task execution time using Nsight Systems.

@lynettez
Copy link
Collaborator

lynettez commented Sep 2, 2024

closing since no activity for several months, thanks!

@lynettez lynettez closed this as completed Sep 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants