This mobile application auto-zooms into detected objects using a locally running detection model. The attached flowchart illustrates the handshake between the mobile device and the detection server. Unlike traditional methods where every 10th frame (or any set interval) is sent for detection—resulting in delayed and potentially missed detections—this logic ensures real-time performance by sending new frames only when the server is idle. Given the camera feed might be 120fps, with a new frame every half a second, while detection can take 1-2 seconds per frame, this approach avoids manual frame selection and prevents lag. This efficient management of frame processing maintains accurate, timely detection and display or result. Eliminating the need to wait for multiple frames when the model is no longer detecting and avoiding a backlog of undetected frames.
2024-03-15.19-03-20.mp4
- Thanks to ccextractor for the idea
- Flutter-Tflite for object detection on live camera feed