-
-
Notifications
You must be signed in to change notification settings - Fork 340
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug during the transfer that prevents the transmission of the complete payload #314
Comments
Hi Richard, Thanks for the report. Could you rerun your test with this patch applied: 783a296 It will display the number of received bytes in case of incomplete transfers. Make sure the debug output is enabled. |
After applying your patch...
|
The bug is due to the naive USB transfer parameters computation done in arvuvstream.c To be compared to what is done in the National Instruments driver: https://github.com/ni/usb3vision/blob/ac1599d4ef58c61863839504adff37eaf61810a3/u3v_stream.c#L279 |
Could you try to change the value of ARV_UV_STREAM_MAXIMUM_TRANSFER_SIZE and set it to 256*256 ? Line 38 in 783a296
|
Making said change enables a stream of 1200x1200 @ 25fps with arv-camera-test. It does not change the result of the gst pipeline however.
|
Ok, that is better, even if the incomplete warning is a bit worrying. We have to find how to determine the optimum maximum transfer value, instead of using a |
What is the UV stream by the way? |
I'm not sure to understand the question. ArvUvStream derives from ArvStream, and implements the USB3Vision streaming protocol. |
A better question would have been how does ARV_UV_STREAM_MAXIMUM_TRANSFER_SIZE affect the stream. As the name suggests, it defines the USB3vision maximum transfer size, simple enough. What puzzled me was how reducing this maximum from 2^20 to 2^16 made for successfully completed buffers when running arv-camera-test. I can test a range of values tomorrow, but it will be just blind guessing. With my camera, the uv bandwidth limit can be controlled, but the current value of 200 MB is greater than the payload in the test stream of 1.44 MB. So I assume that the bandwidth limit is something else than the stream's max transfer size. |
This value is used for the computation of the USB3Vision streaming parameters. Lines 294 to 314 in 783a296
It looks like on some system, there is a maximum transfer size that should not be exceeded. But I don't know how to determine this limit. |
I've forwarded this issue to Allied Vision. This is their response: DeviceLinkThroughputLimit in Byte/s effectively controls the transfer bandwidth. I will perform some testing with various levels of device link throughput in attempts to achieve a successful stream. I'll keep you posted. |
Hi again, I've done some testing which found that by decreasing both the #define ARV_UV_STREAM_MAXIMUM_TRANSFER_SIZE inarvuvstream.c to 256x256 and setting my camera's DeviceLinkThroughputMode to 'On' and the DeviceLinkThroughputLimit to less than 185 000 000 bytes allowed for successfull streaming at 2592x1944, RGB, @ 4 fps. Increasing the link throughput limit to allow for 25 fps is too much for my USB port, and thus causes missed frames. This happens with Allied Vision's simple frame grabber C program, so is not the fault of Aravis. I have found that if #define ARV_UV_STREAM_MAXIMUM_TRANSFER_SIZE is set to its original value of 1048576 while setting the DeviceLinkThroughputLimit still amounts to missed frames. Regarding this, I believe that having a aravissrc property that sets the device link throughput limit would be beneficial. Also, as a side note, I know that my camera when in ExposureAutoMode=Continuous it has a tendency to increase the exposure so much that the frame rate decreases. Could a plugin property be added which allows for the control of exposure auto limits? |
Thanks for your feedback. Regarding aravissrc, I think we need a new property that would take a string containing a list a feature/value pairs, similar to the parameters of the arv-tool |
Hi @EmmanuelP, We're experimenting with Aravis and several camera models. For some older Basler cameras we also had to reduce Thus, it is currently not possible to find one value for So would it be an option to add a new method that can set And is Our test program basically just uses
I can also try to come up with some more debug info if you want - just tell me what you need. |
Hello - I am involved with the University of Aberdeen on a project where we use various monochrome machine vision cameras for the purposes of recording digital holograms of plankton. I used Aravis orginally under Linux to implement a holographic camera (weeHoloCam) when it was in release 0.6. with no issues, however I'm now using it under Windows for holography with an collection of different sensors. I have found that various high-spec windows laptops have a tendency to drop frames with "ARV_BUFFER_STATUS_MISSING_PACKETS" and similar issues while other laptops are fine. This seems to be at least partly computer dependant. I had assumed it was bottlenecks in the USB subsystems, but now I'm not at all sure. Due to the apparent dependency on different computers as well as different cameras, I would like to advocate the exposure of the ARV_UV_STREAM_MAXIMUM_TRANSFER_SIZE parameter via your API so it can be modified at runtime. Given the issues I've been experiencing, I would be inclined to implement an auto-tune of this parameter if I could access it at runtime. For example, we have one laptop that works fine using a Jai GO-5100M with ARV_UV_STREAM_MAXIMUM_TRANSFER_SIZE set to 1024x1024, but on a higher spec'd laptop I have had to reduce this to 256x256 to get the sensor working reliably. As the syssem is currently with a customer, I have had to recompile aravis and have them replace libaravis-0.8-0.dll progressively until they found a version that worked. |
fyi, to make it work with basler cameras, I do need to change |
@dxt , @PolyVinalDistillate , @schrammae , Hi. Could you please test #958 ? This merge request implements a new |
Describe the bug
Bug during video transfer that prevents the transmission of the complete payload. Streaming from Allied Vision GeniCam with resolution greater than approx. 350x350 results in "Incomplete image received, dropping"
To Reproduce
Using an Allied Vision Alvium 1800-500c camera, attempt the following pipeline:
gst-launch-1.0 -e aravissrc camera-name="Allied Vision-xxxxx"
! video/x-raw,format=GRAY8,width=2592,height=1944,framerate=4/1
! videoconvert
! ximagesink
Expected behavior
A successfully displayed video stream at 4 frames per second
Camera description:
Platform description:
NVIDIA Pascal™ Architecture GPU
2 Denver 64-bit CPUs + Quad-Core A57 Complex
8 GB L128 bit DDR4 Memory
32 GB eMMC 5.1 Flash Storage
Additional context
When running nearly the same pipeline with far less resolution (i.e. 200x200), the stream works fine. As the resolution increases, the displayed video becomes more and more choppy as if the USB bandwidth is reaching its maximum capacity. This threshold I have found to be around 350x350, where greater than this results in the incomplete image errors. The amount of data in such a stream should not be near enough to bog down USB3.0.
I have confirmed that my hardware is functioning properly by running a test stream with Allied Vision's example software (Vimba API built on top of GenICam API), which successfully captures frames at full resolution of 2592x1944 and framerates upwards of 24 fps.
arv-camera-test-output.txt
The text was updated successfully, but these errors were encountered: