-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Setting config parameters from a yaml file in ROS #2366
Comments
Hi @hakunaMatataHub Thanks for your questions. Whilst I do not have information about setting up sequence_id in a yaml file, the links below provide advice about setting it in launch file rosparams. This may provide some useful insights about how to define those instructions in a yaml file instead. ROS1 ROS2 Please note that the ROS2 advice in the link above was intended for the previous ROS2 wrapper (up to version 3.2.3) rather than the current ros2_beta wrapper (currently version 4.0.4). You can set the disparity shift by defining its value in a json camera configuration file and then loading that json file during launch, as described at #981 The depth unit scale cannot be changed in the RealSense ROS wrapper for reasons explained at #277 (comment) |
@MartyG-RealSense How do I set depth clamp min and depth clamp max in ros wrapper? Atleast I can deduce that the clamp depth min/max values should be within the min and max depth values resulting from the depth-disparity realtion corresponding to a window of 126 disparity, right? |
The RealSense ROS wrapper does not support setting a minimum depth distance. You can set a maximum distance with the clip_distance parameter. However, if you use rtabmap-ros then you can set a minimum distance with its RangeMin parameter and a maximum with RangeMax. More information about this can be found at introlab/rtabmap_ros#413 Disparity Shift is not the same as a depth clamp, which defines the range of distance values that are allowed to be rendered on the depth image and excludes from the image the depth values that are outside that defined min-max range. When the disparity shift value is increased then the camera's minimum depth sensing distance decreases, enabling the camera to move closer to the surface of an object before the depth image starts breaking up when the camera moves below minimum distance. However, the camera's maximum observable depth distance also decreases at the same time as disparity shift is increased. A disparity shift of between 50 and 100 provides a good balance between being able to move the camera closer to a surface and not losing most of the longer-range background detail. A disparity shift may only improve the depth image in regard to the area of the depth image nearest to the camera being more detailed. Unless you are using the min-max of rtabmap_ros's RangeMin and RangeMax though then the only clamp option available to you will be to set the maximum distance with clip_distance. |
@MartyG-RealSense |
There isn't a difference. The threshold filter is one of the methods available to implement depth clamping of the image by controlling what distance values are permitted to be included on the image. |
If I point my real-sense towards ceiling of my room with depth unit = 1000, depth clamp min =0 and depth clamp max = 65536 in the depth table under advanced controls, the pixels corresponding to the ceiling in the depth image shows depth of ~ 2.2 m. If I change only the depth unit to say 4000 the depth for the same camera-scene configuration changes to almost half i.e ~1m and If I decrease it ti say 300 it is beyond 10m for same scene-camera configuration. Doesn't the depth unit say how many micro-meters does one unit in the 16bit representation of the depth image correspond to? Why does it change the value of the estimated depth? |
Real distance in meters can be obtained by multipying the raw 16-bit depth value by the depth unit scale, which on the 400 Series cameras is usually 0.001 (m) by default (except for the D405 model, where it is 0.01 by default). So if the maximum raw depth value of 65536 is multiplied by 0.001 then that gives a real distance value of 65.536 m. Most of the Advanced Mode features are undocumented by Intel and do not have explanations of their functions. This is because Advanced Mode controls interact with one another in complex ways and so Intel chose to control them automatically with machine learning algorithms (whilst allowing users to perform trial and error experimentation with them if they wish to do so). I would therefore recommend leaving the Advanced Mode's Depth Units setting alone and instead controlling scale with the Depth Units option under Stereo Module > Controls |
@MartyG-RealSense Just that their units are different, the one under advance control is in micro meter while other one is in meters |
I tested that and you are correct, changing the Controls > Depth Units does automatically change the Advanced Mode > Depth Units. This is an example of how the undocumented nature of Advanced Mode functions can lead to surprises even for expert users. :) Two RealSense team members at IntelRealSense/librealsense#5487 explain how the two modes are interconnected and like I did, they recommend using the Controls version of the setting instead of Advanced Mode. |
@MartyG-RealSense One last thing, I am not able to see meta data such as sequence id and exposure in the realsense -viewer? |
When auto-exposure is enabled, the exposure value does not update and so the Actual Exposure metadata value represents the exposure value. HDR metadata values are available for sequence size, sequence ID and sequence name. As the Viewer's metadata overlay was programmed a long time before the HDR mode was introduced, it is likely that the selection of metadata options displayed in the overlay is fixed in the Viewer's code and so does not contain the HDR metadata, which should be able to be retrieved through scripting when creating your own program. |
@MartyG-RealSense Does this mean It is not supported in 20.04 LTS? |
References to frame metadata in the SDK documentation usually refer to the original metadata types that are are listed in the metadata overlay in the Viewer. Kernel patching support for 20.04 (Focal) and kernel 5.8 was added in SDK 2.45.0 - as detailed at IntelRealSense/librealsense#8787 - and a patch was added for Focal and kernel 5.11 in SDK 2.50.0, as detailed at IntelRealSense/librealsense#9727 The patching process should add hardware metadata support to a source code build of librealsense. Patching is not necessary when building from Debian packages or from source code with the RSUSB backend installation method. |
@MartyG-RealSense I didn't get the first part of your comment, do you mean the highlighted metadata type is not one of original metadata type? |
If you are installing the SDK with Debian packages then you do not need to patch, as the patch is included in the package. What I mean about the Real Exposure metadata type is that because the exposure value does not update whilst auto-exposure is enabled, you cannot read the exposure unless you disable auto-exposure or you obtain the 'Real Exposure' metadata value instead of querying the exposure value directly. |
@MartyG-RealSense |
I was able to duplicate your metadata readout with a D435i where support for hardware metadata was not enabled. However, your situation is different in that you have Global Time enabled and no red warning message about lack of hardware metadata, suggesting to me that you should have metadata support but something is going wrong. If you built from Debian packages then the most recent kernel officially supported should be 5.4, whereas a source code build will support more recent kernels used with 20.04 such as 5.8 and 5.11. Newer kernels that are not officially supported can work with the SDK but there may be unpredictable consequences in regards to stability. |
I just checked my kernel is 5.13.0-44-generic. So do you recommend building it from source for hardware metadata support? |
Yes, I would recommend a source code build. If you do not want to change your kernel then you could try building the SDK from source code with the RSUSB backend installation method. This install method is compatible with 20.04. It bypasses the Linux kernel and so is not dependent on Linux versions or kernel versions and does not require patching. Hardware metadata support is provided automatically in an RSUSB build without having to patch it in. |
@MartyG-RealSense |
I have the following parameters set in a yaml filew which is uploaded to rosparam server from rs_camera.launch file, This is rs_camera_config.yaml /camera/temporal/filter_smooth_alpha: 0.1 This is rs_camera.launch file
This is middensitypreset.json which is included in rs_camera.launch { |
@MartyG-RealSense But I would also need the hdr_merge filter to be turned on for improving my depth. |
My understanding is that in the ROS1 wrapper you can enable hdr_merge in the roslaunch instruction as a filter with filters:=hdr_merge For example:
If you needed to use more than one filter, such as pointcloud and hdr_merge, then you can separate the filters with a comma with no space between them:
The hdr_merge instruction is listed within the filters section of the RealSense ROS wrapper documentation, confirming that it should be handled as a filter type. Based on the information in #1657 (comment) the rosrun equivalents of the HDR settings may look something like the instructions below if you wish to apply the settings during runtime after launch has completed.
I include the exposure/1 and exposure/2 commands as an example of setting values for 2 different exposures, though you may only need to define exposure/2 |
@MartyG-RealSense I am able to turn on hdr_merge filter from within rs_camera.launch file along with spatial, temporal and decimation filters. My issue is It does not turn on hdr beacuse I can not see any flickering when i visualisize the infrared images in rqt_image_view. |
@MartyG-RealSense Why is this needed in hdr_merge? |
The purpose of the rosrun command that I listed was to act as an example of how to define all the instructions associated with HDR in rosrun, not necessarily to use them all in the same instruction. There is little information available about using the HDR commands in the RealSense ROS wrapper, so I have attempted to provide as much as I can of the information that is available in order to help you to work out the commands that you need to use to achieve the desired effect. HDR configuration in the ROS wrapper is not a subject that I am familar with though, so there is a limit to the advice that I can provide on the subject unfortunately. |
@MartyG-RealSense |
I have referred your question to the RealSense ROS development team. |
@MartyG-RealSense |
@MartyG-RealSense
EDIT:: HDR MERGE feature works fine in intel real sense viewer |
I have not received a response yet to my referral to Intel of your question. I will update you when I do. Thanks very much for your patience. As you have performed a Debian installation and then a source code build on the same computer, this can result in conflicts such as Multiple udev rules. You should be able to completely remove Debian packages related to librealsense using the command below:
|
@MartyG-RealSense I did source installation only for ros wrapper |
@MartyG-RealSense |
I have not received a response yet. I promise that I will update when I do. |
Hi @hakunaMatataHub Intel concluded their discussion of the issue that you reported. They decided that the non-flickering on the infrared image when visualizing the infrared images in rqt_image_view was a valid concern and an official bug report for the issue was created so that it can be investigated by Intel. |
@MartyG-RealSense |
No, the reports are internal and not publicly viewable or trackable. If a fix is implemented then it is typically listed in release notes such as those for librealsense or the ROS wrapper (which has a Fixed Issues list in its release notes). I added an Enhancement label to this case, indicating that the case should be kept open until the bug report is concluded. When a report is closed, the Enhancement label is removed from the case. So you can tell at a glance if the bug report is still open by looking at the list of labels in the side panel at the top of the case. |
Hi @MartyG-RealSense @hakunaMatataHub, I've added hdr_merge.enable and depth_module.hdr_enabled to the rs_launch.py file (ros2-development branch)
Let me know if you need further assistance. |
Excellent, thanks so much @SamerKhshiboun :) @hakunaMatataHub Can you confirm please whether the amended rs_launch.py file resolves the issue for you, please? Thanks! |
As the hdr_merge filter change has been merged into the ROS wrapper, I will close this case. Thanks! |
I have enabled HDR_Merge to true, set the sequence name to 0, set the sequence size to 2 using ros yaml files.
How can I set exposure and gain values for the two frames in the subpreset sequence i.e subpreset sequence id 1 and sub preset sequenceid 2 using the yaml file.
In the real sense viewer I am able to set the exposure and gain values for individual sequence ids when HDR Enabled is checked using the slide-bar. I also note that if HDR-enabled is checked in realsense -viewer it automatically deselects auto exposure which kind of makes sense. Should i also explicitly turn off auto exposure in ros yaml file?
Also it would be very helpful if you can guide me where can I set disparity shift , min z, max z values, and depth units in the ros wrapper. I coud not locate them in the rs_camera.launch file neither in the dynamic reconfigure gui.
The text was updated successfully, but these errors were encountered: