You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sparse pointcloud of the last n range sensor readings in the "room frame".
We can try to use the relative pose between the different range sensor readings to align them in the same frame. I.e. the successive range sensor readings of the same wall should end up on a line. Then it might be that we can use this (very) sparse point cloud in the next generation decision maker. The odometry estimate from the optical flow sensor will drift over time, invalidate old mapped features, but we can just discard them. This will give us a less extensive map, but maybe it can be helpful nevertheless.
Tasks
Read the relative pose from mavros.
Read the range sensor values
Decide for a feasible datastructure for the map
Implement mapping
Publish a ros pointclout to a suitable topic. (And view the visualsation in rviz)
Write tests
Criteria
A node that publishes a local map of the drone
The text was updated successfully, but these errors were encountered:
nodes/map_node.py which contains the ros interface of the node and calls the logic in src.
src/local_map/map_logic.py which contains the logic which is called from test and nodes.
test/map_logic_test.py which contains the unittests for the logic in src.
I added some TODOs, both in the interface part (Eg. finding the correct topic for our case, and converting from the datatype the logic outputs to the relevant ROS message) and in the logic part. To be fair, nothing in the logic is made and most of the task is to make that.
I encourage you to separate the logic as much as reasonable into pure functions to make the testing easier.
To test this, you can write unittests for each function you make. When we are ready to fly, we can test it by running it together with the rest of the system too.
I now made the node interface in 3f4b8da. I tested briefly if it works to publish point clouds to rviz, and it does. I have not tested the range sensors. I believe I have found the correct mavros topic for the position, but if not can we change it later.
Details
Input
Output
n
range sensor readings in the "room frame".We can try to use the relative pose between the different range sensor readings to align them in the same frame. I.e. the successive range sensor readings of the same wall should end up on a line. Then it might be that we can use this (very) sparse point cloud in the next generation decision maker. The odometry estimate from the optical flow sensor will drift over time, invalidate old mapped features, but we can just discard them. This will give us a less extensive map, but maybe it can be helpful nevertheless.
Tasks
mavros
.rviz
)Criteria
The text was updated successfully, but these errors were encountered: