-
-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Replace '.points.' with just '.' #4217
Replace '.points.' with just '.' #4217
Conversation
Depends on #4220 |
Ignore the changes affecting |
Marking this as needs more work, because of conflicts. The current state is all reviewed from my side. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can't reproduce test failure in marching cubes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The tests are complaining in all 3 OSes.
That was my bad. Please check now |
Not even in docker? We can't merge this until we understand what caused it. If we manage to isolate part of the code which has no influence we can merge it in advance but otherwise we're merging something which turns our tests red from now on and that doesn't make much sense. |
@@ -246,7 +246,7 @@ pcl::MarchingCubes<PointNT>::performReconstruction (pcl::PointCloud<PointNT> &po | |||
voxelizeData (); | |||
|
|||
// preallocate memory assuming a hull. suppose 6 point per voxel | |||
double size_reserve = std::min((double) intermediate_cloud.points.max_size (), | |||
double size_reserve = std::min((double) intermediate_cloud.max_size (), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the line causing error. I've seen a similar fix (by @mvieth I think) regarding std::min
or std::max
where using them introduced an error.
The testing was done here:
https://kunaltyagi.visualstudio.com/pcl/_build?definitionId=3&_a=summary&repositoryFilter=2&branchFilter=99%2C99%2C99%2C99%2C99%2C99
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure what fix you are referring to, but something like this seems better: const std::size_t size_reserve = std::min(intermediate_cloud.max_size (), 2 * 6 * (std::size_t) (res_y_*res_z_ + res_x_*res_z_ + res_x_*res_y_));
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isolate this change to another PR so this once can be merged.
…point_field_types * origin/new_point_field_types: (211 commits) [CI] Make windows build on c:\ drive to fix out-of-disk-space errors (PointCloudLibrary#4382) Replace '.points.' with just '.' (PointCloudLibrary#4217) [Surface/Poisson4] Always update counter and prevent overflow access in poisson4 octree (PointCloudLibrary#4316) [surface/3rdParty] Add stdlib header for malloc in poisson (bugfix for gcc-5) (PointCloudLibrary#4376) [Misc] Deprecate unused ease-of-internal-use headers (PointCloudLibrary#4367) Use nth_element in median filter nth_element is faster than partial_sort, and should be used for median finding enable CUDA on 16.04 CI Optimize includes As recommended by include-what-you-use Update gpu_install.rst Fix getColor() Variable needs to be expanded. Optimize includes As recommended by include-what-you-use [gpu] Add square distances to ApproxNearestSearch (PointCloudLibrary#4340) [common] Allow conversion of PointCloud with more than 32-bit size rows/columns (PointCloudLibrary#4343) Change the macro to detect and remove deprecations before release update docstring clang format approx_nsearch.cu another indentation fix fix indentation and getBitsNum function Allow file_io to read large point clouds depending on PCL config (PointCloudLibrary#4331) ...
No description provided.