Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix calculation of attribute_data_offset #1094

Merged
merged 4 commits into from
Jul 1, 2024
Merged

Conversation

lukasdreyer
Copy link
Collaborator

@lukasdreyer lukasdreyer commented Jun 20, 2024

Describe your changes here:

This PR fixes the multiple attributes test for at least 8 procs.
The attribute data offset update did not include the number of ghost attributes, which it does now.

All these boxes must be checked by the reviewers before merging the pull request:

As a reviewer please read through all the code lines and make sure that the code is fully understood, bug free, well-documented and well-structured.

General

  • The reviewer executed the new code features at least once and checked the results manually

  • The code follows the t8code coding guidelines

  • New source/header files are properly added to the Makefiles

  • The code is well documented

  • All function declarations, structs/classes and their members have a proper doxygen documentation

  • All new algorithms and data structures are sufficiently optimal in terms of memory and runtime (If this should be merged, but there is still potential for optimization, create a new issue)

Tests

  • The code is covered in an existing or new test case using Google Test

Github action

  • The code compiles without warning in debugging and release mode, with and without MPI (this should be executed automatically in a github action)

  • All tests pass (in various configurations, this should be executed automatically in a github action)

    If the Pull request introduces code that is not covered by the github action (for example coupling with a new library):

    • Should this use case be added to the github action?
    • If not, does the specific use case compile and all tests pass (check manually)

Scripts and Wiki

  • If a new directory with source-files is added, it must be covered by the script/find_all_source_files.scp to check the indentation of these files.
  • If this PR introduces a new feature, it must be covered in an example/tutorial and a Wiki article.

Licence

  • The author added a BSD statement to doc/ (or already has one)

@lukasdreyer lukasdreyer marked this pull request as ready for review June 20, 2024 10:04
@jmark jmark self-assigned this Jun 21, 2024
@jmark jmark assigned lukasdreyer and unassigned jmark Jun 28, 2024
@jmark jmark added the draft Enhance the visibility that this is a draft. label Jun 28, 2024
@jmark jmark marked this pull request as draft June 28, 2024 13:25
@lukasdreyer
Copy link
Collaborator Author

@jmark: Regarding our discussion, whether ghost_tree_attributes can be handled in the same way as tree_attributes:
I do not think they can, as the local order of ghosts when creating the cmesh from stash is not given by the global order of treeids, but by the order in which they are seen by facejoins. Therefore, different to tree attributes, the ghosts are not visited in the same order that they are saved in the char*, therefore the "optimization" done in t8_cmesh_trees_get_attribute, where the next attribute info offset is already set, cannot be done for the ghost attributes.

@lukasdreyer lukasdreyer assigned jmark and unassigned lukasdreyer Jun 28, 2024
@lukasdreyer lukasdreyer requested a review from jmark June 28, 2024 13:29
@lukasdreyer lukasdreyer marked this pull request as ready for review June 28, 2024 13:29
@jmark jmark merged commit 758cb99 into main Jul 1, 2024
10 checks passed
@jmark jmark deleted the fix-ghost_attribute_data_offset branch July 1, 2024 15:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
draft Enhance the visibility that this is a draft.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants