Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Core][MPI] Minor fix in STL IO in MPI write #11656

Merged
merged 2 commits into from
Oct 5, 2023
Merged

Conversation

loumalouomega
Copy link
Member

📝 Description

In this PR , changes have been made to the stl_io.cpp. The PR introduces fixes in MPI communication. Specifically, it improves the handling of the number of entities and geometries in the code.

Here's a breakdown of the changes made in the PR:

  1. A new variable "number_of_entities" has been introduced to store the count of entities.
  2. The count of entities is calculated and summed across all processes using "rDataCommunicator.SumAll(number_of_entities)".
  3. Similar changes have been made for the count of geometries, with the introduction of the variable "number_of_geometries."
  4. The condition for checking if there are entities or geometries has been updated to use these new count variables, improving efficiency in MPI communication.

Until now, due to the previous check, if a partition was empty the communication could get stuck.

🆕 Changelog

@loumalouomega loumalouomega added Kratos Core Parallel-MPI Distributed memory parallelism for HPC / clusters FastPR This Pr is simple and / or has been already tested and the revision should be fast Bugfix IO labels Oct 4, 2023
@loumalouomega loumalouomega requested a review from ddiezrod October 4, 2023 10:16
@loumalouomega loumalouomega requested a review from a team as a code owner October 4, 2023 10:16
philbucher
philbucher previously approved these changes Oct 4, 2023
Copy link
Member

@philbucher philbucher left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

classic 😄

@matekelemen
Copy link
Contributor

Good catch! We're in desperate need of a better way of finding these bugs other than stumbling into them by chance at runtime.

Does anyone have experience with other MPI software, and how they avoid these problems?

@loumalouomega
Copy link
Member Author

Good catch! We're in desperate need of a better way of finding these bugs other than stumbling into them by chance at runtime.

Does anyone have experience with other MPI software, and how they avoid these problems?

@matekelemen reapproval?

@philbucher
Copy link
Member

Good catch! We're in desperate need of a better way of finding these bugs other than stumbling into them by chance at runtime.

Does anyone have experience with other MPI software, and how they avoid these problems?

One simple way would be to create a test modelpart that has an empty partition, and use that in the tests. It is always the same problem.

@loumalouomega loumalouomega merged commit 0fb47b9 into master Oct 5, 2023
@loumalouomega loumalouomega deleted the core/stl-io-mpi-fix branch October 5, 2023 08:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bugfix FastPR This Pr is simple and / or has been already tested and the revision should be fast IO Kratos Core Parallel-MPI Distributed memory parallelism for HPC / clusters
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants