Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Core][MPI] Adding MinLocAll and MaxLocAll to DataCommunicator #11712

Merged
merged 7 commits into from
Nov 8, 2023

Conversation

loumalouomega
Copy link
Member

📝 Description

Adding MinLocAll and MaxLocAll to DataCommunicator. This function not only computes the maximum/minimum but the partition where this values is local.

I also changed the suite in the serial DataCommunicator tests, as it was wrong,

🆕 Changelog

@loumalouomega loumalouomega added Enhancement Kratos Core Testing Parallel-MPI Distributed memory parallelism for HPC / clusters labels Oct 23, 2023
@loumalouomega loumalouomega requested a review from jcotela October 23, 2023 12:41
@loumalouomega loumalouomega requested review from a team as code owners October 23, 2023 12:41
Copy link
Member

@jcotela jcotela left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not convinced by the change of suite of the data_communicator tests but fair enough if people thinks it makes sense.

I would suggest to put all the logic for MPI datatype deduction on one place (mpi_message).

Other than that, looks good to me

kratos/mpi/sources/mpi_data_communicator.cpp Outdated Show resolved Hide resolved
@jcotela
Copy link
Member

jcotela commented Oct 24, 2023

I changed it a bit so that MinLoc and MaxLoc are not defined for array/vector types, as I think they would not make sense with the current implementation (min/max by component, so that the result can be a composite of the values from different partitions). If this is not desirable, my preference would be to revert cde37b7 and bb95bac and keep @loumalouomega's original implementation.

@RiccardoRossi
Copy link
Member

could u document what the funciton returns possibly in its signature?

is it (value, rank) or something different?

@loumalouomega
Copy link
Member Author

could u document what the funciton returns possibly in its signature?

is it (value, rank) or something different?

It is an std::pair of value, int

@loumalouomega
Copy link
Member Author

could u document what the funciton returns possibly in its signature?

is it (value, rank) or something different?

Documentation added

@loumalouomega loumalouomega requested a review from jcotela November 2, 2023 10:01
@loumalouomega
Copy link
Member Author

Feel free to approve @jcotela

@loumalouomega
Copy link
Member Author

Feel free to approve @jcotela

?

Copy link
Member

@philbucher philbucher left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please add a python test as well then I approve

@loumalouomega
Copy link
Member Author

please add a python test as well then I approve

Done

@loumalouomega loumalouomega merged commit f27e57b into master Nov 8, 2023
@loumalouomega loumalouomega deleted the core/mpi/adding-min-loc-all-and-max-loc-all branch November 8, 2023 12:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Enhancement Kratos Core Parallel-MPI Distributed memory parallelism for HPC / clusters Testing
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants