Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Core] update analysis stage #2135

Merged
merged 30 commits into from
Jun 13, 2018
Merged

[Core] update analysis stage #2135

merged 30 commits into from
Jun 13, 2018

Conversation

philbucher
Copy link
Member

This PR significantly improves the capabilities of the baseclass of the AnalysisStage. Common functionalities from the already existing AnalysisStages have been moved to the baseclass, to avoid code duplication and make the usage easier.

  • Solver construction is handled properly now

  • Now the processes are constructed and handled completely through the baseclass. In case the list of initialization matters (like it does in fluid) the order can be passed as a list

  • All processes involved in the analysis are now grouped in projectparams.json =>

...
"processes" : {
    initial_processes : [
        { proces_specific_params },
        { proces_specific_params }
    ],
    boundary_processes : [
        { proces_specific_params },
        { proces_specific_params }
    ]
}
...

Along with this come two more goodies:
We can finally do #869 !
Also it will be possible to automatize the computation of the ComputingModelPart (quite error prone atm)

  • NO existing code breaks!!! I implemented methods in the AnalysisStage of Fluid and Structure that make sure the old format still works

  • I adapted one test in StructuralMechanics as a proof of concept, the remaining ones I will update in another PR

  • @jcotela the test in Fluid also works. We should talk next week abt these modifications

  • The derived AnalysisStagess become much cleaner!

I know this a big step, so please take your time to review.

Requires #2068 to work

@ipouplana this is the PR I was refering to. Please also have a look, I think the PoroAnalysisStage will become much simpler after this

@loumalouomega I also had a look at the one in contact, it should not be much effort adapting it

@philbucher philbucher self-assigned this May 20, 2018
@philbucher philbucher requested a review from a team May 20, 2018 13:26
Copy link
Member

@RiccardoRossi RiccardoRossi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hi @philbucher this looks pretty fine to me.

what i am not clear is the distinction between "new" and "old" processes. could u explain didactically what is happening?

if len(list_of_processes) == 0:
KratosMultiphysics.Logger.PrintInfo("FluidDynamicsAnalysis", "Using the old way to create the processes, this will be removed!")
from process_factory import KratosProcessFactory
factory = KratosProcessFactory(self.model)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could u explain in short what is happening here?

the KratosProcessFactory is still the same right? or did u change it?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did not change the ProccesFactory
If I enter this if it means that there are no processes inside a processes dict present in the ProjectParams. This is how the current interface looks like

Then I construct the processes the same way they were constructed previously

@philbucher
Copy link
Member Author

@RiccardoRossi I changed the naming, old processes was not really a good name

I meant old interface, i.e. if a block of processes is present outside of the processes dict
=> current interface, e.g. "initial_conditions" or "boundary_conditions" are present in the lowest level of the ProjectParams

In the future they would be grouped under "processes", see the example in the first post

@RiccardoRossi
Copy link
Member

ok, all of this makes sense to me.

can i ask to document this in the wiki prior to merging the PR?

@philbucher
Copy link
Member Author

Sure, will let you know when I am done with it

@philbucher
Copy link
Member Author

ping @KratosMultiphysics/technical-committee I added the wiki for the AnalysisStage and the PythonSolver

Please have a look

@maceligueta
Copy link
Member

Nice workd @philbucher !
In my experience, printing results from a coupled problem can be a complicated task as well. Merging the results in a single file, for example for GiD, requires knowing which variables are going to be printed for just one application or for many. That's also a kind of coupling that I don't really know were would go...

@philbucher
Copy link
Member Author

@maceligueta maybe I misunderstand what you mean, but I am not intending to have all the results in one file.
How the output is printed is decided with which output_processes you add

@maceligueta
Copy link
Member

Of course printing the results of each application in separate files is an option. But I prefer to print together all the meshes from the different coupled applications. It's very convenient, as you can embed your code in GiD (for example) and switch to post-process and take a look at the results, while the computation is still running. No need to do merges or open several post-processors.
So, my previous comment was in this direction: if someone wants to follow this approach, they have to disable the regular output of the original applications (delete the processes somehow, or override some methods with 'pass') and write a new process or method for printing the coupled output. This coupled output process might depend on the specific applications that you are coupling, though I guess it could be coded in a very general way.

@philbucher
Copy link
Member Author

@maceligueta I think I understand now:
You have a utility or process with which you can print the results of a coupled simulation in one file to do what you describe, right?

If you want to use this but not follow the usual interface of the processes, then in your derived AnalysisStage you would override e.g. the OutputSolutionStep method to call your special utility

Regarding disabling the regular output of the original applications:
This is very simple, you only have to remove the corresponding process from the json-file, no need to do any modifications in python

Does this answer your question?

@maceligueta
Copy link
Member

maceligueta commented May 24, 2018

Thanks for the clarification @philbucher . Actually it was just a comment about the wiki: there it says that physics and couplings are bounded to the solver, however, couplings not related to physics can fall outside the solver. They will probably go to the processes or methods of the Analysis. Sorry if I was not very clear. So, even if the explanation at the wiki is simple, the actual coupled codes usually mess things up.

@philbucher
Copy link
Member Author

@loumalouomega now I am done with the changes.

Note that the Fluid does not work with this yet, bcs the Fluid solvers do not yet derive from the base-solver and also don't take the model as input.

I spoke with @jcotela and he is going the changes very soon.

Contact is working

@loumalouomega
Copy link
Member

loumalouomega commented May 28, 2018 via email

@philbucher
Copy link
Member Author

restart saving is handled through a process, added in #2283

@philbucher
Copy link
Member Author

@jcotela can you please have a close look at the updates to the FluidAnalysis in case I missed sth?

@philbucher
Copy link
Member Author

ping can we merge this now?

Its blocking further developments

@RiccardoRossi
Copy link
Member

this looks fine to me, but having seen the troubles in the last weeks, i would like @rubenzorrilla to give his approval

@philbucher
Copy link
Member Author

@RiccardoRossi makes sense to me
@rubenzorrilla the changes to the sovlers are minimal in this PR, nevertheless I will look again into the FSIapp to double check

@philbucher
Copy link
Member Author

i took a look at the FSI but I cannot get the example from GiD to work.
Neither with Master nor with this branch.

@rubenzorrilla is it up to date?

The tests in FSI work except the one that always fails

@philbucher
Copy link
Member Author

ping @rubenzorrilla


def _CreateProcesses(self, parameter_name, initialization_order):
"""Create a list of processes
Format:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems too much complex to me. Is it required by any technical reason?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The idea behind grouping the processes is that one can loop them.
this way we can solve #869 and also automatize the computation of the ComputingModelPart (which is quite error-prone)

Without the extra level it would not be possible to specify the order of execution (like it is required e.g. in Fluids)

Users can still add their custom processes, but it should be done under processes or output_processes respectively

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Go ahead then!

@philbucher
Copy link
Member Author

Thanks

@philbucher philbucher merged commit 94aae63 into master Jun 13, 2018
@philbucher philbucher deleted the core/update-analysis-stage branch June 13, 2018 09:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants