pymor.models.mpi¶
Module Contents¶
Classes¶
Wrapper class mixin for MPI distributed |
|
Base class for immutable objects in pyMOR. |
Functions¶
Wrap MPI distributed local |
|
Attributes¶
- class pymor.models.mpi.MPIModel(obj_id, *args, **kwargs)[source]¶
Wrapper class mixin for MPI distributed
Models.Given a single-rank implementation of a
Model, this wrapper class uses the event loop frompymor.tools.mpito allow an MPI distributed usage of theModel. The underlying implementation needs to be MPI aware. In particular, the model’ssolvemethod has to perform an MPI parallel solve of the model.Note that this class is not intended to be instantiated directly. Instead, you should use
mpi_wrap_model.
- class pymor.models.mpi.MPIVisualizer(m_obj_id)[source]¶
Bases:
pymor.core.base.ImmutableObjectBase class for immutable objects in pyMOR.
Instances of
ImmutableObjectare immutable in the sense that after execution of__init__, any modification of a non-private attribute will raise an exception.Warning
For instances of
ImmutableObject, the result of member function calls should be completely determined by the function’s arguments together with the object’s__init__arguments and the current state of pyMOR’s globaldefaults.While, in principle, you are allowed to modify private members after instance initialization, this should never affect the outcome of future method calls. In particular, if you update any internal state after initialization, you have to ensure that this state is not affected by possible changes of the global
defaults.
- pymor.models.mpi.mpi_wrap_model(local_models, mpi_spaces=('STATE',), use_with=True, with_apply2=False, pickle_local_spaces=True, space_type=MPIVectorSpace, base_type=None)[source]¶
Wrap MPI distributed local
Modelsto a globalModelon rank 0.Given MPI distributed local
Modelsreferred to by theObjectIdlocal_models, return a newModelwhich manages these distributed models from rank 0. This is done by first wrapping allOperatorsof theModelusingmpi_wrap_operator.Alternatively,
local_modelscan be a callable (with no arguments) which is then called on each rank to instantiate the localModels.When
use_withisFalse, anMPIModelis instantiated with the wrapped operators. A call tosolvewill then use an MPI parallel call to thesolvemethods of the wrapped localModelsto obtain the solution. This is usually what you want when the actual solve is performed by an implementation in the external solver.When
use_withisTrue,with_is called on the localModelon rank 0, to obtain a newModelwith the wrapped MPIOperators. This is mainly useful when the local models are genericModelsas inpymor.models.basicandsolveis implemented directly in pyMOR via operations on the containedOperators.Parameters
- local_models
ObjectIdof the localModelson each rank or a callable generating theModels.- mpi_spaces
List of types or ids of
VectorSpaceswhich are MPI distributed and need to be wrapped.- use_with
See above.
- with_apply2
See
MPIOperator.- pickle_local_spaces
See
MPIOperator.- space_type
See
MPIOperator.