- class pymor.models.mpi.MPIModel(obj_id, *args, **kwargs)[source]¶
Wrapper class mixin for MPI distributed
Given a single-rank implementation of a
Model, this wrapper class uses the event loop from
pymor.tools.mpito allow an MPI distributed usage of the
Model. The underlying implementation needs to be MPI aware. In particular, the model’s
solvemethod has to perform an MPI parallel solve of the model.
Note that this class is not intended to be instantiated directly. Instead, you should use
- class pymor.models.mpi.MPIVisualizer(m_obj_id, remove_model=False)[source]¶
Base class for immutable objects in pyMOR.
ImmutableObjectare immutable in the sense that after execution of
__init__, any modification of a non-private attribute will raise an exception.
For instances of
ImmutableObject, the result of member function calls should be completely determined by the function’s arguments together with the object’s
__init__arguments and the current state of pyMOR’s global
While, in principle, you are allowed to modify private members after instance initialization, this should never affect the outcome of future method calls. In particular, if you update any internal state after initialization, you have to ensure that this state is not affected by possible changes of the global
- pymor.models.mpi.mpi_wrap_model(local_models, mpi_spaces=('STATE',), use_with=True, with_apply2=False, pickle_local_spaces=True, space_type=MPIVectorSpace, base_type=None)[source]¶
Wrap MPI distributed local
Modelsto a global
Modelon rank 0.
Given MPI distributed local
Modelsreferred to by the
local_models, return a new
Modelwhich manages these distributed models from rank 0. This is done by first wrapping all
local_modelscan be a callable (with no arguments) which is then called on each rank to instantiate the local
MPIModelis instantiated with the wrapped operators. A call to
solvewill then use an MPI parallel call to the
solvemethods of the wrapped local
Modelsto obtain the solution. This is usually what you want when the actual solve is performed by an implementation in the external solver.
with_is called on the local
Modelon rank 0, to obtain a new
Modelwith the wrapped MPI
Operators. This is mainly useful when the local models are generic
solveis implemented directly in pyMOR via operations on the contained
ObjectIdof the local
Modelson each rank or a callable generating the
List of types or ids of
VectorSpaceswhich are MPI distributed and need to be wrapped.