pymor.models.mpi
¶
Module Contents¶
- class pymor.models.mpi.MPIModel(obj_id, *args)[source]¶
Wrapper class mixin for MPI distributed
Models
.Given a single-rank implementation of a
Model
, this wrapper class uses the event loop frompymor.tools.mpi
to allow an MPI distributed usage of theModel
. The underlying implementation needs to be MPI aware. In particular, the model’scompute
method has to perform MPI parallel computation of the desired quantities.Note that this class is not intended to be instantiated directly. Instead, you should use
mpi_wrap_model
.Methods
- class pymor.models.mpi.MPIVisualizer(m_obj_id, remove_model=False)[source]¶
Bases:
pymor.core.base.ImmutableObject
Base class for immutable objects in pyMOR.
Instances of
ImmutableObject
are immutable in the sense that after execution of__init__
, any modification of a non-private attribute will raise an exception.Warning
For instances of
ImmutableObject
, the result of member function calls should be completely determined by the function’s arguments together with the object’s__init__
arguments and the current state of pyMOR’s globaldefaults
.While, in principle, you are allowed to modify private members after instance initialization, this should never affect the outcome of future method calls. In particular, if you update any internal state after initialization, you have to ensure that this state is not affected by possible changes of the global
defaults
.Methods
- pymor.models.mpi.mpi_wrap_model(local_models, mpi_spaces=None, use_with=True, with_apply2=False, pickle_local_spaces=True, space_type=MPIVectorSpace, base_type=None)[source]¶
Wrap MPI distributed local
Models
to a globalModel
on rank 0.Given MPI distributed local
Models
referred to by theObjectId
local_models
, return a newModel
which manages these distributed models from rank 0. This is done by first wrapping allOperators
of theModel
usingmpi_wrap_operator
.Alternatively,
local_models
can be a callable (with no arguments) which is then called on each rank to instantiate the localModels
.When
use_with
isFalse
, anMPIModel
is instantiated with the wrapped operators. A call tosolve
will then use an MPI parallel call to thesolve
methods of the wrapped localModels
to obtain the solution. This is usually what you want when the actual solve is performed by an implementation in the external solver.When
use_with
isTrue
,with_
is called on the localModel
on rank 0, to obtain a newModel
with the wrapped MPIOperators
. This is mainly useful when the local models are genericModels
as inpymor.models.basic
andsolve
is implemented directly in pyMOR via operations on the containedOperators
.Parameters
- local_models
ObjectId
of the localModels
on each rank or a callable generating theModels
.- mpi_spaces
List of types of
VectorSpaces
which are MPI distributed and need to be wrapped. IfNone
the type of the model’ssolution_space
is used.- use_with
See above.
- with_apply2
See
MPIOperator
.- pickle_local_spaces
See
MPIOperator
.- space_type
See
MPIOperator
.