pymor.models.mpi

Module Contents

class pymor.models.mpi.MPIModel(obj_id, *args)[source]

Wrapper class mixin for MPI distributed Models.

Given a single-rank implementation of a Model, this wrapper class uses the event loop from pymor.tools.mpi to allow an MPI distributed usage of the Model. The underlying implementation needs to be MPI aware. In particular, the model’s compute method has to perform MPI parallel computation of the desired quantities.

Note that this class is not intended to be instantiated directly. Instead, you should use mpi_wrap_model.

Methods

visualize

visualize(U, **kwargs)[source]
class pymor.models.mpi.MPIVisualizer(m_obj_id, remove_model=False)[source]

Bases: pymor.core.base.ImmutableObject

Base class for immutable objects in pyMOR.

Instances of ImmutableObject are immutable in the sense that after execution of __init__, any modification of a non-private attribute will raise an exception.

Warning

For instances of ImmutableObject, the result of member function calls should be completely determined by the function’s arguments together with the object’s __init__ arguments and the current state of pyMOR’s global defaults.

While, in principle, you are allowed to modify private members after instance initialization, this should never affect the outcome of future method calls. In particular, if you update any internal state after initialization, you have to ensure that this state is not affected by possible changes of the global defaults.

Methods

visualize

visualize(U, **kwargs)[source]
pymor.models.mpi.mpi_wrap_model(local_models, mpi_spaces=('STATE',), use_with=True, with_apply2=False, pickle_local_spaces=True, space_type=MPIVectorSpace, base_type=None)[source]

Wrap MPI distributed local Models to a global Model on rank 0.

Given MPI distributed local Models referred to by the ObjectId local_models, return a new Model which manages these distributed models from rank 0. This is done by first wrapping all Operators of the Model using mpi_wrap_operator.

Alternatively, local_models can be a callable (with no arguments) which is then called on each rank to instantiate the local Models.

When use_with is False, an MPIModel is instantiated with the wrapped operators. A call to solve will then use an MPI parallel call to the solve methods of the wrapped local Models to obtain the solution. This is usually what you want when the actual solve is performed by an implementation in the external solver.

When use_with is True, with_ is called on the local Model on rank 0, to obtain a new Model with the wrapped MPI Operators. This is mainly useful when the local models are generic Models as in pymor.models.basic and solve is implemented directly in pyMOR via operations on the contained Operators.

Parameters

local_models

ObjectId of the local Models on each rank or a callable generating the Models.

mpi_spaces

List of types or ids of VectorSpaces which are MPI distributed and need to be wrapped.

use_with

See above.

with_apply2

See MPIOperator.

pickle_local_spaces

See MPIOperator.

space_type

See MPIOperator.