pymor.reductors.data_driven

Module Contents

class pymor.reductors.data_driven.DataDrivenPODReductor(training_parameters, training_snapshots, regressor=None, T=None, time_vectorized=False, output_functional=None, input_scaler=None, output_scaler=None, product=None, pod_params=None)[source]

Bases: DataDrivenReductor

Reductor building a reduced basis and relying on a machine learning surrogate.

In addition to the DataDrivenReductor, this reductor uses snapshot data in order to construct a reduced basis via POD and projects the snapshots onto the reduced basis to generate data for the machine learning training. The approach is described in [HU18]. See DataDrivenReductor for more details.

Parameters:
  • training_parametersParameter values to use for training of the regressor.

  • training_snapshotsVectorArray to use for the training of the regressor. Contains the solutions or outputs associated to the parameters in training_parameters. In the case of a time-dependent problem, the snapshots are assumed to be equidistant in time.

  • regressor – See DataDrivenReductor.

  • T – See DataDrivenReductor.

  • time_vectorized – See DataDrivenReductor.

  • output_functionalOperator mapping a given solution to the model output. In many applications, this will be a Functional, i.e. an Operator mapping to scalars. This is not required, however. The output functional will be projected automatically onto the reduced space.

  • input_scaler – See DataDrivenReductor.

  • output_scaler – See DataDrivenReductor.

  • product – Inner product Operators defined on the discrete space the problem is posed on. Used for reduced basis computation via POD and or orthogonal projection onto the reduced basis.

  • pod_params – Dict of additional parameters for the POD-method.

Methods

extend_training_data

Add sequences of parameters and corresponding snapshots to the training data.

reconstruct

Reconstruct high-dimensional vector from reduced vector u.

reduce

Reduce by training a machine learning surrogate.

extend_training_data(parameters, snapshots)[source]

Add sequences of parameters and corresponding snapshots to the training data.

reconstruct(u)[source]

Reconstruct high-dimensional vector from reduced vector u.

reduce(**kwargs)[source]

Reduce by training a machine learning surrogate.

If the regressor supports incremental extension via an extend method and has already been fitted, only the new training data (added via extend_training_data) is passed to extend. Otherwise, the regressor is fully retrained on all training data.

Incremental extension requires that all scalers are either pre-fitted (via the input_scaler_fitted / output_scaler_fitted flags) or not used, since unfitted scalers need to be refitted on the full dataset.

Parameters:

kwargs – Additional arguments that will be passed to the fit method of the regressor.

Returns:

The data-driven reduced model.

class pymor.reductors.data_driven.DataDrivenReductor(training_parameters, training_snapshots, regressor=VKOGARegressor, regressor_parameters=None, target_quantity='solution', T=None, time_vectorized=False, output_functional=None, input_scaler=None, output_scaler=None, input_scaler_fitted=False, output_scaler_fitted=False)[source]

Bases: pymor.core.base.BasicObject

Reductor relying on a machine learning surrogate.

The reductor works for stationary as well as for instationary problems and returns a suitable model for the respective case.

Depending on the argument target_quantity, this reductor either approximates the solution or the output as a parametric quantity by training a machine learning surrogate.

In case of an approximation of the solution, the reductor trains a machine learning regressor that approximates the mapping from parameter space to a NumPy array. Typically, the array contains coefficients of the solution with respect to a reduced basis.

For target_quantity='output', the machine learning regressor directly approximates the output depending on the parameter. The outputs for the training parameters are either computed using the full-order model or can be provided as the training_snapshots argument.

Parameters:
  • training_parametersParameter values to use for training of the regressor.

  • training_snapshots – Iterable containing the training snapshots of the regressor. Contains the solutions (reduced coefficients w.r.t. the reduced basis or outputs) associated to the parameters in training_parameters. In the case of a time-dependent problem, the snapshots are assumed to be equidistant in time.

  • regressor – Regressor with fit and predict methods similar to scikit-learn regressors that is trained in the reduce-method. Defaults to VKOGARegressor. Alternatively, one can pass a class which will be instantiated using the attributes in regressor_parameters.

  • regressor_parameters – Dictionary with parameters for regressor instantiation. This will be used only when a class instead of a regressor object is passed as regressor.

  • target_quantity – Either 'solution' or 'output', determines which quantity to learn.

  • T – In the instationary case, determines the final time until which to solve.

  • time_vectorized – In the instationary case, determines whether to predict the whole time trajectory at once (time-vectorized version; output of the regressor is typically very high-dimensional in this case) or if the result for a single point in time is approximated (time serves as an additional input to the regressor).

  • output_functionalOperator mapping a given solution to the model output. In many applications, this will be a Functional, i.e. an Operator mapping to scalars. This is not required, however.

  • input_scaler – If not None, a scaler object with fit, transform and inverse_transform methods similar to the scikit-learn interface can be used to scale the parameters before passing them to the regressor.

  • output_scaler – If not None, a scaler object with fit, transform and inverse_transform methods similar to the scikit-learn interface can be used to scale the outputs (reduced coeffcients or output quantities) before passing them to the regressor.

  • input_scaler_fitted – If True, the input_scaler is assumed to be already fitted and will not be refitted during reduce. This enables the incremental extend path for regressors that support it. Useful when the scaler has been pre-fitted based on domain knowledge (e.g., the parameter space bounds).

  • output_scaler_fitted – If True, the output_scaler is assumed to be already fitted and will not be refitted during reduce.

Methods

extend_training_data

Add sequences of parameters and corresponding snapshots to the training data.

reduce

Reduce by training a machine learning surrogate.

extend_training_data(parameters, snapshots)[source]

Add sequences of parameters and corresponding snapshots to the training data.

reduce(**kwargs)[source]

Reduce by training a machine learning surrogate.

If the regressor supports incremental extension via an extend method and has already been fitted, only the new training data (added via extend_training_data) is passed to extend. Otherwise, the regressor is fully retrained on all training data.

Incremental extension requires that all scalers are either pre-fitted (via the input_scaler_fitted / output_scaler_fitted flags) or not used, since unfitted scalers need to be refitted on the full dataset.

Parameters:

kwargs – Additional arguments that will be passed to the fit method of the regressor.

Returns:

The data-driven reduced model.