jesterTOV.inference.flows.config.FlowTrainingConfig#
- class FlowTrainingConfig(**data)[source]#
Bases:
BaseModelConfiguration for training normalizing flows on posterior samples.
- Variables:
posterior_file (str) – Path to .npz file with posterior samples
output_dir (str) – Directory to save model weights, kwargs, and plots
parameter_names (list[str]) – List of parameter names to extract from posterior file. Examples: GW parameters [“mass_1_source”, “mass_2_source”, “lambda_1”, “lambda_2”], NICER [“mass”, “radius”]
num_epochs (int) – Number of training epochs (default: 600)
learning_rate (float) – Learning rate for training (default: 1e-3)
max_patience (int) – Early stopping patience (default: 50)
nn_depth (int) – Depth of neural network blocks (default: 5)
nn_block_dim (int) – Dimension of neural network blocks (default: 8)
flow_layers (int) – Number of flow layers (default: 1)
invert (bool) – Whether to invert the flow (default: True)
cond_dim (int | None) – Conditional dimension for conditional flows (default: None)
max_samples (int) – Maximum number of samples to use for training (default: 50,000)
seed (int) – Random seed for reproducibility (default: 0)
plot_corner (bool) – Generate corner plot comparison (default: True)
plot_losses (bool) – Plot training and validation losses (default: True)
flow_type (Literal["block_neural_autoregressive_flow", "masked_autoregressive_flow", "coupling_flow"]) – Type of normalizing flow to use (default: masked_autoregressive_flow)
nn_width (int) – Width of neural network hidden layers (default: 50)
standardize (bool) – Whether to standardize input data (default: True, changed from False)
standardization_method (Literal["zscore", "minmax"]) – Method for standardizing input data (default: zscore). - “zscore”: Standardize to mean=0, std=1 (recommended for most cases) - “minmax”: Standardize to [0, 1] range (legacy, for backward compatibility) Only used if standardize=True.
transformer (Literal["affine", "rational_quadratic_spline"]) – Transformer type for masked_autoregressive_flow and coupling_flow (default: rational_quadratic_spline, changed from affine)
transformer_knots (int) – Number of knots for RationalQuadraticSpline transformer (default: 10, changed from 8)
transformer_interval (float) – Interval for RationalQuadraticSpline transformer (default: 5.0, changed from 4.0)
val_prop (float) – Proportion of data to use for validation (default: 0.2)
batch_size (int) – Batch size for training (default: 128)
- __init__(**data)#
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
Methods
__init__(**data)Create a new model by parsing and validating input data from keyword arguments.
construct([_fields_set])copy(*[, include, exclude, update, deep])Returns a copy of the model.
dict(*[, include, exclude, by_alias, ...])from_orm(obj)from_yaml(filepath)Load configuration from a YAML file.
json(*[, include, exclude, by_alias, ...])model_construct([_fields_set])Creates a new instance of the Model class with validated data.
model_copy(*[, update, deep])!!! abstract "Usage Documentation"
model_dump(*[, mode, include, exclude, ...])!!! abstract "Usage Documentation"
model_dump_json(*[, indent, ensure_ascii, ...])!!! abstract "Usage Documentation"
model_json_schema([by_alias, ref_template, ...])Generates a JSON schema for a model class.
model_parametrized_name(params)Compute the class name for parametrizations of generic classes.
model_post_init(context, /)Override this method to perform additional initialization after __init__ and model_construct.
model_rebuild(*[, force, raise_errors, ...])Try to rebuild the pydantic-core schema for the model.
model_validate(obj, *[, strict, extra, ...])Validate a pydantic model instance.
model_validate_json(json_data, *[, strict, ...])!!! abstract "Usage Documentation"
model_validate_strings(obj, *[, strict, ...])Validate the given object with string data against the Pydantic model.
parse_file(path, *[, content_type, ...])parse_obj(obj)parse_raw(b, *[, content_type, encoding, ...])schema([by_alias, ref_template])schema_json(*[, by_alias, ref_template])update_forward_refs(**localns)validate(value)Validate that parameter_names is a non-empty list.
Validate that float value is positive.
Validate that integer value is positive.
Validate that validation proportion is in (0, 1).
Attributes
model_computed_fieldsmodel_configConfiguration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
model_extraGet extra fields set during validation.
model_fieldsmodel_fields_setReturns the set of fields that have been explicitly set on this model instance.
- flow_type: Literal['block_neural_autoregressive_flow', 'masked_autoregressive_flow', 'coupling_flow']#
- classmethod from_yaml(filepath)[source]#
Load configuration from a YAML file.
- Parameters:
- Return type:
- Returns:
FlowTrainingConfig instance with loaded configuration
Example
>>> config = FlowTrainingConfig.from_yaml("config.yaml")