etna.loggers.ClearMLLogger#
- class ClearMLLogger(project_name: str | None = None, task_name: str | None = None, task_name_prefix: str = '', task_type: str = 'training', tags: Sequence[str] | None = None, output_uri: str | bool | None = None, auto_connect_frameworks: bool | Mapping[str, bool | str | list] = False, auto_resource_monitoring: bool | Mapping[str, Any] = True, auto_connect_streams: bool | Mapping[str, bool] = True, plot: bool = True, table: bool = True, config: Dict[str, Any] | None = None, save_dir: str | None = None)[source]#
Bases:
BaseLogger
ClearML logger.
Note
This logger requires
clearml
extension to be installed. Read more about this at installation page.Warning
There is a possibility, that aggregated metrics charts may log incorrectly. For more details see issue.
Create instance of
ClearMLLogger
.- Parameters:
project_name (str | None) – The name of the project in which the experiment will be created.
task_name (str | None) – The name of Task (experiment).
task_name_prefix (str) – Prefix for the Task name field.
task_type (str) – The task type.
tags (Sequence[str] | None) – Add a list of tags (str) to the created Task.
output_uri (str | bool | None) – The default location for output models and other artifacts.
auto_connect_frameworks (bool | Mapping[str, bool | str | list]) – Automatically connect frameworks.
auto_resource_monitoring (bool | Mapping[str, Any]) – Automatically create machine resource monitoring plots.
auto_connect_streams (bool | Mapping[str, bool]) – Control the automatic logging of stdout and stderr.
plot (bool) – Indicator for making and sending plots.
table (bool) – Indicator for making and sending tables.
config (Dict[str, Any] | None) – A dictionary-like object for saving inputs to your job, like hyperparameters for a model or settings for a data preprocessing job.
save_dir (str | None) – Path to the directory for saving intermediate data. Used only when logging DL models. Defaults to
./tb_save
Notes
For more details see documentation
Methods
finish_experiment
(*args, **kwargs)Finish
Task
.Reinit
Task
.log
(msg, **kwargs)Log any event.
log_backtest_metrics
(ts, metrics_df, ...)Write metrics to logger.
log_backtest_run
(metrics, forecast, test)Backtest metrics from one fold to logger.
set_params
(**params)Return new object instance with modified parameters.
start_experiment
([job_type, group])Start
Task
.to_dict
()Collect all information about etna object in dict.
Attributes
This class stores its
__init__
parameters as attributes.Pytorch lightning loggers.
- log(msg: str | Dict[str, Any], **kwargs)[source]#
Log any event.
This class logs string representation of a message.
- log_backtest_metrics(ts: TSDataset, metrics_df: DataFrame, forecast_df: DataFrame, fold_info_df: DataFrame)[source]#
Write metrics to logger.
- log_backtest_run(metrics: DataFrame, forecast: DataFrame, test: DataFrame)[source]#
Backtest metrics from one fold to logger.
- set_params(**params: dict) Self [source]#
Return new object instance with modified parameters.
Method also allows to change parameters of nested objects within the current object. For example, it is possible to change parameters of a
model
in aPipeline
.Nested parameters are expected to be in a
<component_1>.<...>.<parameter>
form, where components are separated by a dot.- Parameters:
**params (dict) – Estimator parameters
- Returns:
New instance with changed parameters
- Return type:
Self
Examples
>>> from etna.pipeline import Pipeline >>> from etna.models import NaiveModel >>> from etna.transforms import AddConstTransform >>> model = NaiveModel(lag=1) >>> transforms = [AddConstTransform(in_column="target", value=1)] >>> pipeline = Pipeline(model, transforms=transforms, horizon=3) >>> pipeline.set_params(**{"model.lag": 3, "transforms.0.value": 2}) Pipeline(model = NaiveModel(lag = 3, ), transforms = [AddConstTransform(in_column = 'target', value = 2, inplace = True, out_column = None, )], horizon = 3, )