PerformanceReport#

class PerformanceReport[source]#

Deprecated. Summarize given scores on a dataset and model.

__init__(alternative_scorers: Optional[Dict[str, Callable]] = None, reduce: Union[Callable, str] = 'mean', **kwargs)[source]#
__new__(*args, **kwargs)#

Methods

PerformanceReport.add_condition(name, ...)

Add new condition function to the check.

PerformanceReport.add_condition_class_performance_imbalance_ratio_less_than([...])

Add condition - relative ratio difference between highest-class and lowest-class is less than threshold.

PerformanceReport.add_condition_test_performance_greater_than(...)

Add condition - metric scores are greater than the threshold.

PerformanceReport.add_condition_train_test_relative_degradation_less_than([...])

Add condition - test performance is not degraded by more than given percentage in train.

PerformanceReport.clean_conditions()

Remove all conditions from this check instance.

PerformanceReport.conditions_decision(result)

Run conditions on given result.

PerformanceReport.config()

Return check configuration (conditions' configuration not yet supported).

PerformanceReport.from_config(conf)

Return check object from a CheckConfig object.

PerformanceReport.metadata([with_doc_link])

Return check metadata.

PerformanceReport.name()

Name of class in split camel case.

PerformanceReport.params([show_defaults])

Return parameters to show when printing the check.

PerformanceReport.reduce_output(check_result)

Return the values of the metrics for the test dataset in {metric: value} format.

PerformanceReport.remove_condition(index)

Remove given condition by index.

PerformanceReport.run(train_dataset, ...[, ...])

Run check.

PerformanceReport.run_logic(context)

Run check.