MultiModelPerformanceReport#

class MultiModelPerformanceReport[source]#

Summarize performance scores for multiple models on test datasets.

Parameters
alternative_scorersDict[str, Callable] , default: None

An optional dictionary of scorer name to scorer functions. If none given, using default scorers

__init__(alternative_scorers: Optional[Dict[str, Callable]] = None, **kwargs)[source]#
__new__(*args, **kwargs)#

Methods

MultiModelPerformanceReport.add_condition(...)

Add new condition function to the check.

MultiModelPerformanceReport.clean_conditions()

Remove all conditions from this check instance.

MultiModelPerformanceReport.conditions_decision(result)

Run conditions on given result.

MultiModelPerformanceReport.config()

Return check configuration (conditions' configuration not yet supported).

MultiModelPerformanceReport.from_config(conf)

Return check object from a CheckConfig object.

MultiModelPerformanceReport.metadata([...])

Return check metadata.

MultiModelPerformanceReport.name()

Name of class in split camel case.

MultiModelPerformanceReport.params([...])

Return parameters to show when printing the check.

MultiModelPerformanceReport.remove_condition(index)

Remove given condition by index.

MultiModelPerformanceReport.run(...)

Initialize context and pass to check logic.

MultiModelPerformanceReport.run_logic(...)

Run check logic.

Examples#