SingleDatasetPerformance#
- class SingleDatasetPerformance[source]#
Summarize given model performance on a dataset based on selected scorers.
- Parameters
- scorersUnion[List[str], Dict[str, Union[str, Callable]]], default: None
List of scorers to use. If None, use default scorers. Scorers can be supplied as a list of scorer names or as a dictionary of names and functions.
- n_samplesint , default: 10_000
Maximum number of samples to use for this check.
- __init__(scorers: Optional[Union[List[str], Dict[str, Union[str, Callable]]]] = None, n_samples: int = 10000, **kwargs)[source]#
- __new__(*args, **kwargs)#
Methods
|
Add new condition function to the check. |
Add condition - the selected metrics scores are greater than the threshold. |
|
Remove all conditions from this check instance. |
|
Run conditions on given result. |
|
Return check configuration. |
|
|
Return check object from a CheckConfig object. |
|
Deserialize check instance from JSON string. |
Return True if the check reduce_output is better when it is greater. |
|
Return check metadata. |
|
Name of class in split camel case. |
|
|
Return parameters to show when printing the check. |
Return the values of the metrics for the dataset provided in a {metric: value} format. |
|
Remove given condition by index. |
|
|
Run check. |
|
Run check. |
|
Serialize check instance to JSON string. |