SingleDatasetPerformance#
- class SingleDatasetPerformance[source]#
 Calculate performance metrics of a given model on a given dataset.
- Parameters
 - scorersUnion[Dict[str, Union[Metric, Callable, str]], List[Any]] = None,
 An optional dictionary of scorer name to scorer functions. If none given, using default scorers
- __init__(scorers: Optional[Union[Dict[str, Union[Metric, Callable, str]], List[Any]]] = None, **kwargs)[source]#
 
- __new__(*args, **kwargs)#
 
Methods
  | 
Add new condition function to the check.  | 
Add condition - the result is greater than the threshold.  | 
|
Add condition - the result is less than the threshold.  | 
|
Remove all conditions from this check instance.  | 
|
  | 
Compute the metric result using the ignite metrics compute method and reduce to a scalar.  | 
Run conditions on given result.  | 
|
Return check configuration.  | 
|
  | 
Return check object from a CheckConfig object.  | 
  | 
Deserialize check instance from JSON string.  | 
Initialize the metric for the check, and validate task type is relevant.  | 
|
Return check metadata.  | 
|
Name of class in split camel case.  | 
|
  | 
Return parameters to show when printing the check.  | 
Return the values of the metrics for the dataset provided in a {metric: value} format.  | 
|
Remove given condition by index.  | 
|
  | 
Run check.  | 
  | 
Serialize check instance to JSON string.  | 
  | 
Update the metrics by passing the batch to ignite metric update method.  |