model_evaluation#

Module contains checks of model evaluation.

Classes

BoostingOverfit

Check for overfit caused by using too many iterations in a gradient boosted model.

CalibrationScore

Calculate the calibration curve with brier score for each class.

ConfusionMatrixReport

Calculate the confusion matrix of the model on the given dataset.

ModelInferenceTime

Measure model average inference time (in seconds) per sample.

ModelInfo

Summarize given model parameters.

MultiModelPerformanceReport

Summarize performance scores for multiple models on test datasets.

TrainTestPerformance

Summarize given model performance on the train and test datasets based on selected scorers.

RegressionErrorDistribution

Check for systematic error and abnormal shape in the regression error distribution.

RegressionSystematicError

Check the regression systematic error.

RocReport

Calculate the ROC curve for each class.

SegmentPerformance

Display performance score segmented by 2 top (or given) features in a heatmap.

SimpleModelComparison

Compare given model score to simple model score (according to given model type).

TrainTestPredictionDrift

The TrainTestPredictionDrift check is deprecated and will be removed in the 0.14 version.

PredictionDrift

Calculate prediction drift between train dataset and test dataset, using statistical measures.

WeakSegmentsPerformance

Search for segments with low performance scores.

UnusedFeatures

Detect features that are nearly unused by the model.

SingleDatasetPerformance

Summarize given model performance on the train and test datasets based on selected scorers.

PerformanceBias

Check for performance differences between subgroups of a feature, optionally accounting for a control variable.