CalibrationScore#

class CalibrationScore[source]#

Calculate the calibration curve with brier score for each class.

Parameters
n_samplesint , default: 1_000_000

number of samples to use for this check.

random_stateint, default: 42

random seed for all check internals.

__init__(n_samples: int = 1000000, random_state: int = 42, **kwargs)[source]#
__new__(*args, **kwargs)#

Methods

CalibrationScore.add_condition(name, ...)

Add new condition function to the check.

CalibrationScore.clean_conditions()

Remove all conditions from this check instance.

CalibrationScore.conditions_decision(result)

Run conditions on given result.

CalibrationScore.config([include_version, ...])

Return check configuration (conditions' configuration not yet supported).

CalibrationScore.from_config(conf[, ...])

Return check object from a CheckConfig object.

CalibrationScore.from_json(conf[, ...])

Deserialize check instance from JSON string.

CalibrationScore.metadata([with_doc_link])

Return check metadata.

CalibrationScore.name()

Name of class in split camel case.

CalibrationScore.params([show_defaults])

Return parameters to show when printing the check.

CalibrationScore.remove_condition(index)

Remove given condition by index.

CalibrationScore.run(dataset[, model, ...])

Run check.

CalibrationScore.run_logic(context, dataset_kind)

Run check.

CalibrationScore.to_json([indent, ...])

Serialize check instance to JSON string.

Examples#