ModelInferenceTime#

class ModelInferenceTime[source]#

Measure model average inference time (in seconds) per sample.

Parameters
n_samplesint , default: 1_000

number of samples to use for this check.

random_stateint, default: 42

random seed for all check internals.

__init__(n_samples: int = 1000, random_state: int = 42, **kwargs)[source]#
__new__(*args, **kwargs)#

Methods

ModelInferenceTime.add_condition(name, ...)

Add new condition function to the check.

ModelInferenceTime.add_condition_inference_time_less_than([value])

Add condition - the average model inference time (in seconds) per sample is less than threshold.

ModelInferenceTime.clean_conditions()

Remove all conditions from this check instance.

ModelInferenceTime.conditions_decision(result)

Run conditions on given result.

ModelInferenceTime.config([include_version, ...])

Return check configuration (conditions' configuration not yet supported).

ModelInferenceTime.from_config(conf[, ...])

Return check object from a CheckConfig object.

ModelInferenceTime.from_json(conf[, ...])

Deserialize check instance from JSON string.

ModelInferenceTime.metadata([with_doc_link])

Return check metadata.

ModelInferenceTime.name()

Name of class in split camel case.

ModelInferenceTime.params([show_defaults])

Return parameters to show when printing the check.

ModelInferenceTime.remove_condition(index)

Remove given condition by index.

ModelInferenceTime.run(dataset[, model, ...])

Run check.

ModelInferenceTime.run_logic(context, ...)

Run check.

ModelInferenceTime.to_json([indent, ...])

Serialize check instance to JSON string.

Examples#