MeanAveragePrecisionReport#

class MeanAveragePrecisionReport[source]#

Summarize mean average precision metrics on a dataset and model per IoU and bounding box area.

Parameters
area_range: tuple, default: (32**2, 96**2)

Slices for small/medium/large buckets.

n_samplesOptional[int] , default10000

Number of samples to use for the check. If None, all samples will be used.

__init__(area_range: Tuple = (1024, 9216), n_samples: Optional[int] = 10000, **kwargs)[source]#
__new__(*args, **kwargs)#

Methods

MeanAveragePrecisionReport.add_condition(...)

Add new condition function to the check.

MeanAveragePrecisionReport.add_condition_average_mean_average_precision_greater_than([...])

Add condition - average mAP for IoU values between 0.5 to 0.9 in all areas is greater than given score.

MeanAveragePrecisionReport.add_condition_mean_average_precision_greater_than(...)

Add condition - mAP scores in different area thresholds is greater than given score.

MeanAveragePrecisionReport.clean_conditions()

Remove all conditions from this check instance.

MeanAveragePrecisionReport.compute(context, ...)

Compute the metric result using the ignite metrics compute method and create display.

MeanAveragePrecisionReport.conditions_decision(result)

Run conditions on given result.

MeanAveragePrecisionReport.config([...])

Return check configuration (conditions' configuration not yet supported).

MeanAveragePrecisionReport.from_config(conf)

Return check object from a CheckConfig object.

MeanAveragePrecisionReport.from_json(conf[, ...])

Deserialize check instance from JSON string.

MeanAveragePrecisionReport.initialize_run(context)

Initialize run by asserting task type and initializing metric.

MeanAveragePrecisionReport.metadata([...])

Return check metadata.

MeanAveragePrecisionReport.name()

Name of class in split camel case.

MeanAveragePrecisionReport.params([...])

Return parameters to show when printing the check.

MeanAveragePrecisionReport.remove_condition(index)

Remove given condition by index.

MeanAveragePrecisionReport.run(dataset[, ...])

Run check.

MeanAveragePrecisionReport.to_json([indent, ...])

Serialize check instance to JSON string.

MeanAveragePrecisionReport.update(context, ...)

Update the metrics by passing the batch to ignite metric update method.

Examples#