Mean Average Precision Report#

This notebooks provides an overview for using and understanding the mean average precision report check.

Structure:

What Is the Purpose of the Check?#

The Mean Average Precision Report evaluates the mAP metric on the given model and data, plots the AP on graph, and returns the mAP values per bounding box size category (small, medium, large). This check only works on the Object Detection task.

Generate Data and Model#

We generate a sample dataset of 128 images from the COCO dataset, and using the YOLOv5 model.

from deepchecks.vision.checks import MeanAveragePrecisionReport
from deepchecks.vision.datasets.detection import coco

yolo = coco.load_model(pretrained=True)
test_ds = coco.load_dataset(train=False, object_type='VisionData')

Run the check#

check = MeanAveragePrecisionReport()
result = check.run(test_ds, yolo)
result
Validating Input:
|     | 0/1 [Time: 00:00]
Validating Input:
|#####| 1/1 [Time: 00:05]
Validating Input:
|#####| 1/1 [Time: 00:05]

Ingesting Batches:
|     | 0/2 [Time: 00:00]

Ingesting Batches:
|##5  | 1/2 [Time: 00:05]

Ingesting Batches:
|#####| 2/2 [Time: 00:10]

Ingesting Batches:
|#####| 2/2 [Time: 00:10]


Computing Check:
|     | 0/1 [Time: 00:00]


Computing Check:
|#####| 1/1 [Time: 00:00]


Computing Check:
|#####| 1/1 [Time: 00:00]
Mean Average Precision Report


If you have a GPU, you can speed up this check by passing it as an argument to .run() as device=<your GPU>

To display the results in an IDE like PyCharm, you can use the following code:

#  result.show_in_window()

The result will be displayed in a new window.

Observe the check’s output#

The result value is a dataframe that has the Mean Average Precision score for different bounding box area sizes. We report the mAP for different IoU thresholds: 0.5, 0.75 and an average of mAP values for IoU thresholds between 0.5 and 0.9 (with a jump size of 0.05).

result.value
mAP@[.50::.95] (avg.%) mAP@.50 (%) mAP@.75 (%)
Area size
All 0.409436 0.566673 0.425339
Small (area < 32^2) 0.212816 0.342429 0.212868
Medium (32^2 < area < 96^2) 0.383089 0.600228 0.349863
Large (area < 96^2) 0.541146 0.674493 0.585378


Define a condition#

We can define a condition that checks whether our model’s mean average precision score is not less than a given threshold for all bounding box sizes.

check = MeanAveragePrecisionReport().add_condition_average_mean_average_precision_greater_than(0.4)
result = check.run(test_ds, yolo)
result.show(show_additional_outputs=False)
Validating Input:
|     | 0/1 [Time: 00:00]
Validating Input:
|#####| 1/1 [Time: 00:04]
Validating Input:
|#####| 1/1 [Time: 00:04]

Ingesting Batches:
|     | 0/2 [Time: 00:00]

Ingesting Batches:
|##5  | 1/2 [Time: 00:05]

Ingesting Batches:
|#####| 2/2 [Time: 00:10]

Ingesting Batches:
|#####| 2/2 [Time: 00:10]


Computing Check:
|     | 0/1 [Time: 00:00]


Computing Check:
|#####| 1/1 [Time: 00:00]


Computing Check:
|#####| 1/1 [Time: 00:00]
Mean Average Precision Report


Total running time of the script: ( 0 minutes 31.867 seconds)

Gallery generated by Sphinx-Gallery