# Mean Average Precision Report#

This notebooks provides an overview for using and understanding the mean average precision report check.

Structure:

## What Is the Purpose of the Check?#

The Mean Average Precision Report evaluates the mAP metric on the given model and data, plots the AP on graph, and returns the mAP values per bounding box size category (small, medium, large). This check only works on the Object Detection task.

## Generate Data and Model#

We generate a sample dataset of 128 images from the COCO dataset, and using the YOLOv5 model.

from deepchecks.vision.checks.performance import MeanAveragePrecisionReport
from deepchecks.vision.datasets.detection import coco



## Run the check#

check = MeanAveragePrecisionReport()
result = check.run(test_ds, yolo)
result


Out:

Validating Input:   0%| | 0/1 [00:00<?, ? /s]
Validating Input: 100%|#| 1/1 [00:05<00:00,  5.41s/ ]

Ingesting Batches:   0%|  | 0/2 [00:00<?, ? Batch/s]
Ingesting Batches:  50%|# | 1/2 [00:05<00:05,  5.62s/ Batch]
Ingesting Batches: 100%|##| 2/2 [00:11<00:00,  5.62s/ Batch]

Computing Check:   0%| | 0/1 [00:00<?, ? Check/s]
Computing Check: 100%|#| 1/1 [00:00<00:00,  1.66 Check/s]


#### Mean Average Precision Report

Summarize mean average precision metrics on a dataset and model per IoU and bounding box area.

mAP@[.50::.95] (avg.%) mAP@.50 (%) mAP@.75 (%)
Area size
All 0.41 0.57 0.43
Small (area < 32^2) 0.21 0.34 0.21
Medium (32^2 < area < 96^2) 0.38 0.60 0.35
Large (area < 96^2) 0.54 0.67 0.59

### Observe the check’s output#

The result value is a dataframe that has the Mean Average Precision score for different bounding box area sizes. We report the mAP for different IoU thresholds: 0.5, 0.75 and an average of mAP values for IoU thresholds between 0.5 and 0.9 (with a jump size of 0.05).

result.value

mAP@[.50::.95] (avg.%) mAP@.50 (%) mAP@.75 (%)
Area size
All 0.409436 0.566673 0.425339
Small (area < 32^2) 0.212816 0.342429 0.212868
Medium (32^2 < area < 96^2) 0.383089 0.600228 0.349863
Large (area < 96^2) 0.541146 0.674493 0.585378

## Define a condition#

We can define a condition that checks whether our model’s mean average precision score is not less than a given threshold for all bounding box sizes.

check = MeanAveragePrecisionReport().add_condition_average_mean_average_precision_not_less_than(0.4)
result = check.run(test_ds, yolo)


Out:

Validating Input:   0%| | 0/1 [00:00<?, ? /s]
Validating Input: 100%|#| 1/1 [00:05<00:00,  5.45s/ ]

Ingesting Batches:   0%|  | 0/2 [00:00<?, ? Batch/s]
Ingesting Batches:  50%|# | 1/2 [00:05<00:05,  5.80s/ Batch]
Ingesting Batches: 100%|##| 2/2 [00:11<00:00,  5.71s/ Batch]

Computing Check:   0%| | 0/1 [00:00<?, ? Check/s]
Computing Check: 100%|#| 1/1 [00:00<00:00,  1.70 Check/s]

Mean Average Precision Report

Total running time of the script: ( 0 minutes 35.150 seconds)

Gallery generated by Sphinx-Gallery