Mean Average Recall Report#

This notebooks provides an overview for using and understanding the mean average recall report check.

Structure:

What is the purpose of the check?#

The Mean Average Recall Report evaluates the mAR metric on the given model and data, and returns the mAR values per bounding box size category (small, medium, large). This check only works on the Object Detection task.

Imports#

import numpy as np

from deepchecks.vision.checks import MeanAverageRecallReport
from deepchecks.vision.datasets.detection import coco

Generate Data and Model#

We generate a sample dataset of 128 images from the COCO dataset, and using the YOLOv5 model.

For the label formatter - our dataset returns exactly the accepted format, so our formatting function is the simple lambda x: x function.

yolo = coco.load_model(pretrained=True)

test_ds = coco.load_dataset(train=False, object_type='VisionData')

Run the check#

check = MeanAverageRecallReport()
result = check.run(test_ds, yolo)
result
Validating Input:
|     | 0/1 [Time: 00:00]
Validating Input:
|#####| 1/1 [Time: 00:04]
Validating Input:
|#####| 1/1 [Time: 00:04]

Ingesting Batches:
|     | 0/2 [Time: 00:00]

Ingesting Batches:
|##5  | 1/2 [Time: 00:04]

Ingesting Batches:
|#####| 2/2 [Time: 00:09]

Ingesting Batches:
|#####| 2/2 [Time: 00:09]


Computing Check:
|     | 0/1 [Time: 00:00]


Computing Check:
|#####| 1/1 [Time: 00:00]


Computing Check:
|#####| 1/1 [Time: 00:00]
Mean Average Recall Report


If you have a GPU, you can speed up this check by passing it as an argument to .run() as device=<your GPU>

To display the results in an IDE like PyCharm, you can use the following code:

#  result.show_in_window()

The result will be displayed in a new window.

Observe the check’s output#

The result value is a dataframe that has the average recall score per each area range and IoU.

result.value
AR@1 (%) AR@10 (%) AR@100 (%)
Area size
All 0.330552 0.423444 0.429179
Small (area < 32^2) 0.104955 0.220594 0.220594
Medium (32^2 < area < 96^2) 0.325099 0.417392 0.423844
Large (area < 96^2) 0.481611 0.544408 0.549963


Define a condition#

We can define a condition that checks whether our model’s average recall score is not less than a given threshold

check = MeanAverageRecallReport().add_condition_test_average_recall_greater_than(0.4)
result = check.run(test_ds, yolo)
result.show(show_additional_outputs=False)
Validating Input:
|     | 0/1 [Time: 00:00]
Validating Input:
|#####| 1/1 [Time: 00:04]
Validating Input:
|#####| 1/1 [Time: 00:04]

Ingesting Batches:
|     | 0/2 [Time: 00:00]

Ingesting Batches:
|##5  | 1/2 [Time: 00:04]

Ingesting Batches:
|#####| 2/2 [Time: 00:10]

Ingesting Batches:
|#####| 2/2 [Time: 00:10]


Computing Check:
|     | 0/1 [Time: 00:00]


Computing Check:
|#####| 1/1 [Time: 00:00]


Computing Check:
|#####| 1/1 [Time: 00:00]
Mean Average Recall Report


Total running time of the script: ( 0 minutes 31.368 seconds)

Gallery generated by Sphinx-Gallery