.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "user-guide/vision/auto_quickstarts/plot_classification_tutorial.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_user-guide_vision_auto_quickstarts_plot_classification_tutorial.py: .. _vision_classification_tutorial: ============================================== Classification Model Validation Tutorial ============================================== In this tutorial, you will learn how to validate your **classification model** using deepchecks test suites. You can read more about the different checks and suites for computer vision use cases at the :doc:`examples section `. A classification model is usually used to classify an image into one of a number of classes. Although there are multi label use-cases, in which the model is used to classify an image into multiple classes, most use-cases require the model to classify images into a single class. Currently deepchecks supports only single label classification (either binary or multi-class). .. code-block:: bash # Before we start, if you don't have deepchecks vision package installed yet, run: import sys !{sys.executable} -m pip install "deepchecks[vision]" --quiet --upgrade # --user # or install using pip from your python environment .. GENERATED FROM PYTHON SOURCE LINES 27-29 Defining the data and model =========================== .. GENERATED FROM PYTHON SOURCE LINES 29-49 .. code-block:: default # Importing the required packages import os import urllib.request import zipfile import albumentations as A import matplotlib.pyplot as plt import numpy as np import PIL.Image import torch import torchvision from albumentations.pytorch import ToTensorV2 from torch import nn from torchvision import transforms from torchvision.datasets import ImageFolder from deepchecks.vision.classification_data import ClassificationData .. GENERATED FROM PYTHON SOURCE LINES 50-53 Downloading the dataset ~~~~~~~~~~~~~~~~~~~~~~~ The data is available from the torch library. We will download and extract it to the current directory. .. GENERATED FROM PYTHON SOURCE LINES 53-59 .. code-block:: default url = 'https://download.pytorch.org/tutorial/hymenoptera_data.zip' urllib.request.urlretrieve(url, 'hymenoptera_data.zip') with zipfile.ZipFile('hymenoptera_data.zip', 'r') as zip_ref: zip_ref.extractall('.') .. GENERATED FROM PYTHON SOURCE LINES 60-68 Load Data ~~~~~~~~~ We will use torchvision and torch.utils.data packages for loading the data. The model we are building will learn to classify **ants** and **bees**. We have about 120 training images each for ants and bees. There are 75 validation images for each class. This dataset is a very small subset of imagenet. .. GENERATED FROM PYTHON SOURCE LINES 68-142 .. code-block:: default class AntsBeesDataset(ImageFolder): def __init__(self, *args, **kwargs): """ Overrides initialization method to replace default loader with OpenCV loader :param args: :param kwargs: """ super(AntsBeesDataset, self).__init__(*args, **kwargs) def __getitem__(self, index: int): """ overrides __getitem__ to be compatible to albumentations Args: index (int): Index Returns: tuple: (sample, target) where target is class_index of the target class. """ path, target = self.samples[index] sample = self.loader(path) sample = self.get_cv2_image(sample) if self.transforms is not None: transformed = self.transforms(image=sample, target=target) sample, target = transformed["image"], transformed["target"] else: if self.transform is not None: sample = self.transform(image=sample)['image'] if self.target_transform is not None: target = self.target_transform(target) return sample, target def get_cv2_image(self, image): if isinstance(image, PIL.Image.Image): image_np = np.array(image).astype('uint8') return image_np elif isinstance(image, np.ndarray): return image else: raise RuntimeError("Only PIL.Image and CV2 loaders currently supported!") # Just normalization for validation data_transforms = transforms.Compose([ transforms.Resize(256), transforms.CenterCrop(224), transforms.ToTensor(), transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225]) ]) data_dir = 'hymenoptera_data' # Just normalization for validation data_transforms = A.Compose([ A.Resize(height=256, width=256), A.CenterCrop(height=224, width=224), A.Normalize(mean=(0.485, 0.456, 0.406), std=(0.229, 0.224, 0.225)), ToTensorV2(), ]) train_dataset = AntsBeesDataset(root=os.path.join(data_dir,'train')) train_dataset.transforms = data_transforms val_dataset = AntsBeesDataset(root=os.path.join(data_dir,'val')) val_dataset.transforms = data_transforms dataloaders = { 'train':torch.utils.data.DataLoader(train_dataset, batch_size=4, shuffle=True), 'val': torch.utils.data.DataLoader(val_dataset, batch_size=4, shuffle=True) } class_names = ['ants', 'bees'] device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu") .. GENERATED FROM PYTHON SOURCE LINES 143-146 Visualize a Few Images ~~~~~~~~~~~~~~~~~~~~~~ Let's visualize a few training images so as to understand the data augmentation. .. GENERATED FROM PYTHON SOURCE LINES 146-168 .. code-block:: default def imshow(inp, title=None): """Imshow for Tensor.""" inp = inp.numpy().transpose((1, 2, 0)) mean = np.array([0.485, 0.456, 0.406]) std = np.array([0.229, 0.224, 0.225]) inp = std * inp + mean inp = np.clip(inp, 0, 1) plt.imshow(inp) if title is not None: plt.title(title) plt.pause(0.001) # pause a bit so that plots are updated # Get a batch of training data inputs, classes = next(iter(dataloaders['train'])) # Make a grid from batch out = torchvision.utils.make_grid(inputs) imshow(out, title=[class_names[x] for x in classes]) .. image-sg:: /user-guide/vision/auto_quickstarts/images/sphx_glr_plot_classification_tutorial_001.png :alt: ['ants', 'ants', 'bees', 'ants'] :srcset: /user-guide/vision/auto_quickstarts/images/sphx_glr_plot_classification_tutorial_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 169-176 .. image :: /_static/images/tutorials/ants-bees.png :width: 400 :alt: Ants and Bees Downloading a pre-trained model ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Now, we will download a pre-trained model from torchvision, that was trained on the ImageNet dataset. .. GENERATED FROM PYTHON SOURCE LINES 176-184 .. code-block:: default model = torchvision.models.resnet18(pretrained=True) num_ftrs = model.fc.in_features # We have only 2 classes model.fc = nn.Linear(num_ftrs, 2) model = model.to(device) _ = model.eval() .. rst-class:: sphx-glr-script-out .. code-block:: none Downloading: "https://download.pytorch.org/models/resnet18-f37072fd.pth" to /home/runner/.cache/torch/hub/checkpoints/resnet18-f37072fd.pth 0%| | 0.00/44.7M [00:00 First element is: with len of 4 Example output of an image shape from the dataloader torch.Size([3, 224, 224]) Image values tensor([[[-0.79930, -0.90205, -0.95342, ..., 2.18041, 2.23178, 2.21466], [-0.71367, -0.79930, -0.83355, ..., 2.04341, 1.54679, 1.64954], [-0.54243, -0.62805, -0.69655, ..., 2.21466, 1.94066, 1.61529], ..., [-0.81642, -1.10754, -1.38154, ..., -0.06293, -0.13143, -0.26843], [-0.90205, -1.10754, -1.24454, ..., -0.38830, -0.47393, -0.59380], [-0.90205, -1.05617, -1.07329, ..., -0.74792, -0.71367, -0.78217]], [[ 0.18768, 0.04762, -0.09244, ..., 2.41106, 2.41106, 2.41106], [ 0.34524, 0.20518, 0.08263, ..., 2.34104, 1.76331, 1.72829], [ 0.55532, 0.41527, 0.29272, ..., 2.39356, 2.18347, 1.50070], ..., [-0.12745, -0.44258, -0.70518, ..., 0.52031, 0.45028, 0.36275], [-0.17997, -0.37255, -0.49510, ..., 0.22269, 0.10014, 0.01261], [-0.10994, -0.26751, -0.28501, ..., 0.03011, -0.09244, -0.21499]], [[-1.14214, -1.19442, -1.21185, ..., 2.57028, 2.60514, 2.34370], [-1.05499, -1.14214, -1.15956, ..., 2.58771, 2.08227, 1.52453], [-0.93298, -1.03756, -1.07242, ..., 2.46571, 2.34370, 1.97769], ..., [-1.29900, -1.54301, -1.75216, ..., -0.60183, -0.60183, -0.56697], [-1.36871, -1.54301, -1.68244, ..., -0.81098, -0.89813, -0.86327], [-1.43843, -1.61272, -1.68244, ..., -1.05499, -1.14214, -1.10728]]]) -------------------------------------------------------------------------------- Second element is: with len of 4 Example output of a label shape from the dataloader torch.Size([]) Image values tensor(1) .. GENERATED FROM PYTHON SOURCE LINES 207-213 Implementing the ClassificationData class ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The first step is to implement a class that enables deepchecks to interact with your model and data. The appropriate class to implement should be selected according to you models task type. In this tutorial, we will implement the classification task type by implementing a class that inherits from the :class:`deepchecks.vision.classification_data.ClassificationData` class. .. GENERATED FROM PYTHON SOURCE LINES 213-252 .. code-block:: default # The goal of this class is to make sure the outputs of the model and of the dataloader are in the correct format. # To learn more about the expected format please visit the API reference for the # :class:`deepchecks.vision.classification_data.ClassificationData` class. class AntsBeesData(ClassificationData): def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) def batch_to_images(self, batch): """ Convert a batch of data to images in the expected format. The expected format is an iterable of cv2 images, where each image is a numpy array of shape (height, width, channels). The numbers in the array should be in the range [0, 255] """ inp = batch[0].detach().numpy().transpose((0, 2, 3, 1)) mean = [0.485, 0.456, 0.406] std = [0.229, 0.224, 0.225] inp = std * inp + mean inp = np.clip(inp, 0, 1) return inp*255 def batch_to_labels(self, batch): """ Convert a batch of data to labels in the expected format. The expected format is a tensor of shape (N,), where N is the number of samples. Each element is an integer representing the class index. """ return batch[1] def infer_on_batch(self, batch, model, device): """ Returns the predictions for a batch of data. The expected format is a tensor of shape (N, n_classes), where N is the number of samples. Each element is an array of length n_classes that represent the probability of each class. """ logits = model.to(device)(batch[0].to(device)) return nn.Softmax(dim=1)(logits) .. GENERATED FROM PYTHON SOURCE LINES 253-254 After defining the task class, we can validate it by running the following code: .. GENERATED FROM PYTHON SOURCE LINES 254-265 .. code-block:: default LABEL_MAP = { 0: 'ants', 1: 'bees' } training_data = AntsBeesData(data_loader=dataloaders["train"], label_map=LABEL_MAP) val_data = AntsBeesData(data_loader=dataloaders["val"], label_map=LABEL_MAP) training_data.validate_format(model) val_data.validate_format(model) .. rst-class:: sphx-glr-script-out .. code-block:: none Deepchecks will try to validate the extractors given... Structure validation -------------------- Label formatter: Pass! Prediction formatter: Pass! Image formatter: Pass! Content validation ------------------ For validating the content within the structure you have to manually observe the classes, image, label and prediction. Examples of classes observed in the batch's labels: [[1], [0], [1], [1]] Visual images & label & prediction: should open in a new window ******************************************************************************* This machine does not support GUI The formatted image was saved in: /home/runner/work/deepchecks/deepchecks/docs/source/user-guide/vision/quickstarts/deepchecks_formatted_image (2).jpg Visual example of an image. Label class 1 Prediction class 0 validate_extractors can be set to skip the image saving or change the save path ******************************************************************************* Deepchecks will try to validate the extractors given... Structure validation -------------------- Label formatter: Pass! Prediction formatter: Pass! Image formatter: Pass! Content validation ------------------ For validating the content within the structure you have to manually observe the classes, image, label and prediction. Examples of classes observed in the batch's labels: [[1], [1], [1], [1]] Visual images & label & prediction: should open in a new window ******************************************************************************* This machine does not support GUI The formatted image was saved in: /home/runner/work/deepchecks/deepchecks/docs/source/user-guide/vision/quickstarts/deepchecks_formatted_image (3).jpg Visual example of an image. Label class 1 Prediction class 0 validate_extractors can be set to skip the image saving or change the save path ******************************************************************************* .. GENERATED FROM PYTHON SOURCE LINES 266-272 And observe the output: Running Deepchecks' full suite on our data and model! ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Now that we have defined the task class, we can validate the model with the full suite of deepchecks. This can be done with this simple few lines of code: .. GENERATED FROM PYTHON SOURCE LINES 272-278 .. code-block:: default from deepchecks.vision.suites import full_suite suite = full_suite() result = suite.run(training_data, val_data, model, device=device) .. rst-class:: sphx-glr-script-out .. code-block:: none Validating Input: | | 0/1 [Time: 00:00] Validating Input: |#####| 1/1 [Time: 00:00] Validating Input: |#####| 1/1 [Time: 00:00] Ingesting Batches - Train Dataset: | | 0/61 [Time: 00:00] Ingesting Batches - Train Dataset: |# | 1/61 [Time: 00:00] Ingesting Batches - Train Dataset: |## | 2/61 [Time: 00:00] Ingesting Batches - Train Dataset: |### | 3/61 [Time: 00:00] Ingesting Batches - Train Dataset: |#### | 4/61 [Time: 00:00] Ingesting Batches - Train Dataset: |##### | 5/61 [Time: 00:00] Ingesting Batches - Train Dataset: |###### | 6/61 [Time: 00:00] Ingesting Batches - Train Dataset: |####### | 7/61 [Time: 00:01] Ingesting Batches - Train Dataset: |######## | 8/61 [Time: 00:01] Ingesting Batches - Train Dataset: |######### | 9/61 [Time: 00:01] Ingesting Batches - Train Dataset: |########## | 10/61 [Time: 00:01] Ingesting Batches - Train Dataset: |########### | 11/61 [Time: 00:01] Ingesting Batches - Train Dataset: |############ | 12/61 [Time: 00:01] Ingesting Batches - Train Dataset: |############# | 13/61 [Time: 00:01] Ingesting Batches - Train Dataset: |############## | 14/61 [Time: 00:02] Ingesting Batches - Train Dataset: |############### | 15/61 [Time: 00:02] Ingesting Batches - Train Dataset: |################ | 16/61 [Time: 00:02] Ingesting Batches - Train Dataset: |################# | 17/61 [Time: 00:02] Ingesting Batches - Train Dataset: |################## | 18/61 [Time: 00:02] Ingesting Batches - Train Dataset: |################### | 19/61 [Time: 00:02] Ingesting Batches - Train Dataset: |#################### | 20/61 [Time: 00:02] Ingesting Batches - Train Dataset: |##################### | 21/61 [Time: 00:03] Ingesting Batches - Train Dataset: |###################### | 22/61 [Time: 00:03] Ingesting Batches - Train Dataset: |####################### | 23/61 [Time: 00:03] Ingesting Batches - Train Dataset: |######################## | 24/61 [Time: 00:03] Ingesting Batches - Train Dataset: |######################### | 25/61 [Time: 00:03] Ingesting Batches - Train Dataset: |########################## | 26/61 [Time: 00:03] Ingesting Batches - Train Dataset: |########################### | 27/61 [Time: 00:03] Ingesting Batches - Train Dataset: |############################ | 28/61 [Time: 00:04] Ingesting Batches - Train Dataset: |############################# | 29/61 [Time: 00:04] Ingesting Batches - Train Dataset: |############################## | 30/61 [Time: 00:04] Ingesting Batches - Train Dataset: |############################### | 31/61 [Time: 00:04] Ingesting Batches - Train Dataset: |################################ | 32/61 [Time: 00:04] Ingesting Batches - Train Dataset: |################################# | 33/61 [Time: 00:04] Ingesting Batches - Train Dataset: |################################## | 34/61 [Time: 00:04] Ingesting Batches - Train Dataset: |################################### | 35/61 [Time: 00:05] Ingesting Batches - Train Dataset: |#################################### | 36/61 [Time: 00:05] Ingesting Batches - Train Dataset: |##################################### | 37/61 [Time: 00:05] Ingesting Batches - Train Dataset: |###################################### | 38/61 [Time: 00:05] Ingesting Batches - Train Dataset: |####################################### | 39/61 [Time: 00:05] Ingesting Batches - Train Dataset: |######################################## | 40/61 [Time: 00:05] Ingesting Batches - Train Dataset: |######################################### | 41/61 [Time: 00:05] Ingesting Batches - Train Dataset: |########################################## | 42/61 [Time: 00:06] Ingesting Batches - Train Dataset: |########################################### | 43/61 [Time: 00:06] Ingesting Batches - Train Dataset: |############################################ | 44/61 [Time: 00:06] Ingesting Batches - Train Dataset: |############################################# | 45/61 [Time: 00:06] Ingesting Batches - Train Dataset: |############################################## | 46/61 [Time: 00:06] Ingesting Batches - Train Dataset: |############################################### | 47/61 [Time: 00:06] Ingesting Batches - Train Dataset: |################################################ | 48/61 [Time: 00:06] Ingesting Batches - Train Dataset: |################################################# | 49/61 [Time: 00:07] Ingesting Batches - Train Dataset: |################################################## | 50/61 [Time: 00:07] Ingesting Batches - Train Dataset: |################################################### | 51/61 [Time: 00:07] Ingesting Batches - Train Dataset: |#################################################### | 52/61 [Time: 00:07] Ingesting Batches - Train Dataset: |##################################################### | 53/61 [Time: 00:07] Ingesting Batches - Train Dataset: |###################################################### | 54/61 [Time: 00:07] Ingesting Batches - Train Dataset: |####################################################### | 55/61 [Time: 00:07] Ingesting Batches - Train Dataset: |######################################################## | 56/61 [Time: 00:08] Ingesting Batches - Train Dataset: |######################################################### | 57/61 [Time: 00:08] Ingesting Batches - Train Dataset: |########################################################## | 58/61 [Time: 00:08] Ingesting Batches - Train Dataset: |########################################################### | 59/61 [Time: 00:08] Ingesting Batches - Train Dataset: |############################################################ | 60/61 [Time: 00:08] Ingesting Batches - Train Dataset: |#############################################################| 61/61 [Time: 00:08] Ingesting Batches - Train Dataset: |#############################################################| 61/61 [Time: 00:08] Computing Single Dataset Checks - Train Dataset: | | 0/7 [Time: 00:00] Computing Single Dataset Checks - Train Dataset: |#### | 4/7 [Time: 00:00, Check=Image Segment Performance] Computing Single Dataset Checks - Train Dataset: |##### | 5/7 [Time: 00:00, Check=Image Property Outliers] Computing Single Dataset Checks - Train Dataset: |#######| 7/7 [Time: 00:00, Check=Property Label Correlation] Computing Single Dataset Checks - Train Dataset: |#######| 7/7 [Time: 00:00, Check=Property Label Correlation] Ingesting Batches - Test Dataset: | | 0/39 [Time: 00:00] Ingesting Batches - Test Dataset: |# | 1/39 [Time: 00:00] Ingesting Batches - Test Dataset: |## | 2/39 [Time: 00:00] Ingesting Batches - Test Dataset: |### | 3/39 [Time: 00:00] Ingesting Batches - Test Dataset: |#### | 4/39 [Time: 00:00] Ingesting Batches - Test Dataset: |##### | 5/39 [Time: 00:00] Ingesting Batches - Test Dataset: |###### | 6/39 [Time: 00:00] Ingesting Batches - Test Dataset: |####### | 7/39 [Time: 00:01] Ingesting Batches - Test Dataset: |######## | 8/39 [Time: 00:01] Ingesting Batches - Test Dataset: |######### | 9/39 [Time: 00:01] Ingesting Batches - Test Dataset: |########## | 10/39 [Time: 00:01] Ingesting Batches - Test Dataset: |########### | 11/39 [Time: 00:01] Ingesting Batches - Test Dataset: |############ | 12/39 [Time: 00:01] Ingesting Batches - Test Dataset: |############# | 13/39 [Time: 00:01] Ingesting Batches - Test Dataset: |############## | 14/39 [Time: 00:02] Ingesting Batches - Test Dataset: |############### | 15/39 [Time: 00:02] Ingesting Batches - Test Dataset: |################ | 16/39 [Time: 00:02] Ingesting Batches - Test Dataset: |################# | 17/39 [Time: 00:02] Ingesting Batches - Test Dataset: |################## | 18/39 [Time: 00:02] Ingesting Batches - Test Dataset: |################### | 19/39 [Time: 00:02] Ingesting Batches - Test Dataset: |#################### | 20/39 [Time: 00:03] Ingesting Batches - Test Dataset: |##################### | 21/39 [Time: 00:03] Ingesting Batches - Test Dataset: |###################### | 22/39 [Time: 00:03] Ingesting Batches - Test Dataset: |####################### | 23/39 [Time: 00:03] Ingesting Batches - Test Dataset: |######################## | 24/39 [Time: 00:03] Ingesting Batches - Test Dataset: |######################### | 25/39 [Time: 00:03] Ingesting Batches - Test Dataset: |########################## | 26/39 [Time: 00:03] Ingesting Batches - Test Dataset: |########################### | 27/39 [Time: 00:04] Ingesting Batches - Test Dataset: |############################ | 28/39 [Time: 00:04] Ingesting Batches - Test Dataset: |############################# | 29/39 [Time: 00:04] Ingesting Batches - Test Dataset: |############################## | 30/39 [Time: 00:04] Ingesting Batches - Test Dataset: |##############################9 | 31/39 [Time: 00:04] Ingesting Batches - Test Dataset: |################################ | 32/39 [Time: 00:04] Ingesting Batches - Test Dataset: |################################# | 33/39 [Time: 00:04] Ingesting Batches - Test Dataset: |################################## | 34/39 [Time: 00:05] Ingesting Batches - Test Dataset: |################################### | 35/39 [Time: 00:05] Ingesting Batches - Test Dataset: |#################################### | 36/39 [Time: 00:05] Ingesting Batches - Test Dataset: |##################################### | 37/39 [Time: 00:05] Ingesting Batches - Test Dataset: |###################################### | 38/39 [Time: 00:05] Ingesting Batches - Test Dataset: |#######################################| 39/39 [Time: 00:05] Computing Single Dataset Checks - Test Dataset: | | 0/7 [Time: 00:00] Computing Single Dataset Checks - Test Dataset: |#### | 4/7 [Time: 00:00, Check=Image Segment Performance] Computing Single Dataset Checks - Test Dataset: |###### | 6/7 [Time: 00:00, Check=Label Property Outliers] Computing Single Dataset Checks - Test Dataset: |#######| 7/7 [Time: 00:00, Check=Property Label Correlation] Computing Checks: | | 0/10 [Time: 00:00] Computing Checks: | | 0/10 [Time: 00:00, Check=Class Performance] Computing Checks: | | 0/10 [Time: 00:00, Check=Train Test Prediction Drift] Computing Checks: |## | 2/10 [Time: 00:00, Check=Train Test Prediction Drift] Computing Checks: |## | 2/10 [Time: 00:00, Check=Simple Model Comparison] Computing Checks: |## | 2/10 [Time: 00:00, Check=Model Error Analysis] Default parameter min_samples_leaf will change in version 2.6.See https://github.com/scikit-learn-contrib/category_encoders/issues/327 Default parameter smoothing will change in version 2.6.See https://github.com/scikit-learn-contrib/category_encoders/issues/327 Computing Checks: |#### | 4/10 [Time: 00:00, Check=Model Error Analysis] Computing Checks: |#### | 4/10 [Time: 00:00, Check=New Labels] Computing Checks: |#### | 4/10 [Time: 00:00, Check=Heatmap Comparison] Computing Checks: |#### | 4/10 [Time: 00:00, Check=Train Test Label Drift] Computing Checks: |#### | 4/10 [Time: 00:00, Check=Image Property Drift] Computing Checks: |######## | 8/10 [Time: 00:00, Check=Image Property Drift] Computing Checks: |######## | 8/10 [Time: 00:00, Check=Image Dataset Drift] Computing Checks: |######## | 8/10 [Time: 00:00, Check=Property Label Correlation Change] Computing Checks: |##########| 10/10 [Time: 00:01, Check=Property Label Correlation Change] Computing Checks: |##########| 10/10 [Time: 00:01, Check=Property Label Correlation Change] .. GENERATED FROM PYTHON SOURCE LINES 279-282 Observing the results: ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The results can be saved as a html file with the following code: .. GENERATED FROM PYTHON SOURCE LINES 282-285 .. code-block:: default result.save_as_html('output.html') .. rst-class:: sphx-glr-script-out .. code-block:: none 'output (2).html' .. GENERATED FROM PYTHON SOURCE LINES 286-287 Or, if working inside a notebook, the output can be displayed directly by simply printing the result object: .. GENERATED FROM PYTHON SOURCE LINES 287-289 .. code-block:: default result .. raw:: html
Full Suite


.. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 21.210 seconds) .. _sphx_glr_download_user-guide_vision_auto_quickstarts_plot_classification_tutorial.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_classification_tutorial.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_classification_tutorial.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_