Skip to main content
Version: v0.3.0

Tutorial 1A: Annotation task: Text classification

Overview

This tutorial describes the process of annotating and specifying an annotation task rubric for a text classification annotation task. To highlight the process, we are going to annotate a dataset that contains user reviews (in text format) and ratings (from 0 to 5) of Amazon products. This tutorial also quickly explores how you can download the fully annotated dataset supported in H2O Hydrogen Torch.

Step 1: Explore dataset

We are going to use the preloaded amazon-reviews-demo demo dataset for this tutorial. The dataset contains 180 samples (text), each containing a review of an Amazon product. Let's quickly explore the dataset.

  1. On the H2O Label Genie navigation menu, click Datasets.
  2. In the datasets table, click amazon-reviews-demo.

Step 2: Create an annotation task

Now that we have seen the dataset let's create an annotation task that enables you to annotate the dataset. An annotation task refers to the process of labeling data. For this tutorial, the annotation task refers to a text classification annotation task. It assigns one or more categorical target labels to an input text. Let's create an annotation task.

  1. Click New annotation task.
  2. In the Task name box, enter tutorial-1a.
  3. In the Task description box, enter Annotate a dataset containing reviews from Amazon products.
  4. In the Select task list, select Classification.
  5. In the Select text column box, select comment.
  6. Click Create task.

Step 3: Specify annotation task rubric

For our dataset and purposes of this tutorial, we use the comment of a review to determine if a client was happy or unhappy with the product (purchase).

  1. In the New class name box, enter Happy.
  2. Click Add.
  3. Click Add class.
  4. In the New class name box, enter Unhappy.
  5. Click Add.
  6. Click Continue to annotate.
note

H2O Label Genie supports multi-label text classification annotation tasks.

Step 4: Annotate dataset

Now that we have specified the annotation task rubric, let's annotate the dataset.

In the Annotate tab, you can individually annotate each review (text) in the dataset. Let's annotate the first review.

  1. A zero-shot learning model is on by default when you annotate a text classification annotation task. The model accelerates the annotation (labeling) process by providing the percentage probability of a text (in this case, a review) belonging to a certain label (one of the labels created in the Rubric tab).

    You can immediately start annotating in the Annotate tab or wait until the zero-shot model is ready to provide annotation suggestions. H2O Label Genie notifies you to Refresh the instance when zero-shot predictions (suggestions) are available.

    Refresh

    For example, after refreshing the instance in this tutorial, the model provided probabilities for each label.

    Annotated review with selection of Unhappy class

    Note
    • To learn about the utilized model for a text classification annotation task, see Zero-shot learning models: Text classification.
    • During the annotation process of a text classification dataset, you can download generated zero-shot predictions (probabilities) in the Export tab. To download all generated zero-shot predictions, consider the following instructions:
      caution
      • If the Enable zero-shot predictions setting is turned Off, the zero-shot learning model utilized for a text classification annotation task is not available during the annotation process while preventing the generation of zero-shot predictions (probabilities). To turn On the Enable zero-shot predictions setting, see Enable zero-shot predictions.
      • The time it takes H2O Label Genie to generate zero-shot predictions depends on the computational resources of the instance.
      1. Click the Export tab.
      2. In the Export zero-shot predictions list, select Download ZIP.
  1. Click Save and next.

    Note
    • Save and next saves the annotated review (sample)
    • To skip a review to annotate later: Click Skip.
      • Skipped reviews (samples) reappear after all non-skipped reviews are annotated
    • To download all annotated samples so far, consider the following instructions:
      1. Click the Export tab.
      2. In the Export approved samples list, select Download ZIP.
        note

        H2O Label Genie downloads a zip file containing the annotated dataset in a format that is supported in H2O Hydrogen Torch. To learn more, see Downloaded dataset formats: Text classification.

Download annotated dataset

After annotating all the reviews, you can download the dataset in a format that H2O Hydrogen Torch supports. Let's download the annotated dataset.

  1. In the Annotate tab, click Export approved samples. Notification of completed annotation task ready to export approved samples
  2. In the Export approved samples list, select Download ZIP.

Summary

In this tutorial, we learned the process of annotating and specifying an annotation task rubric for a text classification task. We also learned how to download a fully annotated dataset supported in H2O Hydrogen Torch.

Next

To learn the process of annotating and specifying an annotation task rubric for other various annotation tasks in computer vision (CV), natural language processing (NLP), and audio, see Tutorials.


Feedback