Blog
Feature Spotlight: Meet Your Quality Thresholds with Review Steps

Feature Spotlight: Meet Your Quality Thresholds with Review Steps

Have you heard of the pointing and calling method? It’s a Japanese safety control method that reduces errors by 83%. It’s most prominently used by Japanese railway employees, and it’s exactly what the name implies it to be: One employee points at an item or process of interest (e.g., the speedometer) and calls out the reading (the train’s speed, in this example) to a teammate. The teammate acknowledges the call and whether there’s anything wrong. If all is good, they move to the next safety check until they’re done. It’s an incredibly simple but effective way to make sure that a train safely gets from point A to point B.

You may be wondering: “What does this have to do with data annotation?

The same principle that makes pointing and calling so powerful applies when you want to get the maximum possible quality for your annotated data. Specifically, requiring an additional level of quality checks by two or more employees can help ensure a high standard of quality for your data annotations.

Our self-service data platform Sama Go includes a point-and-call-like review process which can serve as a first-pass quality assurance (QA) step baked right into your annotation workflow.

We call it Review Steps, and here’s how it can help you catch and resolve errors in your data annotation workflow so you can meet quality requirements more quickly.

How it works: Review Steps in the Sama Go platform

  1. To get started, the project manager who creates a project can require all (or a configurable sample of) annotations to be reviewed:
  2. The criteria for review can be fine-tuned in the “Settings” tab.
  3. The UI then shows the progress of all tasks:
    • Tasks that haven’t been touched yet are labeled as “New”;
    • Tasks that have been annotated but are not yet reviewed are listed as “In progress”; and
    • Tasks that have been done and reviewed are listed as “Completed”.
  4. The annotator would then click on “Start annotating” to annotate “New” tasks.
  5. When the tasks are annotated, the configured review step would require a reviewer to select and open an unreviewed task and click on “Start reviewing.”
    Ideally, the reviewer should be someone other than the person who originally annotated the data – Sama Go gives you the flexibility to assign whoever you think is best suited to review the task.
  6. S/he goes through the annotation and makes modifications as instructed from the “Review” instructions box.
  7. When done, the work is submitted.
  8. And that’s it! Export the annotations using the “Export” button or by using our APIs.

As you can see, this is all pretty straightforward, but it’s effective.

And importantly, this whole process — project creation, data annotation, annotation review — can be in your hands with Sama Go, the same platform that our dedicated annotators use to label and process data for Fortune 500 organizations’ computer vision models across a host of industries.

The Review Steps feature outlined in this post is a software implementation of Sama’s own in-house QA procedures which helps us guarantee a minimum of 95% data quality to our managed services clients. You and your organization can now freely use that platform for your own data projects today!

Get started with your first Review Steps workflow in Sama Go today

Related Resources

In-House vs Outsourcing Data Annotation for ML: Pros & Cons

13 Min Read

Sama’s Experiment-Driven Approach to Solving for High-Quality Labels at Scale

6 Min Read

ML Assisted Annotation Powered by MICROMODEL Technology

8 Min Read