Rigorous Quality Assurance For
Data Annotation You Can Rely On

Mindy Support understands that there is a direct correlation

between the data quality and the performance of your product

which is why we put together all of our experience into creating

a QA process for data annotation that will give you peace of mind.

GET A QUOTE

How We Ensure Excellent Quality Annotation

Based on almost a decade of successful project realization, we created step-by-step roadmap with the most important steps we follow to ensure the highest quality of the annotation work:

  • The Pre-launch Phase
  • Launching the Project
  • Ensuring the Quality of the Work Done
  • Improving Quality (If Needed)
  • Additional Ways We Ensure the Highest Quality

    The Pre-launch Phase

  • We assign a dedicated project manager and quality assurance manager to each project. They are responsible for studying all of the instructions and information provided by the client and getting answers to any questions that may arise during the whole project.
  • Launching the Project

  • Once the pre-launch processes have been completed, our Quality Manager, either by themselves or with the project manager, provides all the necessary information and conducts training for the internal/external team. After the training, we test the team’s knowledge of the project and ensure the instructions are understood correctly. The team starts working only after all of these steps have been completed.
  • Ensuring the Quality of the Work Done

  • Every day our QA teams check the annotation quality. They check anywhere from 10% to 100% of the data annotation work, depending on the quality requirements and the QA phase. The quality team constantly supports the data annotators throughout the project, answering any questions and providing all the necessary information for all members of the project.
  • Improving Quality (If Needed)

  • A semi-automated system helps us categorize our team into groups: bottom, middle, top performers. Based on their result, Quality manager set a different action plan for each of these groups. Usually, the plans include some common actions: visual monitoring, online monitoring, coaching, the mistakes observation.
  • Additional Ways We Ensure the Highest Quality

  • We conduct calibration calls with all stakeholders of the quality control process, where we discuss and analyze all the smallest details of the tasks on the project. We ensure transparency and send weekly reports with quality indicators on the project or task. The report describes the challenges that cause quality problems and ways to fix them.

    The Pre-launch Phase

  • We assign a dedicated project manager and quality assurance manager to each project. They are responsible for studying all of the instructions and information provided by the client and getting answers to any questions that may arise during the whole project.

  • Launching the Project

  • Once the pre-launch processes have been completed, our Quality Manager, either by themselves or with the project manager, provides all the necessary information and conducts training for the internal/external team. After the training, we test the team’s knowledge of the project and ensure the instructions are understood correctly. The team starts working only after all of these steps have been completed.

  • Ensuring the Quality of the Work Done

  • Every day our QA teams check the annotation quality. They check anywhere from 10% to 100% of the data annotation work, depending on the quality requirements and the QA phase. The quality team constantly supports the data annotators throughout the project, answering any questions and providing all the necessary information for all members of the project.

  • Improving Quality (If Needed)

  • A semi-automated system helps us categorize our team into groups: bottom, middle, top performers. Based on their result, Quality manager set a different action plan for each of these groups. Usually, the plans include some common actions: visual monitoring, online monitoring, coaching, the mistakes observation.

  • Additional Ways We Ensure the Highest Quality

  • We conduct calibration calls with all stakeholders of the quality control process, where we discuss and analyze all the smallest details of the tasks on the project. We ensure transparency and send weekly reports with quality indicators on the project or task. The report describes the challenges that cause quality problems and ways to fix them.

Mindy Support’s Annotation Quality Services

Quality Validation

  • Quality check of annotations / pre-labels on required metrics (geometry accuracy, label correctness, etc.);
  • Project metrics improvements (annotations validation, adjustments, and creation) to meet the required accuracy level; 
  • Quality control of the output data.

When do you need it?

  • For cross-check between different vendors;
  • Reaching the quality over 95%;
  • To correct failed work of other vendors;
  • Working with pre-annotations and making manual corrections.

Quality Assurance (QA)

  • Quality check of annotations / pre-labels on required metrics (geometry accuracy, label correctness, etc.); 
  • Statistics collection and report compiling on the achieved quality (accuracy, precision, recall, F1 score, etc.); 
  • Rework management: provision of the feedback and requests to the team to correct mistakes;
  • Support with corrective measurements and the project’s guidelines adoption.

When do you need it?

  • To make sure that the targeted quality level has been reached;
  • For calibration between annotation teams on the understanding of the guidelines;
  • Need quality statistics and reports on annotations;

Quality Validation

  • Quality check of annotations / pre-labels on required metrics (geometry accuracy, label correctness, etc.);
  • Project metrics improvements (annotations validation, adjustments, and creation) to meet the required accuracy level; 
  • Quality control of the output data.

When do you need it?

  • For cross-check between different vendors;
  • Reaching the quality over 95%;
  • To correct failed work of other vendors;
  • Working with pre-annotations and making manual corrections.

Quality Assurance (QA)

  • Quality check of annotations / pre-labels on required metrics (geometry accuracy, label correctness, etc.); 
  • Statistics collection and report compiling on the achieved quality (accuracy, precision, recall, F1 score, etc.); 
  • Rework management: provision of the feedback and requests to the team to correct mistakes;
  • Support with corrective measurements and the project’s guidelines adoption.

When do you need it?

  • To make sure that the targeted quality level has been reached;
  • For calibration between annotation teams on the understanding of the guidelines;
  • Need quality statistics and reports on annotations;

Tell us more about your business needs in data annotation

Contact us

Annotation Quality Calculation Methods

Annotation quality refers to how much material is annotated correctly. Its tracking is required to ensure high quality processing on the project. Quality is always presented as a percentage. We use several types of quality calculations, you can learn more about them before the start of each project:

Images/frames quality calcullation.

Used when 1 annotation is required per image/frame (usually on Image Tagging projects), or at client requirements. To do this, you determine the total number of images/frames (z), and the number of images/frames with mistakes (z1) (it doesn’t matter whether there is one mistake or several). The formula for its calculation: x=(z-z1)/z.

Calculation of quality by annotations.

The most common method. To do this, you determine the total number of annotations (y) and the number of annotations with mistakes (y1). The formula for its calculation: x=(y-y1)/y.

Determining the quality with the scoring system (5-point, 10-point, etc.).

It is used very rarely, when in project is need too much time for counting the number of annotations, mistakes, etc. To do this, you determine the number of individually evaluated units (a), the maximum possible mark (m), and the sum of the marks of all evaluated units (ma). The formula for its calculation: x=ma/(m*a).

Use Cases

For this project we needed to verify how well a data annotation vendor annotated traffic lights in images. This project proved to be very challenging because of the sheer number of mistakes that were found. In fact, 3-4 times more mistakes were found during the validation process than after our internal annotation phase and the team had to be very focused on the detection and correction of mistakes to meet expectations.

  • Bounding Boxing and Tagging with attributes
  • Increased quality from <80% to >95%
  • Identified 3-4 times more mistakes than the vendor

We needed to check the quality of the work done by another vendor. They needed to draw boxes around all detected pieces of text in images, apply some labels to them and then transcribe detected text. Images contained scenes from stores, journals, books, streets, etc. All text in the image had to be detected. We worked in a client’s tool, so we provided only services of QA without any extra pre-\post-processing of metadata. We were able to completely meet the standards of the client and technically our team became a part of their QA team

  • OCR European Languages, Text
  • Bounding Boxing, Tagging with attributes and Transcription
  • Met 100% of the standards of the client

We needed to check annotations (tags of audio records and some pieces of audio records) instead of the client’s internal teams. We also needed to accumulate statistics on mistakes and their trends and provide feedback on the quality and mistakes to the team. Our quality Assistant had to check 10% of the work of each Annotator and send all mistakes back to correct them. Also, they had to make sure that all team members understand guidelines in the same way. The client was happy with our speed and the quality. After calibration, the team maintained a quality level of 99%.

  • Audio Annotation
  • Achieved a 99% quality level
  • Team maintained a quality level of 99%

The client asked us to check and correct pre-annotation. They created a workflow with a 2 steps approval process and the final QA step had to meet a >99% quality level. Herefore, we needed to make sure on our end that the quality of annotation meets the requirements of the End customer in all categories (geometry, correctness of detected objects, correctness of attributes). The client was impressed by how fast we scaled up and increased throughputs on the project, maintaining very good quality.

  • Video Annotation of Traffic in streets
  • Audio Tagging
  • >99% quality level

The client requested our assistance in checking annotations after their other vendors and detecting mistakes. They sent tasks with mistakes back for correction to other vendors and we needed to analyze all of this information and come up with an actualization strategy. Our QA team had to check the accuracy of geometry, labels, and attributes, detect mistakes and send tasks with mistakes back for correction to other vendors. Our team was responsible to control that the final quality on the project meets the requirements of the client. Our QA team could mark mistakes in the tool, so other vendors could see them and correct them in an efficient way.

  • 3D point cloud, Lidar
  • 3D Bounding Boxing and Tagging with attributes
  • The final quality met 100% of the end customer requirements.

The highest quality management and security standards

Let’s Expand with Mindy!

    I have read and agree to the Privacy Policy

    We have a minimum threshold for starting any new project, which is 735 productive man-hours a month (equivalent to 5 graphic annotators working on the task monthly).