Watch the video, or follow the step-by-step guide below to learn how to create Manual QA projects.
Under in the Annotate section in the navigation bar, select ‘Projects’. Select the Manual QA tab to start creating a Manual QA Project.
If you are part of an Organization a project tags drop-down is visible. Project tags are useful for categorizing your projects. Select as many tags as are relevant to your project.
Enter a meaningful title and description. A clear title and description help keep your Projects organized.
Attach one or more Datasets to the Project. Click the Attach dataset button and select the Datasets you want to add to the Project. You have the option to create a new Dataset.
Permission | Admin | Team Manager | Reviewer | Annotator | Annotator & Reviewer |
---|---|---|---|---|---|
Attach / Detach datasets | ✅ | ❌ | ❌ | ❌ | ❌ |
Attach / Switch ontology | ✅ | ❌ | ❌ | ❌ | ❌ |
Delete | ✅ | ❌ | ❌ | ❌ | ❌ |
Invite team members | ✅ | ✅ | ❌ | ❌ | ❌ |
Manage team permissions | ✅ | ✅ | ❌ | ❌ | ❌ |
Manage admins | ✅ | ❌ | ❌ | ❌ | ❌ |
Annotate & review tasks in task management system | ✅ | ✅ | Review only | Annotate only | ✅ |
Confirm annotations outside of the task management system | ✅ | ✅ | ❌ | ❌ | ❌ |
Control assignments and status in task management system | ✅ | ✅ | ❌ | ❌ | ❌ |
Manual quality assurance for annotation projects means that annotation tasks have to be reviewed before they can be marked as Complete.
You can set the following parameters for manual quality control in the Settings tab in your annotation project shown in the screenshot below:
A. Sampling Rate B. Multi review assignment C. Default rejection reasons D. Reviewer mapping E. Expert review
Project administrators can dynamically change the sampling rate applied to submitted annotation tasks. The sampling rate determines the proportion of the submitted labels that a reviewer should review. This can be modified with the slider.
Sampling rates can also be configured by annotation type and annotator (e.g. class Y should have a sampling rate of 50%, class Z should have a sampling rate of 80%, annotator A should have a sampling rate of 70%, annotator B should have a sampling rate of 95%) by clicking the Configure button (this feature is only available to paying users).
Annotation tasks with many labels across one data asset might get partitioned into review tasks that are distributed to different reviewers. Enabling multi review assignment means that all review tasks generated through the submission of one annotation task are assigned to the same reviewer.
The default rejection reasons allows an admin to create default responses a reviewer can select when rejecting annotation tasks. Pressing the + New button and entering a response will save it for future reviews. Setting default rejection reasons can help you identify and systematize errors in your labels.
You can configure rules that automatically assign specific reviewers to classes and annotators (e.g. label X with class Y should always be reviewed by reviewer Z). The setting can be configured by toggling the ‘Reviewer mapping enabled’ option.
Clicking the Configure button opens up a window where you can assign reviewers to specific annotators or classes. Assigning a reviewer to classes (objects or classifications) can be done under the Class mapping tab, and assigning a reviewer to annotators under the Annotator mapping tab. Any number of reviewers can be assigned to annotators and classes. One of them will be selected at a time for each task submitted.
Many industries and domains require years of training or experience to accurately recognize and classify examples — and an expert’s time can often be expensive or hard to schedule. In other cases, there may be additional requirements on your data quality assurance processes depending on the regulatory environment.
To help customers speed up their data annotation processes in these complex environments, Encord provides an expert review feature which empowers expert reviewers you designate to have an additional layer of oversight in the review process.
Expert reviews differ from normal reviews in the following ways:
The expert review configuration resembles as follows:
Set up an expert review configuration by specifying the parameters.
After X reviews: Choose X such that after X cycles of submission and rejection by a normal reviewer, all rejected reviews are forwarded to expert review. This may sometimes be known as the review count threshold. Because 2 is the review count threshold in the above sample configuration, all reviews rejected for a second time will be sent to expert review.
Expert reviewers: Choose the pool of possible expert reviewers. There is no requirement to designate a user as an expert reviewer, other than they have at least reviewer permissions inside in the project. Users can be made expert reviewers regardless of their placement within normal annotator or class reviewer mappings.
Expert review stages: Stages or iterations, indicate how to forward review results to expert review after each normal review. In the above sample configuration, 10% of all first reviews will be sent to expert review, and 50% of approved second reviews will be sent to expert review.
The above configuration can be visualized as follows:
Video Tutorial - Monitoring annotation progress
Selecting a project from the list of annotation projects takes you to its ‘Project dashboard’.
This is where you monitor and manage your project. For example, you can view your project’s summary statistics, manage labeling tasks, view your team’s productivity, train models and invite collaborators.
The dashboard is split into the following:
Access to each tab is associated with the various project roles as follows:
Tab | Annotator | Reviewer | Annotator + Reviewer | Team Manager | Admin |
---|---|---|---|---|---|
Summary | ✅ | ✅ | ✅ | ✅ | ✅ |
Explore | ❌ | ❌ | ❌ | ✅ | ✅ |
Labels | ✅ | ✅ | ✅ | ✅ | ✅ |
Performance | ❌ | ❌ | ❌ | ✅ | ✅ |
Models | ❌ | ❌ | ❌ | ✅ | ✅ |
Export | ❌ | ❌ | ❌ | ❌ | ✅ |
Settings | ❌ | ❌ | ❌ | ✅ | ✅ |
Clicking an annotation project takes you to its Summary dashboard. This dashboard has 2 components and gives you a rich visual display of your project’s progress at a high level.
Project task status overview
Displays the number of annotation tasks that are in each state: Annotate, Review or Completed.
Instance label task status
Displays the number of labels / instances that have been created, and their assigned status.
For a more comprehensive summary of how a task moves from annotation through instance review and full completion, reference the Status section below.
The Explore page provides interfaces to help you understand how project’s annotations are distributed amongst the data assets at both an instance and label level. It allows a deeper exploration through attributes on objects, as well as frame-level classifications.
This section provides the total count of all instances across the datasets in your project.
This is a summary of how your labels are distributed across the project. The pie chart shows a breakdown of how many labels there are for a given class.
The Labels page is your gateway to annotating, reviewing, and auditing the labels made against all the datasets in your project. Access to each pane will depend on the user’s project role. We briefly summarize the purpose of each tab, and the roles which can access each below.
Role | Activity | Queue | Data | Instances |
---|---|---|---|---|
Annotator | ✅ | ✅ | ❌ | ❌ |
Reviewer | ✅ | ✅ | ❌ | ❌ |
Annotator + Reviewer | ✅ | ✅ | ❌ | ❌ |
Team Manager | ✅ | ✅ | ✅ | ✅ |
Admin | ✅ | ✅ | ✅ | ✅ |
The labels dashboard features the following tabs:
The Activity page allows you to quickly monitor annotation and review activity in your project by showing tasks and providing a summary interface. The status of reviewed labels inside each task can also be seen. Tasks are displayed in most recently edited order from top to bottom.
The file column shows the name of the data asset. For files uploaded via the GUI, they keep the name they were uploaded with. For files added from your cloud storage, this will normally be the path under the bucket they are stored on.
Use the search interface to quickly filter and display only those tasks with file names matching your desired text. Even partial matches will be shown. For example: searching “fly” will return file names containing “flyover” and “flyaround.”
The Reopen button allows Administrators and Team Managers to send tasks which are currently Completed or In review back to annotation. Select your target tasks using the checkboxes in the File column to select individual assets, or select the checkbox in the column header to select all tasks, and press the Reopen button to move all selected tasks back to the annotation stage. Tasks reopened in this way will have the status Returned in the Queue tab. No labels are lost by reopening a task. The ‘Reopen’ action is only applied to tasks which are both visible (i.e. not filtered out by the file search) and selected.
This column shows the status of this task within the task management system. The Activity pane only shows assets which have had some action done on them, and therefore only reflects tasks with the following statuses:
For a comprehensive summary of the possible task states, see the status section of the Data tab, below.
The ‘Reviews’ column shows a count of how many instances have been reviewed in a given data asset. Click the number to open a panel which shows the last review action taken on each instance, as well as who originally created the annotation and when. Note that unless the review was done by an ‘Expert Reviewer’, all reviewed annotations must be either ‘Approved’ or ‘Deleted’ before a task can be ‘Completed.’ Read more about the Expert Review feature here.
The Queue tab is where annotators and reviewers look to find their next task. The Start labeling and Start reviewing buttons visible throughout the project open the label editor with the next task in the queue according to the relevant task type.
The Queue tab can be used to assess the number of tasks assigned to you as an annotator or reviewer and therefore estimate your likely workload. Administrators and Team Managers can also use it to quickly verify the current assignments per team member, and change assignments as necessary.
The file column shows the name of the data asset. For files uploaded via the GUI, they keep the name they were uploaded with. For files added from your cloud storage, this will normally be the path under the bucket they are stored on.
Use the search interface to quickly filter and display only those tasks with file names matching your desired text. Even partial matches will be shown. For example: searching “fly” will return file names containing “flyover” and “flyaround.”
The ‘Assign’ button allows Administrators and Team Managers to allocate unassigned tasks to specific collaborators for annotation or review. Select your target tasks using the checkboxes in the File column to select individual assets, or select the checkbox in the column header to select all tasks, and press the ‘Assign’ button open the task assignment popup.
Confirm the selected tasks are as intended, then select the target collaborator from the drop-down and press assign. Tasks which have already been assigned to another collaborator, as indicated by the email in the ‘Reserved by’ column, can not be reassigned until they have first been released.
The Queue tab only shows tasks which have remaining annotation or review work to be done within the task management system. Therefore, the stage of the task within the TMS is understood by reading the Status and Task columns together.
The two types of tasks are ‘Annotate’ and ‘Review’ which can be in any of the following states:
There are two relevant actions that can be done on each task from the ‘Queue’ pane. Press ‘Initiate’ to open the label editor and proceed with annotation or review, depending on the task type.
Additionally, Administrators and Team Managers can click the three vertical dots to open the expanded menu, to access the ‘Release task’ function. Tasks must be explicitly released before they can be reassigned.
The Data page gives a complete overview of all the data asset tasks in the project, regardless of their progress through the task management system. Therefore, this is the first place Administrators and Team Managers should check if they want to confirm the status of a given task.
-
).The file column shows the name of the data asset. For files uploaded via the GUI, they keep the name they were uploaded with. For files added from your cloud storage, this will normally be the path under the bucket they are stored on.
Use the search interface to quickly filter and display only those tasks with file names matching your desired text. Even partial matches will be shown. For example: searching “fly” will return file names containing “flyover” and “flyaround.”
The data tab provides the most comprehensive overview of all the tasks associated with each data asset in a given project. As such, this is the first place to check to see the status of various tasks.
Clicking View will drop you into the label editor to do a live audit of the annotations in this data asset. The Data tab is only visible to Administrators and Team Managers and so grants great power to view any data asset, however appropriate care must be taken to ensure annotations are not simultaneously edited from the ‘Queue’ pane by an annotator or reviewer. Encord advises edit actions are NOT taken from the Data tab unless you have received confirmation no one else is concurrently editing the asset.
Other possible actions include ‘API Details’ which show a popup with sample code you can use to get started with our SDK to access this particular data asset, often known as a label row in the SDK. Click ‘Activity log’ to see a popup with a graphical summary of add / edit / delete actions on this data asset indexed by annotator or ontology class. Click ‘Display logs’ in the lower right to show all actions in reverse chronological order.
The Instances tab allows Administrators and Team Managers to search within the data to directly find specific instances. Recall that an annotation instance correlates to a unique instantiation of a specific ontology class in a data asset.
For example, if you have the ‘Person’ class in your ontology, the first instance of a ‘Person’ in a given data asset will be indicated in the interface as ‘Person (0)’, the second as ‘Person (1)’ and so on. Instances, therefore, can exist in multiple frames of a data asset, and indicate the same object. Use the Instances tab to search for specific instances of objects or classifications using their Identifier.
Instance identifiers are unique at the project scope, and can be found in any of the following ways:
objectHash
or classificationHash
as appropriate.objectHash
or classificationHash
.Once you have an identifier of interest, use the ‘Search instance’ interface to filter the instances by identifier to quickly find the instance you’re interested in. This can be particularly handy when you want to visually confirm an annotation you may not have seen before, but for which you have the identifier.
After locating your instance of interest, click View from the ‘Actions’ column to jump deeply into the dataset straight to where the instance is first annotated.
The Settings tab allows you to make modifications to your project using the following tabs:
To copy a project, click the Copy project button in the Options section of the project’s Settings. This opens the copy project window. From the Copy Project window, you can pick the various parts of your project you want to copy over into your new project.
Choose the parts of your project you want to copy.
You can copy any combination of the following assets:
The new annotation project will use the same ontology as the original. This can be changed in the project settings if required.
If you do not want to copy labels, press Copy project. This creates a copy of your Project, which you can then access in the Projects tab.
If you choose to copy over labels, you will be asked to select the data assets for which you would like labels copied over. To begin the process, press Next: configure labels. Continue to step 2. below.
Select the data units with the labels that you want to copy into your new project.
Click Next to continue.
Select the statuses of the files you want copied over into your new project.
This means that all tasks will be
Annotate
tasks, and their status will beQueued
.
All tasks will have to be re-assigned after being copied.
Click the Copy project button to complete the process.
Video Tutorial - Uploading annotator instructions
Click the Add instructions button to upload instructions for your annotators in PDF format.
To ensure the best possible results, provide as much detail as possible about what you would like annotated and how precise bounding boxes should be drawn. For example, instead of saying ‘person’, consider defining what should constitute a person for your annotators - only a full person? A torso? Or should any part of a person in a frame be labeled as a ‘person’?
You can add tags to a Project if you are part of an Organization.
Project tags allow you to:
Flexibly categorize and group your Projects.
Filter your Projects.
You can add tags to your Projects in:
When creating a Project.
In the Settings page of a Project. This process is described below.
To add tags to your Projects in the Settings page, navigate to the Options tab and click the Project tags drop-down. Here you will see the available tags in your Organization. Click on a tag to add it to a Project. You can remove a tag from your Project by clicking the same tag again, or clicking the x button next to its name.
You can filter your Projects based on the tags they contain. To do so, click the Projects tab in the navigation bar, click the Filter by tags drop-down and select one or more Project tags. This will result in only Projects with the tags being selected being displayed.
You can view or switch the Ontology attached to your Project.
Click the Switch ontology button to switch the ontology linked to your Project.
The resulting pop-up allows you to choose an existing Ontology from a list, or create a new ontology for this project.
Click the View ontology button to view the details of the Ontology that is attached to the current Project.
The Datasets section allows you to attach or detach any number of Datasets to your Project. You must create a new Dataset in the Datasets section for it to become available in a project’s settings.
The ‘Quality’ section allows you to configure the way that manual quality assurance is implemented for a given project.
The Sampling rate slider determines the percentage of labels that will be manually reviewed. Clicking Configure sampling rate allows you to set the sampling rate for each label type, or annotator separately.
The Multi review assignment enabled toggle will assign all labels created for a given task to the same reviewer.
Default rejection reasons allows you to add commonly used reasons for rejecting a label, to make them available to your reviewers and save time when reviewing tasks.
Toggle Reviewer mapping and click Configure reviewer mapping to assign classes, or labels made my specific annotators, to a particular reviewer.
Toggle Expert reviewer rule to enable expert review.
To manage project collaborators, select the ‘Team’ pane in your project Settings.
Here you can invite collaborators to the project, and configure their roles.
To invite collaborators from within your organization to the project:
Select a user role for the collaborator you want to add by selecting an option from the list.
Type the email address of the user you’d like to add and select the user from the list.
Click the Add button to add the user with the specified role.
Collaborators can be added to a project as a group - which can save time as well as ensure that no individual is forgotten.
In the ‘Groups’ section of the page, click on Manage to make the ‘Manage Groups’ pop-up appear.
Click the ‘Select group’ drop-down and pick a group you would like to add as collaborators. After selecting a group, click the ‘Select Role’ drop-down to assign a role to the group of collaborators. Click Add to add the group.
The group you just added will appear under the ‘Added groups’ heading. Repeat the process if you’d like to add more groups with different roles to the project.
Project admins can modify the different roles of collaborators, using the drop-down on the right.
You can assign the following roles to collaborators:
Please confirm or cancel your selection when making a collaborator a project Admin.
You can delete your project by going to the Danger zone tab at the bottom of the menu, and clicking the red Delete project button, shown below.
The Task management system (TMS) is a system built to optimize labeling and quality control for all annotation and review tasks, allowing thousands of annotators, reviewers, team managers, and administrators, to work concurrently on the same Manual QA project.
The task manager is enabled by default but can be switched on and off under the Options tab in project settings.
Annotation and review tasks are distributed automatically using the first in, first out method - tasks that have been in the queue the longest are served first. Annotation tasks are generated and added to the label queue when a dataset or a set of datasets is attached to a project and when new data is added to attached datasets. Review tasks are generated and added to the review queue once an annotator submits an annotation task. Conversely, detaching a dataset will remove any associated annotation and review tasks, so you should exercise caution if you proceed.
Team managers and administrators can also assign tasks explicitly to individual annotators and reviewers. Once an annotation or review task is distributed to an annotator or reviewer, it is reserved by that individual, prohibiting other team members from accessing that task. Both annotation and review tasks are accessible in the Queue pane of the Labels tab.
Annotation tasks are generated and added to the label queue when a dataset or a set of datasets is attached to a project and when a new data asset is added to attached datasets. Review tasks are generated and added to the review queue once an annotator submits an annotation task. Conversely, detaching a dataset will remove any associated annotation and review tasks, so you should exercise caution if you proceed.
By default, each data asset will be labeled once, and each label submitted for review will be reviewed once. You can create additional review tasks by clicking the + Add reviews button and following the steps in the window. You can reopen submitted annotation tasks if you wish to send the data asset back into the queue for further labeling by selecting the relevant assets and clicking the Reopen button.
Annotation and review tasks are distributed automatically using the first in-first out method (illustrated below) - tasks that have been in the queue the longest are served first. Once an annotator or reviewer clicks on the Start labeling or Start reviewing button, the next available free task in the queue is reserved by that individual, prohibiting other team members from accessing the task. Once the task is fetched, the annotator or reviewer is taken to the label editor to complete the task.
Project administrators and team managers can override the automated distribution of tasks by explicitly assigning tasks to individuals in the Queue pane of the Labels tab. Assignments can be done on a task-by-task basis or in bulk by selecting the relevant tasks and clicking the Assign button.
Tasks can be released by pressing the icon next to the task and clicking the Release task button. Reserved tasks do not have an expiry and will keep being assigned to an individual until it is submitted, released, or skipped.
An annotation task is completed once all outstanding labels subject to review have been reviewed. Completed annotation tasks and annotation tasks currently in the review stage are visible in the Activity pane of the Labels tab.
The Task Status indicates the status of a given task. A task’s status evolves from Queued for annotation to In review and finally Complete. If labels are rejected or the tasks is otherwise judged in need of further annotation work, the status will be marked as Returned. The most comprehensive view of task statuses is available to project Administrators, and Team Managers in a Project’s Labels dashboard of the Data tab.
Tasks are labeled in the Label Editor. Click Submit to submit your labels for review.
Annotators can skip tasks by clicking the Skip button. If a task is skipped, the next available task is automatically displayed and assigned.
Review tasks are completed in the Label Editor.
Review mode components:
All labels for review for a particular data asset assigned to the reviewer are automatically loaded into the ‘Pending reviews’ pane. Completed reviews are displayed in the ‘Completed reviews’ pane. You can click on specific objects to highlight them. Labels can be selected and then approved or rejected for a given instance or in bulk using the Reject and Approve buttons or the matching hotkeys, b for reject and n for approve.
You can enter the ‘Single label review’ mode by toggling the switch at the top. The single label review mode automatically highlights and hides other objects, allowing you to review and approve or reject a single label at a time and quickly browse through individual labels using the Up and Down keys on your keyboard.
A convenient feature is allowing reviewers to edit labels and make small adjustments without the need to return the entire set of labels to the annotator. Press the Edit labels button and make any necessary changes before switching back to review mode. Currently, only a subset of label edit operations are supported:
In addition to being able to review all labels for a given instance, you can review labels grouped by frame as well. For review workflows that focus on progressing through video by frame rather than by instance, use the Approve all in frame and Reject all in frame buttons. Of course, you should be sure you want to apply that judgement to all labels in a given frame before using this feature!
If a reviewer rejects a label during the review stage, it will be marked as Returned in the Queue pane of the Labels tab. By default, rejected annotation tasks are returned and assigned to the queue of the person who submitted the task.
Returned tasks are resolved in a purpose-built user interface in the Mark as resolved button to mark it as resolved.
Annotation tasks cannot be re-resubmitted until all issues have been marked as resolved. Once a task is re-submitted, the labels marked as resolved are sent back for an additional review. There is no limit on how many times a label can be rejected and sent back for correction.
If a reviewer determines that a label is missing entirely, they can use the report missing labels feature to indicate labels are missing in a given frame or image. Missing label reports will be sent back to the annotator via the same queue as rejected labels.