The Analytics tab of your Project shows event-based analytics for your Project’s tasks, labels, and users. The Analytics tab has the following views:
The analytics available on the Analytics dashboard vary based on user roles:
Actions performed in the Label Editor or using the SDK are only tracked on the Analytics dashboard if they the tasks are actioned (submitted/rejected/approved).
Project Analytics have the following filters:
Filter | Description |
---|---|
Collaborators | Filter by specific Project collaborators. |
Datasets | Filter tasks based on the dataset they belong to. |
File name includes | Filter by file name. Regex is supported, allowing filtering by prefix or infix in the file title. |
Event source | Filter by the source of the event. Either SDK or UI. |
Workflow stage | Filter tasks based on their current stage in the Workflow. |
Class | Filter by Ontology class. |
The Tasks tab of the Analytics dashboard provides a detailed overview of task throughput, stage efficiency, and task timers. It can help answer questions such as:
How productive was each collaborator in terms of labeling and reviewing tasks over the last week?
Which Dataset has the most labels added, edited, or deleted in a given Workflow stage?
The following charts are visible in the Tasks view:
Chart | Description |
---|---|
Task actions | Displays the number of annotation tasks submitted, skipped, review tasks approved, and review tasks rejected over a selected date range. |
Time spent | Shows the total time spent actively working in the Label Editor, providing insights into productivity. |
Task performance | Shows the actions taken for all tasks in the Project. |
The Task performance table includes the following columns:
Label analytics are not supported for Text and Audio modalities.
The Labels tab in the Analytics dashboard provides a detailed overview of your team’s labeling, reviewing, and task productivity, including time spent. It can help answer questions such as:
How many labels were submitted, approved, or rejected by the team over a given period?
What actions have been taken for specific objects and classifications in the Ontology?
How does the team’s productivity in labeling compare across different objects or classifications? Do certain objects take more time to label than others?
Chart | Description |
---|---|
Label actions | Displays the number of labels submitted, approved, or rejected. |
Objects and classifications actions | Shows the actions taken for all objects and classifications in the Ontology. |
The Objects and classifications actions table includes the following columns:
The Collaborators in the Analytics dashboard provides a detailed view of time spent annotating and reviewing by Project collaborators. It can help answer questions such as:
Chart | Description |
---|---|
Time spent | Displays the distribution of time spent per collaborator per day |
Annotators | Table view of all relevant task/label actions and timers for each collaborator in the Annotation stages |
Reviewers | Table view of all relevant task/label actions and timers for each collaborator in the Review stages |
Both tables and all CSV exports are filter-sensitive; they only display information within the selected filter conditions.
The Annotators and Reviewers table columns vary depending on whether the Instance labels or Frame labels view is selected. They include the following columns.
Instance Labels
The Annotators table includes the following columns:
The Reviewers table includes the following columns:
Frame Labels
The Annotators table includes the following columns:
The Reviewers table includes the following columns:
You can choose which columns should be displayed and included in the CSV export using the Columns selector.
The Issues tab in the Analytics dashboard provides a detailed view of all issues created in the Project.
Chart | Description |
---|---|
Issue actions | Shows the number of issues created, resolved, and opened. |
Issue tag occurrence | Shows the count of all issue tags used within the chosen timeframe. |
Issue tag occurrence by class | Shows the actions taken for all objects and classifications in the Ontology. |
The Issue actions chart shows the following:
Example - How to Understand Issue Analytics:
If your Issue actions chart shows:
This means that:
How do I track annotator performance on pre-labeling/model labels?
Annotator performance on pre-labeled or model-generated labels can be tracked using the Edited Labels and Deleted Labels metrics in the Annotators table. These include:
Vertex/coordinate changes are not tracked as edits.
Can I export analytics for payroll or performance tracking?
Yes, CSV exports from the Analytics tab provide data that can be used for payroll or performance tracking.
What is counted as a created label?
A created label is any new label added by an annotator or reviewer. This includes:
What is counted as an edited label?
An edited label is a modification made to an existing label. This includes:
Can I customize what data is included in the CSV exports?
No, CSV exports are pre-defined and contain all available metrics. However, you can filter the data before exporting to limit what is included.
How are timers tracked?
Timers track active time spent in the Label Editor:
Avg time per task
metric is calculated by dividing the total annotation/review time by the number of submitted tasks.Are label events counted if a task is skipped?
No. If a task is skipped, any actions performed before skipping (for example, creating/editing labels) are not recorded in the Analytics dashboard.
How are actions outside of the task queue counted?
Actions performed outside of the task queue, such as label modifications using the SDK are only tracked if the task is subsequently submitted, rejected, or approved.
How does the SDK filter work?
The Event Source filter allows you to view actions performed using the SDK or UI.
What happens if someone is inactive or takes a break?
How long does it take for actions to appear on the Analytics dashboard?
The Analytics tab of your Project shows event-based analytics for your Project’s tasks, labels, and users. The Analytics tab has the following views:
The analytics available on the Analytics dashboard vary based on user roles:
Actions performed in the Label Editor or using the SDK are only tracked on the Analytics dashboard if they the tasks are actioned (submitted/rejected/approved).
Project Analytics have the following filters:
Filter | Description |
---|---|
Collaborators | Filter by specific Project collaborators. |
Datasets | Filter tasks based on the dataset they belong to. |
File name includes | Filter by file name. Regex is supported, allowing filtering by prefix or infix in the file title. |
Event source | Filter by the source of the event. Either SDK or UI. |
Workflow stage | Filter tasks based on their current stage in the Workflow. |
Class | Filter by Ontology class. |
The Tasks tab of the Analytics dashboard provides a detailed overview of task throughput, stage efficiency, and task timers. It can help answer questions such as:
How productive was each collaborator in terms of labeling and reviewing tasks over the last week?
Which Dataset has the most labels added, edited, or deleted in a given Workflow stage?
The following charts are visible in the Tasks view:
Chart | Description |
---|---|
Task actions | Displays the number of annotation tasks submitted, skipped, review tasks approved, and review tasks rejected over a selected date range. |
Time spent | Shows the total time spent actively working in the Label Editor, providing insights into productivity. |
Task performance | Shows the actions taken for all tasks in the Project. |
The Task performance table includes the following columns:
Label analytics are not supported for Text and Audio modalities.
The Labels tab in the Analytics dashboard provides a detailed overview of your team’s labeling, reviewing, and task productivity, including time spent. It can help answer questions such as:
How many labels were submitted, approved, or rejected by the team over a given period?
What actions have been taken for specific objects and classifications in the Ontology?
How does the team’s productivity in labeling compare across different objects or classifications? Do certain objects take more time to label than others?
Chart | Description |
---|---|
Label actions | Displays the number of labels submitted, approved, or rejected. |
Objects and classifications actions | Shows the actions taken for all objects and classifications in the Ontology. |
The Objects and classifications actions table includes the following columns:
The Collaborators in the Analytics dashboard provides a detailed view of time spent annotating and reviewing by Project collaborators. It can help answer questions such as:
Chart | Description |
---|---|
Time spent | Displays the distribution of time spent per collaborator per day |
Annotators | Table view of all relevant task/label actions and timers for each collaborator in the Annotation stages |
Reviewers | Table view of all relevant task/label actions and timers for each collaborator in the Review stages |
Both tables and all CSV exports are filter-sensitive; they only display information within the selected filter conditions.
The Annotators and Reviewers table columns vary depending on whether the Instance labels or Frame labels view is selected. They include the following columns.
Instance Labels
The Annotators table includes the following columns:
The Reviewers table includes the following columns:
Frame Labels
The Annotators table includes the following columns:
The Reviewers table includes the following columns:
You can choose which columns should be displayed and included in the CSV export using the Columns selector.
The Issues tab in the Analytics dashboard provides a detailed view of all issues created in the Project.
Chart | Description |
---|---|
Issue actions | Shows the number of issues created, resolved, and opened. |
Issue tag occurrence | Shows the count of all issue tags used within the chosen timeframe. |
Issue tag occurrence by class | Shows the actions taken for all objects and classifications in the Ontology. |
The Issue actions chart shows the following:
Example - How to Understand Issue Analytics:
If your Issue actions chart shows:
This means that:
How do I track annotator performance on pre-labeling/model labels?
Annotator performance on pre-labeled or model-generated labels can be tracked using the Edited Labels and Deleted Labels metrics in the Annotators table. These include:
Vertex/coordinate changes are not tracked as edits.
Can I export analytics for payroll or performance tracking?
Yes, CSV exports from the Analytics tab provide data that can be used for payroll or performance tracking.
What is counted as a created label?
A created label is any new label added by an annotator or reviewer. This includes:
What is counted as an edited label?
An edited label is a modification made to an existing label. This includes:
Can I customize what data is included in the CSV exports?
No, CSV exports are pre-defined and contain all available metrics. However, you can filter the data before exporting to limit what is included.
How are timers tracked?
Timers track active time spent in the Label Editor:
Avg time per task
metric is calculated by dividing the total annotation/review time by the number of submitted tasks.Are label events counted if a task is skipped?
No. If a task is skipped, any actions performed before skipping (for example, creating/editing labels) are not recorded in the Analytics dashboard.
How are actions outside of the task queue counted?
Actions performed outside of the task queue, such as label modifications using the SDK are only tracked if the task is subsequently submitted, rejected, or approved.
How does the SDK filter work?
The Event Source filter allows you to view actions performed using the SDK or UI.
What happens if someone is inactive or takes a break?
How long does it take for actions to appear on the Analytics dashboard?