Manage Annotation Projects
- In the Encord platform, select Projects under Annotate.
- Select the Project you want to Manage.
The dashboard is split into the following tabs:
- Project Overview: High-level view of labeling and productivity statistics.
- Explore: Explore the distribution of instances and labels across data assets in the project.
- Queue: Shows all tasks in the Project by Workflow stage.
- Workflow: Graphically displays the path tasks follow through the Project Workflow.
- Labels & Export: For managing all the Project’s labels, including exporting labels.
- Analytics: Detailed Project analytics.
- Settings: Manage your Project Settings, including copying Projects, managing Project tags, customizing editor layouts, and deleting Projects.
Tab Visibility by Role
Tab | Annotator | Reviewer | Annotator + Reviewer | Team Manager | Admin |
---|---|---|---|---|---|
Summary | ❌ | ❌ | ❌ | ✅ | ✅ |
Explore | ❌ | ❌ | ❌ | ✅ | ✅ |
Queue | Annotate tab only | Review tab only | Annotate + Review tabs only | ✅ | ✅ |
Workflow | ❌ | ❌ | ❌ | View only | ✅ |
Labels & Export | ❌ | ❌ | ❌ | View labels only | ✅ |
Performance | Personal performance | Personal performance | Personal performance | Team Performance | Team performance |
Models | ❌ | ❌ | ❌ | View only | ✅ |
Settings | ❌ | ❌ | ❌ | Add users (except Admins) | ✅ All settings and user management |
Update Project Status
Admins and Team Managers can change the status of a Project to any of the following:
- Not started
- In progress
- Paused
- Completed
- Cancelled
- Archived
When the Project status is changed to either Paused, Completed, or Cancelled then Annotators, Reviewers, and Annotator + Reviewers are prevented from opening tasks, and the Initiate button is greyed out. Admins and Team Managers are able to annotate and review tasks as usual, regardless of the Project status.
Tasks can still be opened using back door access. For example if an Admin shares a specific URL of a task with an Annotator, Reviewer, or Annotator + Reviewer.
Project Overview
The Project Overview dashboard has the following components that provide an overview of the Project.
Component | Description |
---|---|
Instance labels created | Displays the total number of labels (object labels and classifications), in the Project, that have been submitted for review. Labels on Text and HTML files are not counted. |
Time spent | Displays the total amount of time, in the Project, spent by users in the Label Editor performing tasks. For example, time spent annotating, editing, and reviewing tasks, and making and resolving issues. |
Open issues | Displays the total number of open issues in the Project. |
Project task status | Displays the number of tasks that are in each stage of your Project. The number of stages and their names reflect the choices made during workflow project selection/creation. |
Explore tab
The Explore tab helps you understand how project annotations are distributed among data assets, at both an instance and label level. It allows a deeper exploration through attributes on objects, as well as frame-level classifications.
- Instance statistics: Class distribution across data assets in the given project.
- Label statistics: Label distributions within data assets, objects and classifications.
Instance statistics
This section provides the total count of all instances across the datasets in your project.
- Project total: Shows total instances (both objects and classifications) across the project by default. To get instance statistics for individual data files, click the drop-down to select a data file.
- Select class: Shows the total instances for a particular class. This is a summary of how a given class is distributed across your project’s data assets. The pie chart segments show a breakdown of how that class is split across the data assets.
- Display timestamps: Flip the toggle to switch between frame numbers and timestamps for the labels.
Label statistics
This is a summary of how your labels are distributed across the project. The pie chart shows a breakdown of how many labels there are for a given class.
- Project total: Shows the total number of labels across different datasets in the project. To get label stats for individual data files, click the drop-down to select a data file.
- Objects: Click on the pie chart segment of a class to see the total number of labels and its attributes (sometimes called nested attributes) if available for that class.
- Classifications: Shows the global classification at project or individual video level. For example, location, time of day, etc.
Task Queue & Workflow
Use the Queue tab to assign, prioritize, and manage tasks, as well as to start labeling and reviewing for all users associated with a Project.
The Queue tab’s layout adapts based on user permissions.
Queue (Admin, Team Manager)
Queue (Admin, Team Manager)
Queue (Admin, Team Manager)
- A - Use the search bar to filter the list of data units being displayed, or to look for a particular data unit.
- B - Select a task to assign it to a user, release the task, or adjust its priority number.
- C - Filter the list of data units being displayed by task status by Dataset, User, Data Type, or Status. This is the order in which tasks will appear in the Label Editor.
- D - Sort the task queue.
- E - Use the Start labeling (annotation task) and Start reviewing (review task) buttons to begin labeling or reviewing a task.
- F - The list of all Workflow stages shows how many data units each stage currently contains. Each stage can be selected.
- G - Shows the task’s priority number. Tasks are listed in descending order of priority by default.
- H - Shows the task’s Status.
- I - Shows the email address of the user the task is assigned to.
- J - Clicking the Initiate button initiates a task. If an annotation stage is selected, an annotation task is initiated. If a Review stage is selected a review task is initiated.
The Queue tab lists tasks in the same order they appear in the Label Editor.
Filters
- Dataset - Filtering by Dataset only displays data units belonging to the selected Dataset(s).
- Assignees - Displays tasks assigned to a particular user.
- Last Actioned By - Displays tasks last actioned by the selected user.
- Data Type - Displays data units of a specific type.
- Issue Status - Displays tasks with Resolved or Unresolved issues.
Sort by
You can sort the task queue by clicking Sort by next to the filter button. Select whether you want to sort the task queue by:
- Task priority, or alphabetically by the name of the data unit.
- In ascending or descending order.
The default sorting is in descending order of task priority.
Task Status
The task Status indicates which actions had previously been taken on a task.
- New - The task has not been worked on since being added to the Project.
- Reopened - The task was rejected during the Review stage, and has been returned for re-labeling.
- Skipped - The task was skipped by one or more annotators.
Move Tasks
You can move selected tasks from their current stage directly to another stage of the Workflow.
- Moving tasks skips all intermediate Workflow stages, placing them immediately into the selected stage.
- All labels (object and classification) are automatically approved when tasks move to a COMPLETE stage.
- Webhooks do not fire when tasks teleport through a workflow.
- Tasks cannot be moved in to Consensus stages.
- Tasks in the Consensus Annotate block cannot be moved to Consensus Review blocks.
- Tasks cannot be moved out of the COMPLETE or ARCHIVE stages.
- The following analytics do not apply to teleported tasks:
Labels created
,Approved labels
,Rejected labels
.
To move a task:
-
Select the task(s) you want to move.
-
Click the Move dropdown.
-
Select the stage you want to move the selected task(s) to.
Tasks move to the specified stage.
Assigning and Releasing Tasks
Tasks can be assigned to specific users by selecting them from the list and clicking the Assign button, as shown below. Once a task is assigned, only the assigned user is able to open the task. Alternatively, click the small arrow button in the Assigned to column to assign an individual data unit.
Releasing a task is the opposite of assigning a task, and removes any user the task was assigned to. To release any number of tasks, select them from the list and click the Release button located next to the Assign button shown above.
Task priority
All annotation and review tasks can be assigned a priority level to manage workflow efficiency. Each task is assigned a priority value ranging from 0 to 100, where the default value is set to 50. A value of 100 indicates a high-priority task, requiring immediate attention, where a value of 0 signifies a low-priority task. Annotation and review tasks with higher priority levels are displayed in the label editor in descending order of priority.
To update task priority:
-
Click on the number representing the task’s current priority. This action opens the priority setting interface for that particular task. Alternatively, select the task and click the Adjust priority button.
-
Adjust the task’s priority by either using the slider for quick selection or manually entering a specific number between 0 and 100 in the input field provided. This allows for precise control over the task’s priority level.
-
Once the desired priority level is set, finalize your changes by clicking the Update button. This action saves the new priority setting for the task, effectively updating its status in the task queue.
Queue (Annotators)
Queue (Annotators)
Queue (Annotators)
Annotators are presented with the following Queue tab, from which they can manage their annotations.
- A - The list of annotation stages shows how many data units each stage currently contains. If more than one stage is listed, clicking a stage lets you view the tasks it contains.
- B - Shows the task’s priority number. Tasks are listed in descending order of priority by default.
- C - The list of tasks / data units in your queue. Unassigned tasks are also visible and they can be initiated by all Annotators.
- D - The task status.
- E - The user a task is assigned to. A blank field indicates an unassigned task.
- F - Click the Initiate button next to a task to start annotating.
- G - The Start labeling button opens the Label Editor, starting with the highest priority task.
The Queue tab lists tasks in the same order they appear in the Label Editor.
Queue (Reviewers)
Queue (Reviewers)
Queue (Reviewers)
- A - The list of review stages shows how many data units each stage currently contains. If more than one stage is listed, clicking a stage lets you view the tasks it contains.
- B - Shows the task’s priority number. Tasks are listed in descending order of priority by default.
- C - The list of tasks / data units in your queue. Unassigned tasks are also visible and they can be initiated by all Reviewers.
- D - The task status.
- E - The user a task is assigned to. A blank field indicates an unassigned task.
- F - Click the Initiate button next to a task to review it .
- G - The Start reviewing button opens the Label Editor, starting with the highest priority task.
The Queue tab lists tasks in the same order they appear in the Label Editor.
Workflow tab:
The Workflow tab lets you view and edit the Project’s Workflow. To begin editing the Workflow, click the Edit button on the canvas.
Labels & Export
The Labels tab is your gateway to auditing and exporting labels created in your project.
Role | Activity | Queue | Data | Instances |
---|---|---|---|---|
Annotator | ❌ | ✅ | ❌ | ❌ |
Reviewer | ❌ | ✅ | ❌ | ❌ |
Annotator + Reviewer | ❌ | ✅ | ❌ | ❌ |
Team Manager | ✅ | ✅ | ✅ | ✅ |
Admin | ✅ | ✅ | ✅ | ✅ |
The labels dashboard features the following tabs:
- Data: A complete overview of all tasks in the project, with the option to export your labels on a per-task basis.
- Label Instances: The Label Instances tab lets you use the unique instance identifier to search the project for a specific instance, and jump directly into the editor to confirm the status of an annotation visually.
Data tab
The Data tab provides a complete overview of all tasks in the project, and lets you see which workflow stage each task is in.
Export labels
Select the data units in the Data tab you want to export labels for and click the Export and save button highlighted in the screenshot below to export labels. See our documentation on exporting labels for more information.
Save Label Version
Select the data units you’d like to save label version for and click the Save new version button highlighted in the screenshot below. It will be listed in the Saved versions tab.
Label versioning allows you to keep track of your label history over time by providing a snapshot of labels at a point in time. Label versions can be exported and analyzed to track annotation performance over time.
Label Instances
The Label Instances tab allows Administrators and Team Managers to search for specific instances within the data. An annotation instance refers to a unique occurrence of an ontology class in a data asset (e.g., ‘Person (0)’ for the first instance of a ‘Person’). Instances span multiple frames of a data asset, representing the same object. Use this tab to locate specific objects or classifications by their Identifier.
Instance identifiers are unique within a project and can be found in several ways:
- In the Label Editor: Click on an instance, then select Copy identifier from the instance action menu.
- In Exported Labels: Look for
objectHash
orclassificationHash
in the exported data. - Using the SDK: Specify your own
objectHash
orclassificationHash
during label uploads.
Once you have an identifier, use the Search instance interface to filter and locate the specific instance. This is especially useful for visually confirming annotations linked to an identifier.
After finding your instance, click View in the Actions column to jump directly to the first annotation of that instance in the dataset.
Saved Versions
The Saved versions tab displays information for versions of your labels. The Actions column lets you:
-
Export label versions by clicking the Download version icon in the Actions column. For format of exported labels has the same structure as outlined in the export documentation.
-
Delete label versions by clicking the Delete version icon in the Actions column.
Performance (Legacy)
This dashboard shows legacy Analytics. To see Detailed analytics relating to Workflow Projects see the Analytics section.
Performance - Summary
The Summary tab of the performance dashboard provides an overview of your team’s manual labeling and productivity.
Task actions over time
View the number of tasks in a project that have been approved, rejected, and submitted for review over a given period of time.
- The height of a bar represents the total number of tasks.
- The height of each color within a bar represents the number of approved, rejected, and submitted tasks.
- A: Set the time period you would like to see displayed by selecting a range of dates.
- B: The Hide days without any actions toggle removes all days at which no actions were taken from the view.
- C: Download a CSV file of the data.
- D: Display the data as a bar chart, or a table. While the chart provides a clear visual representation, the table provides exact figures for a more detailed picture of your team’s performance.
Instance Label actions over time
View the number of instance label actions in a project that have been approved, rejected, and submitted for review over a given period of time.
- A: Set the time period you would like to see displayed by selecting a range of dates.
- B: Download a CSV file of the data.
- C: Display the data as a bar chart, or a table. While the chart provides a clear visual representation, the table provides exact figures for a more detailed picture of your team’s performance.
Within your specified time period, you can choose which dates to display by using the slider located beneath the graph.
Team collaborators
The ‘Team collaborators’ section shows the duration of time each project collaborator spend working on a given file.
A. ‘Data file’ displays session time collaborators spent working on individual files. ‘Project’ displays session time collaborators have spent working on the project.
B. Table entries can be filtered according to dates by clicking the range of dates, and selecting the start and end date of the period you would like to see table entries displayed for.
C. Table entries can be downloaded in CSV format by clicking the Download CSV button.
D. When lots of entries are present they will be split across a number of different pages. The number of table entries per table can be adjusted.
Performance - Details
The Details tab of the performance dashboard gives a more detailed view of your team’s labeling and productivity. This section will cover manual QA projects. The below details will be displayed for Manual QA projects.
Submissions chart
The submissions chart displays the number of submitted labels or instances over the specified time period. The chart can be filtered to show submissions for specific annotators or classes.
If you filter on both Annotators and Classes then the resulting chart will show the submission statistics for the selected annotators and for selected labels.
Reviews chart
The reviews chart displays the cumulative number of accepted and rejected labels or instances over the specified time period.
Annotator’s table
The annotator’s table displays all the relevant statistics for all annotators in a Project. It can be filtered on classes to show annotator statistics only for the selected classes.
- User: The annotator’s email.
- Rejection rate: Percentage of their labels or instances that have been rejected in the review process.
- Submitted labels / instances: Number of labels or instances that the annotator has submitted for review
- Repeated submissions are not counted.
- Accepted labels / instances: Number of labels or instances that the annotator created that passed the review process.
- Rejected labels / instances: Number of labels or instances that the annotator created that we’re rejected during the review process. Note that this can be higher than the number of submitted labels / instances since a label or instance can be rejected multiple times during the review process but the submission will only be logged once.
- Total session time: Time spent labeling.
Reviewers table
- User: The reviewers email.
- Rejection rate: Percentage of labels or instances that they rejected in the review process.
- Accepted labels / instances: Number of labels or instances that the reviewer accepted.
- Rejected labels / instances: Number of labels or instances that the reviewer rejected.
- Total session time: Time spent reviewing.
Objects and classifications table
Each row in the objects and classifications table can be expanded to show statistics on attributes.
- Class: The class name.
- Rejection rate: Percentage of labels or instances rejected in the review process.
- Reviewed labels / instances: Number of labels or instances of the class that have gone through the review process.
- Accepted labels / instances: Number of labels or instances of the class that have passed the review process.
- Rejected labels / instances: Number of labels or instances of the class that failed the review process.
- Avg. time to annotate: Average time spent annotating this class.
Analytics
The Analytics tab of your Project shows event-based analytics for your Project’s tasks, labels, and users. The Analytics tab has the following views:
- Tasks: View analytics on specific tasks in your Project.
- Labels: View analytics of labels in your Project.
- Collaborators: View collaborator performance in your Project.
- Issues: View analytics relating to issues in your Project.
The analytics available on the Analytics dashboard vary based on user roles:
- Admins and Team Managers have access to the Tasks, Labels, and Collaborators views, offering a comprehensive overview of team performance.
- Annotators, Reviewers, and Annotator + Reviewer roles can only view the Task Actions, Time Spent, and Label Actions tables, limited to their individual contributions.
Actions performed in the Label Editor or using the SDK are only tracked on the Analytics dashboard if they the tasks are actioned (submitted/rejected/approved).
Project Analytics have the following filters:
Filter | Description |
---|---|
Collaborators | Filter by specific Project collaborators. |
Datasets | Filter tasks based on the dataset they belong to. |
File name includes | Filter by file name. Regex is supported, allowing filtering by prefix or infix in the file title. |
Event source | Filter by the source of the event. Either SDK or UI. |
Workflow stage | Filter tasks based on their current stage in the Workflow. |
Class | Filter by Ontology class. |
Tasks
The Tasks tab of the Analytics dashboard provides a detailed overview of task throughput, stage efficiency, and task timers. It can help answer questions such as:
-
How productive was each collaborator in terms of labeling and reviewing tasks over the last week?
- By filtering the data by Collaborators and Date time range, you can see how many tasks each team member worked on and how much time they spent on labeling and reviewing.
-
Which Dataset has the most labels added, edited, or deleted in a given Workflow stage?
- You can filter by Datasets and Workflow stage to see which Dataset is being worked on the most and how many labels are being modified at each stage of the process.
The following charts are visible in the Tasks view:
Chart | Description |
---|---|
Task actions | Displays the number of annotation tasks submitted, skipped, review tasks approved, and review tasks rejected over a selected date range. |
Time spent | Shows the total time spent actively working in the Label Editor, providing insights into productivity. |
Task performance | Shows the actions taken for all tasks in the Project. |
The Task performance table includes the following columns:
- File name: The name of the file associated with the task.
- Data type: The type of file used in the task (example: image, video, DICOM).
- Dataset: The Dataset from which the task originates.
- Total time: The total time spent on the task.
- Total created labels: The number of labels created in the task.
- Total edited labels: The number of labels that were modified in the task.
- Total deleted labels: The number of labels that were removed in the task.
- Time spent - The amount of time spent in a specific Workflow stage.
- Collaborators - The users who contributed to a specific Workflow stage.
- Created labels - The number of labels created in a specific Workflow stage.
- Edited labels - The number of labels modified in a specific Workflow stage. In review stages, this refers to Edit Review.
- Deleted labels - The number of labels removed in a specific Workflow stage.
Labels
Label analytics are not supported for Text and Audio modalities.
The Labels tab in the Analytics dashboard provides a detailed overview of your team’s labeling, reviewing, and task productivity, including time spent. It can help answer questions such as:
-
How many labels were submitted, approved, or rejected by the team over a given period?
- Use the Label actions chart and apply the Date time range filter to view the total number of labels submitted, approved, or rejected within the selected time frame.
-
What actions have been taken for specific objects and classifications in the Ontology?
- Refer to the Objects and classifications actions chart and expand classifications within the Ontology column to see detailed statistics for each classification answer.
-
How does the team’s productivity in labeling compare across different objects or classifications? Do certain objects take more time to label than others?
- Analyze the Created, Approved, and Rejected columns in the Objects and classifications actions table to identify objects or classifications that might require additional review or clarification using their Rejection rate.
- Compare the average time spent per object or classification by utilizing time-tracking metrics alongside these productivity statistics.
Chart | Description |
---|---|
Label actions | Displays the number of labels submitted, approved, or rejected. |
Objects and classifications actions | Shows the actions taken for all objects and classifications in the Ontology. |
The Objects and classifications actions table includes the following columns:
- Ontology: Represents the Ontology class, encompassing both objects and classifications. For classifications, you can expand to view statistics for each classification answer.
- Created: Displays the total number of instances created for this Ontology class. Each instance is counted only once, ensuring that resubmissions of the same label are not double-counted.
- Approved: Displays the total number of instances of this Ontology class that have been approved. Approvals are counted multiple times if a label is approved in multiple review stages or if the task is reopened and reviewed again. Use stage-specific filters to see approvals per review stage.
- Rejected: Displays the number of instances of this Ontology class that have been rejected. Rejections are double-counted if a label is rejected in multiple review stages, rejected again within the same stage, or if the task is reopened and rejected again. Use stage-specific filters to see rejections per review stage.
- Rejection Rate: Calculates the rejection rate percentage of the given Ontology class by dividing the number of rejected labels by the total number of reviewed labels.
Collaborators
The Collaborators in the Analytics dashboard provides a detailed view of time spent annotating and reviewing by Project collaborators. It can help answer questions such as:
- How much time did each collaborator spend on annotation / review tasks?
- Use the Time spent chart to see the time distribution for each collaborator across annotation and review tasks. The Annotators and Reviewers tables, which provide total and average times for each collaborator.
- Which collaborator spent the most time annotating or reviewing tasks in the Project?
- Analyze the Time spent chart to identify the collaborator with the highest time allocation.
Chart | Description |
---|---|
Time spent | Displays the distribution of time spent per collaborator per day |
Annotators | Table view of all relevant task/label actions and timers for each collaborator in the Annotation stages |
Reviewers | Table view of all relevant task/label actions and timers for each collaborator in the Review stages |
- Labels refer to objects and classifications.
- Approve actions are are double counted if there are multiple review stages.
- Review actions are double counted if multiple review stages are present or tasks get rejected again in the same review stage.
Both tables and all CSV exports are filter-sensitive; they only display information within the selected filter conditions.
The Annotators and Reviewers table columns vary depending on whether the Instance labels or Frame labels view is selected. They include the following columns.
Instance Labels
Instance Labels
The Annotators table includes the following columns:
- Submitted tasks: Total tasks submitted by the annotator.
- Skipped tasks: Total tasks skipped by the annotator.
- Approved tasks: Tasks submitted by the annotator that were approved in subsequent review stages.
- Rejected tasks: Tasks submitted by the annotator that were rejected during review.
- Task rejection rate: Percentage of the annotator’s submitted tasks that were rejected. If multiple review stages are present, use workflow filters to view stage-specific rejections.
- Created labels: Total new labels submitted by the annotator. Include any pre-labels imported using the SDK by admins.
- Edited labels: Total existing labels edited by the annotator. This includes pre-labels from an Agent stage, or labels from a previous Annotate stage. Vertex / coordinate changes are not tracked.
- Deleted labels: Total existing labels deleted by the annotator. This includes pre-labels from an Agent stage, or labels from a previous Annotate stage.
- Approved labels: Labels submitted by the annotator that were approved during review.
- Rejected labels: Labels submitted by the annotator that were rejected during review.
- Label rejection rate: Percentage of the annotator’s submitted labels that were rejected during review.
- Total annotation time: Total active time spent annotating in the Label Editor, rounded to the nearest second.
- Avg time per task: Average time spent on each submitted annotation task. Calculated using the total active time spent in the Annotate stage divided by the number of submitted tasks.
- Issue actions against user: The number of Issue actions taken against the user.
The Reviewers table includes the following columns:
- Approved tasks: Number of tasks approved by the reviewer.
- Rejected tasks: Number of tasks rejected by the reviewer.
- Task rejection rate: Percentage of reviewed tasks that were rejected by the reviewer.
- Created labels: Total labels created by the reviewer using Edit Review.
- Edited labels: Total labels edited by the reviewer using Edit Review.
- Deleted labels: Total labels deleted by the reviewer using Edit Review.
- Approved labels: Number of labels approved by the reviewer.
- Rejected labels: Number of labels rejected by the reviewer.
- Total review time: Total active time spent reviewing in the Label Editor, rounded to the nearest second.
- Avg time per label: Average time spent on each reviewed label. Calculated using the total active time spent in the Review stage divided by the number of reviewed labels.
- Avg time per task: Average time spent on each actioned review task. Calculated using the total active time spent in the Review stage divided by the number of actioned reviews.
- Issue actions by user: The number of Issue actions taken by the user.
Frame Labels
Frame Labels
The Annotators table includes the following columns:
- Submitted tasks: Total tasks submitted by the annotator.
- Skipped tasks: Total tasks skipped by the annotator.
- Approved tasks: Tasks submitted by the annotator that were approved in subsequent review stages.
- Rejected tasks: Tasks submitted by the annotator that were rejected during review.
- Task rejection rate: Percentage of the annotator’s submitted tasks that were rejected. If multiple review stages are present, use workflow filters to view stage-specific rejections.
- Created frame labels: Total new labels submitted by the annotator. Include any pre-labels imported using the SDK by admins.
- Edited frame labels: Total existing labels edited by the annotator. This includes pre-labels from an Agent stage, or labels from a previous Annotate stage. Vertex / coordinate changes are not tracked.
- Deleted frame labels: Total existing labels deleted by the annotator. This includes pre-labels from an Agent stage, or labels from a previous Annotate stage.
- Approved frame labels: Labels submitted by the annotator that were approved during review.
- Rejected frame labels: Labels submitted by the annotator that were rejected during review.
- Frame label rejection rate: Percentage of the annotator’s submitted labels that were rejected during review.
- Total annotation time: Total active time spent annotating in the Label Editor, rounded to the nearest second.
- Avg time per task: Average time spent on each submitted annotation task. Calculated using the total active time spent in the Annotate stage divided by the number of submitted tasks.
The Reviewers table includes the following columns:
- Approved tasks: Number of tasks approved by the reviewer.
- Rejected tasks: Number of tasks rejected by the reviewer.
- Task rejection rate: Percentage of reviewed tasks that were rejected by the reviewer.
- Created frame labels: Total labels created by the reviewer using Edit Review.
- Edited frame labels: Total labels edited by the reviewer using Edit Review.
- Deleted labels: Total labels deleted by the reviewer using Edit Review.
- Approved frame labels: Number of labels approved by the reviewer.
- Rejected frame labels: Number of labels rejected by the reviewer.
- Total review time: Total active time spent reviewing in the Label Editor, rounded to the nearest second.
- Avg time per frame label: Average time spent on each reviewed label. Calculated using the total active time spent in the Review stage divided by the number of reviewed frame labels.
- Avg time per task: Average time spent on each actioned review task. Calculated using the total active time spent in the Review stage divided by the number of actioned reviews.
You can choose which columns should be displayed and included in the CSV export using the Columns selector.
Issues
The Issues tab in the Analytics dashboard provides a detailed view of all issues created in the Project.
Chart | Description |
---|---|
Issue actions | Shows the number of issues created, resolved, and opened. |
Issue tag occurrence | Shows the count of all issue tags used within the chosen timeframe. |
Issue tag occurrence by class | Shows the actions taken for all objects and classifications in the Ontology. |
The Issue actions chart shows the following:
- Total issues created: The total number of issues created.
- Total issues resolved: The number of issues marked as resolved.
- Total issues reopened: The number of issues that were reopened.
Example - How to Understand Issue Analytics:
If your Issue actions chart shows:
- Total issues created = 19
- Total issues resolved = 20
- Total issues reopened = 2
This means that:
- 19 issues were created.
- All 19 issues were resolved.
- 2 resolved issues were reopened. (This includes manually reopening an issue and re-rejections of labels)
- 1 of the reopened issues was resolved.
- There is one remaining open issue.
Join Projects in your Org
Organization Admins can search for and join any Projects that exist within the Organization. This includes Projects that contain no collaborators.
- Navigate to Projects under the Annotate heading in the Encord platform.
- Select the All Encord projects tab.
- Find the Project you want to join.
- Click Join project to join the Project.
When an Organization Admin joins a Project, they are automatically assigned the Admin user role for that Project.