Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable Reassignment of Completed Annotation Tasks for Review and Correction #7102

Open
Vsafi opened this issue Feb 18, 2025 · 3 comments
Open

Comments

@Vsafi
Copy link

Vsafi commented Feb 18, 2025

Is your feature request related to a problem? Please describe.
The annotation team used Label Studio to annotate the data. However, when another team reviewed the annotations, they found errors. Now, I need the annotators to correct these mistakes. The problem is that I can't put the completed tasks back into the task queue for reannotation.

Describe the solution you'd like
I need a way to reassign completed annotation tasks in Label Studio so that annotators can review and correct errors. Ideally, I want a solution that allows me to:

Identify Incorrect Annotations – Easily filter or tag tasks that need correction based on feedback from the review team.
Reassign Tasks – Move these tasks back into the queue so annotators can work on them again.
Track Progress – Keep a clear record of corrected tasks and ensure all necessary fixes are completed.

Describe alternatives you've considered
Right now the things are messy. Because they are checking it one by one. And id assigned is random. So they also see the task that is done.

Additional context
Add any other context or screenshots about the feature request here.

@heidi-humansignal
Copy link
Collaborator

Hello,

Thank you for contacting Label Studio,

I understand that after review, errors are found and you’d like an efficient way to bring those tasks back into the queue so that annotators can correct them while preserving the original annotation history. Currently, In Label Studio Enterprise you have a setting under the Review options (for example, “Requeue rejected tasks back to annotators”). When you mark an annotation as rejected during review, this option will reassign the task automatically rather than marking it as completed. Adjusting your review workflow so that tasks flagged for correction are rejected (instead of immediately completed) will help keep them available in the queue for reannotation.

More details can be found in our release notes here: Label Studio Enterprise 2.15.0 Release Notes

Also,

f you require a more automated solution, you can use the Label Studio SDK to programmatically identify tasks that need rework and reassign them. For instance, if you add metadata or tags to tasks that need correction, you can filter the tasks and then reassign them. An example snippet might look like this:

from label_studio_sdk import Client# Initialize the Label Studio clientls = Client(url='https://your-label-studio-url.com', api_key='your-api-key')project = ls.get_project(YOUR_PROJECT_ID)# Retrieve tasks (adjust this call to match how you determine a task needs correction)tasks = project.get_tasks()# Example filter: Identify tasks that have a flag (e.g., "needs_correction") in their metadatatasks_to_reassign = [t for t in tasks if t.get('meta', {}).get('needs_correction')]for task in tasks_to_reassign: # Here you could implement logic to reassign the task, e.g., update assignment via SDK or API print(f"Reassigning task {task['id']} for corrections")

Replace the placeholders with your URL, API key, project ID, and task criteria. This approach allows you to keep a clear record and even further automate tracking for corrections.

i hope this was helpful! Please feel free to reach out to us in case you need support!

Comment by Oussama Assili
Workflow Run

@Vsafi
Copy link
Author

Vsafi commented Feb 18, 2025

Understood. Thank you for quick response.

@heidi-humansignal
Copy link
Collaborator

You are welcome!

Please let us know if you have any questions! we are happy to assist you!

Comment by Oussama Assili
Workflow Run

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants