This guide provides a step-by-step process for using the Amazon MTurk platform to carry out human annotation tasks. From setting up your requester account and creating annotation guidelines to managing data uploads and ensuring quality control through qualification tests, this document will help streamline your workflow and optimize your interaction with workers on the platform.
Guide
1. Sign Up and Request Limit Increase
- Sign up as a Requester on Amazon MTurk.
- 🕒 Request a limit increase for expenses early, as this process can take some time.
2. Understanding MTurk Modes
- Requester Mode: For creating and managing HITs (Human Intelligence Tasks).
- Worker Mode: Used by annotators to complete tasks.
Sandbox Setup (for testing):
- 🧑💻 Requester Sandbox: Publish HITs without payment, useful for testing.
- 👨🏭 Worker Sandbox: View tasks from the worker's perspective to see how tasks will be solved.
3. Writing Annotation Guidelines
Create clear guidelines using Google Slides
Example: NL-EYE: Full Visual Common Sense - Guideline: https://docs.google.com/presentation/d/1k-tu239Ihg21o8W2rdgFN4uLbP5VpBZi3wHeqMZ1Nzk/edit?usp=sharing
- 📄 Write general rules and instructions.
- 📝 Provide 5 examples from your dataset with solutions.
- 👁 Gradually reveal the solution and the corresponding guidelines.