This guide provides a step-by-step process for using the Amazon MTurk platform to carry out human annotation tasks. From setting up your requester account and creating annotation guidelines to managing data uploads and ensuring quality control through qualification tests, this document will help streamline your workflow and optimize your interaction with workers on the platform.

Guide


1. Sign Up and Request Limit Increase


2. Understanding MTurk Modes

Sandbox Setup (for testing):


3. Writing Annotation Guidelines

Create clear guidelines using Google Slides

Example: NL-EYE: Full Visual Common Sense - Guideline: https://docs.google.com/presentation/d/1k-tu239Ihg21o8W2rdgFN4uLbP5VpBZi3wHeqMZ1Nzk/edit?usp=sharing