I’m Hyunchul Lim, a PhD student at Cornell, conducting research on American Sign Language (ASL) using wearable devices.
We are looking for participants for the following study:
Study Title: Smart AR Glasses for ASL Translation with Facial Expressions
Purpose:
This study aims to collect ASL signs along with facial expressions—including grammatical markers, mouth morphemes, and emotional expressions—to develop AI-powered AR glasses for expressive ASL translation. Your participation will help improve our sign language recognition algorithms and ensure the system meets the needs of real users.
Study Details:
Duration: Up to 2 hours (depends on signing speed)
Compensation: $60 (Any gift card via Tango, See reward catalog https://shorturl.at/NYanW)
Task: Perform various ASL sentences with multiple facial expressions for data collection.
Please see the details here: https://shorturl.at/lIYMI