Teams from Gallaudet and the Rochester Institute of Technology/National Technical Institute for the Deaf (RIT/NTID) were recently awarded a collaborative EArly-concept Grants for Exploratory Research (EAGER) grant from the National Science Foundation (NSF) to conduct research around prosody in American Sign Language (ASL).
Non-manual markers, pauses, pacing, and
other elements of ASL are known as prosody. Prosody is responsible for, for instance, marking prominence, regulating turn-taking, and managing disfluency (breaks or disruptions that occur in the flow of language production). As the glue that makes ASL cohesive and coherent, there is a need for its further study. The existing gap is partially because capturing and annotating prosodic elements is a time-consuming process requiring substantial expertise.The NSF’s EAGER award is an exciting funding mechanism; NSF typically describes these grants as “high risk-high payoff,” since projects may involve novel approaches, expertise, and perspectives that involve untested or exploratory work. Projects last for up to two years and are awarded up to $300,000.
Both Gallaudet and RIT/NTID are home to large populations of diverse deaf ASL communities and this project brings together a team of scientists with complementary strengths in sign language research, psycholinguistics, computational linguistics, and accessibility to address this critical resource gap.
The Gallaudet team consists of three faculty members: Principal Investigator Raja Kushalnagar, who is also a Professor and Co-Director of the Accessible Human-Centered Computing program (AHCC); Patrick Boudreault, Associate Professor in the ASL program, and James Waller, Assistant Professor in Psychology and AHCC programs. They will hire and mentor graduate and undergraduate research assistants, who will contribute to the project. Gallaudet’s resesarchers will collaborate with RIT faculty Cecilia Alm and Allison Fitch, who are affiliated with RIT’s Ph.D. program in Cognitive Science.
The team will capture understudied ASL attributes and variation, representative of the diversity of ASL. They will also prepare and empirically assess annotation conventions, leveraging theory-driven concepts from signed language. Kushalnagar says they will be developing annotation practices that are “culturally appropriate and scientifically robust.”
Once released, the team expects the annotated corpus to serve as a valuable resource for continued study of ASL. For example, it will enable language scientists to investigate research questions about how ASL signers convey prosody visually to render grammatical or emotional functions, which is important for ASL interaction insights and ASL communication technologies. Accessible teaching modules and assignments with instructor guides will also be created for linguistics and accessibility-related courses. The project will also train next-generation researchers in ASL resource creation and sign language research.
This project, “Visual Prosody Annotation in a Sign Language Corpus” is funded by NSF Eager Award #2429900
No comments:
Post a Comment