Close

Presentation

65. Investigating the Latency of an AI Driven Facial Expression Therapy System as a Function of Illumination and Eyeglasses
DescriptionTo summarize, the research paper presents a novel multimodal (visual/audio) driven interactive interfaces that teach subjects to express and recognize emotions. It progresses from the base level where users identify facial expression to levels requiring expressing context appropriate emotional interface aspires to empower and support autistic individuals on their emotional expression recognition journey, offering them an accessible, enjoyable, and context-aware interface to enhance their skills and overall wellbeing.
Corresponding Author/Contributor
Event Type
Poster
TimeThursday, September 12th5:30pm - 6:30pm MST
LocationMcArthur Ballroom
Tracks
Aging
Augmented Cognition
Children's Issues
Communications
Cybersecurity
Education
Environmental Design
General Sessions
Human AI Robot Teaming (AI)
Macroergonomics
Occupational Ergonomics
Student Forum
Surface Transportation
Sustainability
System Development