New York, Feb 19: Researchers have developed an app that could help people speak the language of eyes — literally. The smartphone app that researchers working with Microsoft have developed can interpret eye gestures in real time, decode these gestures into predicted utterances, and facilitate communication. Called GazeSpeak, the app would help people with amyotrophic lateral sclerosis (ALS), a condition resulting in individuals gradually losing their strength and the ability to speak, eat or move. As part of the Eble team at Microsoft Research, the scientists developed GazeSpeak to help people with ALS who can move their eyes but cannot speak. ALS also causes other motor impairments that affect voluntary muscle movement. According to the researchers, current eye-tracking input systems for people with ALS or other motor impairments are expensive, not robust under sunlight, and require frequent re-calibration and substantial, relatively immobile setups. Eye-gaze transfer (e-tran) boards, a low-tech altertive, are challenging to master and offer slow communication rates. (ians)
You may also like
About the author
- South Africa Girls Taught well to Deal with Increasing Cases of Crimes against Women
- Bank Employee to go for Nationwide Strike on November 12
- OIL starts spudding of well in Kumchai area at Loc Kuar (Kumchai-15)
- Yellow Sweater And Blue Denim of Priyanka Chopra, A Very Far Fetched Dream!
- ‘100 Percent Organic State’ Sikkim Gets FAO Award For Best Policies
- Assam Chief Minister Welcomes the Court Martial of the Accused in Fake Encounter Case
- Tezpur University Jobs 2018 For Junior Research Fellow Vacancy for M.Sc, M.E/M.
- Ambedkar University Delhi Jobs 2018 for Research Assistant Vacancy for M.A
- PGIMER Jobs 2018 for Senior Research Fellow Vacancy for Any Post Graduate
- Mizoram CM Lal Thanhawla says Congress will return with more than 30 seats