Carnegie Mellon University and Meta Platforms Inc. announced a collaborative project today to make computer-based tasks accessible to individuals with different motor abilities.
Research conducted by Douglas Weber, a professor in the Department of Mechanical Engineering, as well as the Neuroscience Institute, has shown that individuals who have suffered complete hand paralysis still maintain the ability to control muscles in their forearms, even if these muscles are too weak to produce movement. Additionally, Weber's research team found that individuals with spinal cord injuries still produce muscle patterns when asked to attempt to move a specific finger.
The researchers will use surface electromyography prototypes developed by Meta, which evaluate and record electrical activity produced by skeletal muscles.
"The research evaluated bypassing physical motion and relying instead on muscle signals," Weber said in a prepared statement. "If successful, this approach could make computers and other digital devices more accessible for people with physical disabilities."
The two will test how individuals with paralysis can use this technology to interact with computers for both everyday tasks and digital and mixed reality video game environments.
"In the digital world, people with full or limited physical ability can be empowered to act virtually, using signals from their motor system," Dailyn Despradel, a CMU Ph.D. candidate, said. "In the case of mixed reality technology, we are creating simulated environments where users interact with objects and other users, regardless of motor abilities."
CMU has a history with disability research and technology, while Meta's Pittsburgh presence deals heavily with the development of virtual reality and augmented reality technologies. CEO Mark Zuckerberg teased similar wearable technology on a podcast in May, saying that Meta had developed devices that could read signals from the nervous system.
"I'm not talking about something that jacks into your brain," Zuckerberg said in the interview. "I'm talking about something that you wear on your wrist that can basically read neural signals that your brain sends through your nerves to your hand to basically move in different subtle ways that are maybe not perceptible to people around you, but we're basically able to read those signals and be able to use that to control your [augmented reality] glasses or other computing devices."