Polsley Builds Mind-Reading Technology
Published on March 20th, 2014 by Michelle Ward
Before this semester, Seth Polsley served as lead researcher, sole study participant, and lone financer on a yearlong multidisciplinary project. He struggled to do it all on a student’s budget.
That’s why the computer engineering senior let out a sigh of relief when he received a $1,000 award from the KU Center for Undergraduate Research. Polsley was one of only 50 students from across campus to have his project selected for funding.
“I have a lot more freedom in my research now,” Polsley said. “Before I had any dedicated funding, I was trying to do everything as inexpensively as possible, which was very restrictive. This process can be costly, so I think it’s great that KU offers these funding opportunities to students.”
For his honors research project, Polsley is developing a software-based system to record and interpret the nerve activity that precedes movement. His goal is to help improve brain- or muscle-activated technologies, such as exoskeletons and prosthetic limbs, for people with severe disabilities. The technology could also transform users’ interactions with computers, video games, and other devices.
The merging of mind and machine, known as the brain–computer interface, is driving the next generation of motion control technologies.
“I’ve always been interested in human-computer interfaces,” Polsley said. “The keyboard and mouse have been our primary interface for decades, and only recently have reliable voice-based interfaces become popular. I think using the body’s natural signals as an interface has the potential to make some very advanced technologies available and improve a lot of people’s lives.”
Polsley has built a scaled-down electromyogram (EMG) to capture differences in neurons as he claps, waves, or performs other actions. Polsley’s brain must send instructions in the form of electrical signals. The EMG circuit measures and amplifies the activity detected from his nervous system.
A challenge has been making the system sensitive enough to collect the necessary data, Polsley says. He is on his third prototype. His previous attempts, whose components he bought with his own money, taught him that obstacles come in many forms. “It’s all been a learning process. I thought the hardest part of the work would be determining how to interpret the data, not acquiring the data.”
Polsley is testing whether the sensor system can work as an effective prediction tool. Currently, joysticks, sensors, and other external mechanisms are needed to initiate action. They delay movement. By responding to the original signal, an EMG circuit can save critical time between thought and action. The resulting robots can produce more fluid, natural movement.
Current robotic suits measure a person’s upper body movement and shifts in gravity through motion sensor networks. The state-of-the-art suit is considered to be the ReWalk exoskeleton system, developed by ARGO Medical Technologies. The 50-pound suit includes motorized joints, rechargeable batteries, motion sensors, and a computer-based control system.
Polsley wants to streamline the design by replacing motion sensors with mind-reading technology. A single EMG system would greatly reduce the $50,000 price tag as well. The sensors take advantage of the fact that years after a stroke or spinal cord injury, the brain still sends out instructions. These signals could form the basis of future robotic suits and prosthetics.
“Probably like most engineers, I’m always looking to the future of our field, and I think we’re reaching a point in computing where a number of new and amazing technologies are becoming possible. Though my current research isn’t necessarily new, I would like it to serve as a demonstration of just how much we can do with the technology we have today.”
To do this, Polsley has been developing a computer program to process, filter, and interpret signals, combining the older technology of the EMG with new detection and interpretation tools. Much like the human brain, a neural network learns to recognize patterns and relationships. Analyzing large amounts of data and learning the connections among them will allow the software to classify specific movements. The goal of the project is to reliably predict intended movements. It is the first step in building more intuitive, flexible technology.
He will present his muscle sensing system at the Undergraduate Research Symposium on April 26. He plans to continue research on neural networks and artificial intelligence in graduate school.