How to train your brain-controlled drone

Wayne Gillam

With help from the Center for Sensorimotor Neural Engineering and its Industry Affiliate, Advanced Brain Monitoring, a team of University of Washington Electrical Engineering students designed a drone that is guided solely by brain signals.

I suppose because it was black and hovered in the air, the brain-controlled drone designed and demonstrated earlier this summer by a team of University of Washington Electrical Engineering (UWEE) students reminded me vaguely of Toothless, from the popular Dreamworks movie, “How to Train Your Dragon.” In the movie, which I saw a couple days before the drone demonstration, Toothless required a prosthetic tail-fin operated by his friend, Hiccup, in order to fly properly. This UWEE student-modified drone admittedly didn’t look quite like Toothless, but just like that famous dragon, it required a little guidance to fly in the right direction.

In the case of this drone, that guidance was provided solely by brain signals, collected by an EEG (electroencephalogram) cap and processed by recording equipment provided to the students by Advanced Brain Monitoring (ABM), a Center for Sensorimotor Neural Engineering (CSNE) industry affiliate. Technically speaking, the drone was controlled using steady-state visually evoked potentials, but that statement can be unpacked, so it's a little easier to understand.

This brain-controlled drone is controlled solely by the user (wearing an EEG cap) who chooses to look at one of five different lights on a computer monitor. Each light flashes at a different frequency and has a unique, pre-assigned numerical value that corresponds to navigational guidance for the drone, telling it to fly left, right, forward, up or down. The brain naturally mirrors (or evokes) the frequency of whichever light the user happens to be looking at, so when the brain mirrors the frequency of a particular flashing light, the EEG cap picks up this signal frequency and relays the corresponding, pre-assigned numerical value through processing software to the drone, thus telling it which way to fly. View the UWEE 448-449 System Controls and Robotics Capstone Project poster for a more complete explanation of how this works.

As straightforward as it may sound, putting together a project like this required a lot of creativity, research and hard work. Designing an innovative engineering device (like modifying this remote-controlled drone from 3DR Solo to receive navigational guidance from brain signals, rather than from a joystick) was the chief aim of the student's UWEE Spring Quarter 448-449 System Controls and Robotics Capstone Project, and it wasn't easy to do. According to Reni Magbagbeola and Sam Kinn, CSNE students, recent University of Washington (UW) graduates and part of the four-person UWEE student team who built the brain-controlled drone, support from the CSNE and ABM was critical to their success.

"When we went looking for EEG equipment, our first stop was here at the CSNE. I asked Raj [Rao, director of the CSNE] if he knew anywhere we could find some [EEG] equipment, and he lent us the actual equipment we used," Magbagbeola said. "I was initially having trouble trying to retrieve and modify the data, while the equipment was recording. Normally you have to record first, then stop to analyze the data, but we needed it to be a continuous signal. It was difficult to figure out. I asked people here at the CSNE how to do that, and they pointed me in the right direction."

ABM played a crucial role in the project's success by supplying the EEG equipment Dr. Raj Rao lent to Magbagbeola, Kinn and their project team. ABM also provided proprietary software that processed brain signals collected by the EEG cap, as well as offered guidance and advice to the students.

"We needed software for multiple devices, and [ABM] offered that software to us for free, which was very generous of them, because it was not cheap," Kinn said.

Although their brain-controlled drone certainly was innovative enough to earn a good grade, Magbagbeola, Kinn and the other students on their project team actually had goals outside the classroom in mind when designing the drone. Neurological disorders, such as those resulting from spinal cord injury and stroke, can leave people without the use of their hands, arms and legs. Developing brain-computer interface devices, such as the brain-controlled drone, may help those with severe injuries retain independence.

"We considered it, for example, for someone who's paraplegic, who wanted to get something across the room. [The brain-controlled drone] is something they could just send across the room, and use to retrieve an object," Magbagbeola said.

Designing brain-controlled devices for use in the real world might still be a ways off, but working on UWEE projects like this, coupled with their academic background developed at the CSNE, has prepared the students to address challenges inherent to developing brain-controlled devices.

"In my future, I'd like to work on devices that are related or dependent on neural control. This project and the class I took at the CSNE [in neural engineering], together gave me a foundation of knowledge for how to build these sorts of devices. It gives me the confidence that I can build them, and it teaches me where I need to look [and], what I need to learn in order to build the kind of behavior that we want in the device," Kinn said. "That's what I've gotten out of [the CSNE neural engineering class] the most, that high-level idea of what it takes to build some of these things, things we can do to make them better and where I could explore further in terms of research."

For Magbagbeola, her work on the brain-controlled drone project connected well to her studies at the CSNE and her research work in Dr. Chet Moritz's lab, where she is working on finding ways to determine how proprioception, an awareness of your body in space, is encoded in the brain. Magbagbeola has a strong interest in sensorimotor applications of prosthetics, so she is applying a bio-mechanical model to her research work, which is aimed at helping people with prosthetic arms receive sensory information through their engineered limb. Magbagbeola was also a member of a student team, VertiGone, that won the 2015 Tech Sandbox Competition, in which her team designed a neural-engineered system that uses auditory cues to detect when a person experiencing vestibular disequilibrium begins to feel dizzy and then helps to reorient them.

"Even with the Tech Sandbox course, it was really about using the senses to try and help these prosthetics, these external devices. [The Tech Sandbox course] was just taking auditory cues and helping your brain associate those with [the ability to stand upright and not fall from dizziness]. So, with a project like [the brain-controlled drone], which is using visual cues to help someone reach for something, purely based on what they are seeing or thinking, that nicely ties into what I'm interested in," Magbagbeola said. "It adds another layer to my understanding of how we can use sensorimotor applications to actually help people."

Both Magbagbeola and Kinn plan to continue their neural engineering studies post-graduation. Magbagbeola is returning in the fall to her hometown of Cambridge, England, and she recently was accepted into the Master's program for Robotics and Computation at University College London. Kinn was awarded the Washington Research Foundation Innovation Fellowship in Neural Engineering, through which he is continuing his post-graduate research work. After the fellowship is complete, he plans to take a year off from studies and then later pursue his Ph.D. in engineering with an emphasis in communications, control and signal processing, and of course, neural engineering. Both students said that the UW and the CSNE effectively prepared them for their future studies and careers. In particular, they agreed that a key element their education has given them was the confidence to believe they could successfully design brain-computer interfaces that could help people.

As Kinn stated, "Having these resources here and being able to take classes at the CSNE helped make us think that it was actually possible we could do this [design a brain-controlled drone]. It sounds sort of fantastic, if you've never heard of this stuff before, and it sounds a little hard to believe that a bunch of undergrads could actually do something like this [modify the drone]. Having worked and talked with people in the CSNE who have done research in this area already made it a lot more feasible in our minds, so we had confidence we could actually get it done."

The 2016 UWEE Spring Quarter 448-449 System Controls and Robotics Capstone Project team participants include Sam Kinn, Michael Schober, Reni Magbagbeola and Jon Lundlee. For more information about this project and/or the CSNE, contact CSNE member and UWEE Professor, Dr. Howard Chizeck.