Meet the brains behind McGill’s mind-controlled wheelchair

McGill students build a brain-powered wheelchair in just 30 days
From left to right: Jenisha Patel, Claudia Leung, Anna Brandenberger, Simon Tartakovsky, Danielle Nadin, Marley Xiong and Raffi Hotter at the Milo demonstration at Google offices last week

Marley Xiong and Jenisha Patel wanted to do a project together – any project.

“Well, as long as it involved brain computer interfaces,” said Xiong.

Naturally.

So, they did, with help from 30+ McGill NeuroTech friends and associates, students in the Engineering, Arts and Science faculties.

Their brainchild, a smart wheelchair dubbed Milo, was born a mere 30 days later – Milo for mind-controlled locomotive. The project was their submission to the NeuroTechX student competition launched in February.

“One day it was just Jenisha and I throwing ideas around,” said Xiong, a 21-year-old student in Biology and Computer Science. “One week later, on Feb. 24, we got a team together. In the space of seven days, we became this group of 35 people, and we spent time thinking about what to build.

“On March 1, we got the [hardware, a stripped-down] wheelchair, and on March 31, we filmed the demo” with the finished product, a wheelchair controlled by electroencephalography – electrical brain activity – designed to provide more autonomy for people with severe disabilities, like paralysis.

“On the 31st, we woke up at 7 am hoping to God it would work,” said Xiong. “And it did. The last thing I expected was for the wheelchair to work.”

Not because she is a pessimist, but because of the logistical feat of putting together a complex futuristic machine like that from scratch in a month.

This is not the first time such a device has been built but it is an impressive achievement given the time-frame and the group’s collective experience.

Brain power

The chair’s movements are controlled by information it receives from the brain, said Raffi Hotter, a 19-year-old Math and Computer Science student. Four electrodes are connected to the back of the person’s head and brain waves – signals – send commands to the wheelchair to turn left or right. Sensors also detect people and obstacles and make the chair react accordingly by stopping, slowing or turning.

Hotter had the initial task of designing the software, while Xiong developed the algorithms for detecting thought patterns.

Wheelchair bound persons clenching their jaw makes the machine move forward or stop.

“But also if the person is imagining moving left – just by imagining, we’re able to get a signal which makes the wheelchair turn left.”

“We got them imagining moving their right hand or left hand – what someone thinking left or right looks like in terms of the brain signal – and created a whole software suite from that,” said Hotter.

“We fed that info into an algorithm which was constructed so that we can tell it ‘Look, this data corresponds to the person thinking left or right.’ The algorithm is then able to come up with some representation of what turning left means in the brain.”

In short, it translates a thought pattern into a physical action.

“That’s pretty cool,” said Xiong. “We love watching it work.”

“It’s almost science fiction,” Hotter said.

Xiong said that “you can’t simply imagine an apple or coffee and emit a specific wave. But with motor imagery, it’s different. It’s a very robust signal.”

Translating thoughts into physical commands

Typically, a project of this magnitude would take several months, said Hotter, who added features like sending a text message and providing a precise location to the wheelchair person’s caregiver in case of high stress.

“So many things could have gone wrong,” said Xiong. “First, the wheelchair has to do whatever the software tells it to do; the software has to apply the data in a regulated way, so it has to be good; the electrodes have to be attached correctly; and finally, given that all this works, the brain has to work.”

In addition to overseeing the whole project as team leader, Xiong led the machine learning team, which was in charge of translating thought patterns to a physical command.

“It was genuinely an average of 50 hours a week,” said Xiong. “To be honest, I didn’t go to classes because of the project.”

The team presented the wheelchair at Google last week. And McGill professors have expressed interest in a collaboration to do further empirical research in connection with the wheelchair.

“It was just a great culture within the team,” said Xiong. “This was a project we had complete ownership of. If you don’t learn, there’s no project. You’re forced into this new learning environment. That’s what made it so much fun.”

Watch the video below to learn more about Milo