3 minute read
EEG and Student Innovation
A variety of neural imaging techniques are used to identify and diagnose neurological disorders like epilepsy, sleep disorders, strokes, and more. Devices vary in cost, resolution, and accessibility, with magnetic resonance imaging (MRI) machines as the most effective for the highest price but also the most difficult to maintain. Electroencephalography (EEG) machines are at the other end of the range as the most affordable and portable but at a lower resolution—usually. Typical EEG machines often do not have high enough resolution to get highly detailed brain scans because the number of sensing nodes that are attached to a patient’s scalp—10 to 40 electrodes—is too low. For this reason, EEG as a neural imaging technique is often replaced with more invasive, risky, and expensive methods. But Pulkit Grover, an assistant professor of electrical and computer engineering (ECE) at CMU, still has faith in EEG. His group’s research predicts that current theories about EEG severely underestimate their capacity for spatial resolution. They postulate that high-density EEG, with 64 to 256 electrodes, offers clearer images that better send and receive signals through human skulls. Even more effective are UHD EEG: ultra-high-density EEG with up to 1,000 electrodes. Of course, with more nodes come new problems, such as figuring out how to attach all of them and how to get the most out of each node. To address these problems, Grover’s lab includes a group of instrumentation engineers, led by post-doctoral researcher Ashwati Krishnan and doctoral candidate Ritesh Kumar in collaboration with Shawn Kelly, senior systems scientist in the Engineering Research Accelerator. Rounding out the team, are students who are providing crucial insight into how to make EEG as accessible as possible for a variety of people.
EEG FOR VIRTUAL REALITY
Shi Johnson-Bey is a master’s student in biomedical engineering whose research focuses on EEG sensing as an input method for virtual reality (VR) modules. This means that a user can interact with a virtual world without making use of hand-held controllers—an ideal situation for individuals who have limited mobility without a loss of mental capabilities, such as patients with amyotrophic lateral sclerosis (ALS). Johnson-Bey was a Journeyman Fellow for the U.S. Army Research Laboratory both in the summer of 2017 and again for his final year of study. In his summer experience, Johnson-Bey learned about various braincomputer interfaces and created a P300 speller application that allowed users to type text by using brain activity. He continued this research to develop a system where users can interact with objects in VR settings by using audio cues. Within a virtual system, a user would see an environment or layout, such as a menu of options. In addition to wearing a VR headset, which includes immersion by sight and sound, the user would also be hooked up to EEG nodes that connect with the VR system. When hearing or seeing certain sounds or icons next to menu items, the EEG nodes would pick up electrical signals from the brain when the user is paying more attention, a reaction they have when they want to select an option. Reading that signal, the VR system can then select that item, essentially reading the user’s mind. This technology could open up the world of video games to those who previously did not have access, can provide simulations for medical and emotional treatment, and can lead to better understanding of how neurological patients with low communication skills function. Johnson-Bey also gave the example of using this kind of technology as something of a universal remote.
“THE BRAIN
IS BASICALLY
JUST A SYSTEM THAT
GIVES OFF
ELECTRICAL
SIGNALS,” SHI JOHNSON-BEY EXPLAINS.
“SENDING
THOSE
SIGNALS TO
A DEVICE
COULD ALLOW
SOMEONE TO
TURN ON THE
TELEVISION OR
CHANGE THE
LIGHTS JUST BY THINKING