Machines That Read Your Brain Waves
Sometimes a technology that's been simmering in the laboratory or the clinic for decades makes the leap to mainstream consumption almost overnight.
Take the cavity magnetron. The precursor to this curious form of vacuum tube was invented at General Electric around 1920. It wasn't until 1940 that British scientists found a magnetron design that could pump out microwave energy at unprecedented power. That discovery fueled a crash program at the Massachusetts Institute of Technology to build airborne radar units, an advance that helped the Allies turn back Nazi Germany in Europe. The conflict had barely ended when a Raytheon engineer noticed that microwaves could also melt chocolate. The “Radarange” debuted in 1947, and today there's a magnetron in virtually every kitchen.
The next old-but-new technology to pervade our lives may be so-called neural interfaces. Thanks to noninvasive tools that have been around for decades, such as electroencephalography (EEG) and functional magnetic resonance imaging (fMRI), physicians and neuroscientists can measure changes in your brain without drilling a hole in your skull. And now some of the problems that made these tools finicky, expensive and hard to interpret are being ironed out, meaning that neural interfaces are suddenly showing up at Amazon and Target. Which presents a challenge because measuring brain activity isn't like making microwave popcorn. There are enormous privacy and ethical issues at stake.
The story of Toronto-based InteraXon, a brain-machine interface start-up founded in 2007, shows how fast things are changing. Getting reliable brain-wave measurements via EEG used to mean pasting dozens of electrodes to a subject's scalp. But InteraXon built a wearable EEG device with just a few electrodes that rest against the forehead and behind the ears, along with software to classify the brain waves they measure. Low-frequency “alpha” waves indicate a relaxed state; higher-frequency “beta” or “gamma” waves indicate a busy or concentrating mind.
The company's first applications were on the whimsical side. Visitors to the Ontario pavilion at the 2010 Winter Olympics in Vancouver could don a headband and use their thoughts to control the lights shining on Niagara Falls and other distant Ontario landmarks. Later the company built thought-controlled slot cars and Star Wars games. “After all this thought controlling, we hit upon this very important recognition,” InteraXon co-founder Ariel Garten told me. “Although you could control technology with your brain, the way that you did it was not very effective. Frankly, you could just turn the thing with your hand much more readily.”
But in 2014 the company released its Muse headband, now in its second iteration: it pairs with a smartphone app to help users practice mindfulness meditation. When the software detects brain waves indicating a wandering mind, wearers hear feedback in the form of crashing waves or thunder. These sounds cue them to return their attention to their breath. “It's like doing a rep at the gym,” Garten says. “That's you saying, ‘Okay, I have this muscle called my attention, and I'm going to strengthen it.’”
But it's one thing to use EEG data to diagnose sleep disorders or epilepsy; it's quite another to start monitoring the brain states of millions of healthy consumers. So Garten also founded the Center for Responsible Brainwave Technologies, which aims to prevent privacy breaches, excessive scientific claims or other missteps that could derail the nascent neural-interfaces industry. “The goal is to create a set of standards to ensure that everybody's data is kept safe at all times and that the technology is used appropriately,” Garten says.
Mary Lou Jepsen is onboard with that. She's a Silicon Valley hardware engineer who recently founded Openwater, a start-up building a ski-cap-shaped device that will use skull-penetrating infrared light to measure blood flow—a sign of which brain areas are working hardest. Jepsen conceived the technology as a low-cost substitute for fMRI for diagnosing brain injuries or neurodegenerative diseases. But one day, she says, it might also be used to read thoughts.
That could be a boon for people with disabilities, but it is also a privacy nightmare in the making. “I think the mind-reading scenarios are farther out, but the reason I'm talking about them early is because they do have profound ethical and legal implications,” Jepsen says. “The only way we're going to release something is if we have ways to define what it means to be responsible.”
As with so many other technologies, consumer neural interfaces seem destined to reach consumers before they're fully cooked. For now they'll be best served with a healthy side of caution.