Skip to main content

You make this possible. Support our independent, nonprofit newsroom today.

Give Now

Brain technology is moving forward, but are ethical standards up to speed?

brain generic
Enlarge Icon

When you’re experimenting with human brains, or even monkey brains, you’re experimenting with the organ that keeps a creature alive. It’s our brains, with all their synapses and neurons, that make us who we are.

So how do you experiment with brain technology ethically?

Last week, Elon Musk gave a presentation showcasing the work of his company, Neuralink.

Neuralink is a neurotechnology company. In other words, they build implantable brain-computer interfaces, or BCIs.

In the presentation, Musk demonstrates using a monkey that moves a cursor back and forth, hitting a ball that’s bouncing across the screen.

There's concern about how those monkeys are being treated in these medical trials. Reuters reported Tuesday that Neuralink is under investigation for potential federal animal-welfare violations.

Musk claims that the implant could allow people with paralysis to use a computer outside of a lab setting. Other Neuralink employees also talked about using implants for restoring vision to the blind and assisting patients with spinal-chord injuries.

Neuralink is also developing a brain implant for healthy people, to allow them to communicate wirelessly.

During the presentation last week, Musk said the company is only six months from being ready for human trials.

But one thing Musk didn’t mention is the ethical standards those human trials will be held to.

"Without ethics guidelines, the ethics is going to be hit and miss and we're going to be reacting to problems rather than preventing them," said Dr. Nancy Jecker, a professor at the University of Washington School of Medicine, Department of Bioethics and Humanities. "And in the process, we might end up irreversibly damaging people in ways we could have avoided."

"The technology is sort of an iteration on things that people have been doing for a little while," noted Dr. Andrew Ko, a neurosurgeon and professor at the University of Washington School of Medicine. He’s also the director of epilepsy surgery and functional and restorative neurosurgery there.

It's not the technology itself that raises concerns, Ko said, but the long-term implications of this technology.

The potential non-medical uses for BCI, or brain-computer interfaces, vary widely. It could be used for — as Musk mentioned in his presentation — wireless communication, or entertainment, such as virtual reality games.

"Other examples include measuring alertness and safety critical tasks like air traffic control," Jecker said. "And the military has been a leading investor in brain-computer interface research since the early 1970s. So, looking at ways of enhancing soldiers' warfighting capabilities."

Neuralink and the military are for-profit companies. And that means that, as driving forces of neurotechnology, they don't have the same ethical guidelines as a medical institution. Companies that aren't federally funded, such as Neuralink, also aren't under the same ethical constraints.

"They're not required, for example, to have an institutional review board, look at the ethical issues of the protocol for the research or raise other questions," Jecker said. "And, you know, if you have an invasive technology, you do need FDA approval. But they're talking also about non-invasive forms of brain-computer interface, and there you don't necessarily need FDA approval. So where are we going to set some boundaries?"

There are a few possible approaches to setting ethical guidelines, Jecker said. The utilitarian approach, which says to maximize the well-being for everyone affected by an action.

"So, for example, enhancing soldiers might create the greatest good by improving the nation's warfighting abilities, keeping soldiers safer, maintaining military readiness," Jecker said.

Another approach is rights based — that people have certain rights when it comes to their own brain that must be protected, even when it doesn't create the greatest good.

Dr. Jecker and Dr. Ko suggest a third solution that supplements a rights-based approach — they call it the capability view.

"Respecting human dignity requires safeguarding a much broader range of capabilities than neuro-rights. We need to think about safeguarding things like our human ability to be emotionally and physically healthy, to have bodily integrity, or use our body to carry out our desires and goals," Jecker said. "BCI could interfere with those broad range of human capabilities. And we think that they should be protected, at least a certain threshold level, as part of respecting human dignity."

Jecker isn't sure if ethical guidelines will be standard use by the time this technology is broadly available. But she's hopeful.

"I think we're starting these conversations, and we need to have more of them," she explained. "And we need to engage the wider society in these conversations, because these will affect all of us."

Why you can trust KUOW