Skip to main content

The moral dilemma of Microsoft's HoloLens contract with the military

caption: Col. William Bentley, Office of Naval Research deputy for the expeditionary maneuver warfare and combating terrorism department, joins the Chief of Naval Research, Rear Adm. David Hahn, for a demonstration of the "Tactical Decision Kit" that includes a Microsoft HoloLens headset.
Enlarge Icon
Col. William Bentley, Office of Naval Research deputy for the expeditionary maneuver warfare and combating terrorism department, joins the Chief of Naval Research, Rear Adm. David Hahn, for a demonstration of the "Tactical Decision Kit" that includes a Microsoft HoloLens headset.

A group of Microsoft employees signed an open letter in February asking the company to cancel a contract with the U.S. Army.

The technology in question is a controversial augmented reality headset called HoloLens.

KUOW's Bill Radke spoke with a former creative director for HoloLens, Monte Michaelis and Wendell Wallach, a scholar at the Yale Interdisciplinary Center for Bioethics. They discussed how the HoloLens could be used in war and the moral dilemma for the company and its employees.

Microsoft Hololens

On Friday, a group of Microsoft employees signed an open letter, asking the company to cancel a contract with the US Army. The technology in question is a controversial augmented reality headset called Hololens. We asked the project’s former creative director Monte Michaelis and Wendell Wallach, scholar at the Yale Interdisciplinary Center for Bioethics, how the Hololens could be used in war and what Microsoft’s responsibility is.

Interview highlights have been lightly edited for clarity

What is HoloLens and why would the U.S. Army want to use it?

Monte Michaelis: The best way to understand the HoloLens is to picture a headset that you wear, that allows you to see 3D objects like what you see on your computer, except those objects are inside the context of physical space.

caption: A Microsoft Hololens headset.
Enlarge Icon
A Microsoft Hololens headset.

The problems that are solved by the HoloLens are being able to be co-present with people who are a long way away from you, or being able to visualize and understand a physical location that is otherwise difficult to visit.

I’ve never worked on any military applications, but I can see how a military operation would be made more efficient if they could visualize a place that’s difficult to visit or that they don’t understand very well.

What’s an example of war becoming more efficient or more upsetting for you, personally?

Michaelis: I can imagine using this platform to visualize troop placements or the motion across terrain of enemy combatants in a way that wouldn’t force me to physically be in that space while also keeping my hands free to do other things.

If you’re picturing a scenario where there’s an individual soldier wearing this device, their hands are occupied by lots of different types of tools and weaponry, and they’d be able to get real time updates on the battlefield using computer technology.

I think the issue that exists in the mind of most designers is that they are attracted to this platform because of its possibilities in terms of connection, knowledge sharing, and understanding. I think the overt language of increasing lethality on the battlefield is not what most designers got into technology for.

I don’t know that the platform itself is a magic wand for warfare, but for the highest level of problem solving, it would be an effective tool for visualizing things that are hard to get to. Monte Michaelis

Why do you think Microsoft's contract with the U.S. military raises ethical concerns for people who work on HoloLens?

Wendell Wallach: The employees raised their concern that this was turning warfare into a game.

That was crossing a line for some — moving towards dehumanization of humans in warfare. That may not be a line in everyone's mind, but perhaps their protest is at least underscoring that there's more at stake here than people may be aware.

For example, if the military knows where a combatant is located, they may be able to direct a soldier to just fire at that spot even though the soldier cannot see the person they are firing at.

You can even imagine the solider they're firing at could be represented, and I know that this is more science fiction than anything else, but it will illustrate this concern around dehumanization — the person they're firing at could be a cartoon figure or a monster, or anything to take out the sting of the fact that you're actually killing a human being here.

What do you think Microsoft's responsibility is when contracting with the military?

Wallach: Satya Nadella and Brad Smith did speak to this in a meeting with employees and also a blog post back in October, and they made the point that they, as a company, need to support the defense regime of the United States.

Also if they withdrew from it, they would be undermining their ability to even participate in the discussions over the appropriate and ethical use of these new tools.

This is not an easy issue, but it's one that every tech company is now confronting, and every one of the major tech companies has had protests around various technologies that they have developed. Some tech employees don't want to support building tools that could be used immorally or as weaponry.

Do you feel Microsoft needs to support the defense regime of the United States?

Michaelis: Every experience designer brings their personal history into answering that question.

My dad was lifelong military, and I have a belief in the importance of providing security, and that there are morally justifiable causes for conflict. But there's optimism inherent in technology. If Microsoft removes itself totally from this contract, then they wouldn't be able to influence even the morally justifiable, ethical sides of the argument.

Wallach: If Microsoft loses this contract, then somebody else would step in. So it's not that this technology is so advanced that others aren't going to move forward on it, or that people other than Microsoft can't do it.

On that level, this is going to move forward whether it's a good or bad business decision.

But I think more importantly, technology is allowing us to create new tools for warfare and surveillance that potentially take humanity down a road we really don't want to go down. I happen to think the bigger concern is lethal autonomous weapons and augmented reality, but I think it's import to bring to the public's attention the vast array of new technologies that are dual purpose...

They'll both have entertaining and useful purposes to aid our jobs, but they'll also help provide for our security in ways that could be destructive.

Produced for the web by Brie Ripley

Why you can trust KUOW