Image: Mari Helin on Unsplash

Tech Has an Obligation to the Truth

Shannon Coulter
8 min readNov 27, 2019

--

Last summer, I found myself on the phone with a man whose son had been murdered in the 2012 Sandy Hook Elementary School shooting.

Six-year-old Noah Pozner was the youngest victim at Sandy Hook that day. In one memorial, he’s described as an energetic and “endlessly inquisitive” little boy. In the conversation I had with his father, Leonard Pozner explained to me that since Noah’s death, he’d spent a good deal of time finding Sandy Hook-related hoax sites and trying to get them taken down.

These websites, mostly blogs, are created by people who fabricate content aimed at persuading readers the shooting never happened. Often, they feature manipulated images of Sandy Hook victims to this end.

Mister Pozner shared one such blog with me. One of the posts featured a photograph of a city street at night. The author claimed it was in Karachi, Pakistan. A photograph of Noah had been altered (not very convincingly) to appear as if it had been mounted to a wall lining the lane. Images of other Sandy Hook victims appeared alongside it.

Several times, Mister Pozner had asked Wordpress…to remove the manipulated image of his slain six-year-old son. He received only automated replies. The image remained.

These children were once again, asserted the blog’s author, being portrayed as the victims of a brand new incident of mass violence, this time in Pakistan. This, he declared, was proof that the 27 people killed at Sandy Hook Elementary School were, in fact, paid crisis actors, all alive and well.

Mister Pozner and I had been put in touch by a mutual friend. Because he communicates with tech companies regularly, there’s a strong dimension of corporate social responsibility to his work. The friend thought I might be able to help him resolve a snag he’d run into. Several times, Mister Pozner had asked Wordpress, the company that hosted this particular Sandy Hook hoaxer blog, to remove the manipulated image of his slain six-year-old son. He received only automated replies. The image remained.

To boot, the boilerplate responses Mister Pozner received from Wordpress’ parent company, Automattic, said it could take legal action against him if it turned out he wasn’t the copyright holder of the images he was flagging. The company’s exact words were, “you may be liable for damages if you knowingly materially misrepresent your copyrights — and we may seek to collect those damages.”

Think about that for a moment. A major tech company, as its default behavior, was telling the father of a mass shooting victim it could take legal action against him simply for asking the company to act with a minimum of decency.

Mister Pozner’s experiences speak volumes about the state of ethics in the U.S. tech industry: there aren’t any.

After hearing his story, I emailed several executives at Wordpress and Automattic including Wordpress CEO Matt Mullenweg. I asked if the company planned to address the issue of Wordpress-hosted websites maliciously using images of minors. I received no response.

On our call, I’d asked Mister Pozner if he was willing to share with me one of the auto-responder emails he’d received from the company. He was. I used it to tip a reporter at the New York Times. She got back quickly and ended up speaking to Mister Pozner. A story was published shortly thereafter. A few days later, Automattic updated its policies to disallow malicious use of images of minors. The company booted several Sandy Hook conspiracy sites from its platform, including the one Mister Pozner initially flagged.

Now and then, I still marvel that as late as last summer, a 14 year old tech company with a $3 billion valuation and one of the Internet’s most popular hosting platforms hadn’t yet bothered to create a policy preventing malicious use of images of minors. I think about how if Leonard Pozner hadn’t taken action, it might still not have such a policy.

Mister Pozner’s experiences speak volumes about the state of ethics in the U.S. tech industry: there aren’t any.

When tech CEOs portray themselves as the benevolent champions of free speech or the creators of neutral tools, they’re clinging to a bygone age of tech utopianism.

As a cohort, social media CEOs tend to be vocal proponents of free speech. In October, Mark Zuckerberg gave an entire commencement speech at Georgetown about the importance of free expression. He said “I believe in giving people a voice because, at the end of the day, I believe in people.”

I suspect the broad, vocal support for free speech we’ve seen to date from tech CEOs stems less from their love of people and more from a desire to keep user bases and profit margins as large as possible. After all, tackling disinformation in a serious, structural way would cost them money.

I also think the tech industry is hanging on for dear life to an image of itself as inherently good. When tech CEOs portray themselves as the benevolent champions of free speech or the creators of neutral tools, they’re clinging to a bygone age of tech utopianism.

It’s time for that to stop. In an era of growing white supremacy, gun violence, organized trolling, monetization of deceptive content, and state-sponsored election hacking, the tech industry’s zeal for free expression isn’t just naive. It’s immoral.

When you enforce an ethical code only in response to immense public pressure, it means you don’t really have one.

It’s telling, for instance, that despite all we now know about the state-sponsored interference in the U.S. 2016 presidential election, neither Google nor Facebook will make a full throated commitment to refrain from disseminating false information through political ads in 2020. Twitter, by contrast, decided to ban political ads altogether, citing the “significant risks” associated with the Internet advertising’s power.

Jack Dorsey deserves praise for that decision, but we shouldn’t forget how new Twitter’s concern for the truth is. Historically the company has dragged its feet so much on this issue that even after Apple and Facebook showed Sandy Hook conspiracist Alex Jones the door, Dorsey declared he would not. It took immense public pressure and over 80,000 people blocking Twitter’s major advertisers to get Alex Jones banned from the platform.

When you enforce an ethical code only in response to immense public pressure, it means you don’t really have one. Who knows? Maybe Dorsey’s ban on political ads and his evolving views on free speech represent a new direction. When I spoke to Leonard Pozner last year, I was glad to hear him say Google and Facebook had been “moving in the right direction” on removing Sandy Hook conspiracy content.

I still see these developments, though, as just one small step in a larger journey upon which most tech CEOs have yet even to embark. The primary characteristic of that journey involves a proactive rather than a reactive approach for dealing with unethical users and content.

Don’t tell us what brilliant innovators you are for a quarter century then declare basic ethics to be beyond the tech industry’s reach.

Tech CEOs like to talk about how they couldn’t possibly take on the responsibility of being arbiters of truth. How would they know? Most have them haven’t yet begun to try.

This year Facebook hired an ex-CIA officer to be its global elections integrity operations lead, got some nice headlines out of it, then unceremoniously changed her title to “manager” on her second day of work. That isn’t a commitment to integrity. That’s a cynical public relations stunt.

I think most people just want some basic safeguards in place. They want mechanisms to prevent bad actors from developing big platforms. They want those with big platforms to be held to a higher standard of truth telling and conduct. In cases where someone with a big audience is chronically breaking the rules, I think most people want companies to be proactive in getting that person or organization off the platform.

Don’t tell us what brilliant innovators you are for a quarter century then declare basic ethics to be beyond the tech industry’s reach. No stock price is more important than the survival of democracy. Spend some money. Experiment. Tell us how it’s going. Care beyond the optics. Mean it.

We need tech leaders who feel a passionate commitment to the truth.

Regulation could solve some of these problems, but I think we also need a big cultural change in the tech industry itself. It’s incredibly worrisome that Zuckerberg is out there talking about his passion for free expression at this late date. He should be acknowledging the ways in which unchecked free expression has damaged democracy and the lives of people like Mister Pozner.

He should be talking about how true freedom of expression is impossible when paid propaganda drowns out a plurality of voices and intimidates many into silence. He should be talking about how he, as the leader of the world’s largest media platform, has an obligation to the truth.

We need tech leaders who feel a passionate commitment to the truth. Who understand that a basic commitment to truth doesn’t require choosing political sides. Who don’t cling to past incarnations of themselves as makers of neutral tools or champions of free speech, but rather the primary source of news for most people these days. We need tech leaders who understand what news is, how it gets made, and why it’s important.

Imagine tech companies competing on how committed to the truth they could be.

Can you imagine ABC News accidentally broadcasting a mass shooting live for 29 minutes? Can you imagine it airing a story about the Sandy Hook shooting never having happened?

It isn’t the algorithm that needs adjusting this time. It’s the moral compass of human beings so insulated by privilege from the negative effects of their indifference they can barely tell right from wrong.

Picture a world in which fifteen years ago tech leaders had made a formal commitment to the truth and to community standards. Imagine tech companies competing on how committed to the truth they could be.

We might be looking at a very different world. One without an epidemic of child pornography. One without rising rates of preventable disease. One in which white nationalist groups hadn’t just been declared a terrorist threat. One in which Leonard Pozner wouldn’t have had to move a half dozen times in as many years due to the posting of his personal information online.

Last year Google removed its “don’t be evil” clause from its code of conduct. When that motto was first put into use, I remember thinking it seemed like such a fresh, iconoclastic way for a company to think. I didn’t notice how it was an ethos primarily grounded in inaction, in what you weren’t going to do. Now that the tech industry has grown so powerful, it needs is an ethical code grounded in what it will do. Maybe the only way for the truly powerful to avoid being evil is to fight it.

--

--

Shannon Coulter

President of Grab Your Wallet Alliance, helping people flex their economic power for a more equitable, democratic society.