Google last week did something that is really hard to find objectionable: It said it deleted quite a few ("tens of thousands") nude pictures stolen from celebrities. But as with anything that involves such an influential company as Google, this move creates a precedent, and it's a dangerous one.
This is classic slippery-slope territory. Before I detail the many reasons this decision could bring about terrible consequences, I should note that Google might have chosen to take this path for a deeply cynical and Machiavellian reason: It creates a much steeper barrier to entry for any startup that is even thinking about challenging Google's search empire. Google is in a dominant position that allows it to dedicate a large staff to the cost-containment task of deleting things, but startups need everyone committed to revenue-generating activities.
Now let's consider just how slippery that slope is -- that is, how much does Google's move endanger our privacy and right to know?
First we need to look at what brought about Google's decision. It all began on Oct. 1, when a Los Angeles attorney named Martin Singer sent Google executives a letter saying that he represented more than a dozen unspecified "female celebrities, actresses, models and athletes" whose nude or semi-nude photos had found their way from their iCloud accounts onto various public Google pages. He demanded that the images be removed, dropping lots of not-nice things about Google in the process, saying for example that it was "making millions and profiting from the victimization of women." (Irony note: If Singer successfully sues and makes a handsome fee, wouldn't he also be profiting from the same victimization?)
Singer earns his fee with some lawyerly twisting of the facts. He notes that other ISPs that he has written to removed the images "within an hour or two," even though "the vast majority of those sites and ISPs/hosts, all of which are much smaller than Google," have "far fewer staff and resources."
Those three quoted snippets are facts, but by joining them together, Singer tries to give them meaning in a way that leaves truth behind. The truth is that those smaller sites have dramatically fewer such requests to sift through. At a company the size of Google, getting many more requests every day, it's unlikely that anyone in a position to act has even seen those messages within an hour or two. Newsflash: Tiny companies can move a lot faster than a Fortune 50 company such as Google (annual revenue last year, $61 billion).
And while Singer would like Google to grant his request immediately, the rest of us are glad that companies take time to review and investigate such complaints. Companies need to perform due diligence before acquiescing to requests to delete things. After all, choose anything at all that's on the Internet and you can surely find someone somewhere who will object to it. Take it all down, and there's nothing left.
This is where I find Google's response frustrating. The search giant said that its turnaround for these requests, in fact, "is generally hours, not weeks. Of course people continue to post these images on the web, so -- like other online services -- we rely on people notifying us to help us take them down, whether by flagging content, or filing DMCA (Digital Millennium Copyright Act) requests. We're removing these photos for community guidelines and policy violations (eg nudity and privacy violation) on YouTube, Blogger and Google+. For search we have historically taken a different approach as we reflect what's online -- but we remove these images when we receive valid copyright (DMCA) notices."
Do you feel the slope sliding away beneath your feet? Google's statement says nothing about investigations. Instead, it touts its quick response to a complaint it received. It creates the impression that letters of complaint -- not meaningful probes -- cause images to vanish.
Oh, and I do expect requests to come rolling in. Some of them will be pretty easy calls. Nude photos whose subjects object? That's easy; take them down. Child porn? Of course it will be rooted out. OK, but what about extreme violence? That sounds like an easy call. But if the image is from a police dashboard cam, does the depicted violence also carry implications about civil liberties and police brutality? Can a case be made for removing videos of beheadings by ISIS that still allows for other forms of political violence, like the shooting of President Kennedy?
The important question in all of this is: Do we want lawyers at Google answering these questions for us?
I don't. And it's not just images. Copyright-protected and trademark-protected documents could easily be candidates for suppression. There are the disclosures found in WikiLeaks documents, and there are all the news reports that quote from those documents. Trade secrets might seem like a sure bet for suppression, but what happens when there's a clear public interest at risk? Remember when GM took engineering shortcuts that resulted in deaths?
Hate speech seems like something that shouldn't cause trouble. But who gets to determine what constitutes hate speech? Where do you draw the line between hate speech and the articulation of a political philosophy? And if you start censoring political speech, then you are encroaching on the very ground that the First Amendment was meant to protect.
Let all of these things slip by, and soon you're well down that slippery slope. Now you have to consider whether embarrassing social-media posts should be deleted by Google, just because a good lawyer will argue that such details could impact future earnings. The same goes for DWI arrest details and registered sex offender lists. After that it will be negative product reviews and pejorative comments from employees on Glassdoor.com.
To make myself clear: Google is right to take down legally unacceptable images. But when it does that, it needs to make it clear that its decision is based on an extensive due diligence effort. Google's customers need to know that their access to valuable information is not being hampered, and the disgruntled of the world need to know that kneejerk takedown requests won't work.
Evan Schuman has covered IT issues for a lot longer than he'll ever admit. The founding editor of retail technology site StorefrontBacktalk, he's been a columnist for CBSNews.com, RetailWeek and eWeek. Evan can be reached at email@example.com and he can be followed at twitter.com/eschuman. Look for his column every other Tuesday.