By Scott M. Fulton, III, Betanews
What does it mean to have a "right to privacy?" We have a right to vote, and too few of us use it. I heard it explained to me once, a human right is like a vegetable garden. You have to nurture it, take care of it, and harvest it. Otherwise you have a plot of dirt.
The Internet is not like a vegetable garden. Perhaps that test is appropriate, then, for lawmakers worldwide considering whether the "right to Internet access" follows from the right to free speech -- there are places in the world where is this actively being considered. If a person is denied access to the Internet, the argument goes, her free speech rights are being violated, or at least abridged.
By that same logic, the extent to which one makes use of the Internet, must therefore abridge that person's own right to privacy. At least, by that same logic.
"I think judgment matters..."
Quite a bit has been said about Google CEO Eric Schmidt's comment, in a recent interview with CNBC's Maria Bartiromo for a documentary, that if an Internet user truly expects privacy, then he should consider whether there's something about himself that he doesn't want the world to know.
Perhaps just as important as Schmidt's response was Bartiromo's question, which often gets cut out of the excerpt: "People are treating Google like their most trusted friend. Should they be?"
"Well, I think judgment matters," Schmidt responded. "If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place. But if you really need that kind of privacy, the reality is that search engines, including Google, do retain this information for some time. And it's important, for example, that we are all subject in the United States to the Patriot Act. It is possible that that information could be made available to the authorities."
Now, if I were a psychologist, or even if I played one on TV, I could write a field manual on this single paragraph. It's amazing what associations can tell you about the way one's mind works; and Google is, after all, in the business of making associations -- literally, creating contexts and selling ad space on them. Schmidt's second and third sentences imply that his company retains information on what people shouldn't be doing. And Schmidt's third and fourth sentences imply that Google is bound by law to opening up that treasure trove to law enforcement.
The question was, should people trust you? And the answer was, hey lady, I could turn you over to the cops at any time.
Somewhere within the PR department of Google, a hundred heads simultaneously sank.
The sad truth is, we really don't know the extent to which Google or Bing or Facebook or Twitter retains information about its users for any length of time. But if there truly were some type of conspiracy to construct a data mine that plows into the individual privacy rights of billions, it would require a collective resource called intelligence, the abundance of which is contra-indicated by available evidence. It would require the type of concerted effort, thoughtful engineering, and unadulterated ingenuity with which the Internet itself has never been blessed.
The Internet, for lack of a more poetic description, happened. Most of its technology just fell into place, like the result of a classic game of "52-Pickup." The opportunities were there for every Web connection to be encrypted, for every transaction to be certified, for each individual to specify the extent to which she wants her personal information to be utilized -- or not -- and for servers to comply. The specifications were written, and the proposals were made. Google "HTTP-Next Generation" (HTTP-NG) sometime and see what you find.
So why didn't it get done?
The typical response is, because somebody -- often somebody specific -- failed to care enough to see the project through. When big problems culminate in colossal failures, step one in the process is typically to assign blame. Earlier this week, after the embarrassing revelation of information in sensitive TSA documents not having been properly redacted, the Dept. of Homeland Security immediately responded by saying it is tightening its noose around who's to blame. And that should make us all feel better. But in the back of our minds, we know that noose needs to be made wider, not narrower.
The most prominent failure of policy, whether it be in a government or a computer network, is its failure to exist. When there are no rules to be followed, what ends up happening is typically some well-meaning person's best attempt at a solution. That's probably the case with respect to the TSA worker who could not delete the text in the PDF document because his old version of Acrobat didn't let him, so he just blacked it out with a rectangle.
It's probably a very similar case with respect to the architecture of the Internet as a whole. The multitude of privacy violations that take place every day to thousands of individuals whose names aren't "Transportation Safety Administration," are not the fault of one person upstream someplace who failed to click the right button. We -- as in the oversized first word in the US Constitution, "We" -- have failed to work this problem out. Instead, we're trusting someone else to do the job for us.
"People are treating Google like their most trusted friend," asked Maria Bartiromo so poignantly. "Should they be?"
"Get over it."
The Internet was not originally designed to be a communications system. Back in the late 1960s, DARPA's engineers were not looking to replace the telephone with something digital. What they designed was a network of dynamic, switchable routes that connect terminals in one place to databases in another. The Internet was, and is, a networked database.
Like any database, what it gives us is a function of what we give it. Data in, data out. And yes, I'm sounding like Scott McNealy, when he told a JavaOne conference in 1999, "You have no privacy. Get over it." But if McNealy was speaking on behalf of his database -- which, indeed, he was -- then it seems sensible to say that a database isn't going to give you something that you don't give it. Something intangible such as, say, privacy.
To make the database give someone else less than what you give it, there needs to be policies. There needs to be a system in place that enables individuals to specify, first of all, the information that belongs to them. That system needs to enable individuals to specify who can use that information, and who cannot.
But that system will not be given to us. It will not be bestowed upon us, like a human right.
Next: "Give us our privacy!"
"Give us our privacy!"
Two years ago, after Facebook established a service called Beacon that shared users' behavioral data with partners, a multitude of Facebook users started an uproar, in their own way. One of the countless groups that sprang in Beacon's wake was called, "Petition to tell Facebook, give us our privacy!" Its founder wrote, "The second goal of this group is to tell Facebook that we do not want them watching what we do online. What are we all, terrorist suspects; and what is Facebook? The CIA? We deserve to have our privacy. It's a right given to us in the US Constitution."
Actually...no, it's not. The Fourth Amendment grants citizens the right to be protected from unwarranted search and seizure; and the Ninth Amendment states there are other rights too numerous to mention in the Constitution. Judges have interpreted the Fourth and Ninth Amendments as coalescing into something they call the expectation of privacy. That is to say, there are situations where a reasonable citizen should expect to be reasonably private. But as judges interpret it, that expectation rarely arises, or is at least diminished, in public places.
If we think of the Internet as a database, then there's the technical feasibility of policies put in place to restrict information access. But the more we characterize the Internet as a public place -- as a community of 350 million users, like Facebook -- the less likely it will be perceived as a source from which all privacy flows.
Facebook's privacy policy changes were rolled out yesterday. And if users follow Facebook's gentle and friendly instructions as to how to go about implementing those changes -- as the ACLU and the EFF correctly pointed out this week -- those users will merrily flip their privacy controls off. Because "Off" -- or as Facebook calls it, "Everyone" -- is now not only the default setting, but the "Recommended" one.
The ACLU's campaign to compel services including Facebook to improve and observe their own privacy policies, is called Demand Your dotRights - Privacy 2.0. Here's how the ACLU explains the portion of the campaign that deals with social networking: "Social networking sites are an amazing way to connect with friends and family. But the information that these sites collect about you -- not just what's on your profile but also the records of everything you do on the site -- says a lot about your interests, habits, beliefs, and concerns. And outdated privacy laws, written before the Internet even existed, mean that even when you think your profile is 'private,' it isn't private to the government. Don't pay for social networking by giving up control of your personal information. Demand a privacy upgrade. Demand your dotRights."
The ACLU's concern is very reasonable: It's not that Facebook or Google could potentially share your photo or address or phone number with someone on your "Do Not Share" list. It's that aggregators and data miners (the real subject of Facebook's latest policy changes, made more prominent by their absence of reference) are actually encouraged, as part of the business model, to make use of that data collectively, to determine how we live, how we eat, and maybe when it is we use the bathroom.
The most poignantly ironic element of the ACLU's paragraph above is the phrase, "Don't pay for social networking." Because we already don't.
"Privacy is a basic human need"
In response to Schmidt's comment to CNBC, security technologist Bruce Schneier cited an essay he wrote in 2006. In it, he wrote, "Privacy protects us from abuses by those in power, even if we're doing nothing wrong at the time of surveillance. We do nothing wrong when we make love or go to the bathroom. We are not deliberately hiding anything when we seek out private places for reflection or conversation. We keep private journals, sing in the privacy of the shower, and write letters to secret lovers and then burn them. Privacy is a basic human need."
Quite arguable, especially if we phrase it like Schneier -- as a need rather than a right. But in that light, Twitter also tapped into another primal human need: to announce the events of our lives. As Twitter described itself from the very beginning, it's "a global community of friends and strangers answering one simple question: What are you doing?"
And we know that the guy in the Verizon ad who's sitting on his back patio and Twittering, "I'm...sitting...on...the patio..." is not at all exactly unique.
The right to privacy is in contest with the "basic human need," to borrow Schneier's phrase, to broadcast one's events to the world. And as long as the latter wins out, the ability and the inclination will be there for companies to make use of the data we're spilling out in buckets, in aggregate form. For as long as we tell the world what we're watching on TV or eating for lunch or where we're seated, but we refrain from telling when we're in the bathroom or making love, simple deduction will lead some algorithm somewhere to the inevitable conclusions inferred from the information we refrain from providing. It's what legal scholars would call the "fruit of the tree."
Privacy is not something that Facebook or Twitter or the ACLU will give us. It is something we have to nurture, cultivate, and build for ourselves. We as Internet users seriously need to revisit the idea of globally relevant identification -- digital certificates that identify us to Google and Facebook and Betanews and our mother-in-law's blog. Such tools would cost us a bit of our cherished right to anonymity (something else not guaranteed by the Constitution), but it would enable us to start guaranteeing our own privacy, building policies for databases that state up front what data belongs to us and what data does not, and disabling the ability for any service we designate to maintain copies of our data.
If we really want privacy, just like every other right, we're the ones who need to build it and to use it. Otherwise -- especially out here on the Internet -- it will die.
The opinions expressed here are those of Scott M. Fulton, III, who is solely responsible for his content.
Copyright Betanews, Inc. 2009