Cory Doctorow has a new essay in Technology Review entitled “The Curious Case of Internet Privacy”. He begins by outlining the idea of “the trade” an idea he rightly suggests has risen to the level of myth.
“The trade” is simply that you are permitted to use a system like Facebook for free, and in return you give them permission to sell information about what you say and do on the service. This trade has been criticized on a number of grounds. The user often does not understand what she is giving up, either because it isn’t clear what damage that loss of privacy might bring in the future, or that the deal is cleverly concealed in 30 pages of legalese that constitutes the End-User License Agreement. Others suggest that privacy itself is a human right and not any more subject to barter than is your liver.
But Doctorow doubles down on the myth of the trade, suggesting merely that it is a bad deal, a deal with the devil. You are trading your immortal privacy for present-day reward. I don’t disagree with the details of his argument, but in this case I don’t know that the devil really is in the details. Maybe it’s not a deal with the devil, but a deal with a Tengu.
A tengu, for those who are not familiar, is a long nosed beastie from Japanese mythology, often tied to esoteric Buddhism and specifically the yamabushi. (Those of you who have visited me in the office have probably seen one or two tengu masks, left over from when I lived near the Daiyuzan Saijyouji temple.) The deal with the Tengu is sometimes told a bit differently, with, in one case, the human claiming that he is afraid of gold or mochi (and the Tengu producing these in abundance to scare him off), or a tengu getting nailed with a splinter while a woodcutter is doing his work, and complaining about the human tendency to not think about the consequences of their actions. In other words, there is a deal, but maybe the end user is making out like a bandit.
Right now, it’s not clear what value Facebook, to take our earlier example, is extracting from this personal data. Clearly it is part of some grail of behavioral marketing. Yes, they present ads based on browsing behavior now, and yes, I suspect those targeted ads are more effective (they’ve worked on me at least once), but I’m not sure that the marginal price Facebook can command for this data adds up to all that much, except in the aggregate. Indeed, for many users of the service, the bet against future value of privacy is a perfectly reasonable one to make.
I’ll put off for now an argument that comes dangerously close to “Zuck is right,” and suggests that our idea of “privacy” is pretty unstable, and that we are seeing a technologically mediated change in what “privacy” means not unlike the change we saw at the beginning of the last century. In other words “it’s complicated.”
Doctorow seems to suggest that all we are getting from this deal is a trickle of random emotional rewards in the form of responses from our social network. Is this the same guy who invented Whuffie‽ Those connections are not mere cheap treats, but incredibly valuable connections. The are not provided by Facebook (or Twitter or Google, etc.) but they are brokered by them. Facebook is the eBay of social interaction, and so they take a small slice out of each deal. Can Facebook be disintermediated? Of course! But for now they are the disintermediator, making automatic the kinds of introductions and social maintenance that in earlier times was handled by a person.
In other words, if there is an exchange–and again, I’m not sure this idea of a trade adequately represents the complexity of the relationship–it isn’t at all clear that it is zero-sum, or that the user loses as much as she gains.
This does not at all obviate some of the solutions Doctorow suggests. Strategically lying to systems is, I think, and excellent way of mediating the ability of systems to tie together personal data in ways you would prefer do not happen. But I suspect that people will continue to cede personal data not just because the EULA is obscure, or because they poorly estimate future cost of sharing, but because they find it to be a good deal. Providing them the tools to be able to make these decisions well is good practice because arming citizens with both information and easy ways of making choices is essentially a Good Thing™. But I would be surprised if it led to less sharing. I expect just the opposite.