I’m going to be presenting at The Network: Towards Unity for Health conference in Fortaleza, Brazil later this year and so my reading has largely been focused around what I’m thinking of talking about. I haven’t formalised the structure of the presentation yet but will probably publish it here as I figure out what I want to do.
What is public? (Anil Dash)
Public is not simply defined. Public is not just what can be viewed by others, but a fragile set of social conventions about what behaviors are acceptable and appropriate. There are people determined to profit from expanding and redefining what’s public, working to treat nearly everything we say or do as a public work they can exploit. They may succeed before we even put up a fight.
What if the public speech on Facebook and Twitter is more akin to a conversation happening between two people at a restaurant? Or two people speaking quietly at home, albeit near a window that happens to be open to the street? And if more than a billion people are active on various social networking applications each week, are we saying that there are now a billion public figures? When did we agree to let media redefine everyone who uses social networks as fair game, with no recourse and no framework for consent?
The business models of some of the most powerful forces in society are increasingly dependent on our complicity in making our conversations, our creations, and our communities public whenever they can exploit them. Given that reality, understanding exactly what “public” means is the only way to protect the public’s interest.
What is privacy? (danah boyd): Think of this piece as an extension of the piece above, where boyd unpacks the notion of privacy in the context of “public” that Anil Dash wrote about.
The very practice of privacy is all about control in a world in which we fully know that we never have control. Our friends might betray us, our spaces might be surveilled, our expectations might be shattered. But this is why achieving privacy is desirable. People want to be *in* public, but that doesn’t necessarily mean that they want to *be* public. There’s a huge difference between the two. As a result of the destabilization of social spaces, what’s shocking is how frequently teens have shifted from trying to restrict access to content to trying to restrict access to meaning. They get, at a gut level, that they can’t have control over who sees what’s said, but they hope to instead have control over how that information is interpreted. And thus, we see our collective imagination of what’s private colliding smack into the notion of public. They are less of a continuum and more of an entwined hairball, reshaping and influencing each other in significant ways.
When powerful actors, be they companies or governmental agencies, use the excuse of something being “public” to defend their right to look, they systematically assert control over people in a way that fundamentally disenfranchises them. This is the very essence of power and the core of why concepts like “surveillance” matter. Surveillance isn’t simply the all-being all-looking eye. It’s a mechanism by which systems of power assert their power. And it is why people grow angry and distrustful. Why they throw fits over beingexperimented on. Why they cry privacy foul even when the content being discussed is, for all intents and purposes, public.
Are Google making money from your exercise data?: Exercise activity as digital labour? (Chris Till)
In this article I made a suggestion of what I believe to be a previously untheorised consequence of the large scale tracking of exercise activity by self-tracking devices such as Fitbit and Nike+ and related apps on smart phones.
My suggestion was that this kind of tracking is potentially transforming exercise activity into labour. By synthesising existing analyses of self-tracking and quantified self activities with theories of digital labour I proposed that by converting the physical movement of bodies during exercise into standardised measures which can be analysed, compared and accumulated on a large scale they are made amenable to the extraction of value.
Another study conducted by web analytics and privacy group Evidon commissioned by the Financial Times found that data was shared with nearly seventy companies by the twenty most popular health and fitness apps and some of these companies were advertising firms (see graphic below). Although the headline rhetoric often presents a concern for the privacy of users an analysis of the privacy policies of many of the most popular health and fitness tracking apps and devices most allowing “non-personally identifiable information” to be shared and many were ambiguous on whether they permitted sharing of user data.
Wearer be warned: Your fitness data may be sold or used against you (Deborah Lupton)
When self-tracking was an activity limited to jotting notes down in a paper journal or diary, this information could easily be kept private. No-one else could know the finer details of one’s sleeping or bowel habits, sex life, diet, heart rate, body weight or efforts to give up smoking.
However when people use digital devices that connect to computing cloud storage facilities or developers’ data archives, the user no longer owns or control their own data. This personal and often very private information becomes part of vast digital data collections that are increasingly used by actors and agents in many different social domains.
Personal health and medical data is now used for much more than just gathering information on oneself for one’s own private reasons. This information is a commodity that can be used for commercial, managerial and governmental purposes and on-sold to third parties.
Every little byte counts (Evgeny Morozov)
When Big Data allows us to automate decision-making, or at least contextualize every decision with a trove of data about its likely consequences, we need to grapple with the question of just how much we want to leave to chance and to those simple, low-tech, unautomated options of democratic contestation and deliberation.
As we gain the capacity to predict and even pre-empt crises, we risk eliminating the very kinds of experimental behaviors that have been conducive to social innovation. Occasionally, someone needs to break the law, engage in an act of civil disobedience or simply refuse to do something the rest of us find useful. The temptation of Big Data lies precisely in allowing us to identify and make such loopholes unavailable to deviants, who might actually be dissidents in disguise.