The internet is designed for corporations, not people
- Written by Gordon Hull, Associate Professor of Philosophy, Director of Center for Professional and Applied Ethics, University of North Carolina – Charlotte
Urban spaces are often designed[1] to be subtly hostile to certain uses. Think about, for example, the seat partitions on bus terminal benches that make it harder for the homeless to sleep there or the decorative leaves on railings in front of office buildings and on university campuses that serve to make skateboarding dangerous.
Scholars call this “hostile urban architecture.”[2]
When a few weeks ago, news broke that Facebook shared millions of users’ private information[3] with Cambridge Analytica, which then used it for political purposes, I saw the parallels.
As a scholar[4] of the social and political implications of technology, I would argue the internet is designed to be hostile to the people who use it. I call it a “hostile information architecture.”
The depth of the privacy problem
Let’s start with Facebook and privacy. Sites like Facebook supposedly protect user privacy[5] with a practice called “notice and consent.” This practice is the business model of the internet. Sites fund their “free” services by collecting information[6] about users and selling that information[7] to others.
Of course, these sites present privacy policies to users to notify them how their information will be used. They ask users to “click here to accept” them. The problem is that these policies are nearly impossible to understand[8]. As a result, no one knows what they have consented to.
David M G/Shutterstock.com[9]But that’s not all. The problem runs deeper than that. Legal scholar Katherine Strandburg[10] has pointed out[11] that the entire metaphor of a market where consumers trade privacy for services is deeply flawed. It is advertisers, not users, who are Facebook’s real customers. Users have no idea what they are “paying” and have no possible way of knowing the value of their information. Users are also unable to protect themselves, as opting out of sites like Facebook and Google isn’t viable for most.
As I have argued in an academic journal[12], the main thing notice and consent does is subtly communicate to users the idea that their privacy is a commodity that they trade for services. It certainly does not protect their privacy. It also hurts innocent people.
It’s not just that most of those whose data made it to Cambridge Analytica did not consent to that transfer, but it’s also the case that Facebook has vast troves of data even on those who refuse to use[13] its services.
Not unrelated, news broke recently that thousands of Google Play apps – probably illegally – track children[14]. We can expect stories like this to surface again and again. The truth is there is too much money in personal information.
Facebook’s hostile information architecture
Facebook’s privacy problem is both a symptom of its hostile information architecture and an excellent example of it.
Several years ago, two of my colleagues, Celine Latulipe[15] and Heather Lipford[16] and I published an article[17] in which we argued that many of Facebook’s privacy issues were problems of design.
Our argument was that these design elements violated ordinary people’s expectations of how information about them would travel. For example, Facebook allowed apps to collect information on users’ friends (this is why the Cambridge Analytica problem impacted so many people). But no one who signed up for, say, tennis lessons would think that the tennis club should have access to personal information about their friends.
The details have changed since then, but they aren’t better. Facebook still makes it very hard for you to control how much data it gets about you. Everything about the Facebook experience is very carefully curated. Users who don’t like it have little choice, as the site has a virtual monopoly on social networking.
The internet’s hostile architecture
Lawrence Lessig[18], one of the leading legal scholars of the internet, wrote a pioneering book[19] that discussed the similarities between architecture in physical space and things like interfaces online. Both can regulate what you do in a place, as anyone who has tried to access content behind a “paywall” immediately understands.
In the present context, the idea that the internet is at least somewhat of a public space where one can meet friends, listen to music, go shopping, and get news is a complete myth.
Unless you make money by trafficking in user data, internet architecture is hostile from top to bottom. That the business model of companies like Facebook is based on targeted advertising is only part of the story. Here are some other examples of how the internet is designed by and for companies, not the public.
Consider first that the internet in the U.S. isn’t actually, in any legal sense, a public space. The hardware is all owned by telecom companies, and they have successfully lobbied[20] 20 state legislatures to ban efforts by cities to build out public broadband.
The Federal Trade Commission has recently declared its intention to undo Obama-era net neutrality[21] rules. The rollback, which treats the internet as a vehicle for delivering paid content[22], would allow ISPs like the telecom companies to deliver their own content, or paid content, faster than (or instead of) everyone else’s. So advertising could come faster, and your blog about free speech could take a very long time to load.
Copyright law gives sites like YouTube very strong legal incentives to unilaterally and automatically, without user consent, take down[23] material that someone says is infringing, and very few incentives to restore it, even if it is legitimate. These takedown provisions include content that would be protected free speech in other contexts; both President Barack Obama and Senator John McCain campaigns had material removed from their YouTube channels in the weeks prior to the 2008 elections.
Federal requirements that content-filtering software is installed in public libraries that receive federal funding regulate[24] the only internet the poor can access. These privately produced programs are designed to block access to pornography, but they tend to sweep up other material, particularly if it is about LGBTQ+ issues. Worse, the companies that make these programs are under no obligation to disclose how or what their software blocks.
In short, the internet has enough seat dividers and decorative leaves to be a hostile architecture. This time, though, it’s a hostile information architecture.
A broader conversation
AP Photo/Carolyn KasterSo let’s do have a conversation about Facebook. But let’s make that part of a bigger conversation about information architecture, and how much of it should be ceded to corporate interests.
As the celebrated urban theorist and activist Jane Jacobs[25] famously wrote[26], the best public spaces involve lots of side streets and unplanned interactions. Our current information architecture, like our heavily surveilled urban architecture, is going in the opposite direction.
References
- ^ designed (www.theatlantic.com)
- ^ “hostile urban architecture.” (www.theatlantic.com)
- ^ Facebook shared millions of users’ private information (www.nytimes.com)
- ^ scholar (scholar.google.com)
- ^ supposedly protect user privacy (theconversation.com)
- ^ collecting information (www.cnn.com)
- ^ selling that information (www.nytimes.com)
- ^ nearly impossible to understand (theconversation.com)
- ^ David M G/Shutterstock.com (www.shutterstock.com)
- ^ Katherine Strandburg (its.law.nyu.edu)
- ^ pointed out (chicagounbound.uchicago.edu)
- ^ argued in an academic journal (ssrn.com)
- ^ refuse to use (www.aclu.org)
- ^ track children (blogs.edweek.org)
- ^ Celine Latulipe (www.celinelatulipe.com)
- ^ Heather Lipford (webpages.uncc.edu)
- ^ an article (ssrn.com)
- ^ Lawrence Lessig (www.lessig.org)
- ^ wrote a pioneering book (codev2.cc)
- ^ successfully lobbied (arstechnica.com)
- ^ net neutrality (theconversation.com)
- ^ vehicle for delivering paid content (ssrn.com)
- ^ unilaterally and automatically, without user consent, take down (ssrn.com)
- ^ regulate (ssrn.com)
- ^ Jane Jacobs (www.pps.org)
- ^ famously wrote (books.google.com)
Authors: Gordon Hull, Associate Professor of Philosophy, Director of Center for Professional and Applied Ethics, University of North Carolina – Charlotte
Read more http://theconversation.com/the-internet-is-designed-for-corporations-not-people-95030