.

  • Written by Scott Shackelford, Associate Professor of Business Law and Ethics; Director, Ostrom Workshop Program on Cybersecurity and Internet Governance; Cybersecurity Program Chair, IU-Bloomington, Indiana University

If Boeing is allowed to certify that a crash-prone aircraft is safe[1], and Facebook can violate users’ privacy expectations[2], should companies and industries ever be allowed to police themselves[3]? The debate is heating up[4] particularly in the U.S. tech sector with growing calls to regulate – or even break up – the likes of Google, Apple and Amazon[5].

It turns out to be possible, at least sometimes, for companies and industries to govern themselves, while still protecting the public interest. Groundbreaking work by Nobel Prize-winning political economist Elinor Ostrom[6] and her husband Vincent found a solution to a classic economic quandary, in which people – and businesses – self-interestedly enrich themselves as quickly as possible with certain resources[7] including personal data[8], thinking little about the secondary costs they might be inflicting on others.

Companies' self-regulation doesn't have to be bad for the public Elinor Ostrom in 2009, when she won the Nobel Prize in Economics. Holger Motzkau/Wikimedia Commons, CC BY-SA[9][10]

As the director of the Ostrom Workshop Program on Cybersecurity and Internet Governance[11], I have been involved in numerous projects studying how to solve these sorts of problems when they arise, both online and offline. Most recently, my work[12] has looked at how to manage the massively interconnected world of sensors, computers and smart devices – what I and others[13] call the “internet of everything[14].”

I’ve found that there are ways companies can become leaders[15] by experimenting with business opportunities[16] and collaborating with peers, while still working with regulators to protect the public, including both in the air and in cyberspace.

Tragedy revisited

In a classic economic problem, called “the tragedy of the commons[17],” a parcel of grassland is made available for a community to graze its livestock. Everyone tries to get the most benefit from it – and as a result, the land is overgrazed. What started as a resource for everyone becomes of little use to anyone.

For many years, economists thought there were only two possible solutions. One was for the government to step in and limit how many people could graze their animals. The other was to split the land up among private owners who had exclusive use of it, and could sustainably manage it for their individual benefit.

The Ostroms, however, found a third way. In some cases, they revealed, self-organization can work well[18], especially when the various people and groups involve can communicate[19] effectively. They called it “polycentric governance,” because it allows regulation to come from more than just one central authority. Their work can help determine if and when companies can effectively regulate themselves – or whether it’s best for the government to step in.

A polycentric primer

The concept can seem complicated, but in practice it is increasingly popular, in federal programs and even as a goal for governing the internet[20].

Scholars such as Elinor Ostrom produced a broad swath of research over decades, looking at public schools and police department performance[21] in Midwestern U.S. cities, coastal overfishing, forest management in nations like Nepal, and even traffic jams[22] in New York City. They identified commonalities among all these studies[23], including[24] whether the group’s members can help set the rules by which their shared resources are governed, how much control they have over who gets to share it, how disputes are resolved, and how everyone’s use is monitored.

Nobel Prize winner Elinor Ostrom explains her work in a 2010 lecture.

All of these factors can help predict whether individuals or groups will successfully self-regulate, whether the challenge they’re facing is climate change[25], cybersecurity[26], or anything else. Trust is key[27], as Lin Ostrom said, and an excellent way to build trust is to let smaller groups make their own decisions[28].

Polycentric governance’s embrace of self-regulation involves relying on human ingenuity[29] and collaboration skills to solve difficult problems – while focusing on practical measures to address specific challenges.

Self-regulation does have its limits, though – as has been clear in the revelations about how the Federal Aviation Administration allowed Boeing[30] to certify the safety[31] of its own software[32]. Facebook has also been heavily criticized for failing to block an anonymous horde[33] of users across the globe[34] from manipulating people[35]’s political views[36].

Polycentric regulation is a departure from the idea of “keep it simple, stupid[37]” – rather, it is a call for engagement by numerous groups to grapple with the complexities of the real world.

Both Facebook and Boeing now need to convince themselves, their employees, investors, policymakers, users and customers that they can be trusted. Ostrom’s ideas suggest they could begin to do this by engaging with peers and industry groups to set rules and ensure they are enforced.

Governing the ‘internet of everything’

Another industry in serious need of better regulations[38] is the smart-device business, with tens of billions of connected devices around the world, and little to no concern[39] for user security or privacy.

Customers often buy the cheapest smart-home camera or digital sensor, without looking at competitors’[40] security and privacy protections. The results are predictable – hackers have hijacked thousands of internet-connected devices and used them to attack the physical network of the internet[41], take control of industrial[42] equipment, and spy on private citizens through their smartphones and baby monitors[43].

Companies' self-regulation doesn't have to be bad for the public Who else might be watching this view, over the internet? Saklakova/Shutterstock.com[44]

Some governments are starting to get involved. The state of California and the European Union are exploring laws that promote “reasonable[45]” security requirements, at least as a baseline. The EU is encouraging companies to band together to establish industry-wide codes of conduct[46].

Getting governance right

Effective self-governance may seem impossible in the “Internet of everything” because of the scale and variety of groups and industries involved, but polycentric governance does provide a useful lens through which to view these problems. Ostrom has asserted this approach may be the most flexible and adaptable way[47] to manage rapidly changing industries. It may also help avoid conflicting government regulations that risk stifling innovation in the name of protecting consumers without helping either cause.

But success is not certain. It requires active engagement by all parties, who must share a sense of responsibility to the customers and mutual trust in one another. That’s not easy to build in any community, let alone the dynamic tech industry[48].

Government involvement can help build bridges and solidify trust across the private sector, as happened with cybersecurity efforts from the National Institute for Standards and Technology[49]. Some states, like Ohio[50], are even rewarding firms for using appropriate self-regulation in their cybersecurity decision-making.

Polycentric governance can be flexible, adapting to new technologies more appropriately – and often more quickly – than pure governmental regulation. It also can be more efficient and cost-effective, though it’s not a cure for all regulatory ills. And it’s important to note that regulation can spur innovation as well as protect consumers, especially when the rules are simple[51] and outcome focused.

Consider the North American Electric Reliability Council. That organization was originally created as a group of companies that came together voluntarily in an effort to protect against blackouts. NERC standards, however, were eventually made legally enforceable in the aftermath of the Northeast blackout of 2003[52]. They are an example of an organic code of conduct that was voluntarily adopted and subsequently reinforced by government, consistent with professor Ostrom’s ideas. Ideally, it should not require such a crisis to spur this process forward.

Ultimately, what’s needed – and what professor Ostrom and her colleagues and successors have called for – is more experimentation and less theorizing. As the 10-year anniversary of Ostrom’s Nobel Prize approaches, I believe it is time to put her insights to work, offering industries the opportunity to self-regulate where appropriate while leaving the door open for the possibility of government action, including antitrust enforcement, to protect the public and promote cyber peace[53].

[ Thanks for reading! We can send you The Conversation’s stories every day in an informative email. Sign up today.[54] ]

References

  1. ^ certify that a crash-prone aircraft is safe (www.washingtonpost.com)
  2. ^ violate users’ privacy expectations (www.nytimes.com)
  3. ^ allowed to police themselves (thehill.com)
  4. ^ heating up (www.reuters.com)
  5. ^ Google, Apple and Amazon (www.marketwatch.com)
  6. ^ Nobel Prize-winning political economist Elinor Ostrom (www.aei.org)
  7. ^ certain resources (doi.org)
  8. ^ personal data (bierdoctor.com)
  9. ^ Holger Motzkau/Wikimedia Commons (commons.wikimedia.org)
  10. ^ CC BY-SA (creativecommons.org)
  11. ^ Ostrom Workshop Program on Cybersecurity and Internet Governance (ostromworkshop.indiana.edu)
  12. ^ work (illinoislawreview.org)
  13. ^ and others (www.cisco.com)
  14. ^ internet of everything (dx.doi.org)
  15. ^ companies can become leaders (doi.org)
  16. ^ experimenting with business opportunities (ssrn.com)
  17. ^ the tragedy of the commons (en.wikipedia.org)
  18. ^ self-organization can work well (www.aei.org)
  19. ^ communicate (www.iucn.org)
  20. ^ governing the internet (www.washingtonpost.com)
  21. ^ public schools and police department performance (books.google.hr)
  22. ^ traffic jams (ir.lawnet.fordham.edu)
  23. ^ commonalities among all these studies (dx.doi.org)
  24. ^ including (www.nobelprize.org)
  25. ^ climate change (ssrn.com)
  26. ^ cybersecurity (digitalcommons.wcl.american.edu)
  27. ^ Trust is key (escotet.org)
  28. ^ smaller groups make their own decisions (dx.doi.org)
  29. ^ human ingenuity (www.ubs.com)
  30. ^ the Federal Aviation Administration allowed Boeing (www.businessinsider.com)
  31. ^ certify the safety (www.businessinsider.com)
  32. ^ of its own software (arstechnica.com)
  33. ^ anonymous horde (www.cbsnews.com)
  34. ^ users across the globe (www.wired.com)
  35. ^ manipulating people (theconversation.com)
  36. ^ political views (www.nytimes.com)
  37. ^ keep it simple, stupid (www.dallasnews.com)
  38. ^ serious need of better regulations (www.forbes.com)
  39. ^ concern (www.csmonitor.com)
  40. ^ without looking at competitors’ (www.schneier.com)
  41. ^ physical network of the internet (www.forbes.com)
  42. ^ industrial (www.bbc.com)
  43. ^ baby monitors (www.marketwatch.com)
  44. ^ Saklakova/Shutterstock.com (www.shutterstock.com)
  45. ^ reasonable (www.natlawreview.com)
  46. ^ industry-wide codes of conduct (iapp.org)
  47. ^ the most flexible and adaptable way (dx.doi.org)
  48. ^ dynamic tech industry (www.digitalistmag.com)
  49. ^ National Institute for Standards and Technology (papers.ssrn.com)
  50. ^ Ohio (www.techrepublic.com)
  51. ^ when the rules are simple (www.mckinsey.com)
  52. ^ Northeast blackout of 2003 (www.eenews.net)
  53. ^ cyber peace (ndias.nd.edu)
  54. ^ Thanks for reading! We can send you The Conversation’s stories every day in an informative email. Sign up today. (theconversation.com)

Authors: Scott Shackelford, Associate Professor of Business Law and Ethics; Director, Ostrom Workshop Program on Cybersecurity and Internet Governance; Cybersecurity Program Chair, IU-Bloomington, Indiana University

Read more http://theconversation.com/companies-self-regulation-doesnt-have-to-be-bad-for-the-public-117565

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more