If the open Internet is an essential precondition for democracy, should governments or corporations be allowed to restrict it? This is the question at the heart of my book ‘The Closing of the Net’ and it discusses the backdrop to the political controversies of today around such issues as fake news, terrorism content online, and mis-use of data – controversies that result in calls for ‘responsibility’ by online companies. The book argues that any regulation of these companies must enshrine public interest criteria, which must balance the competing rights at stake.
‘The Closing of the Net’ explores the back-story to today’s debates that have blown up around for example, Cambridge Analytica and the allegedly illegitimate uses of people’s data to skew both the election of President Trump and the Brexit referendum. Especially, I relate the back-story to the new European law on data protection, known as the General Data Protection Regulation (GDPR) which businesses are now being asked to implement. I also discuss the political impact of the Snowden revelations on mass surveillance. Both of these factors raised red flags about misuse of personal data that is collected by online companies.
Why protect online content and people’s rights to communicate online?
We live in an age when ordinary people live out their lives online. Messaging and social media platforms have replaced the phone as the primary means of communication. ‘Hockey mums’ message their children’s match scores to family members in real time. They jostle with their daughters for access to the smartphone. I was personally just in a situation where a family funeral was co-ordinated over social media messaging. And let’s not forget how social media has been able to re-unite old friends and family in ways that were simply impossible just two decades ago.
Freedom of expression, without interference from the State is the very foundation of our modern democracy. Social media has become the communications vehicle of choice for charity campaigners, and NGOs, who function in a democracy to raise opposition voices. Even academic exchanges now take place as much on Twitter as in conferences. Indeed, Twitter is the platform of choice for interaction between the world of politics and constituents.
Online is increasingly the preferred route for commerce. Freedom to innovate without permission from the network or platform has been the enabler for e-commerce, online banking and entertainment systems, without which millions of people today could not manage.
When online media has penetrated so deeply into the lives of individuals, it becomes imperative to safeguard free expression online. ‘The Closing of the Net’ argues that States have a legal duty to safeguard it. Yes there are problems – there is what one might loosely term ‘bad’ content. This could be illegal content, propaganda, or socially unacceptable content. There is a growing list of types of content that are perceived to fall into the ‘bad’ category. Illegal content includes certain types of pornography, or hate speech or incitement to terrorist acts as defined by law. But the other two categories are problematic because they are not defined by law, and are subject to individual interpretation. Moreover, there is much content that is harmless, that is legal and that informs, educates and entertains.
How to deal with ‘bad’ content?
The question is how do we protect the open and free Internet and deal with the ‘bad’ content? My book, ‘The Closing of the Net’ deals with the conflicts that arose over copyright, which created the precedent for today’s political arguments over terrorist content, hate speech and other matters such as cyber-bullying and fake news. These are the arguments that arise now over the big technology platforms such as Facebook, and Google, and how they should deal with content. The most salient example today is the scandal around Cambridge Analytica and the alleged mis-use of personal data to create fake news stories in an attempt to skew the outcomes of democratic elections.
The State’s duty is to balance the competing interests in any conflict that arises. The government’s answer to date has been ‘self-regulation, where platforms are asked to deal with the ‘bad’ content themselves, and to do so under their own terms and conditions. Their Ts and Cs are not the law and this is a key differentiation that should be borne in mind. social media companies operate against privately determined criteria, which do not necessarily match what the law says. It should also be borne in mind this is content uploaded by their users not their own content, and therefore they are arguably regulating their users and not themselves, as the term self-regulation’ would suggest.
Self-regulation typically means there is no transparency of criteria for content restriction, of action, of scope, and no redress for users. This is not about law but about corporate rules over a public space. It is highly problematic. I argue in ‘The Closing of the Net’ that such measures online serve to entrench the monopoly of the platforms, hence why they agree to do it.
The threat to democracy
In my book I argue that the threat to democracy subsists in the narrowing of the prism by which we access knowledge, culture, information and through which we exercise our right to free speech coupled with monopolistic control of access. I propose a notion of the fingertap of desire. We tap the screen and get knowledge, and it all comes via a tiny 6 x 5 inch format, smaller than the traditional postcard. The big technology companies, together with the network providers, control access to that tiny screen and they have the ability to allow or deny. I argue that this is the greatest source of their power. They can choose what we see or don’t see, and it is all done through automated processes powered by algorithms. I suggest in the book that if we are not careful, we will have the greatest censorship machine ever invented, controlling the access to our tiny screens. It is important that whoever controls it can be made publicly accountable for any restrictions or manipulation of what people can see or do.
I explore in ‘The Closing of the Net’ how the companies that have that control have become political actors. In particular, I take the example of copyright enforcement and how entertainment and music companies have acted politically to protect their interests online.
My approach is based on Lawrence Lessig’s Code is law theory. Control of the code is about control of the system and its rules. Code also interacts with law, and can be regulated by law. Likewise, law can be influenced by code and can respond to changes in it. Two other forces that interact with code and law are markets and social norms. The four forces interact with each other to create the outcome we experience. This four-way model allows us to understand the different levers that exist to exercise control over the system and in order to protect democracy.
How do the Internet companies hold power?
In ‘The Closing of the Net’ I make the case that Internet platforms hold structural power. They can determine the way things should be done – the way information, culture, knowledge is accessed and the way that democratic speech is exercised. They have power to accede or deny access. Deny at the point of publication /upload (which networks cannot do – networks can only render invisible). Power to deny is stronger. This is how they control the fingertap of desire. They can open and shut gates to this tiny screen. As such, they force political choices, such as how to handle the ‘bad’ content. They alone control the structures for deleting content or other policing actions. Access to user data for profiling and other analytics is also a form of structural power. They are being asked to use structural power to restrict content – take it down, stop the wrong people from advertising etc. We have to ensure this power itself is put under publicly accountable controls.
Could we have predicted the mis-use scandals around fake news and influencing election outcomes?
Could we have foreseen the scandals over fake news or online vote rigging? I would argue ‘yes’. We did not know what form it would take but we have had several warnings and alarm bells about the possible harms contained within the technology if allowed to be mis-used. I discuss these in my book.
If you want more – please buy ‘The Closing of the Net’ – available here.