How do we collect income tax from an automated workforce? What happens to our urban retail centres if we continue our shift to online shopping? Who will monitor what children will watch in schools while learning on augmented or virtual reality headsets?
The challenge of regulating technology will demand increasing levels of focus by government and multilateral organisations in the coming years and will be the subject of large-scale lobbying and huge debates about the role of the state and how far it can or should regulate.
In fact, some of the big players in the sector are as large financially as countries. In 2020 Amazon had total revenue of about $280 billion, about the same as the gross domestic product (GDP) of South Africa, while Apple’s $260 billion was the equivalent of the GDP of Finland and Alphabet (Google’s parent company) had revenue of $162 billion, greater than the GDP of Hungary.
Until recently, we were used to a democratic state that functioned in a vertical fashion. Government made decisions and was held to account by parliament and at elections. The decisions were often informed and interpreted by academics through evidence-based research, then relayed through traditional media who were used to fact-checking and generally trying to provide balance.
The development of the internet disrupted all of that. This was the new “democratic” space. All views could be expressed. Control was in the hands of the proletariat – well, in theory at least.
In reality, new technology in the Western world has resulted in power shifting into the hands of a small unelected group – almost all white, American men and generally with a libertarian bent and accountable to nobody bar shareholders. Increasingly, the Zuckerbergs, Bezoses and Musks look to express their views on how they would shape the world.
Given the power and influence that these men yield as a result of the incredible companies that they have created, their opinions and those of their companies will be useful in helping state actors determine how we need to regulate technology and online activity in the best interests of all our citizens and of the planet.
A co-operative approach will serve tech companies well. The state should look to encourage research and innovation as well as exploration of how technology can help solve national and global challenges. Regulation should not inhibit imagination.
The state also has a responsibility, however, to protect citizens and resources. Increasingly, in the years ahead, the state will need to engage with and regulate tech companies’ use of the data that it collects about us – and with machine learning, biometrics and the use of cyber surveillance, that becomes more complex. It will also need to constantly guard against the abuse of dominant market positions by the tech giants and it will need to ensure that algorithmic decision-making is carried out in a fair and transparent manner.
The state will also increasingly be challenged in the debate around freedom of expression and the need to be responsible online. How we do we deal with hate speech, online abuse and bullying? What about regulating fake news and disinformation?
There will be increasing challenges around cybersecurity and how the state also uses data it gathered itself. Meanwhile, we cannot continue to kick the issue of digital taxation down the road – if we are going to have public services, we need to find a way to fund them in a digital world.
In late 2018 Mark Zuckerberg issued the blueprint for Facebook’s content governance and enforcement. While it opened with a nod to cyber-libertarian principles, he made clear that Facebook’s “services must respect local content laws” and that he wanted government and industry to work together to determine “where the lines should be drawn between free expression and safety”.
He muses on how these “limits” to expression can be embedded – what content can be “distributed” or “blocked”, and who decides, who enforces, who holds to account.
This indeed is the challenge for the state. In deciding to ban Donald Trump from their platforms, Facebook and Twitter accepted that they are publishers. Facebook had already acknowledged its responsibility when it banned posts about Holocaust denial.
But should those companies self-regulate or should the state set down the standards? Should the state require of tech companies the same standards that they would require of traditional media publishers?
Legislators need to start to seriously consider all of these issues, and we need to inform ourselves on how technology is shaping everything we do. This decade will determine who makes those decisions.
Senator Malcolm Byrne is a member of the Oireachtas Media Committee, which is currently considering the Online Safety Bill