In 2008, a formal proposal was presented to the Stratigraphy Commission of the Geological Society of London, asking to make the Anthropocene, the age of man, a formal geological epoch. However, with the exponential growth of computing power and the increases in algorithmic intelligence, it seems far more likely that we are entering an age where not mankind, but its cybernetic and algorithmic progeny takes center-stage.
The economic idea of the rational man has long been identified as fiction, we are bounded in our rationality as Herbert Alexander Simon wrote in Models of Man; but by learning to understand these boundaries and our environment ever better we are overcoming these limitations through the use of technology. Over the last several decades, our progress in technology as well as our increase in understanding and analysis have been overwhelming; they have liberated women and children, have defeated or significantly diminished severe illnesses. Bill Gates predicts that by the mid-2030s there will be no countries left in the world that are considered poor by today’s standards. And while there is much uncertainty, the trend visible in the data is on his side.
God’s in his heaven, all’s right with the world?
While the positive effects of this development are apparent, it is nevertheless important to consider the consequences of fast-paced progress in depth. While it may have been possible to deliberate about the consequences of the introduction of the railway for several years, politics and public debate can hardly keep up with the advances in technology today, let alone tomorrow. The progress in the algorithmic analysis of environmental and behavioural data by governments, corporations and individuals increasingly present society at large with a fait accompli and we must equip our societal systems to mitigate this fact.
We certainly benefit, individually and collectively, from the nudges and hints made possible by the extensive analysis of big data. Still, this development must also give us pause, for it may lead us down a path where, ultimately, the primacy of individual freedom is yielded to cybernetic algorithms and, eventually, forms of artificial intelligence: If everyone tracks their caloric intake and keeps fit, can we really still justify eating something we would like to eat but that isn’t compatible with our caloric budget? If our smartphones scold us for not exercising in two days and use gamification to bribe our minds into feeling great when we eventually do, does this not compromise our freedom?
Is it Big Data, or Artificial Intelligence we should be worried about?
When algorithms and artificial intelligences increasingly tell us which action to take on a personal, national and global level and without anyone really understanding their reasoning in all its detail, does this not clash with our ideals of a liberal democratic society in which everyone has the freedom to pursue happiness by whichever means suit him?
For the longest time, people have been afraid of the specter of Orwellian surveillance, but it is increasingly the decidedly Huxleyan idea of a society of leisure which preconditions each and everyone with active, context-sensitive, sanctions that we face — and that, at this pivotal point in history, may change humanity forever. As the noted physicist Stephen Hawking put it in a recent article: with the rise of artificial intelligence, which will base its decision making chiefly on the advances we are making in big data analysis, humanity is facing “potentially the best or worst thing to happen to [it] in history”. And thus far, we are not according it the thoughtfulness it requires.
Ubiquitous digital technology and the capacity to store, analyse and algorithmically act upon practically infinite amounts of data helps us solve problems, but it may also, eventually, relegate us into a state Kant would have recognized: the selbstverschuldete Unmündigkeit from which sapere aude emancipated us. We lived in ignorance before we learned how to understand the world, now technologies of our own devising may soon understand it better than we ever could. The consequences are profound and difficult to predict.
Consider that when the Spinning Jenny was invented, it was demonized by the Luddites in England. It was seen as a tool that took power from them and enslave them. In France, only a little time later, the advances Bouchon, Vaucanson, Jacquard and others made in developing programmable looms were greeted with enthusiasm. The workers saw them as a means for their empowerment and by cooperating with the inventors also gained the power to shape the path of progress. The Luddites failed to achieve anything meaningful. When we look back on the invention of the printing press, we see it as a revolutionary device for spreading the promethean gift of knowledge, but it initially spread Prometheus’ more literal gift of fire by inciting religious wars.
This is not to be taken as alarmism, merely a reminder to carefully study history and heed its lessons: while we now consider them as almost universally beneficial, previous substantial advances in technology were regularly accompanied by strife and unrest and it is only through inclusive discourse and democratic compromise that society can influence which path is taken. Today, more than ever, we are in a position to responsibly chaperone change, thereby preventing or mitigating potential negative effects while enhancing positive developments. If we wish to build a better tomorrow in which democracy and the ideal of a liberal society still have the force that they have today, we will need to bring this discussion from the ivory towers into society at large and recognize that many of our current notions of privacy and what it means to be human will have to be questioned. Otherwise, without knowing it, and feeling ever more empowered, by increasing our dependence on big data, artificial intelligence and technology which most of us no longer understand, we may be nudged onto a Road to Serfdom so beguiling, that Friedrich August von Hayek only faintly dreamed of it when he wrote his seminal work of the same name.