Any comments, suggestions or just looking for a chat about this subject? Don't hesitate and leave a comment on our comment section down below the article!
By Dionysios Demetis - Lecturer in Management Systems, University of Hull
I can still recall my surprise when a book by evolutionary biologist Peter Lawrence entitled “The making of a fly” came to be priced on Amazon at $23,698,655.93 (plus $3.99 shipping). While my colleagues around the world must have become rather depressed that an academic book could achieve such a feat, the steep price was actually the result of algorithms feeding off each other and spiraling out of control. It turns out, it wasn’t just sales staff being creative: algorithms were calling the shots.
This eye-catching example was spotted and corrected. But what if such algorithmic interference happens all the time, including in ways we don’t even notice? If our reality is becoming increasingly constructed by algorithms, where does this leave us humans?
Inspired by such examples, my colleague Prof Allen Lee and I recently set out to explore the deeper effects of algorithmic technology in a paperin the Journal of the Association for Information Systems. Our exploration led us to the conclusion that, over time, the roles of information technology and humans have been reversed. In the past, we humans used technology as a tool. Now, technology has advanced to the point where it is using and even controlling us.
We humans are not merely cut off from the decisions that machines are making for us but deeply affected by them in unpredictable ways. Instead of being central to the system of decisions that affects us, we are cast out in to its environment. We have progressively restricted our own decision-making capacity and allowed algorithms to take over. We have become artificial humans, or human artefacts, that are created, shaped and used by the technology.
Examples abound. In law, legal analysts are gradually being replaced by artificial intelligence, meaning the successful defence or prosecution of a case can rely partly on algorithms. Software has even been allowed to predict future criminals, ultimately controlling human freedom by shaping how parole is denied or granted to prisoners. In this way, the minds of judges are being shaped by decision-making mechanisms they cannot understand because of how complex the process is and how much data it involves.
In the job market, excessive reliance on technology has led some of the world’s biggest companies to filter CVs through software, meaning human recruiters will never even glance at some potential candidates’ details. Not only does this put people’s livelihoods at the mercy of machines, it can also build in hiring biases that the company had no desire to implement, as happened with Amazon.
In news, what’s known as automated sentiment analysis analyses positive and negative opinions about companies based on different web sources. In turn, these are being used by trading algorithms that make automated financial decisions, without humans having to actually read the news.
Unintended consequences
In fact, algorithms operating without human intervention now play a significant role in financial markets. For example, 85% of all trading in the foreign exchange markets is conducted by algorithms alone. The growing algorithmic arms race to develop ever more complex systems to compete in these markets means huge sums of money are being allocated according to the decisions of machines.
On a small scale, the people and companies that create these algorithms are able to affect what they do and how they do it. But because much of artificial intelligence involves programming software to figure out how to complete a task by itself, we often don’t know exactly what is behind the decision-making. As with all technology, this can lead to unintended consequences that may go far beyond anything the designers ever envisaged.
Take the 2010 “Flash Crash” of the Dow Jones Industrial Average Index. The action of algorithms helped create the index’s single biggest decline in its history, wiping nearly 9% off its value in minutes (although it regained most of this by the end of the day). A five-month investigationcould only suggest what sparked the downturn (and various other theories have been proposed).
But the algorithms that amplified the initial problems didn’t make a mistake. There wasn’t a bug in the programming. The behaviour emerged from the interaction of millions of algorithmic decisions playing off each other in unpredictable ways, following their own logic in a way that created a downward spiral for the market.
The conditions that made this possible occurred because, over the years, the people running the trading system had come to see human decisions as an obstacle to market efficiency. Back in 1987 when the US stock market fell by 22.61%, some Wall Street brokers simply stopped picking up their phones to avoid receiving their customers’ orders to sell stocks. This started a process that, as author Michael Lewis put it in his book Flash Boys, “has ended with computers entirely replacing the people”.
The financial world has invested millions in superfast cables and microwave communications to shave just milliseconds off the rate at which algorithms can transmit their instructions. When speed is so important, a human being that requires a massive 215 milliseconds to click a button is almost completely redundant. Our only remaining purpose is to reconfigure the algorithms each time the system of technological decisions fails.
As new boundaries are carved between humans and technology, we need to think carefully about where our extreme reliance on software is taking us. As human decisions are substituted by algorithmic ones, and we become tools whose lives are shaped by machines and their unintended consequences, we are setting ourselves up for technological domination. We need to decide, while we still can, what this means for us both as individuals and as a society.
Source: The Conversation
If you enjoy our selection of content please consider following Universal-Sci on social media: