OPINION: We are walking into a digital dystopia

The COVID-19 pandemic is accelerating our reliance on data-harvesting technologies

online-id-privacy-data-protection-social-media-Shutterstock Representational image | Shutterstock

Comedian Bill Maher, on the TV show Real Time, notably said that tech tycoons “need to stop pretending that they’re friendly nerd gods building a better world. They have to admit they’re just tobacco farmers in T-shirts selling an addictive product to children.”

With the COVID-19 pandemic, which left billions confined to their homes, tech tycoons gained unprecedented access to our daily lives. We spend countless lockdown hours glued to our mobile phones, tablets and computers; our kids attend classes online while we attend meetings on Zoom. Apps facilitate this process and as we get used to this ‘new normal’, the ‘appmosphere’ becomes a new world order.

We are led to believe that it is a one-way street, where we stand only to gain and have nothing to lose from this technology invasion. This cannot be farther from the truth.

Data-driven colonization is pervasive and poses multiple dangers to privacy and individual freedom. Even a seemingly innocent technology like personalization of search results could provide powerful armour for technological forces to mould us in ways we may never consent to voluntarily.

Data-driven colonisation

Kiran S, IPS, Superintendent of Police with the CBI Kiran S, IPS, Superintendent of Police with the CBI

It is extremely critical that the technology should be committed to the privacy of its users, even as it tailors their experiences individually. Should I choose to experience any digital platform, I should always have the exclusive right to the knowledge gathered by the platforms from my data, as well as the exclusive right to decide how such knowledge shall be utilized. Today, this knowledge of your experiences, psychology, attitudes and behaviour are subject to the unilateral claims of business ventures that thrive on this knowledge economy.

Shoshanna Zuboff, in her seminal book Surveillance Capitalism explains that our personal data, encompassing our experiences, behaviours and attitudes, serves as raw material for a handful of corporations who decide, unfairly, the end-use of our experiences. She rightly claims that we are raw materials—and we are being monetised under the garb of being provided as a tailored digital experience.

Amazon knows my buying preferences, what I read, what movies and TV shows I watch; Apple and Samsung know my travel details; Facebook knows my past, present—and possibly my future—and knows the same for my friends. In the name of curating a better experience, these technological behemoths may know more about me than I do about myself. For tech companies, I am a person of interest (PoI).

It is a new age colony, the difference being, you don’t even know that you are being colonised and your important lives are becoming slaves to technology.

The unlikely evils of personalisation

Deepak P, Assistant Professor in Computer Science at Queen's University, Belfast Deepak P, Assistant Professor in Computer Science at Queen's University, Belfast

A few months back, we were in a reunion of old friends, and we were talking about how our lives had changed over the decades. It was not long until one person said, “I have given up reading newspapers and magazines. Google News and such recommendation algorithms have greatly simplified having to choose the news I want to read. There is simply no going back.”

As one might guess, this prompted several discussions on the topic of the embarrassing mistakes that recommendation algorithms can make. “Last year I applied for a job in Kolkata to simply test my value in the market, and thereon, these algorithms have been treating me as a Kolkata resident—what dumbos!” said one friend.

While such algorithm mistakes can make for nice reunion conversations, these can largely be fixed by providing feedback to the recommendation engine: Some results allow you to flag them and pick a “not interested in stories like this” option, prompting the algorithm to avoid showing you such stories again.

While ostensibly designed for personalisation of content, these custom-preferences can have a subtle and unapparent effect on us that is also serious and harmful to our existence as multifaceted human beings—they push us deeply and firmly into a corner of the news spectrum.

Let us consider some examples: If I set the news location preference to my hometown, the algorithm would decide not to show me any news about a faraway city, however interesting it may be. If I tell the algorithm just once that I do not fancy football, it would never risk showing me football news again. That is not exactly what I would like. By saying I am not quite a football fan, I only mean that much; I don’t mean that I hate football. If Lionel Messi is visiting to play in a friendly match, I may still like to know about it. But, the algorithm does not get such nuance. For it, it is simply not worth the risk to show me football news.

After all, its ‘unique selling point’ is efficiency with respect to newspapers. It is trying to automate one particular manual task that I undertake while reading a newspaper, that of sifting through headlines and identifying which article to read. But consuming news is not like having an ice cream or a samosa; we are influenced by the act in myriad ways. Familiarity often breeds attraction. Consequently, algorithms change us over time; it is not a one-way street. This happens across all topics. A slightly left-leaning person is pushed firmly to the left in the political interests, whereas a slight right preference evolves to the far-right. The algorithm likes it since it also makes the algorithm’s job simple; it is far easier to predict what a far-right person would like than what a centrist would fancy.

These algorithms shape us without us realising it. This is often called the ‘echo chamber effect’, where we soon reach a stage where we only hear voices similar to our own.

You may think that one could just revert to reading newspapers to get a pluralism of perspectives. But, it is not so easy to tune out. Your digital footprint is everywhere. Netflix and YouTube know your movie preferences and nudge you towards them. Your online grocer likely uses a recommendation algorithm to show you offers based on your previous shopping preferences. Nor is your cuisine spared from being treated as predictable: Isn’t it more ‘efficient’ to simply repeat our last Zomato order, via the conveniently-placed button, rather than search for dishes all over again?

Our digital footprints, which have undoubtedly become extremely vivid and deep during COVID-19 lockdown, are scattered across many digital service providers. They constantly use these “resources” to nudge us in directions they find easier for themselves to operate in. We are increasingly becoming the digital clay which a number of highly dexterous algorithmic hands are moulding in accordance with their own separate incentives.

One of the final topics of the reunion discussion was about the looming AI singularity; we discussed several options of how we could prevent humanoids robot from controlling humans. Throughout this discussion too, ironically, many of us were pulling out digital devices from our pockets and providing periodic fodder for AI-driven algorithms for them to control us even better.

Is the digital dystopia inevitable?

Is there no way out of this trap? Of course, there is, but it is not easy. It involves making hard choices about where we draw the lines—the first step is increased awareness of what digital technology does to us. A heightened awareness would help us understand that blind faith in technology would eventually undermine ourselves. As the digital colonisers, in their quest for expansionism, try to extend the breadth and depth of their control over our grey matter each day, let us make sure that we are firmly in control of our friendships, desires, values and thus, our lives. 

The opinions expressed in this article are those of the author's and do not purport to reflect the opinions or views of THE WEEK

Kiran S. is an IPS officer presently working as Superintendent of Police in the CBI. 

Deepak P. is an Assistant Professor in Computer Science at Queen's University Belfast. His interests are in AI and data ethics.