By surrendering information willingly to corporations and governments, we promote the totalitarian state that Aristotle warned against.
Many centuries ago, Greek philosopher Plato came up with a theory called the ‘Myth of the Metal’. According to it, God created man unequally, with different metals—gold, silver, iron or bronze—assuming hierarchy in the souls of men. So, a man with gold would be destined to become a philosopher king, while one with bronze or iron would be relegated to the role of a common worker. This is why Plato advocated a totalitarian state where philosopher kings would have absolute authority over citizens, since they were naturally better. Aristotle, Plato’s student, disagreed. He argued that such a state would impede individual thought and creativity. “It is the mark of an educated mind to be able to entertain a thought without accepting it,” he said.
By upholding the right to privacy as a fundamental right, it is this right that the Supreme Court has protected, in the process reawakening the spirit of democracy. Because, at its heart, the right to privacy is about choice. By giving an individual the right to decide her own food preferences, the right to take her life, the right to sexual orientation or the right to abort a child, you empower her by giving her the freedom of choice. In other words, you infuse democracy with dynamism.
But, that choice comes with responsibility. By surrendering information willingly to corporations and governments, we inadvertently promote that totalitarian state that Plato advocated and Aristotle warned against.
Syed Areesh Ahmad, who teaches political philosophy at Ramjas College, University of Delhi, talks of how modern states are moving towards the “modern surveillance society” that 20th century French philosopher John-Michel Foucault warned against. “We are a society that is increasingly getting automated,” he said. “But, while the government has become more efficient, the scope of individual rights have started shrinking.”
Amar Lalwani, who heads research and development at Funtoot, a company that uses big data and machine learning to personalise education for every child, said, “Large amounts of data are now available to companies and governments. Interpreting that data using different algorithms is relatively easy. So, suppose you watch porn online, and you’re accused in a sexual harassment case, then it can be used against you. Or if schools have a database categorising students based on their personality, such information can later be used by employers to deny jobs.”
Both Ahmad and Lalwani emphasise the importance of individuals being aware of how their data can be misused. “There is a moral obligation on the part of individuals to protect the right to privacy, both their own and that of others,” said Ahmad. “They need to be vigilant all the time.”
This is not some kind of cataclysmic, fictional future that experts are scared might come into existence one day. It is already here. Take the case of fitness-tracking apps. Recently, Fitbit announced that its Fitbit Charge 2 device is now part of UnitedHealthcare Motion’s wellness programme, which is powered by Qualcomm Life, a company that processes data for medical insurance companies. Users of Fitbit Charge 2 taking part in the programme can earn up to $1,500 in Health Savings Account or Health Reimbursement Account.
But, little do users know that tracking the medical record of people through these fitness devices could arm insurance companies with the power to deny coverage. The companies can come up with algorithms that link the information provided by Fitbit with insurance claims. We accept payment for volunteering information that can be used against us.
“Such trade-offs could lead to dystopian situations in the future,” said Pranesh Prakash, policy director at Centre for Internet and Society, a non-profit organisation. “We need to analyse what the consequences of such choices could be in the long-term.” He says one solution could be to move towards a more decentralised web. “We should be able to opt for decentralised services which allow everyone to own their own data,” he said. “We should have norms to ensure that your data on someone else’s server is still your data.”
Facebook is making billions using your data. But, the bigger consequence might not be economic, but psychological. Three years ago, Facebook conducted an experiment involving 6,89,000 users, in which a user’s News Feed was manipulated to determine how to make her happier or sadder. It called it a process of ‘emotional contagion’. The implications of this experiment are chilling, and reduce us to mere automatons.
“Thanks to smartphones or Google Glass, we can now be pinged whenever we are about to do something stupid, unhealthy or unsound,” said Evgeny Morozov, the author of The Net Delusion: The Dark Side of Internet Freedom, in MIT Technology Review. “We wouldn’t necessarily need to know why the action would be wrong. Citizens take on the role of information machines that feed the techno-bureaucratic complex with our data. And, why wouldn’t we, if we are promised slimmer waistlines, cleaner air, or longer (and safer) lives in return?”
Earlier this year, it was discovered that data firm Slice Intelligence was using an email decluttering service called Unroll.me to scan people’s inboxes for receipts from services like Lyft, a ride-sharing app, and selling it to its rival Uber. This created a huge controversy about the breach of privacy by the company. But it was we who gave it access to our emails.
The truth is that most of us will sign up for free services without knowing what we’re trading. By doing so, we undermine a key component of Aristotle’s democracy: the power to entertain a thought without accepting it. The Supreme Court, through its judgment, has taken the first step towards protecting our privacy. In many ways, the ball is in our court now.