Much of the outrage around the recent Cambridge Analytica incident has centred around the fact that academic Aleksandr Kogan could, with a simple Facebook survey app, leverage 270,000 survey respondents to gain access to the data of 50 million users directly from the Facebook platform. While most of those respondents were dimly aware that they must have consented to sharing personal information, many of them were surprised (outraged even) that, in addition to information about them, the app was able to harvest information about their friends as well.
Nothing about this incident really surprises me. If anything, I am surprised that everyone around me is so surprised.
The Facebook terms of service makes it clear that when we use a Facebook app, we provide the app developers access to a wide variety of information about us for them to use to personalize our user experience. This explicitly includes access to our list of friends. Anyone who has used an app on Facebook has done so based on these terms, and app developers who collect information about our friends are doing so legitimately, based on the consent that we provided. Everyone is upset that Facebook is hiding behind its terms of service. But to me what is more significant is how few of us actually bother to read the terms of service or to fully understand the repercussions of what we are signing up to.
Today, consent is a formality—a step we blindly take when we sign up to new services. We agree to terms and conditions as acts of faith, trusting that the service providers we’ve signed up with will not let us down. Even if we want to be mindful about what we are committing to, we feel a sense of futility in even attempting to understand these terms as we have no scope to negotiate any provisions we might find unacceptable.
Social media networks have become our principal point of interaction with friends and family. As a result, our need to participate on these platforms often overrides our instinct to refrain from agreeing to clauses we disagree with.
To be clear, rarely have I found the terms of service of social media platforms to be deceptive. Most alleged violations tend to take place within the realm of what is legally permissible. They usually involve a breach of the spirit rather than letter of the law.
In this most recent incident, Kogan’s app was entitled to access information about the friends of its users. Kogan then passed that information on to Cambridge Analytica, which was against Facebook’s terms. There was nothing illegal about the initial collection, however; what was wrong was that the data wasn’t used as it was intended to be.
This is why consent cannot be the primary safeguard that protects our personal privacy. When it is made available to them, service providers will hide behind it, using our acceptance of their terms as absolution of their responsibility to protect us from harms we might suffer as a consequence of using their service. If, instead, service providers can be held accountable for their actions despite the consent we have provided them, we might have a useful framework to safeguard our privacy.
Last year, I spelled out the contours of such a framework, calling for an accountability regime that completely does away with consent, in Beyond Consent: A New Paradigm For Data Protection. At the time, many argued against such a drastic measure on the ground that consent allowed us to exercise autonomy over what can be done with our data. In the light of recent incidents, I wonder how many still think consent is still useful.
As much as autonomy is necessary, consent is irrevocably broken, and in today’s world of take-it-or-leave-it contracts, its continued use is meaningless. If we can no longer negotiate our contracts or truly understand their implications on our personal privacy, any faith we have in our consent as the instrument of our autonomy is misplaced.
That said, autonomy is important and in the paper I have suggested a granular opt-out mechanism by which we can preserve our autonomy. I have suggested a framework of learned intermediaries coupled with regular audits to ensure that data controllers are held accountable even if we don’t fully understand their algorithms or the way in which their services work. What we can no longer afford to do is allow our consent to terms and conditions, whose implications we barely understand, to absolve data controllers of their obligation to be responsible for the privacy implications of their actions.
In my submissions to the Justice Srikrishna Committee on Data Protection, I have suggested that India should take the bold step of doing away with consent entirely. As much as it might have been useful decades ago, consent no longer fulfils the purpose we think it serves. Given the trajectory of our growth, India is likely to become a significant player in the data economy over the coming decade. We need a privacy law that is in tune with the realities of the future to which we are hurtling. If the evidence before us is any indication, consent is useless. We should not feel constrained to use consent as the foundation of our law merely because the rest of the world does so. India has a unique opportunity to create a truly forward-thinking privacy law completely unencumbered by the baggage of consent that the rest of the world is struggling to shrug off.
I hope we have the courage to find our own way.
Rahul Matthan is a partner at Trilegal. Ex Machina is a column on technology, law and everything in between. His Twitter handle is @matthan.