A woman uses her smartphone to log her personal data on a period tracking app.
Atlantico: Supreme Court-suggested changes to abortion law in the United States are pushing clinics to look at digital privacy. Some clinic workers say they’re embracing encrypted messaging apps and Zoom meetings to leave fewer paper trails if Roe v Wade were revoked. How does this reflect the reality of the situation?
Fabrice Epelboin: It reflects a reality that could not be colder: from the moment abortion would be illegal, one could imagine a legal action requiring the recovery of data to catch those who do this illegal thing. It’s the law. It is also an incredible opportunity to realize that all the platforms to which we give our digital data without really thinking about the consequences, are paying with our freedom. Many digital activists have been claiming this for ten or fifteen years, but until now it has remained rather abstract. Now almost everyone realizes it, but it’s far too late. Because it is not just this way of identifying a woman who seeks an abortion or who has had it, far from it, the latter leave digital traces in abundance which make it possible to characterize this. A family a few years ago received promotions for diapers, not understanding why, before realizing that one of the 16-year-old girls was pregnant, which the marketing system identified from the traces. numbers left by it. The AI had compared her profile to hundreds of thousands of others who were pregnant and extrapolated that. So it is very good that clinics take these measures to minimize the digital traces of their customers, but that will not be enough, especially since we recently learned that many hospital establishments in the USA share their data with Facebook. There will be 1000 other ways to hunt them down. This is valid for abortion as for all subjects. If I have liver cancer and I consult sites that use Google analytics, google will end up deducing that I have liver cancer. The next step is that nothing will prevent Google from giving this information to insurance companies which will therefore refuse to provide health insurance. And the worst part is that it does not necessarily violate my personal data. Since the GDPR, data brokers have found ways to circumvent the regulation.
Our lives under Instagram: this new social control that we inflict on ourselves after having sent flying all those who existed before
Is it possible to change course and get out of this situation?
Today, it is no longer really possible to go back, but it is possible to look things in the face and perhaps have a collective awareness. And the right to abortion which is wavering in the USA seems to me a fantastic opportunity for this awareness. Women’s rights is a sensitive subject and this will force us to face reality in the world in which we have lived for more than fifteen years. This will no doubt translate into a setback in the rights of American women, after all, this threat comes from the USA and it is legitimate for American citizens to be the first victims, but this is undoubtedly one of the last opportunities that is offered to us to become aware of the world in which we have entered on one level. We’ve been there for more than a decade. These companies are more powerful than most states. You just have to be aware of it, it’s too late to fight against it. At best, we can perhaps outline solutions, no doubt looking at the side of strengthening the fight against corruption.
To what extent are these techniques already used in the United States and France?
We know absolutely nothing about it. What is certain is that most GAFAMs are moving towards insurance. The Health Data Hub, the warehouse where all our health data is stored, has thus been entrusted to Microsoft. Microsoft has invested colossal sums to buy a company specializing in predictive health on the basis of health data and is interested in insurance. Tomorrow, inevitably, complementary health will be determined by our digital traces. The current welfare state system will regress, it is unfortunately inevitable, in favor of private insurance, which not everyone will be able to afford, not necessarily because of a lack of means, but also because of a biological inequality in the face of health problems that an AI will be able to identify in order to ensure only profitable customers. This example I use for a long time, without grad effect. But there, abortion is very real.
Biased moderation: behind the Mila case, these embarrassing questions that social networks refuse to answer
Is the use of personal data for legal purposes already in place?
When Mila is harassed, we recover the data of her harassers. This still works quite badly in France because the GAFAMs cooperate very little with the justice system which, for its part, has remained in the 20th century and does not know how to interact with the GAFAMs. But prosecuting someone based on the release of their personal data is very common.
Is it possible to escape this system?
Individually, there are two voices, totally opposed, to do so. Either stop using any digital tool, that said, faced with the absence of data, insurance companies will probably not take the risk of insuring you. The second is to understand in detail what we do and how we disseminate our data. Unfortunately, this is only within the reach of a small elite. She can lie, cheat, segment her profile, and deceive AI. When you really understand how you disseminate your personal data, you have this power, but it is absolutely complex.
Collectively, it’s screwed up, the governments have abdicated in any case in the face of GAFAM. Digital sovereignty is no longer the responsibility of states, apart from China and Russia, from a digital point of view, the rest of the world is, so to speak, an American colony. Our major hosting services, from Orange to Atos via Thalès, became GAFAM franchisees under the first Macron five-year term. And it is irremediable.
On the business side, the problem is that the services provided by GAFAM are practical, efficient, professional, and often offered. In the short term, this is a very good calculation, perfectly compatible with the urgent need to display an annual report which boosts a stock market price. In the long term, this consists of locking oneself into technologies that will end up sucking up any future profits through licenses that will inexorably end up exploding. The alternative is to invest human, technical and financial resources to guarantee its independence, but very few companies do this, and it is absolutely not possible for companies listed on the stock exchange.
Accused social networks, stand up: the evidence of the undermining of the foundations of our democracies is there
What made us let ourselves be locked into this situation?
Digital illiteracy. Business leaders, political leaders and most journalists understand absolutely nothing about digital issues. It is very easy to lie, to manipulate, to defame whistleblowers, so as to impose the solutions of the GAFAMs and make those who warn of dangers look like conspirators.
To what extent have the populations accepted the situation?
I don’t think they accepted the situation. They don’t understand it. We live in a world where the rule of law has declined enormously. Many laws are broken. But if tomorrow a government decides to restore the rule of law based on personal data, it will be carnage. We would be able, if we really wanted to – which is not the case – to apprehend most tax evaders thanks to their personal data.
Is the hypothesis of a public revolt if social control becoming more significant a possibility?
I don’t think there will be other, more meaningful reasons to revolt that will come long before. This system of social control is much more insidious than others, such as the one found in China, for example. And to a certain extent, we have already accepted a certain form of social control put in place by social networks. We vaguely know what is going on there. Ever since the Cambridge Analytica affair, most people have realized that such an effective system for selling us products could just as well be selling us ideas, and the nudge, which is at the heart of the manipulation orchestrated by Cambridge Analytica, is practiced excessively by the French government, without shocking anyone.
China is always cited as the example of the country where social control is the strongest. Can we imagine a western state acting like Beijing?
All CAF recipients already have some form of social credit. We can also talk about the credit rating in the United States. An embryo of social credit already exists in France, the United States and Japan, all three of which are democracies. China should not be seen as an exception but as a vanguard which has for it to be relatively honest and clear about its intentions regarding the use of technologies for the purposes of social control, but the West is not not left out. The INDECT project, which dates back ten years, had control and social stability as its objective, and again, this did not shock many people. From a numerical point of view, everyone, dictatorships as democracies, goes in the same direction. China assumes it when France does it discreetly, that’s all. At worst, we could say that we are late.