SOCIAL NETWORKS: With the “Digital Services Act”, the new European legislation will impose on all platforms “the duty to cooperate” to remove hate content
- The Digital Services Act, which will be presented on December 15 by the European Commission, will for the first time impose “means obligations” on social networks in the moderation of content.
- The European Commission will also ask digital platforms like Facebook, Twitter or Google for transparency on the way their algorithms recommend content.
- “In a way, this text comes at the right time,” said Christine Balagué, professor at the Mines-Télécom Institute, alluding to recent events such as the coronavirus crisis or the American elections.
“ Online platforms have taken a central place in our life, in our economy and in our democracy. This observation drawn up by Thierry Breton, the European commissioner in charge of the internal market, suffices to summarize the spirit of the draft legislation which will be presented this Tuesday by the European Commission to better frame the digital world. This eagerly awaited project, which will include two legal components called the Digital Services Act (DSA) and Digital Market Act (DMA), intends to impose stricter rules on digital giants like Google or Facebook, in order to eliminate lawless spaces on the Internet. and abuse of a dominant position. “It will be a real European digital constitution” which will structure the sector “for the next decades”, indicates the cabinet of the State Secretariat for European Affairs.
One of the main components of the Digital Services Act concerns the framework of platforms and social networks in order to fight against online hatred and disinformation. The assassination in France of Professor Samuel Paty, targeted on social networks, underlined the dangers of digital anarchy. The new European legislation will now impose “on all digital services the duty to cooperate” to remove “dangerous” content (hate speech, terrorism, child pornography, etc.). “In a way, this text comes at the right time,” said Christine Balagué, professor at the Institut Mines-Télécom, alluding to the events of recent months – the Covid-19 crisis, US elections – which showed the deleterious effects that social networks could have.
It’s time to put #digital services at the service of Europeans 🇪🇺
🔜 15th December, with 2 major legal acts, which go hand in hand:
✔️ #DigitalServicesAct
✔️ #DigitalMarketsAct👇op-ed co-signed with Margrethe @vestager https://t.co/AEIyS8Oc2G pic.twitter.com/NsBZCCm8BI
— Thierry Breton (@ThierryBreton) December 6, 2020
The European Commission should not touch the hosting status of these platforms but will regulate their moderation practices , both in terms of the removal of illegal content (the definition of which will remain specific to each Member State, given the cultural and historical differences) , that at the level of the techniques of highlighting of the contents, what is called the management of the virality, and which includes in particular all the algorithms put forward on the news feeds of the various social networks.
“Obligation of means” in the moderation of content
For the first time, the Digital Services Act will impose “obligations of means” on platforms in the moderation of content. “Social networks today do not act quickly enough and are exonerated of their responsibility in the matter”, indicates the cabinet of Cédric O. The Secretary of State for Digital thus pushes towards a regulation similar to that which exists in the banking sector concerning fraudulent transfers. “Banks are not responsible for every fraudulent transfer made by their customers. On the other hand, they have the obligation, under penalty of significant fines, to put in place suitable means to detect and report these transfers. ”
With the Digital Services Act, the principle would be the same. In the event of insufficient or ineffective means put in place by digital platforms to fight against hateful or violent content, heavy sanctions, fines or even blocking of access to some or all of the services could be considered. “The objective is to control the speed of the management of these contents and the reliability of their prioritization [the platforms will have to deal with the most serious contents first]. But also to check that there is no over-withdrawal in order to preserve freedom of expression ”, indicates the cabinet of the State Secretariat in charge of Digital.
This DSA measure should be added by amendment to the bill intended to fight radical Islam and “separatism” , which was presented on December 9 in the Council of Ministers. This obligation of “means” was included in the Avia bill against online hatred, largely censored by the Constitutional Council. “The will of the government is to be able”, via the bill “reinforcing the republican principles, to translate it a little by anticipation, given the urgency that there is on this subject”, declared Cédric O during ‘an interview given in mid-November.
“Transparency” of the algorithms used to recommend content
The European Commission will also ask digital platforms like Facebook, Twitter or Google for transparency on how their algorithms recommend content. “We cannot allow decisions that affect the future of our democracy to be taken in the secrecy of a few corporate boards,” explains European Commission Vice-President Margrethe Vestager . These platforms will have to “tell users how their recommendation systems decide what content to show”, which will allow them “to judge” whether to “trust the worldview they give”.
They will also have to “provide regular reports on the content of the moderation tools they use” and “better information on the advertisements we see”, in order to have “a better idea of who is trying to influence us”. “Today, we do not know how many French-language moderators work on Twitter. And we are unable to verify the numbers given to us by other social networks. This is not normal ”, explained Cédric O in November. After the assassination of the history professor Samuel Paty, the French Secretary of State for Digital had clearly denounced in a forum the “opacity” of the algorithms of social networks.
The algorithms of the digital giants are regularly singled out. Some have already been manipulated, as was the case with Facebook during the 2016 presidential election in the United States or during the Brexit referendum in the United Kingdom. “Social networks say they are politically neutral but are they really? No one can now observe their behaviour, ”explained recently at a conference Benoît Loutrel, who led a study mission of senior French officials at Facebook. “Social networks have a major impact on people’s opinions,” explains Christine. Balagué. They are therefore indebted for explanations on the way in which they disseminate information. ”
<