Google, Meta, TikTok, and other tech giants reportedly agree to revisions of EU’s anti-disinformation code, which will have the companies share more information with the European Commission.
A confidential report obtained by the Financial Times includes the details of an updated code of practice that some of the world’s largest tech companies have signed onto.
The ‘Code Of Practice On Disinformation’ is a voluntary agreement to abide by a set of self-regulatory standards to fight disinformation. It was established in 2018 and includes Google, Meta, TikTok, Twitter, Microsoft, and Mozilla.
The code has been amended several times since its introduction, though this latest revision ups the ante for signatories in terms of how much information they’re being asked to disclose.
Here’s some information about EU’s code of practice, and what’s being added in this set of changes.
EU Code Of Practice On Disinformation
In a collective effort fight the spread of false information online, the EU established a code of conduct that a number of tech companies signed onto.
Companies who enter into the agreement are asked to share details with the European Commission about the actions they take against disinformation on their respective platforms.
Their commitments involve practices like ensuring political ads are transparent, removing fake accounts, and demonetizing accounts that spread false information.
Each company has to report on measures taken to comply with their commitments under the Code of Practice.
A ‘strengthened’ version of the code has been in the works since last year, and it looks like it’s going to be unveiled soon. Here’s what’s getting added to it.
A ‘Strengthened’ Code Of Practice Releasing Soon
Based on what the Financial Time has reported, the updated code has three major changes.
The first is that each company will have to provide country-by-country data on their efforts to fight disinformation. Currently they’re providing either global or Europe-wide data.
After some initial resistance, the companies are conceding to demands from regulators for more country-specific data. The reasoning behind sharing more granular data is to assist the Commission with targeting people spreading disinformation within individual countries.
The second significant change is a requirement for companies to disclose how they’re removing or limiting harmful content in advertising and promoted content.
And third, companies who enter into the code of practice will have to develop tools and partnerships with fact-checkers and include “indicators of trustworthiness” on independently verified information about important issues.
In addition to the new terms, the updated version is said to have 30 signatories that include the aforementioned tech companies and civil society groups. The new version is reportedly being released on Thursday.
Since all parties enter into the agreement voluntarily, the code of practice isn’t enforced with harsh fines or penalties. However, that may change in the future.
The European Commission aims to evolve to code toward becoming a co-regulatory instrument under the Digital Services Act. This will allow enforcement of the code through legislation.
If the Code Of Practice On Disinformation is enforced through the Digital Services Act, companies that the rules could face fines of up to 6% of global turnover.
Featured Image: VanderWolf Images/Shutterstock