Under the EU Digital Services Act, very large online platforms have an obligation to identify and assess the systemic risks associated with their services.
The European Commission today (October 2) requires YouTube, Snapchat and TikTok to share more information about their content recommendation algorithms and the role these systems play in amplifying risks to the platforms’ users. requested.
Platforms must submit the requested information by November 15th.
Under the EU Digital Services Act, companies designated as “very large online platforms” such as YouTube, TikTok, Facebook and Snapchat must identify, analyze and assess systemic risks associated with their services and report them to the European Commission. There is an obligation to do so. Oversight.
Platforms also have an obligation to take steps to mitigate these risks.
The commission today ordered YouTube and Snapchat to restrict the parameters their algorithms and systems use to recommend content to users, as well as the mental health of users, the protection of minors, the election process, and the election process. We asked them to provide further information on their role in amplifying the associated risks. civil discussion.
The Committee also requested information on how these platforms are mitigating the potential impact of their recommendation systems on the spread of illegal content, such as hate speech and the promotion of illegal drugs.
Similarly, the European Commission has asked TikTok to explain the measures it has taken to avoid manipulation of its service by malicious parties and how it is mitigating risks that could be amplified by its recommendation system. We are asking you to provide us with information.
Based on the platform’s response (to be submitted within two months), the European Commission may formally open a non-compliance procedure and investigate the platform or impose a fine of up to 1% of the company’s gross annual revenue. There is sex.
YouTube has a history of containing extremist and harmful content and attracting criticism as a result. However, the previously prevalent problems appear to have been curbed after strict regulations were introduced. But a study last year found that while YouTube may have addressed the “rabbit hole” of algorithm-influenced content, it has not been able to eradicate extremist content and misinformation from its platform. is suggested.
Earlier this year, the European Commission launched formal proceedings against TikTok under the DSA, stating that the platform has been forced to comply with regulations regarding the protection of minors and advertising transparency, as well as harmful practices arising from its addictive design and recommendation system. Assessed whether content risk management regulations were violated.
Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of must-know science and technology news.