The EU has requested information from Snapchat, TikTok and YouTube about how their algorithms work. The purpose of this RFI is to reduce the risks of using AI for content recommendations. Both companies must respond with the necessary data by November 15th, and the next course of action will be decided based on the responses.
The European Union said in a press release that it has sent requests for information (RFIs) to Snapchat, TikTok and YouTube asking for more information on how their algorithms work when it comes to recommending content to users. did.
Both companies must respond with the necessary information by November 15th.
All three platforms are asked to share parameters for recommending content. They are also asked to share what steps they are taking to address systemic risks related to mental health, elections, and public disclosure. Additionally, TikTok has been asked about the steps it takes to prevent algorithm manipulation, so users can understand how the algorithm works and manipulate it to their advantage.
Depending on the response, the EU will decide whether further measures are necessary.
These activities are based on the Digital Services Act (DSA), and if any of the three companies are found to be in breach of the rules under the DSA, they will be subject to fines of up to 6% of their annual global turnover. There is a possibility.
Snapchat and YouTube have not yet commented on the matter, but TikTok spokesperson Paolo Ganino issued a statement confirming they had received the notification. This is currently under consideration and we will cooperate with the EU Commission throughout this process.
Managing the risks of using AI for content recommendations
DSA has been around for quite some time and all three companies have been operating in the EU for many years. So why is the EU suddenly demanding information about algorithms? It’s because of AI.
The DSA classifies these three companies as Very Large Online Platforms (VLOPs). VLOP is a platform that attracts at least 45 million users per month from the EU and is subject to stricter rules.
As AI is now being used to recommend content, these companies have a responsibility to investigate and mitigate the negative impact AI may have on mental health and civil discourse.
There are enough complaints and concerns that social media platforms are too addictive. In fact, the family of a New Yorker teenager recently filed a lawsuit claiming Instagram is too addictive.
The problem is further exacerbated by algorithms that constantly recommend relevant posts. The EU therefore wants to analyze the impact of AI on content recommendations to curb addiction.
Additionally, they want to ensure that AI-powered algorithms don’t encourage the spread of harmful content.
โThe issue also concerns measures taken by platforms to reduce the potential impact of recommendation systems on the spread of illegal content, such as the promotion of illegal drugs or hate speechโ โ EU
It’s not the first time
This is not the first time the EU has requested such information from the three individuals. All three of these (and many other VLOPs) have previously received RFIs from the EU following the introduction of the DSA.
Some VLOPs have also received requests for information about what steps they are taking to protect children on their platforms.
Among these three, TikTok has the most interactions with the EU. It is also the only company among the three to have undergone a formal DSA investigation regarding TikT9k Lite’s addictive rewards program.
Several issues are also being discussed, including child safety and the company’s anti-addiction efforts.