By Alex Omenye
The European Commission has officially initiated proceedings to assess whether TikTok violated the Digital Services Act in multiple areas, including the protection of minors.
“The compliance with the DSA obligations related to the assessment and mitigation of systemic risks, in terms of actual or foreseeable negative effects stemming from the design of TikTok’s system, including algorithmic systems, that may stimulate behavioural addictions and/ or create so-called ‘rabbit hole effects,” The European Union said in a statement.
The organization add that, “Such assessment is required to counter potential risks for the exercise of the fundamental right to the person’s physical and mental well-being, the respect of the rights of the child as well as its impact on radicalization processes. Furthermore, the mitigation measures in place in this respect, notably age verification tools used by TikTok to prevent access by minors to inappropriate content, may not be reasonable, proportionate and effective,”
Bytedance-owned TikTok is under investigation for concerns related to advertising transparency, data access for researchers, and the management of addictive design and harmful content.
The Commission’s inquiry is based on a preliminary investigation, including the analysis of TikTok’s risk assessment report and responses to formal Requests for Information.
The DSA, in effect since this month, allows penalties for confirmed breaches to reach up to 6% of the global annual turnover. The proceedings will specifically scrutinize TikTok’s compliance with DSA obligations related to assessing and mitigating systemic risks, ensuring privacy and security for minors, implementing appropriate measures, providing a reliable repository for advertisements, and increasing platform transparency.
TikTok, designated as a Very Large Online Platform under the DSA, declared 135.9 million monthly active users in the EU and is required to comply with various obligations outlined in the act.