Telegram CEO Pavel Durov announced on Friday that the messaging app would enhance its content moderation and remove features that have been exploited for illegal activities.
This move comes as Durov faces a formal investigation in France over the use of Telegram for crimes such as fraud, money laundering, and sharing child exploitation material.
In a message to his 12.2 million followers on Telegram, Durov emphasized that while 99.999% of users have no involvement in criminal activities, the small fraction that does tarnishes the platform’s reputation. “This year, we are committed to turning moderation on Telegram from an area of criticism into one of praise,” he stated.
Durov did not provide detailed plans but noted that Telegram had already disabled media uploads to a standalone blogging tool that had been misused and removed the “People Nearby” feature due to issues with bots and scammers. Instead, the platform will promote legitimate, verified businesses in users’ vicinity.
These changes mark Durov’s first public response since his arrest in France, where he was questioned for four days before being released on bail. The case has sparked global debate about the limits of free speech online and whether platform owners can be held accountable for user crimes.
Telegram has also quietly updated its FAQ page, removing language that said it doesn’t process reports of illegal content in private chats. Durov, who also announced Telegram reaching 10 million paid subscribers, defended the platform’s moderation efforts, stating that claims of Telegram being an “anarchic paradise” are unfounded.