In line with Regulation (EU) 2022/2065 of the European Parliament and of the Council of October 19, 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (“Digital Services Act” or “DSA”), Telegram Messenger Inc. (“Telegram”) publishes this transparency report that covers the Report Period from 17 February 2024 to 17 February 2025. This report is prepared according to the transparency obligations under Articles 15 and 24 of the DSA. This report outlines, in particular, how we moderate potentially illegal content and other attempts of abuse, as well as handle complaints and official requests.
Telegram has appointed European Digital Services Representative (EDSR), registered at Avenue Huart Hamoir 71, 1030 Brussels, Belgium, as its representative for matters related to the EU Digital Services Act. EDSR may be contacted by competent authorities by email dsa.telegram@edsr.eu, by phone +32 2 216 19 71 or by post.
Telegram’s services primarily center on private messaging, and only some optional and ancillary features of Telegram may qualify as “online platforms” under the DSA. Where relevant in this report, we have calculated the statistics for these optional features using the highest reasonable threshold estimate.
As of February 2025, the optional features of Telegram that may qualify as “online platforms” had significantly fewer than 45 million average monthly active recipients in the EU over the preceding 6 months — which is below the threshold required for designation as a “very large online platform.”
Telegram processes official requests from EU Member States concerning content restriction and metadata disclosure. Typically, these requests are issued by law enforcement agencies or judicial authorities operating under European law, or national law in compliance with European law. When such requests are legally valid and authentic, our moderation and legal professionals take appropriate actions, including reviewing and restricting the referred content or providing the phone number or IP address of criminal suspects, as outlined in Section 8.3 of the Telegram Privacy Policy.
During the Report Period, Telegram received 5 046 requests from EU Member States authorities to provide metadata and 553 requests to action certain content.
Within the Report Period, the median response time for content takedown requests was 1,61 days. For information requests, the total median response time was 7 days, with the most urgent requests processed within a timeframe of 30 minutes to 48 hours. Telegram considers all valid requests received at the time of their transmission, provided that they comply with DSA requirements and are submitted by EU law enforcement or judicial officials via DSA-mandated channels.
Number of requests from EU Member States to act against illegal content or to provide basic subscriber information, by requesting Member State
Member State | Takedown requests | Information requests |
---|---|---|
![]() |
5 | 48 |
![]() |
22 | 335 |
![]() |
0 | 31 |
![]() |
0 | 2 |
![]() |
0 | 5 |
![]() |
0 | 41 |
![]() |
1 | 78 |
![]() |
1 | 25 |
![]() |
6 | 60 |
![]() |
10 | 1 005 |
![]() |
288 | 1 778 |
![]() |
1 | 55 |
![]() |
0 | 19 |
![]() |
0 | 12 |
![]() |
2 | 334 |
![]() |
0 | 27 |
![]() |
0 | 15 |
![]() |
14 | 0 |
![]() |
0 | 27 |
![]() |
154 | 122 |
![]() |
0 | 332 |
![]() |
1 | 63 |
![]() |
0 | 23 |
![]() |
0 | 4 |
![]() |
0 | 1 |
![]() |
47 | 576 |
![]() |
1 | 28 |
Total | 553 | 5 046 |
Number of requests from EU Member States to act against illegal content or to provide basic subscriber information, by type of alleged illegal activity
Illegal activity | Takedown requests | Information requests |
---|---|---|
Terrorism | 528 | 525 |
Child sexual abuse material | 3 | 556 |
Illegal pornographic content | 1 | 58 |
Calls for violence, violent crimes | 1 | 297 |
Controlled substances | 0 | 621 |
Illegal sale of weapons | 0 | 52 |
Fake money, fake documents | 2 | 171 |
Fraud, scam, extortion | 10 | 1 841 |
Hacking and cybercrime | 5 | 353 |
Missing persons | 1 | 75 |
Other | 2 | 497 |
Total | 553 | 5 046 |
Telegram is a technology company committed to keeping its more than 1 billion users worldwide safe while protecting their privacy and freedom of speech.
While staying true to its core value of user privacy, Telegram actively engages in policy efforts and implements technical and organizational measures to combat abusive content. Telegram has robust policies and procedures in place to routinely detect and remove potentially illicit content and to ban or restrict accounts (users and bots) and communities (groups and channels) that promote or encourage illegal activities.
Telegram commits significant resources to ensure that content available on its service is reviewed 24/7. This includes comprehensive systems and algorithms to prevent automation, proactive review of public content powered by both hash-matching databases, advanced AI/ML solutions and highly qualified moderators, as well as the processing of user reports through AI-enhanced interfaces and tools. Additionally, Telegram thoroughly reviews email reports submitted by users, authorities and trusted organizations.
Telegram does not algorithmically amplify content — users only encounter content they explicitly choose to engage with, both in its core service of private messaging and through other optional ancillary features that may involve public content. This principle significantly differentiates Telegram from other platforms, substantially reducing the likelihood of exposure to harmful content.
Firstly, Telegram employs an extensive multi-layered rate-limiting system to deter abuse at scale by limiting all rapid or repeated actions. This system functions as an effective preventative measure and blocks automated abuse attempts before they can impact the platform in any capacity.
Separately, sophisticated algorithms developed by world-class engineers proactively detect patterns associated with bulk messaging, scams and fraud. When such activity is detected, these anti-spam systems are able to automatically remove all relevant messages and restrict the accounts of violators.
Extensive hash-matching databases are employed for content review across groups and channels. These databases contain hashes supplied by external expert organizations or derived from content previously removed by Telegram moderators. If a user attempts to reupload illegal material that corresponds to one of these hashes, the content is removed automatically, preventing it from appearing on the platform. The hash-matching systems operate with 100% accuracy.
Furthermore, Telegram leverages specialized AI-based and ML-based systems with custom data pipelines that utilize keywords, phrases, imagery, and logos to identify new or unknown potentially illicit content and automatically flag or remove content identical or similar to previously blocked items. These automated systems operate at an accuracy rate of no less than 99,1% and a possible error rate below 0,9%.
Similarly, dedicated AI-based and ML-based systems assign confidence scores to user reports to assess urgency, limit the visibility of certain content before additional evaluation, and efficiently route cases for further review. Human moderators review objectionable content in all cases where the appropriate response needs to be validated manually.
Telegram's moderation team processes reports from users and trusted organizations regarding content found anywhere on its platform.
Telegram publishes detailed global moderation statistics for its platform on a live dashboard that tracks its global moderation efforts.
Telegram maintains a dedicated team of professional moderators around the world in order to both proactively remove potentially harmful content and address user reports.
All Telegram moderators are highly trained and undergo regular quality-assurance checks. Moderators are carefully selected through rigorous logic and language tests, evaluating intelligence, decision-making, integrity, and accuracy. They are provided with interfaces that maximize their efficiency and sets of guidelines specifically optimized for their particular tasks. All moderators have access to feedback tools, overseen by a dedicated team that reviews input, provides clarifications, and updates guidelines as needed.
To reduce exposure to sensitive content, moderators are allowed flexible schedules, with mandatory breaks built into shift-based work. We conduct regular briefing sessions and performance evaluations, including monthly management assessments and daily peer review. A percentage of daily reports are randomly reassigned to several moderators to independently review, in order to help ensure consistent decisions are being made.
By type of illegal content or ToS violation | |
---|---|
Child sexual abuse material | 32 298 |
Terrorism | 22 085 |
Illegal pornographic content | 63 893 |
Calls for violence, violent crimes | 62 631 |
Controlled substances | 215 400 |
Illegal sale of weapons | 18 351 |
Fake money, fake documents | 87 842 |
Fraud, scam, extortion | 425 026 |
Hacking and cybercrime | 186 122 |
Personal data offenses | 79 061 |
Spam | 2 608 969 |
Other | 68 257 |
Total | 3 869 935 |
By detection method | |
---|---|
Hash-based detection and removal | 148 847 |
AI-based detection | 318 894 |
Algorithmic and signals detection | 2 743 315 |
Proactively by human moderators | 202 557 |
Reports | 456 322 |
Total | 3 869 935 |
By type of restriction | |
---|---|
Accounts and communities restricted | 2 215 984 |
Accounts and communities removed | 1 653 951 |
In 2 606 cases, the objectionable content was found not to be against Telegram’s Terms of Service but was actioned based on applicable national law. All other cases were handled based on Telegram’s Terms of Service.
Any EU user, non-registered viewer, or organization can also submit notices of illegal content via this page.
In order to report illegal content under the notice and action procedure laid out in Article 16 of the DSA, a person will be required to include their contact details, real name and a clear explanation of why they believe the content in question is illegal.
Number of notices submitted in accordance with Article 16 of the DSA via notice and action mechanisms categorized by the type of alleged illegal content. This table represents the types of content as alleged by the reporting user, which may not reflect actual violations.
Type of alleged illegal content | Number of notices submitted by persons and organizations from the EU |
---|---|
Illegal goods | 246 597 |
Scam, fraud, spam | 44 088 |
Personal data offenses | 21 405 |
Illegal adult content | 12 302 |
Child sexual abuse material | 5 378 |
Violence | 6 224 |
Terrorism | 2 824 |
Other | 16 977 |
Total | 355 795 |
During the Report Period, Telegram processed 43 219 notices using automated means.
The median action time — calculated as time between the moment a notice is received and the first action taken on that notice — was 30 seconds for manifestly illegal content addressed through automated systems, and 20 hours for notices reviewed by human moderators. The moderators may carry out extended review or take further action on notices, with subsequent notifications provided to the complainants after such additional review.
All notices are reviewed both according to Telegram’s Terms of Service and according to applicable laws. For 200 416 notices reporting 60 749 unique items of content, Telegram confirmed a violation of its Terms of Service and restricted or removed the offending content. In 49 further cases, the content was found not to be against Telegram’s Terms of Service and was actioned based on applicable national law.
Telegram prioritizes notices submitted by DSA-approved trusted flaggers, as well as by other trusted organizations that are not (or not yet) approved as trusted flaggers under the DSA.
Content referrals from trusted flaggers and trusted organizations are reviewed manually by human moderators. In all cases, content was actioned based on Telegram’s Terms of Service.
Notices from trusted organizations | |
---|---|
Trusted flaggers in the EU | 146 |
Trusted EU organizations that are not DSA-approved trusted flaggers | 602 |
Total | 748 |
The decisions of moderators can be reversed, such as if the affected user claims that they were restricted by mistake. Within the Report Period, Telegram received 163 038 such complaints.
Telegram received, at the highest reasonable threshold estimate, 43 219 complaints related to optional elements of its service that may quality potential “online platforms”. The median decision time on these complaints was 9 hours.
Within the Report Period, no disputes were submitted to out-of-court settlement bodies certified by the DSA.
Number of complaints by type and number of successfully appealed decisions (related to potential “online platforms”)
Complaint type | Number of EU complaints | Decisions reversed |
---|---|---|
Account or community removal | 97 366 | 13 139 |
Account or community restriction | 59 644 | 8 411 |
Reporter appeal | 6 028 | 374 |
Total | 163 038 | 21 924 |
During the Report Period, Telegram suspended, at the highest reasonable threshold estimate, 949 557 accounts that provided manifestly illegal content using optional parts of the Telegram Service that may constitute “online platforms”.
A share of repeated notices and complaints received during the Report Period were considered ill-founded and subsequently discarded. Generally, in the interest of comprehensive content review, Telegram does not restrict submitters of such notices. Nevertheless, Telegram has restricted 842 accounts that intentionally and systematically attempted to abuse or overload its notice and action mechanisms over time.
Telegram employs strict safeguards to proactively eliminate content that violates local laws and its Terms of Service, preventing it from appearing on the platform or reaching a significant audience. In the event that such content is encountered by a user, trusted flagger or other organization, Telegram moderators and moderation systems quickly respond to reports and official requests.
Telegram not only meets the content moderation requirements of the EU and the Digital Services Act — it exceeds them. The Telegram Team employs several world-class engineers tasked with maintaining and improving its cutting-edge moderation systems, and is regularly hiring new moderators.
As outlined in the DSA, Telegram will continue to release transparency reports on an annual basis. The next report will be published in 2026.