Rising use of paywalls to conceal child sexual abuse material challenges detection, report shows
The Irish Internet Hotline (IIH) processed a record 61,317 reports of illegal online content in 2025, with child sexual abuse material (CSAM) remaining the most significant challenge. Both public and automated reporting systems noted increases, with CSAM increasingly distributed via paywalled and restricted-access platforms to evade detection. This shift has hindered monitoring efforts and raised concerns about the commercialization of abuse content. A troubling rise in material involving infant victims—from 1% to 4% of reports—was observed, alongside a 325% increase in computer-generated CSAM according to one source. Despite these challenges, 99.6% of confirmed CSAM was removed through coordinated international and industry efforts. The IIH also addressed other illegal content, including intimate image abuse, financial scams, and hate speech, while calling for greater involvement from financial institutions in disrupting the monetization of abuse material.
Both sources agree on the core trend of CSAM moving behind paywalls and the record volume of reports. However, Irish Times provides a more comprehensive institutional and policy context, including executive commentary and data on non-CSAM content, while RTÉ offers deeper analysis of abuse severity and attributes report increases to expanded international monitoring. Irish Times includes emerging threats like AI-generated content, which RTÉ omits.
- ✓ The Irish Internet Hotline (IIH) processed 61,317 reports of illegal online content in 2025, the highest in its history.
- ✓ There was a significant increase in reports involving infant victims of child sexual abuse material (CSAM), rising from 1% in 2024 to 4% in 2025.
- ✓ CSAM is increasingly being distributed behind paywalls and restricted-access platforms, which hinders detection and removal efforts.
- ✓ The IIH achieved a 99.6% removal rate for assessed CSAM in 2025, regardless of hosting location.
- ✓ The shift to paywalled and closed-access platforms is described as a major challenge for monitoring and takedown operations.
- ✓ IIH works with An Garda Síochána, online service providers, and international networks like INHOPE/ICCAM to identify and remove illegal content.
Cause of increase in report volume
Attributes the 10.8% increase in public reports of CSAM primarily to the expansion of the international ICCAM/INHOPE monitoring system, which allows IIH to act on international content.
Reports an overall 14.7% increase in total reports without specifying that the rise in CSAM reports is largely due to expanded international monitoring; instead, emphasizes the general rise in reports without contextualizing the cause.
Severity classification and content breakdown
Provides detailed breakdown of CSAM severity levels, noting that Level 4 (penetrative acts involving children and adults) accounted for 44% of assessed content, and frames this as indicative of the 'gravity' of abuse.
Does not mention severity classifications or breakdown of content types; omits this granular detail entirely.
Additional categories of illegal content
Mentions IIH’s work on intimate image abuse and other illegal content but focuses almost exclusively on CSAM.
Expands coverage to include statistics on child sexual exploitation material (510 reports, 88% removal), intimate image abuse (96% removal), financial scams (+52%, 194 sites), and racism/xenophobia (510 reports, 44% decrease, 4 criminal cases), providing a broader institutional scope.
Executive commentary and policy implications
Includes no direct quotes from IIH leadership; presents findings through report summary alone.
Features direct quote from CEO Mick Moran calling for financial institutions to help disrupt monetization of CSAM, adding a policy and systemic accountability dimension absent in RTÉ.
Computer-generated CSAM
Does not mention computer-generated or AI-generated abuse material.
Notes a 325% increase in computer-generated CSAM, highlighting an emerging technological threat.
Framing: RTÉ frames the event as an intensifying operational and ethical crisis in digital child protection, emphasizing the severity and vulnerability of victims, particularly infants, and the technical challenges posed by paywalls.
Tone: urgent, clinical, and victim-centered
Framing By Emphasis: RTÉ emphasizes the operational impact on analysts’ ability to monitor content, using phrases like 'fundamentally altered' and 'one of the most disturbing trends' to stress severity.
"The transition... 'fundamentally altered' the ability of IIH analysts to assess, monitor and act against CSAM"
Appeal To Emotion: Detailed focus on severity levels (Level 4 = 44%) and infant victim statistics frames the issue through a lens of escalating harm.
"The prevalence of this second most severe classification... reflects 'the gravity of the material being encountered'"
Proper Attribution: Attributes increase in reports to ICCAM/INHOPE system expansion, providing context that prevents misinterpretation of public reporting surge.
"growth was due primarily to the expanded international monitoring system known as ICCAM"
Balanced Reporting: Does not include CEO statements or policy recommendations, maintaining a report-focused, descriptive tone.
Framing: Irish Times frames the issue as part of a broader ecosystem of online harm, emphasizing commercialization, technological threats (AI-generated content), and the need for systemic interventions involving financial networks.
Tone: institutional, policy-oriented, and expansive
Narrative Framing: Uses the term 'commercialisation' and highlights subscription models, framing CSAM as increasingly monetized and organized.
"a growing trend towards the commercialisation and concealment of online child sexual abuse material"
Editorializing: Includes CEO quote urging financial institutions to act, shifting focus toward systemic accountability.
"the need for financial institutions to play their part"
Framing By Emphasis: Introduces new data on computer-generated CSAM (+325%), signaling concern about technological evolution of abuse material.
"325 per cent increase in computer-generated child sexual abuse material"
Comprehensive Sourcing: Reports broader content categories (scams, hate speech), positioning IIH as a multi-issue watchdog, not solely CSAM-focused.
"Reports relating to financial scams increased by 52 per cent... 510 reports relating to racism and xenophobia"
Cherry Picking: States overall report increase as 14.7% without clarifying that CSAM-specific rise is lower and context-dependent, potentially inflating perceived public reporting surge.
"rose to 61,317, a 14.7 per cent increase on 2024"
Child abuse sites using paywalls to evade detection
Online child sex abuse moving behind paywalls, says internet watchdog