Report links Discord in string of kid kidnappings, exploitation
This narrative contains explicit discussions of child and adolescent grooming, abduction, and sexual exploitation.
There is a disturbing rumour that paedophiles have previously used Discord to sexually exploit minors. A new NBC News analysis shows that the popular chat platform was mentioned in 35 cases against prosecuted adults who engaged with juveniles prior to their convictions over the past several years.
According to Discord’s rules, as well as the laws of nearly every other country on Earth, the sale of such items is strictly forbidden. The National Centre for Missing & Exploited Children (NCMEC) found that between 2020 and 2021, CSAM (child sexual abuse material) reports on Discord increased by 474 percent.
NBC points out that Discord isn’t the only platform to face issues related to online child trafficking. The fact that it has become so intertwined with video games—via native apps on consoles and the relocation of designers’ communities—has only exacerbated the issues that it already had.
An additional 165 examples of adults being charged for sending or receiving CSAM on Discord have allegedly been reported. Ninety-one percent of those found guilty have been in criminal cases.
The senior vice president of NCMEC, John Shehan, has acknowledged the “explosive development” of CSAM on Discord. Similarly, he classified it as “indisputable.” There is a risk of child exploitation on the site.
Shehan adds that NCMEC “regularly” receives reports from other online platforms where individuals specifically label Discord a venue for CSAM, proving his case further.
Discord’s blind areas have let the concern bloom
According to NBC’s investigation, the reason Discord still has a CSAM problem is that it allows people of all ages to mix together in community servers. Another issue with online services is that it is impossible to verify a user’s stated age.
Discord says it doesn’t monitor every server or conversation. […]When we see an issue, we look at the behaviours that cause it and change them.
NCMEC suggests that a change in Discord’s response time to such products is a contributing reason to this evolution. It took 3 days to respond to complaints on average in 2021, while it’s taken 5 days in 2022. Discord may have stopped responding to complaints under other conditions.
According to their 2022 transparency report, Discord disabled over 37,000 accounts for violating its child safety policies. It purportedly provides product such IP addresses, chat logs, and account names when interacting with law enforcement over CSAM complaints.
As a next step, Discord has announced that it plans to acquire “age guarantee innovations” to reduce the volume of exploitative content on the service. Police and security firms believe these concerns should have been resolved far sooner on the site.
According to Inhope’s executive director Denton Howard, “security by style needs to be included from Day One.” Inhope is a hotline for abused and missing children. As of Day 100, no.
Check out this link for NBC News’ full coverage. The post also discusses Discord’s partnerships with other organisations working to minimise global CSAM production and the methods through which adults entice young people to join their servers.
Game Designer has joined Discord to discuss the NBC News article and will provide an update when a solution has been found.
Add to favorites