Amazon discovered a 'high volume' of CSAM in its AI training data but isn't saying where it came from

Engadget | 30.01.2026 05:47
The National Center for Missing and Exploited Children said it received more than 1 million reports of AI-related child sexual abuse material (CSAM) in 2025. The "vast majority" of that content was reported by Amazon, which found the material in its training data, according to an investigation by Bloomberg. In addition, Amazon said only that it obtained the inappropriate content from external sources used to train its AI services and claimed it could not provide any further details about where the CSAM came from.