Skip to main content
All CollectionsTrust & Safety
CSAM Detection and Reporting
CSAM Detection and Reporting
Updated over a week ago

Anthropic strictly prohibits Child Sexual Abuse Material (CSAM) on our services. We are committed to combatting CSAM distribution across our products and will report flagged media and related information to the National Center for Missing and Exploited Children (NCMEC).

As just one example of how we are combatting CSAM distribution: on our first-party services, we use a hash-matching tool to detect and report known CSAM that is included in a user or organization’s inputs. This tool provides access to NCMEC’s database of known CSAM hash values. When an image is sent in an input to our services, we will calculate a perceptual hash of the image. This hash will be automatically compared against the database. In the case of a match, we will notify and provide NCMEC information about the input and the related Account.

As part of Anthropic’s safety process, we will also send a notice to the user or organization any time we report CSAM to NCMEC. If you receive a notification from us about CSAM detection and believe we’ve made an error, please email

Did this answer your question?