ActivityPub/Primer/Detecting and Reporting CSAM

From W3C Wiki

Note: The following discussion is not legal advice.

ActivityPub processors should consider the issue of child sexual abuse material (CSAM) in their implementations.

Child sexual abuse material (CSAM) is a complicated topic; this page only discusses some features that ActivityPub servers and clients can use to limit distribution of CSAM materials.

  • Screen images for CSAM. The PhotoDNA cloud service provides both screening and reporting functionality.
  • Report confirmed CSAM images. PhotoDNA also includes reporting functionality. Moderators may need a separate, manual mechanism for reporting images.
  • Check local submitted images for CSAM. When images are uploaded to the server via the ActivityPub API or another API, the server should screen the content for CSAM. This will limit the distribution of matching material to the bare minimum.
  • Re-check existing images. Newly-created CSAM may not appear in remote API service databases for some period of time. It may make sense to re-check uploaded files multiple times after creation.
  • Report deletions to recipient servers. If an image that has been distributed to recipient servers is later deleted by moderators, don't delete it silently. Send a Delete activity to recipient servers so they know to delete cached data from their own storage.
  • Check images received remotely through ActivityPub for CSAM. When images are received by the server via the ActivityPub protocol from other servers, the recipient server should screen the content for CSAM.
  • Consider moderator safety in administrative interfaces. When content is reported to moderators, consider the trauma caused by viewing CSAM material. Blurring, pixellating, rendering in black-and-white, or otherwise limiting the traumatic impact of CSAM images can give moderators the option to review details or report and remove the content.


Reading and Resource List

General Information

European Union

Reporting

Reporting requirements vary by country, and are based on where your servers are hosted, not where you are based as a Service Provider. e.g., Canadian website but servers are in US, then US laws apply for reporting. You likely also want to report the content locally.

IFTAS has information available on GitHub about CSAM+CSE

USA

Reporting is legally required under federal law, reporting happens via the NCMEC CyberTip Line API.

Canada

European Union

No Reporting API available, but DSA has some reporting requirements:

Germany

All possession of CSAM, even if for the purposes of reporting to law enforcement is strictly illegal, and will result in jail time. Recommendation from Freiwillige Selbstkontrolle Multimedia-Diensteanbieter e.V. (FSM) has been "just delete the media, don't bother reporting it to police", as to report it, you must leave it up, and then you could face jail time, because you possess CSAM on your servers.

Additionally, advice from FSM has been "if law enforcement want to investigate, they will notify you".

Other Jurisdictions

Tools and APIs