WhatsApp has a zero-tolerance policy around child sexual abuse. We ban users from WhatsApp if we become aware they are sharing content that exploits or endangers children. We also report tips to the National Center for Missing and Exploited Children (NCMEC), which plays an important role in helping victims of abuse. WhatsApp employs a dedicated team that includes experts in law enforcement, online safety policy, and investigations to oversee these efforts.
To protect the privacy and security of our users, WhatsApp provides end-to-end encryption by default, which means only the sender and recipient can see the content of messages. To help prevent sharing of child exploitative imagery (CEI), WhatsApp relies on all available unencrypted information including user reports to detect and prevent this kind of abuse. Over the last three months, WhatsApp has banned approximately 250,000 accounts each month suspected of sharing CEI.
For example, we use photo-matching technology called PhotoDNA to proactively scan profile photos for known CEI. Should our systems detect such an image, we will ban the user and associated accounts within a group. While we may not be able to distinguish between a user sending or receiving CEI images, our zero tolerance policy means we ban all group members from using WhatsApp.
We are working with leading technology companies to improve our ability to detect media with child nudity and previously unknown child exploitative content. In turn, our work will help expand the reach of photo matching technology to help prevent these images from being used elsewhere. In addition, we rely on advanced machine learning technology to evaluate group information to identify and ban members of groups suspected of sharing CEI.
WhatsApp encourages users to report problematic content to us. Whenever a user receives a message from someone outside of their address book, we display a message asking if a user wants to “block” or “report” the contact. Users can also report an individual account or group at any time. When a user sends a report, our machine learning systems review it for CEI content and take action. Learn more about how to stay safe on WhatsApp in our Help Center.
WhatsApp was built for private messaging, mostly among contacts that know one another. We do not provide an ability to find people or groups outside of your contacts — like on social media. We work with app store providers to prevent the proliferation of apps that have CEI content or that attempt to connect people interested in sharing CEI content via group invite links.
WhatsApp appreciates the work that law enforcement agencies do to keep people safe around the world. We regularly engage with law enforcement agencies to ensure they know how to contact us and understand how to make requests of WhatsApp. Our Information for Law Enforcement Authorities includes an online system for law enforcement to securely submit these legal requests.
When WhatsApp becomes aware of CEI on the platform, we ban the accounts involved. We also remove the images and report them along with associated account details to NCMEC in compliance with U.S. law. When NCMEC refers matters to law enforcement for investigation, WhatsApp is prepared to respond to valid law enforcement requests accordingly. WhatsApp has received feedback from law enforcement that our efforts have assisted in rescuing victims of child abuse.
February 6, 2019