Two Hat releases artificial intelligence model to fight against child sexual abuse
Canadian AI technology company Two Hat has released CEASE.ai, an image recognition technology for social platforms and law enforcement that detects images containing child sexual abuse. By making the technology available to public and private sectors, Two Hat says it aims to address the problem not only at the investigative stage but also at its core, by preventing images from being posted online in the first place.
“This issue affects everyone, from the child who is a victim, to the agents who investigate these horrific cases, to the social media platforms where the images are posted,” says Two Hat CEO and founder Chris Priebe. “With one hat in social networks and the other in law enforcement, we are uniquely positioned to solve this problem. With CEASE.ai, we’ve leveraged our relationship with law enforcement to help platforms protect their most vulnerable users.”
Built in collaboration with Canadian law enforcement, and with support from the Government of Canada’s Build in Canada Innovation Program and Mitacs with top Canadian universities, CEASE.ai is an artificial intelligence model that uses ensemble technology for precision. With Two Hat’s recent acquisition of image moderation company ImageVision, they say they’ve boosted their existing technology to achieve even greater accuracy and efficiency.
Unlike similar technology that only identifies known images (“hash lists”), CEASE.ai detects new child sexual abuse material (CSAM). Developed for law enforcement, CEASE.ai aims to reduce investigators’ workloads and reduce trauma by prioritizing images that require immediate review, ultimately rescuing innocent victims faster, according to the company. Now social platforms can use CEASE.ai to detect and remove child abuse images as they are uploaded, preventing them from being shared.
Predators are increasingly using social platforms to solicit and share images. According to a 2018 NetClean report, “Grooming and extortion are now coming from social media apps, unlike a few years ago where most of it occurred by someone that had access to the child.”