AI ethics

Comment: Microsoft has created a tool to find pedophiles in online chats

On the basis of words and patterns of speech, the system assigns a rating for the likelihood that one of the participants is trying to groom the other. Companies implementing the technique can set a score (for example, 8 out of 10) above which any flagged conversations are sent to a human moderator to review.

Jee, C. (2020). Microsoft has created a tool to find pedophiles in online chats. MIT Technology Review.

This sounds amazing. As long as it works with 100% accuracy. Because if it’s not 100% accurate it’s going to be so unimaginably horrible for any innocent people who are implicated. Even if they’re ultimately found to be innocent, some things are so toxic that simply being associated with it is going to make the rest of your life a nighmare. And, what’s worse…

Microsoft hasn’t explained the precise words or patterns the tool hunts for—doing so could potentially cause predators to adjust their behavior to try to mask their activities.

I understand the reasoning behind the decision the keep the code closed but that means that no-one can evaluate the algorithm making the decision. And given the way that gamers (the technology is currently aimed at gaming platforms) talk to each other (I’m not saying it’s OK), this is going to throw up a ton of false positives because lanugage and people are nuanced and because context matters.

Which is why, for decisions that have high risk/high impact outcomes for any of the stakholders, we’re going to have to insist on AI-based systems that are as close to perfect as is reasonable, in addition to having human moderators informed when potential perpetrators are flagged. I don’t care if my search results are sometimes terrible, or even if my car takes me to the wrong place every now and again. I could even live with the occasional prescription error (provided it doesn’t kill me). But I really don’t want to be wrongfully accused of being a child predator.

Note: See here for a really good article on the challenges and potential solutions to the problem, which go beyond the relatively simple suggestion of keyword and pattern matching.

By Michael Rowe

I'm a lecturer in the Department of Physiotherapy at the University of the Western Cape in Cape Town, South Africa. I'm interested in technology, education and healthcare and look for places where these things meet.