Anna Lauren Hoffmann, an assistant professor at the University of Washington Information School, appeared on KUOW radio on Thursday, Nov. 9, to discuss Facebook鈥檚 new initiative to prevent 鈥渞evenge porn鈥 and other explicit images from appearing on the site without consent.
In testing its new tools, Facebook is asking users in Australia to take preventive measures by sending such images and videos to the company, which will then use artificial intelligence and photo-matching technologies to prevent people from posting the same content publicly.
Hoffmann, who specializes in data ethics, noted that Facebook is having employees check the intimate photos and videos to ensure that they would violate the company鈥檚 terms of service if they were to be posted. As she told KUOW host Bill Radke, Facebook doesn鈥檛 want to give people a tool to ban photos simply because they dislike or disagree with them.
鈥淭his is a good reminder of the ways in which the human and the technological are intimately intertwined,鈥 Hoffmann said.
Radke asked whether it makes sense for people to send photos and videos to Facebook when they absolutely don鈥檛 want those same images appearing there. Hoffmann said that from an engineering perspective, the company鈥檚 solution makes sense, but it might be lacking on a human level.
鈥淲hat you have here is a site that鈥檚 saying 鈥榡ust trust us. Just go ahead and share with us this photo that has already victimized you,' 鈥 she said.
Hoffmann said such a policy raises a bigger issue over how much power companies such as Facebook are allowed to wield.
鈥淚 think we need to take a step back and think about the broader frame of the kinds of data and the kinds of information about us that are circulating through these platforms and the kinds of power that these platforms have,鈥 she said.
You can listen to a recording of the . Hoffmann鈥檚 segment begins around the 38-minute mark.