Facebook asks for nude photos from Australian users to combat 'revenge porn'

Muriel Hammond
November 10, 2017

Would you trust Facebook with your nude photos?

People concerned their nude pictures may be uploaded by an ex are being asked to send the images to Facebook so they can be blocked if they are posted in the future.

In April, Facebook detailed plans to fight revenge porn, including an artificial intelligence tool capable of matching photos to prevent them from appearing on platforms like Messenger or Instagram.

Facebook is telling users to send them naked pictures they might have of themselves for their own protection.

They will keep the blurred image for some time to ensure the technology is working correctly before deleting it. He questions how Facebook will protect that uploaded image from hackers.

According to The Guardian, anyone who shared an intimate, nude or sexual images with their partner and are anxious that they might distribute them without their consent can use Messenger to send the images to be "hashed" - which means that Facebook "converts the image into a unique digital fingerprint that can be used to identify and block any attempts to re-upload that same image".

Turns out, before the image can be "hashed", an actual human at Facebook has to look at it to make sure it "fits the definition of revenge porn".

The "preemptive revenge porn defense" will be tested in Australia and 3 other countries for now, and to execute the idea, Facebook will be partnering with e-Safety, an Australian government agency focused on preventing digital abuse.

Not only will this team mean issues like this will be dealt with faster, but this new photo-recognition technology that Facebook founder and CEO Mark Zuckerberg is putting into place will be able to find any reproductions of the offensive photos and delete them all.

The pilot program is also available in the USA, the United Kingdom and Canada, according to CNBC.

Other reports by

Discuss This Article