New York, Apr 6 (IANS): Facebook has rolled out tools to help people thwart the circulation of their intimate images without consent or 'revenge porn' on its platforms including Messenger and photo-sharing service Instagram.
In a press statement issued late on Wednesday, Facebook said these tools were an example of the potential technology has to help keep people safe.
Facebook said that 93 per cent US victims of non-consensual intimate images report significant emotional distress and 82 per cent report significant impairment in social, occupational or other important areas of their life.
If a user notices an intimate image on Facebook that seems to have been shared without permission, he/she can report it by using the "Report" link that appears next to the post.
A Facebook team will then review the image and remove it if it violates Community Standards.
According to the social networking site, in most cases they will also disable the account for sharing intimate images without permission.
It does not stop here. Facebook uses photo-matching technologies to help curb further attempts to share the image on Facebook, Messenger and Instagram.
"If someone tries to share the image after it's been reported and removed, we will alert them that it violates our policies and that we have stopped their attempt to share it," Facebook said.
About four per cent of US Internet users -- 10.4 million people -- have been victims of revenge porn or threatened with the posting of explicit images, according to a 2016 study by the US Data and Society Research Institute.
According to a report in the Washington Post, Facebook's policies on 'revenge porn' have come into sharp focus after members of the US Marine Corps were found to be sharing nude pictures of female Marines, without permission, in a private Facebook group.