Bumble Has Released The Open Source Of Its AI Tool For Catching Unwanted Nudity
As part of the fight against “cyberflashing,” dating app Bumble is open-sourcing its artificial intelligence tool that detects unwanted obscene images. Private Detector blurs nudity sent through the Bumble app, giving the user on the receiving end the choice of whether to open the image.
Unwanted sexual harassment is a common reality for many women both online and offline. The study found that 57 percent of women felt they were being stalked on the dating apps they used. More recently, a resident of the United Kingdom found that 76% of girls between the ages of 12 and 18 had received unwanted nude images. The problem goes beyond dating apps as apps have their own solutions.