Dating apps have proven to be a mixed bag for women, with sexual harrasment being commonplace on all platforms. It seems that could be changing, as Bumble has announced an image-detecting AI that will protect women from unwanted pictures of male genitalia.
Fans of the app are pretty happy and are hoping that it leads to other apps using a similar AI. Hopefully, this is the case, since apps like Tinder let men talk first, so that can mean dick pics first.
No more dick pics on Bumble?
According to CNET, Bumble has been using an AI called Private Detector, which detects pictures of men’s privates. Rather than block them outright, it gives the user a chance to respond if they want to see the dick pic or not.
Interestingly enough, the dating app has now released Private Detector to the entire internet, so now anyone can use it. Obviously, this is a great move for women everywhere, combating sexual harassment.
"Safety is at the heart of everything we do and we want to use our product and technology to help make the internet a safer place for women," Rachel Haas, Bumble's vice president of member safety, said. "Open-sourcing this feature is about remaining firm in our conviction that everyone deserves healthy and equitable relationships, respectful interactions, and kind connections online."
Read More: ‘Barebones Holodeck’ powered by AI generates anything you ask for
Fighting dick pics since day one
Dating activists know that Bumble has always advertised itself as a female-friendly app, combating dick pics for a long time. The company was able to have anti-cyberflashing laws for books in Texas and Virgina, which is a very positive step.
"Bumble was one of the first apps to address cyberflashing by giving the power to our community to consensually decide if they would like to see certain photos and creating a safety standard if not. We've been working to address cyberflashing and to help create more online accountability for years but this issue is bigger than just one company," said Bumble public policy head Payton Iheme.