Bumble open sourced its AI that detects unsolicited nudes • TechCrunch

0

As part of its larger commitment to combating ‘electronic flicker’, dating app Bumble is open sources Its proprietary AI tool that detects unwanted obscene images. Debuting in 2019, Private Detector (let’s take a moment to allow that name to show up) blurs nudes being sent via the Bumble app, giving the user on the receiving end the option to open the photo or not.

“Although the number of users who submit profanity images on our apps is fortunately a tiny minority – only 0.1% – our scale allows us to collect an industry-best data set of pornographic and non-pornographic images, designed to achieve the best possible performance on the task,” the company wrote. In a press release.

Available now on github, a revised version of AI is available for commercial use, distribution, and modification. Although it is not a fully developed technology to develop a model that detects nude images, it is possible that small companies do not have the time to develop themselves. So, other dating apps (or any product people can send pictures of their penis to, AKA the whole internet?) can practically incorporate this technology into their own products, helping to protect users from unwanted obscene content.

Since the launch of Private Detector, Bumble has also worked with US lawmakers to enforce the legal consequences of sending unwanted nudes.

“There is a need to address this issue outside the Bumble product ecosystem and engage in a larger conversation about how we can tackle the problem of unwanted obscene images – also known as electronic streaming – to make the Internet a safer and kinder place for everyone,” Bumble added.

When you stutter first inserted This artificial intelligence, the company claimed to have an accuracy of 98%.

Leave A Reply

Your email address will not be published.