some don’t know bumble, but let’s say it’s Tinder’s competition. It is a dating app, concerned about unsolicited nudity and inappropriate content that its users may receive, so he launched an interesting artificial intelligence (AI).
This online dating app works in a very similar way to the most popular platform in this segment of the industry. Profiles of potential matches are displayed to users, who can swipe left to reject a candidate or swipe right to indicate interest.
If you use Bumble and you’re interested in someone, you might be able to meet your better half, but there’s also the scenario where someone catches your eye, but tries to “talk you into it” with inappropriate content, like nudes you never asked for.
Fortunately, with the well-being of its users in mind, Bumble launched a tool called Private Detector. According to the report posted on the website of InfobaeIt works with an artificial intelligence.
What is Private Detector?
Private Detector uses an AI that is trained to detect nude images and other inappropriate content being posted within Bumblee chats.
This feature is built into the profiles within the dating app and will notify users in case any user sends an inappropriate photo via direct message to another person without their consent.
The developers of Private Detector used machine learning as a way to allow the app to understand a photograph and detect it if it contains an image that may be considered harmful to the user who receives it.
To achieve a high level of precision, the artificial intelligence tool was exposed to non-sexual images labeled as “limit content”, which refers to parts of the body such as arms or legs, so that these can be allowed and do not rate any photo as sensitive or inappropriate content.
Additionally, images containing weapons are also tagged as sensitive content and can be reported to media that are integrated within the app.