It is estimated that Google registers more than 5,600 million searches worldwide every day. is the search engine most used and as such, has a responsibility to all.
For this reason, Google has announced that it will start using Artificial Intelligence to help those who may be looking for content related to suicideabuse or mistreatment, at that moment when they need it most.
How Google’s AI will help those at risk
Google will incorporate a system of machine learning to detect searches related to suicide attempts, abuse or sexual abuse thanks to MUM, its new search algorithm presented in autumn 2021, whose AI understands and offers more context with more human questions and answers.
Through Google searches, artificial intelligence will be able to detect traits that indicate someone is in danger, and offer search results with quality content related to health and safety.
In this case, MUM will be able to detect which searches may be related to a difficult personal moment. In this way, the search engine will be able to offer the user quality information related to health and safety.
As explained by Anne Merrit, director of health products at Google, “for a human being it is obvious that ‘why has he attacked me by telling him I don’t love him’, is a search related to gender violence. But these types of questions are not so obvious for an Artificial Intelligence system”.
And it is precisely there where MUM improves that as they already explained to us in Xataka, allows more accurate and humanized search results who will be able to help those in danger.
Photos | Firmbee.com and Nathana Reboucas on unsplash