• Adidas
  • Adobe
  • AliExpress
  • Amazon
  • AMD
  • Android
  • Apple
  • Batman
  • Bitcoin
  • ChatGPT
  • Chocolate
  • CorelDRAW
  • Cyberpunk
  • Disney
  • Elden Ring
  • Entertainment
  • Exercises
  • Facebook
  • Gaming
  • Google
  • HBO
  • Health
  • Hogwarts Legacy
  • How to
  • How to grow your children
  • Huawei
  • Instagram
  • Internet
  • iOS
  • iPhone
  • Lamborghini
  • Lenovo
  • Linux
  • Marijuana
  • Marvel Cinematic Universe
  • Mediatek
  • Mercedes
  • Metaverse
  • Mexico
  • Microsoft
  • MIUI
  • Motorola
  • Movies
  • Movistar
  • Naruto
  • Netflix
  • NFT
  • Nintendo
  • Nissan
  • OnePlus
  • Photoshop
  • PlayStation
  • Pokemon
  • Pregnancy
  • PUBG
  • Redmi
  • Russia
  • Samsung
  • Series
  • Smart Home
  • Smartwatch
  • Sony
  • Space
  • Technology
  • Terms And Conditions
  • TikTok
  • Toyota
  • Trailer
  • Twitter
  • Uber
  • Uncharted
  • Volkswagen
  • Walmart
  • WhatsApp
  • Wi-Fi
  • Will Smith
  • WordPress
  • Write for us
  • Xbox
  • YouTube
  • Windows
Facebook Twitter Instagram
Facebook Twitter Instagram
Bullfrag Bullfrag
Subscribe
  • Entertainment
    • Fashion
    • Lifestyle
      • Home Decor
  • Gaming
  • Health
  • News
    • Business
      • Marketing
    • Cryptocurrency
    • Sports
  • Recipes
  • Technology
    • Science
    • Automobiles
    • Internet
    • Software
Bullfrag Bullfrag
Home»News»Google denounces a father after scanning his photos and classifying a photo of his sick son as child abuse

Google denounces a father after scanning his photos and classifying a photo of his sick son as child abuse

AlexBy AlexAugust 22, 2022No Comments3 Mins Read
Google denounces a father after scanning his photos and classifying a photo of his sick son as child abuse
Share
Facebook Twitter LinkedIn Pinterest Email

The Google AI in charge of reviewing the images that users send through the Android messaging application, and whose objective is to detect and report those multimedia media that contain child sexual abuse material (CSAM), has blocked the account of a dad after he took a photo of his son’s groin infection to send to his doctoras reported by the parent himself to New York Times.

The Father, named Mark, sent the image at the request of a nurse in February 2021. Date on which some health centers in the United States maintained telematic visits despite the fact that the restrictions due to the pandemic had eased. The nurse, after assessing the image and making a visit via video call with the adult, was able to prescribe antibiotics intended to cure the infection. This, in particular, was in the genital area that the little one.

Google instead sent a notification to Mark two days after the photo was captured. In this, the company warned of having detected “harmful content” which, moreover, was “a serious violation of Google’s policies and could be illegal“. The father’s accounts were blocked immediately after the notice. Google automatically forwarded the report to the National Center for Missing & Exploited Children (NCMEC) for further investigation.

Mark assures T.N.Y. that the blocking of your account by Google caused you to lose access to all services and platforms linked to your user. Among them, email, contacts, images or even a phone number, since his father used Google Fi, the company’s OMV.

Read:  BMW will use Android Automotive, but without Google services

Google’s system to detect images of child sexual abuse does not convince users and experts

Photo by Tim Gouw in unsplash

mark too was investigated by the San Francisco Police in December 2021. The case, however, was closed after the Police, with the evidence presented, concluded that no crime had been committed.

Mark’s case, however, is further proof of the drawbacks of the child abuse detection system that companies like Google or Apple are using or will use in the future. This, in particular, works through an artificial intelligence system, which isscan the images that are sent via message to find matches against the hash database from the National Center for Missing & Exploited Children. If found, a human will review those images to categorize them as CSAM, lock the account, and initiate investigation. Citizens and experts, however, believe that this method is an attack on the privacy of users.

Google, however, has assured TheVerge that the team of experts in charge of reviewing the images also consults with pediatricians to “identify instances where users may be seeking medical adviceMark’s image, showing his son’s infection, however, was reported directly to NCMEC.


Related Posts

What is luxury made of?

June 4, 2023

These are the 5 most outstanding campaigns of the week

June 4, 2023

Woman reveals on TikTok how she stopped feeling guilty about being attractive

June 4, 2023
Add A Comment

Leave a Reply Cancel reply

What happened to the child actors of “Accomplices to the Rescue”

June 4, 2023

rods vs. Djokovic LIVE via ESPN and Star: link of today’s match

June 4, 2023

With these two tricks you can save on your plane tickets this summer with little effort

June 4, 2023

Who are the characters that return to “Manifesto” Season 4 Part 2

June 4, 2023
Facebook Twitter Instagram
  • Privacy Policy
  • Disclaimer
  • Terms And Conditions
  • Write for us
© 2023 Bullfrag. Designed by Bullfrag.

Type above and press Enter to search. Press Esc to cancel.