Google AI flagged parents’ accounts for potential abuse over nude photos of their sick kids

Sharanya Sinha
Sharanya Sinha August 22, 2022
Updated 2022/08/22 at 9:07 PM

A concerned father says that after using his Android smartphone to take photos of an infection on his toddler’s groin, Google flagged the images as child sexual abuse material (CSAM), according to a report from The New York Times. The company closed his accounts and filed a report with the National Center for Missing and Exploited Children (NCMEC) and spurred a police investigation, highlighting the complications of trying to tell the difference between potential abuse and an innocent photo once it becomes part of a user’s digital library, whether on their personal device or in cloud storage. When Apple announced its child protection program last year, concerns were raised about the consequences of blurring the lines of what should be considered private. As part of the project, Apple internally scans images on Apple devices before uploading them to iCloud and then matches the images with NCMEC’s ​​well-known CSAM database. If you find enough competitions, the personal rating will review the content and block the user account if CSAM has CSAM.

Accounts have been made with content “maybe illegal”. Electronic Fund (EFF), a nonprofit group of Digital Rights (EFF), has been attached to Apple’s plan: “It can open the door for your personal life” and “refers to confidentiality for all iClaud photos”, “and” and ” Progress is not progressing ”.

Apple ended with the end of the scan area of ​​images, but featuring iOS 15.2, it continued to add additional function for children’s accounts included in the family exchange program. When choosing parents with news with news, “Analyze the image link and determine if there is a naked photo keeping the end of the first messages.” If he is found naked, he will remove the image, will show a warning to the child and give evidence that you want to help protect the Internet.

The main incident in which The New York Times was covered in February 2021, when some of the doctor’s departments were closed due to the -19 government infection. According to the Times, Mark (whose last name has not been released) noticed a swelling in his son’s genital area and, at the nurse’s request, sent photos of the problem before the video consultation. The doctor prescribed antibiotics to treat the infection. “Child sexual abuse material (CSAM) is abhorrent, and we are committed to preventing it from appearing in our operating systems,” Google spokeswoman Krista Muldoon told The Verge in an emailed statement. We follow US law in defining CSAM and use a combination of hash-matching technology and artificial intelligence to identify and remove them from our websites. “Additionally, our team of child safety experts review branded content for accuracy and consult with pediatricians to ensure we can identify cases where users may seek medical attention.”

While protecting children from abuse is undoubtedly important, critics argue that the practice of scanning users’ photos unduly intrudes on their privacy. John Callas, EFF’s director of technology projects, called Google’s actions “coercive” in a statement to The New York Times. “This is exactly the dream we all care about,” Gallos told the NYT. They will scan my family album and then I’m in trouble.

For more such updates on latest news, keep reading on techinnews.com

TAGGED: , ,
Share this Article