The head of WhatsApp criticizes Apple’s tool for scanning private photos on iPhone to curb child abuse

[ad_1]

New Delhi: Will Cathcart, the head of WhatsApp, slammed Apple’s plan to introduce photo identification measures to identify child abuse images in the iOS photo library, saying that Apple software can scan all private photos on the phone, which is a clear violation of privacy.

Cathcart emphasized that WhatsApp will not allow such Apple tools to run on his platform. He said that Apple has long needed to take more measures to combat child sexual abuse materials (CSAM), “but the methods they are taking have brought the world Something very worrying”.

“I read the information Apple released yesterday, and I am very worried. I think this is a wrong approach and a frustration to the privacy of people all over the world. People ask us if we will adopt this system on WhatsApp. The answer is no,” he said in Zhou. Five will post on Twitter later.

“Apple is not focused on making it easy for people to report what is shared with them, but has built software that can scan all private photos on the phone-even photos you haven’t shared with anyone. That’s not privacy.”

On Thursday, Apple confirmed plans to deploy new technologies in iOS, macOS, watchOS and iMessage to detect images of potential child abuse, but clarified key details of the ongoing project.

According to a report by The Verge, for American devices, the new versions of iOS and iPadOS launched this fall have “new cryptography applications to help limit the online spread of CSAM, while also designing for user privacy.”

However, Cathcart said that this is a surveillance system built and operated by Apple that can be easily used to scan private content to find any content they or the government decides to control.

He added: “Countries/regions where iPhones are sold have different definitions of acceptable products.”

Apple said that as the program expands, other child safety organizations may be added as hash sources.

“Will this system be used in China? What content do they think is illegal there, and how do we know? How will they manage requests from governments around the world to add other types of content to the list for scanning?” Cascade Asked.

“What happens when a spyware company finds a way to exploit this software? Recent reports show the cost of vulnerabilities in iOS software. If someone figured out how to exploit this new system, what would happen?” he lamented. Please also read: Now you can use this feature to co-host on Twitter Spaces

According to 9to5 Mac, an internal memo from Sebastian Marineau-Mes, Apple’s vice president of software, acknowledged that the new child protection measures make some people “worry about its impact,” but the company will “maintain Apple’s deep commitment to users.” privacy. “Also read: RBI rule: follow this rule, otherwise your check will be returned



[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker