麻豆传媒AV

Skip to content

Apple to scan U.S. phones for images of child abuse

Apple is planning to scan U.S. iPhones for images of child abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused by governments looking to surveil their citizens.
2021080515084-610c367397218f96fbd51657jpeg

Apple is planning to scan U.S. iPhones for images of child abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused by governments looking to surveil their citizens.

Apple said its messaging app will use on-device machine learning to warn about sensitive content without making private communications readable by the company. The tool Apple calls 鈥渘euralMatch鈥 will detect known images of child sexual abuse without decrypting people's messages. If it finds a match, the image will be reviewed by a human who can notify law enforcement if necessary.

But researchers say the tool could be put to other purposes such as government surveillance of dissidents or protesters.

Matthew Green, a security professor at Johns Hopkins University who earlier posted his concerns on Twitter, that Apple's move will 鈥渂reak the dam 鈥 governments will demand it from everyone.鈥

Tech companies including Microsoft, Google, Facebook and others have for years been sharing 鈥渉ash lists" of known images of child sexual abuse. Apple has also been , which unlike its messages is not end-to-end encrypted, for such images.

The company has been under pressure from governments and law enforcement to allow for surveillance of encrypted data.

Apple was one of the first major companies to embrace 鈥渆nd-to-end鈥 encryption, in which messages are scrambled so that only their senders and recipients can read them. Law enforcement, however, has long pressured for access to that information in order to investigate crimes such as terrorism or child sexual exploitation.

鈥淎pple鈥檚 expanded protection for children is a game changer," John Clark, President & CEO, National Center for Missing & Exploited Children, said in a statement. "With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material,鈥

Julia Cordua, the CEO of Thorn, said that Apple's technology balances 鈥渢he need for privacy with digital safety for children." Thorn, a nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifying victims and working with tech platforms.

Barbara Ortutay And Frank Bajak, The Associated Press

push icon
Be the first to read breaking stories. Enable push notifications on your device. Disable anytime.
No thanks