Is Apple Going To Scan iCloud Photos for Child Abuse Images! 

Apple Starts Scanning iCloud Photos for Child Abuse Images

As technology companies face increased federal pressure to crack down on child sexual exploitation and abuse on their platforms, Apple is gonna launch a new system in the United States that will help identify images of child sexual abuse in photos people upload on their iPhone before they are uploaded to the iCloud.

In Detail

On Thursday, August 5, 2021, Apple said it would begin launching a new system that will automatically compare images on iPhones and uploaded iCloud accounts to a database of child sexual abuse photos and inform authorities as required.

According to the company’s press conference, the new software will convert photos on devices into an unreadable collection of hashes or complicated numbers saved on user devices. The software will compare these numbers to a hash database issued by the National Center for Missing and Exploited Children.

Apple Starts Scanning iCloud Photos for Child Abuse Images

source: feednews.com

According to Apple, detecting child sexual abuse material (CSAM) is one of several new features intended at better protecting children who use its services from online harm, including filters to block potentially sexually explicit photographs sent and received through a kid’s iMessage account. When a person tries to search for CSAM-related terms using Siri or Search, additional features will be unlocked.

Apple follows in the footsteps of other major technology organizations such as Google and Facebook in adopting this move. However, it is also attempting to strike a balance between security and privacy, the latter of which Apple has highlighted as a key selling point for its gadgets.

Apple’s latest CSAM detection technology, NeuralHash, operates a user’s device and can detect whether a user uploads known child abuse images to iCloud without decrypting the photos until a threshold is exceeded and a process of checks to ensure the content is clear.

Back Story

Apple Starts Scanning iCloud Photos for Child Abuse Images

source: engadget.com

In the previous year, the US. The Department of Justice issued a series of “voluntary guidelines” requiring technology and social media organizations to do more to fight child sexual exploitation and abuse. It urged businesses to put in place a comprehensive system for detecting and responding to illegal content and reporting it to authorities. As a result, Microsoft developed photo DNA better to identify photographs of child sexual abuse on the internet. Both Facebook and Google have procedures in place to review and identify potentially illegal content. In addition, Facebook has stated that it is developing new measures to reduce. The distribution of photographs of child sexual abuse on its site.

Statement 

Apple Starts Scanning iCloud Photos for Child Abuse Images

source: firstcoastnews.com

Matthew Green, a security researcher at John Hopkins University, told the Associated Press. Apple’s willingness to build a system to scan iPhone user’s phones for prohibited content. Which breaks the dam and leads to the US government demanding it from everyone. He also expressed concern that other international governments might compel Apple to detect other data.

Shocking Numbers 

20 million. According to a study by the National Council for Missing and Exploited Children. The number of child sexual abuse pictures Facebook submitted to law authorities in 2020. This figure incorporates data from both the Instagram & Facebook apps. This figure has risen from 16 million in 2019.

Disclaimer

Apple Starts Scanning iCloud Photos for Child Abuse Images

source: pinterest.com

This news/content is taken from third-party websites and is provided here for information purposes only; Live Enhanced makes no such claim or guarantees about the accuracy of the data. 

Recommended Articles