您现在的位置是:焦點 >>正文

【】

焦點9787人已围观

简介Apple is officially taking on child predators with new safety features for iPhone and iPad.One scans ...

Apple is officially taking on child predators with new safety features for iPhone and iPad.

One scans for child sexual abuse material (CSAM), which sounds like a good thing. But it has several privacy experts concerned.

So, how does it work? The feature, available on iOS 15 and iPadOS 15 later this year, uses a new proprietary technology called NeuralHash to detect known CSAM images.

Before the image is stored in iCloud Photos, it goes through a matching process on the device against specific CSAM hashes.

It then uses technology called "threshold secret sharing," which doesn't allow Apple to interpret a photo unless the related account has crossed a threshold of CSAM content.

Apple can then report any CSAM content it finds to the National Center for Missing and Exploited Children (NCMEC).

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

It's worth noting that there is room for false positives. Matthew Green, cybersecurity expert and associate professor at Johns Hopkins University, took to Twitter to voice his concerns.

“To say that we are disappointed by Apple’s plans is an understatement,” said the Electronic Frontier Foundation, arguing that “even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.”

We've reached out to Apple for comment and will update this story when we hear back.

Apple says its threshold provides "an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account."

Once a device crosses that threshold, the report is manually reviewed. If Apple finds a match, it disables the user's account and a report is sent to NCMEC. Users who think their account has been flagged by mistake will have to file an appeal in order to get it back.

While it's tough to criticize a company for wanting to crack down on child pornography, the fact that Apple has the ability to scan someone's photos in generalis concerning. It's even worse to think that an actual human being might look through private images only to realize an account was mistakenly identified.

SEE ALSO:Apple addresses AirTags security flaw with minor privacy update

It's also ironic that Apple, the company that brags about its privacy initiatives, specifically its Nutrition Labels and App Transparency Tracking, has taken this step.

Apple assures users that "CSAM is designed with user privacy in mind," which is why it matches an image on-device beforeit's sent to iCloud Photos. But they said the same thing about AirTags, and, well, those turned out to be a privacy nightmare.

TopicsCybersecurityiPhonePrivacy

Tags:

相关文章

  • Snapchat is about to explode in popularity, report says

    焦點

    Snapchat is about to have a couple of really good years.。The company will see huge gains in the numb ...

    焦點

    阅读更多
  • 白帶常規檢查正常標準

    焦點

    白帶常規檢查是一種在如今的婦科中  ,比較常見的一種檢查方式 ,通過白帶檢查 ,可以查出多種類型的婦科病,使女性能夠更加明確自己的健康狀況,如果有疾病的話,也能夠早日進行治療,以避免病情惡化給女性的健康帶來更 ...

    焦點

    阅读更多
  • 先兆流產怎樣保胎最快

    焦點

    先兆流產的情況是許多孕婦都有注意的,但是並不知道要怎樣保胎才是最快的。先兆流產的情況,孕婦一定要注意好自己的休息,並且減少活動,然後需要留意流血量和性質,並且注意好自己的陰道是否有異物排出,然後多留意 ...

    焦點

    阅读更多