Joanna Stern Interviews Craig Federighi Regarding New iCloud Security Features

At least two exclusive tidbits in this interview (News+ link):

The changes represent a new potential setback for law-enforcement
officials. Last year, Apple proposed software for the iPhone that
would identify child sexual-abuse material on the iPhone. Apple
now says it has stopped development of the system, following
criticism from privacy and security researchers who worried that
the software could be misused by governments or hackers to gain
access to sensitive information on the phone.

Mr. Federighi said Apple’s focus related to protecting children
has been on areas like communication and giving parents tools to
protect children in iMessage. “Child sexual abuse can be headed
off before it occurs,” he said. “That’s where we’re putting our
energy going forward.”

Through its parental-controls software, Apple can notify
parents who opt in if nude photos are sent or received on a
child’s device.

So the controversial CSAM fingerprint-hashing project for iCloud Photos has been shelved. A lot of us saw that project as a precursor to offering end-to-end encryption for iCloud Photos. It is very good news that Apple forged ahead with E2E encyption for Photos without it.

The new encryption system, which will be tested by early users
starting Wednesday, will roll out as an option in the U.S. by
year’s end, and then worldwide including China in 2023, Mr.
Federighi said.

In the video — which is also available on YouTube — Federighi is slightly circumspect about China, saying only that Apple expects it to roll out to all customers around the world next year, but quips that he hasn’t personally heard from the Chinese government about it.

Read Original post from Daring Fireball

Skip to content