الخميس، 16 ديسمبر 2021

NEWS TECHNOLOGIE

Apple talks up iPhone security, but Zerodium says it's falling behind.

Apple raised many eyebrows earlier this year when it announced a plan to combat child sexual abuse with a multi-pronged approach of several new technologies that would be implemented in iOS 15. The most controversial was a program that would scan users’ iCloud libraries for CSAM, which stands for Child Sexual Abuse Material. With the rollout of iOS 15.2 this week, Apple did implement one of these anticipated features — the ability to detect nude photos in the children’s version of Messages — but the aforementioned scanning technology was notably absent. As of today, it appears all references to the image scanning part of Apple’s plan have been removed from its website, leading people to wonder if Apple has scuttled the technology for good due to the fierce backlash.

Previously, Apple had announced it was simply delaying the launch of the technology due to the criticism, stating it needed time to listen to feedback and revise its implementation, according to Macrumors. Back in September it released the following statement, “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

However, instead of the company stating that these plans are still in-place, Macrumors writes the company simply erased every sign of it from its Child Safety website. As you can see from visiting the link, it only talks about the just-launched nudity-detection algorithm for Messages, which is not enabled by default and appeared in iOS 15.2. As we noted in our coverage, “…once implemented on a device with a family-sharing account, it will scan for nudity within images sent and received by the Messages app. If nudity is detected, Messages blurs the image and displays an on-screen warning, which explains the dangers of sharing explicit photos and asks whether the viewer would like to proceed.” So far, it seems this technology has been received without much hullabaloo, but the week isn’t over yet. 

Apple rolled out nudity detection for children using its Messages app this week, seemingly to little backlash.

Interestingly, critics of Apple’s iCloud scanning technology put forth what essentially boils down to a “slippery slope” argument, saying if Apple can design an algorithm that scans for X, what’s stopping it from scanning for Y and Z in the future? As the Electronic Freedom Foundation put it, “All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan the accounts of anyone, not just children. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.” There were also concerns that governments would co-opt Apple’s technology for surveillance of its citizens, a claim the company vociferously promised it would never allow.

Finally, despite Apple removing mentions of CSAM from its Child Safety portal, we were still able to dig up the original text Apple released when it announced its new initiative, so perhaps Apple just forgot about that PDF. What is noteworthy, however, is the newly updated Child Safety pages only mention the nudity detection for images in Messenger, and not CSAM. Despite the removal of references to the controversial technology on its website, an Apple spokesman told The Verge the company’s plans haven’t changed, and that it’s still just delayed.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/3q3ODZP

ليست هناك تعليقات:

إرسال تعليق