The tech giant is defending its new features, aimed at preventing the spread of child sexual abuse material, despite mounting pressure from privacy advocates.
Apple plans to scan iCloud photos for child sexual abuse images, and says its “method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind”.
The company has also announced a parental control option, which warns children and their parents when they are about to view or send sexually explicit photos in the Messages app.
But privacy groups claim the new features will “create new risks for children”.
Concerns have also been raised that the scanning software “could be used to censor speech and threaten the privacy and security of people around the world”.
A coalition of more than 90 rights groups has now written to Apple CEO Tim Cook, outlining their concerns, and urging the tech titan to abandon its plans to introduce the new features.
The signatories include civil rights, human rights and digital rights groups.
The coalition of rights groups has raised concerns that the scan and alert feature in Messages “could result in alerts that threaten the safety and wellbeing of some young people.
The groups say LGBTQ+ youths with unsympathetic parents are particularly at risk.
They also claim that once the “CSAM hash scanning for photos is built into Apple products, the company will face enormous pressure, and possibly legal requirements, from governments around the world to scan for all sorts of images that the governments find objectionable”.
Apple defends its child safety features
Apple has sought to allay concerns, pushing back against claims that the technology will be used for other purposes.
The trillion-dollar company insists it won’t give in to pressure from any government to use the technology for other surveillance purposes.
Apple says it “will refuse any such demands”
“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” Apple said in a recent FAQ.
Why outdated patching methods leave companies vulnerable and how AI can fix it
Automation is crucial in reducing cybersecurity vulnerabilities, says Vivek Bhandari, VP of Product Marketing at Tanium. Poor patching methods create a backlog of security issues, or “vulnerability debt,” which can leave organisations exposed.
Bhandari urges companies to modernise their processes and use AI and automation to quickly identify and fix vulnerabilities. This proactive approach can significantly reduce risk and keep systems secure. #tickernow
A recent survey reveals contrasting views on healthcare technology between U.S. consumers and healthcare professionals.
While consumers are optimistic about tech improving care, professionals remain cautious due to concerns about workflow disruption and patient interaction, according to Randy Boldyga, Founder & CEO of RXNT.
Boldyga emphasised the need for better communication to bridge the gap, with patients requiring more education on tech benefits and professionals seeking streamlined tools.
RXNT is focused on creating solutions that enhance both provider workflows and the patient experience in this evolving healthcare landscape.