Apple scans photos to check for child abuse

Apple scans photos to check for child sexual abuse images, an executive has said, as tech companies come under pressure to do more to tackle the crime. Jane Horvath, Apple’s chief privacy officer, said at a tech conference that the company uses screening technology to look for the illegal images. The company says it disables accounts if Apple finds evidence of child exploitation material, although it does not specify how it discovers it.

Apple has often clashed with security forces and authorities, refusing to break into criminals’ phones and applying encryption to its messaging app in the name of protecting its users’ privacy.

Speaking at the Consumer Electronics Show in Las Vegas, Ms Horvath said removing encryption was “not the way we’re solving these issues” but added: “We have started, we are, utilising some technologies to help screen for child sexual abuse material.”

An Apple spokesman pointed to a disclaimer on the company’s website, saying: “Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space.

“As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation.

Similar Articles:

The Time Tim Cook Stood His Ground Against the FBI

The Time Tim Cook Stood His Ground Against the FBI

Apple Announces 'Sign in With Apple' for Signing into Apps Using Your Apple ID

Apple Announces 'Sign in With Apple' for Signing into Apps Using Your Apple ID

Apple Now Scans Uploaded Content for Child Abuse Imagery

Apple Now Scans Uploaded Content for Child Abuse Imagery

How to stop Apple from listening to your Siri recordings

How to stop Apple from listening to your Siri recordings