No announcement yet.

Apple Child Abuse Protection

  • Filter
  • Time
  • Show
Clear All
new posts

  • Apple Child Abuse Protection

    Apple to add child sexual abuse protections to mobile devices

    In an effort to protect children from predators, Apple will update iPhones and iPad to detect and report images of child sexual abuse. The megacompany vowed yesterday to implement new detection software to automatically flag any photos that involve children being abused and exploited as soon as they are uploaded into iCloud, Apple’s proprietary cloud-based storage service. Photos will be reported to the National Centre for Missing and Exploited Children after the update is launched that will modify the operating systems of iPhones and iPad. The ubiquitous tech company released a statement announcing the new plan to monitor photos on their devices acknowledging that communication tools such as their devices are used by predators to recruit and exploit children.

    Apple aims to limit the spread of child sexual abuse material by implementing this new automated technology. They say they will launch the detection as part of a suite of new tools coming for all their mobile devices. Siri, the AI personal assistant included with Apple devices will be programmed to step in when users search for terms related to child sex abuse. iPhone’s messaging app will also use artificial intelligence to learn to recognize dangerous messages and explicit or nude photos and send warning messages to the child involved and notify their parents as well.

    No specific release date was announced. The use of AI and monitoring users actions and data are often controversial and call into question where to draw the line in the balance between public and personal safety versus personal privacy for users.

  • #2
    Apple delays privacy-invading child abuse detection software

    After huge backlash over their seemingly well-intentioned plan to fight child pornography and child sexual abuse, Apple has backed down and delayed the release of a software update that would include protection tools that many believe would invade privacy and open the door for other infringement in the future. The Silicon Valley tech giant had introduced a plan to release software on all of its iPhones and iPads that would protect children by detecting images that matched a database of child sex abuse images when they were uploaded to iCloud the cloud storage service Apple created, as well as give warnings to children and parents when it detected messages that were dangerous regarding child trafficking or exploitation.

    But this week Apple recognized the concerns that allowing a backdoor to detect these child abuse images and messages create a slippery slope that could later allow the company to search not just for illegal content, but for anything they find disagreeable, or allow government or authorities to access a device’s data. Now Apple has decided to delay the rollout of this new software update announcing on Friday that they are responding to the criticism from advocacy groups, customers, research, and others.

    “We have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
    The plan was to identify abusive photos by comparing them using artificial intelligence against a database of known abuse images that are provided and curated by safety organizations. The software would flag these images when they were uploaded to Apple’s cloud storage software and alert proper authorities. Apple maintains that the system would use a cryptographic technology that would not reveal the contents of the child abuse image but compare its pixel data without a human eye ever seeing it. But digital rights organizations fought back quickly, saying that it was an invasion of privacy and created a much bigger risk.

    Digital privacy experts postulate that changing Apple’s operating system to allow the software to run would create a backdoor that governments or nefarious groups may be able to access against the phone or mobile device owner’s wishes.