What Apple's New Security Initiative Might Means for Pro Photographers

 

A few days ago, Apple released a statement regarding its new children's security initiative that would, ultimately, scan images generated or stored on Apple products.

What Apple's New Security Initiative Might Means

It was a significant announcement that sent ripples through the photography world. "We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM)," the company states.

Basically, Apple's algorithm will be inspecting iPhone images, searching for potential child abuse and nude photos. While no one would doubt the ultimate benefit of protecting children using the latest technology, Apple's decision could affect how photographers globally store their images.

Here is what the company has said their new technology on "expanded protections for children" would do:

  1. New communication tools will allow for a more informed parental role in helping children navigate online communication.

  2. iOS and iPadOS will use new cryptography applications to help limit the spread of CSAM online while planning for user privacy. This new CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos. (Yes, the company will access your images on their Cloud.)

  3. New updates to Siri and Search will provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

protection

All three new or expanded elements of Apple's technology could be valuable in addressing and stopping the exploitation of children, a pernicious problem that needs addressing. Also, Apple assures its customers that it will keep private communications unreadable.

However - and this is a BIG however - their new initiative also raises serious privacy concerns, including the constant scanning of our iPhone and iCloud images and the intrusion of Big Tech and, eventually, government. 

The new technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos. For that to happen, they will scan images. So how advanced is the technology? Will it flag a random baby in the tub picture and potentially lead to serious problems for a well-meaning parent? 

Then, of course, there is the issue of professional fine art nude photography. How will it be treated? Will it be flagged? Suppose you're a professional photographer who shoots nudes or boudoir images. Will you be affected if you send or store anything on an Apple product? The new security initiative, while highly commendable, raises a lot of questions for many professional photographers.

You can expect the new initiative to take effect with the company's release of updated operating systems, iOS15 for iPhones, and updates for the iPad, Apple Watch, and Mac computers. You always have the option to delay upgrading your operating system. Still, if you've worked with Apple technology for a while, you'll know that you will eventually have to upgrade.

Let's take a closer look at what the new security initiative could mean for professional photographers.

iMessages

iMessages are affected by the new changes. Texts sent from an iPhone, iPad, Apple Watch, or a Mac computer, and have a family iCloud account, will generate a warning to children and parents if receiving or sending sexually explicit photos.

  • Clearly, parents want to know and should be informed if their children are receiving (or even sending) sexually explicit images. So, this is a big plus.

  • The downside is that Apple will scan and flag your Apple images to send questionable ones to the authorities. There's room for error here. How will their algorithm detect between what is exploitive and a fine art image taken by a pro? For that matter, how will they know it's not an image between consenting adults?

  • Photographers who snap any fine art images of nudes on their phone might want to rethink shooting or store them on any Apple product.

iCloud Monitoring

While we can certainly understand why they'd want to access images stored in their Cloud, this too could be problematic. 

  • Apple says it will use a database of child abuse "image hashes" rather than scanning our actual images.

  • Apple further states that "less than a one in one trillion chance per year of incorrectly flagging," an account.

  • We remain concerned. Artificial Intelligence in facial recognition is still relatively new, with countless incorrect identifications. If you ever tried to find an image on an Apple product using facial recognition, you'll understand what we mean.

  • If you store your images on iCloud, there is the possibility that they will be erroneously flagged; your account disabled; information sent to authorities.

  • You might find it easier, at least for now, to use a different cloud for storing your images, such as Google Drive, SmugMug, or a hard drive.

We applaud Apple's initiative to better secure our children's experiences online. Child exploitation is a real and grave issue across the globe. Our hope is that we can all find a balance somehow between protection and privacy. 

Until algorithms are more advanced, avoiding incorrect flagging of images, you might want to backup your pictures elsewhere. And if you don't want your privacy intruded upon by having Apple's technology scanning your pictures, you'll want to send and store them some other way.

 
Previous
Previous

Our List of Top Website Builders and Hosts

Next
Next

How to Photograph in an Entertainment Venue or at an Event