The Latest

THE LATEST

THE LATEST THINKING

THE LATEST THINKING

The opinions of THE LATEST’s guest contributors are their own.

In the Digital Age, 'Good' is an Ongoing Debate

Jessica Dine

Posted on September 15, 2021 21:35

1 user

The debate over Apple's child abuse scanning system speaks to the broader issue at play: how to balance privacy and harm reduction in a technologically-driven world.

Last month, Apple announced plans to implement a new scanning system, meant to detect child abuse images on iPhones, through an iOS update in the fall. The process would work by matching the digital fingerprints of photos against those of known child abuse images drawn from official databases. It was built with checks against the possibility of false positives. It was also not an entirely novel idea, as platforms like Google and Amazon have long been scanning content uploaded to their platforms for prohibited materials. 


And yet the announcement generated backlash because of the location of the scanning, which would take place on users’ devices rather than in the cloud. Critics from privacy-centric groups, security experts, and consumers themselves denounced the intrusion into user privacy as a gateway to government surveillance through our personal devices. They worried that it could be expanded to other content, or content deemed illegal by authoritarian governments, or even that the system could be misused. The flip side of the argument was the obvious need for a crackdown on child exploitation, which child protection groups had been requesting of Apple for years. 


The debate this announcement generated speaks to the broader issue at play — how much power, if any, should communications platforms have to moderate content? As increasing determinants of our societal interactions and norms, what responsibility do tech platforms have to harness their influence and work for social goods? If we’re entrusting them with this much power, what do we want that to look like, and where do we draw the line?


Recently, Apple announced plans to delay the rollout, citing a need for improvements in the detection process before implementation. Whether it will eventually be run is now uncertain.  


The initial announcement, ensuing debate, and tentative withdrawal mirror the uncertainty surrounding technological progress. The societal effects of tech like social media platforms and artificial intelligence are slowly becoming understood. Lawmakers have only recently begun to address the regulatory deficit surrounding tech giants and still can’t get a handle on how to proceed. Meanwhile the technology itself has been forging ahead, rewiring our society while we’ve been left arguing over how to best play catch-up. 


That didn’t happen here. In the very age when technology is reshaping the ways we interact and invading previously untouched spheres of our lives, one avenue for taking back consumer control starts by using the platforms for the benefits they provide: communication and shared knowledge.  


So we haven’t yet answered any of the questions posed above, and both tech companies and consumers are still trying to correctly balance competing aims: consumer privacy, harm reduction, and how much power non-elected individuals should have. The question has only really been delayed for now. What’s clear is that in the process of answering it, more transparency and more consumer input can only be a good thing, and one way to achieve that is through the very communication these platforms provide.

Jessica Dine

Posted on September 15, 2021 21:35

Comments

comments powered by Disqus
Source: Upworthy

At Apple’s Worldwide Developers Conference in June, the company announced it would soon require developers to disclose their...

THE LATEST THINKING

Video Site Tour

The Latest
The Latest

Subscribe to THE LATEST Newsletter.

The Latest
The Latest

Share this TLT through...

The Latest