The Assault on Trust

0


THE ASSAULT ON TRUST by Brad AveryIn Daniel Goldhaber’s Cam, which debuted on Netflix last month, camgirl Alice Ackerman (Handmaid’s Tale’s Madeline Brewer) becomes the target of a malicious A.I. that hacks her online profiles, steals her identity, and produces a perfect replicant of her body and her voice to entertain her online fans. A brilliantly-crafted piece of entertainment driven by a paranoid, awards-worthy performance by Brewer, Cam packs the double punch of being a feminist critique of patriarchy and a digital canary in the coalmine.

Alice is a sex worker, performing strip teases online as “Lola” for a live-streaming website where patrons leave tips and comments as they watch her bedroom broadcasts. Hoping to climb the site’s model rankings and break into the Top 50, Alice travels to a remote studio where she breaks her set boundaries and performs with other girls, ultimately winning her coveted position as one of the site’s most watched streamers. But the next morning she finds herself locked out of her account while “Lola” continues to perform: a perfect doppelganger in a set identical to her own is continuing to host live shows where the sexual performances begin going far past Alice’s self-set limits.

Cam has been compared to Black Mirror and hailed for its accurate depiction of sex work. Written by former camgirl Isa Mazzei, the movie takes a staunch sex positive approach to Alice’s work — the horror is not in the sexuality, but in the robbery of a woman’s autonomy over her own body. But as much as Cam is a feminist exploration of sex work, one which other critics have already discussed in depth, the movie is also a disturbing vision of our near future as artificial intelligence becomes more complex and more prevalent in our lives.

The film’s central horror is not through cinematic scares, but postmodern dread. After the simulated videos force Alice to come clean to her family about her work as a camgirl, her mother refuses to listen to her pleas that the woman on screen is not her (“I know, it’s a character,” her mother says). The website’s tech support, her camgirl friends, and her clients are all unable to make sense of the fractured reality they’re faced with. In essence, the algorithm has subsumed Alice not only online, but in her real life, as its actions become Alice’s actions.

While Cam is classified as sci fi, the simulated threat is hardly a far-in-the-future fantasy. In 2016, during a sweep of copyright enforcement activity, Warner Bros discovered that somebody had uploaded copies of the classic Ridley Scott film Blade Runner to Vimeo. The studio quickly filed a DMCA copyright violation report to remove the content, stripping the illegal copies of their intellectual property from the free internet.

But Warner had a problem. Those streaming copies of Blade Runner were just that — copies. They were not direct rips of the film from a home video source, but instead the work of researcher Terence Broad, who had taken a video file of the original film and ran it through a digital neural network where a computer “watched” the movie and reproduced as close a replica to the original as possible. While the final product was visually muddy and lacked a soundtrack, it was not a proper clone of the movie. Rather, it was like an artist painting a new abstract version of every frame, only in this case the artist was autoencoder software. These versions of Blade Runner, a movie about replicants of humans who are almost indistinguishable from actual people, were in fact legal new works and Warner owned no copyright claim to them. Broad had created simulacrums of a film about simulacrum: copies bereft of originals.

Now take the case of deepfakes. Deepfakes are digital manipulations of existing video footage where, through A.I. and machine learning, users are able to create near-seamless edits where people can be made to look like they’re saying anything the user wants them to say. The technology first emerged online in its current form in 2016 and the most cited examples to date have been falsifications of politicians and actors. Earlier this year, Get Out director Jordan Peele released a PSA about the phenomenon where he used footage of Barack Obama and turned the former leader into his puppet, having him say among other things “President Trump is a total and complete dipshit.”

The technology has now been used to target YouTubers and online personalities, and the rise of deepfakes has sent the journalism community into a panic as the threat of perfectly manipulated video footage begins to make its way to the public’s newsfeeds. NiemanLab has released multiple blogs educating journalists on how to detect fake images and raising the alarm that verifiable truth itself may be in danger.

Fake news was one thing, but manipulatable audio, image, and video — as metonymized by deepfakes — would assault the trust we instinctively place in our senses,” wrote Betaworks Ventures analyst Jared Newman. “More engaging and more believed than text, any and all photos or videos could become as doubted as a photoshopped magazine cover. Descartes’ wax would not just melt near the fire, it would disintegrate.

Like almost every other new technology, deepfakes were immediately used for porn. Around 2017 deepfake technology became more accessible to the public and so, as if anyone expected anything else but this to happen, internet users rapidly began putting the faces of Hollywood actresses onto the bodies of women in porn videos. While more crudely constructed than the more professional deepfakes of politicians, the act is no less disturbing. A subreddit, r/deepfakes emerged where users could post their homemade manipulations and requests others with specific actresses. In a swift moral judgment Reddit, Tumblr, and PornHub quickly banned deepfakes, but we all know once something’s online it’s there forever. If public figures like Emma Watson and Jennifer Lawrence can have their likeness stolen and put in compromising acts, then what is to stop this same technology from being used against anyone, whether for blackmail, revenge, or wanton humiliation?

Even in the mainstream entertainment industry performers have seemingly lost the rights to their likeness as animation technology reaches a photorealistic event horizon. Over the past decade we’ve seen Jeff Bridges restored to his younger self in TRON: Legacy and Kurt Russell looking like it’s 1986 in a flashback sequence from Guardians of the Galaxy Vol. 2. But those very alive actors could consent to their digital manipulations. Rogue One: A Star Wars Story saw fit to exhume Peter Cushing via a golem of 1’s and 0’s for a posthumous cameo as Grand Moff Tarkin (Cushing died in 1994, but it’s okay because his estate signed off on it). And that’s only scratching the surface of the hologram craze where dead performers from Tupac Shakur to Amy Winehouse are being reanimated for karaoke concert tours, seemingly with the full approval of their estates. We’ve come a long way from capitalizing on dead celebrities by pasting photos of Bruce Lee’s face over another actor in Game of Death, but the practice already sits poorly with the public. The NFL earlier this year famously cancelled plans to have a hologram of Prince appear during a Super Bowl halftime performance after fans mounted an uproar campaign arguing such a stunt would go against the dead musician’s religious beliefs. The halftime show eventually settled for a simple projection over a curtain.

“But perfection is always punished: the punishment for perfection is reproduction,” wrote philosopher Jean Baudrillard in The Perfect Crime. Facing this future of replication, Baudrillard is quickly proving to be among our most prophetic theorists. Most famous for his 1981 work Simulacra and Simulation, Baudrillard argues that we have replaced reality with a series of symbols and signifiers and that we instead live in the Hyperreal — a world where everything is itself a copy.

In Cam, it is Alice’s goal to reach the site’s Top 50 list that makes her a target for simulation, and it is our most beloved artists and highest global leaders who have become the first test cases for digitization. The algorithm in Cam that steals Alice’s likeness is much more advanced than the deepfake technology we have today, but it’s not hard to imagine us reaching the point where A.I. can generate perfect replicas of reality — both sight and sound. Goldhaber and Massei’s horror/thriller, with the cold glow of its computer screens and the uncanny valley of the villainous digital duplicant, taps deeper into the zeitgeist than many would care to reconcile.

In one key scene in the third act, Alice meets with one of her male clients in the hopes that he can help her get her account back, or at least offer an explanation for what is going on. Against Alice’s preference, they stay together platonically in a seedy motel. As she works to make sense of the situation she discovers the man in the bathroom masturbating to her deepfake’s live show on his laptop. Even with the real Alice in the next room, he has chosen to embrace virtual reality.

“Now the image can no longer imagine the real, because it is the real,” Baudrillard wrote. “It can no longer dream it, since it is its virtual reality. It is as though things had swallowed their own mirrors and had become transparent to themselves, entirely present to themselves in ruthless transcription, full in the light and in real time. Instead of being absent from themselves in illusion, they are forced to register on thousands of screens, of whose horizons not only the real has disappeared but the image too. The reality has been driven out of reality. Only technology perhaps still binds together the scattered fragments of the real.”

Like Broad’s Blade Runner replicants, Cam shows us a world where images are divorced from human creation, designed by algorithm and dreamed in the neural networks of computers. What both works reveal to us is far beyond soft propaganda terms like “Fake News” and “Post-Truth,” it’s a pandora’s box for a digital evolution far outpacing humanity’s ability to adapt. Reality risks being lost amid a net of computer-generated visualisations, like tears in rain.

Share.

About Author

Brad Avery is a Boston-based writer and journalist whose film analysis has appeared in Vanyaland and Brattle Theatre Film Notes. You can follow him on Letterboxd.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: