Cyberpunk? In My Neighborhood?

It's More Likely Than You Think

Cyberpunk? In My Neighborhood?

Illustration by Will Lahey

I screamed — genuinely — when I saw that Anonymous had returned.

It was early June. I’d been alone in my Brooklyn apartment for close to three months in the midst of the pandemic; my hinge on reality was, admittedly, already loose. Then, in the earliest days of protests following the murder of George Floyd, as the images coming out of Minneapolis felt like something out of a movie, the reappearance of the internet’s most iconic hacktivist group felt like another dramatic advance in the plot.

I hit play on the newly-released video, their first in years. A distorted voice flooded my apartment; from behind a Guy Fawkes mask, the vigilante delivered a haunting warning against insidious government activity and promised to take action on behalf of the people. (Moments later, they allegedly delivered by shutting down the Minneapolis Police Department website and hacking police radios in Chicago to play NWA’s Fuck tha Police). Over 240,000 tweets about the enigmatic group went out within the hour, and users flocked to their accounts by the millions to celebrate their comeback and seek safety under their digital wings.

I laughed: 2020 had, against all odds, outdone itself again.

The next morning, I logged onto Reddit. The first post on my feed, a news story detailing the reemergence of Anonymous; the second, a question from a user interested in becoming a hacktivist themselves. On my walk to the store later that day, I passed pro-resistance graffiti and a guy sailing by on a hoverboard beneath an elevated subway track. Online, I watched friends don expression-concealing n95 face masks, take up cropped and shaved heads for lack of better haircutting skills, and all around embracing a year that looked and felt almost apocalyptic.

In a fruitless search for precedent in unprecedented times, journalists, trend forecasters, and wannabe soothsayers have tried to predict what will come of the coronavirus pandemic: mass exodus from major cities, the collapse of global economies, a rise in remote work, the obsolescence of all pants but sweatpants. The same goes for the consequences of the United States’ upcoming presidential election, poised to be a catastrophe regardless of the outcome: violence in the streets; the potential collapse of the country as we know it; the rise of an insidious, Orwellian government.

But what I see here has been building long before 2020. At risk of exposing my tin foil hat, I feel it’s safe to say that humanity has entered its cyberpunk phase.

Lying at the intersection of science fiction and film noir, the cyberpunk genre forces us to confront our capitalist culture and obsession with technological advancement — and the consequences of each.

Much like ourselves, everyday citizens in a cyberpunk cityscape live in a world consumed by tech, for better or for worse; mega-corporations, ruled over by Bezos-like villains, sit alongside corrupt governments to craft the law of the land. Cities caught somewhere between collapse and rapid advancement set the scene, as neon-lit streets and dark alleyways teeming with secrets abound. In this genre of speculative fiction, stories often begin with an elusive figure (perhaps a hacker, not unlike our friends at Anonymous) tasked with using tech to expose wrongdoings and bring justice down upon the cruel, reigning elites.

Unlike the teen dystopian fiction and fantasy markets, remarkably, the world of cyberpunk hasn’t been oversaturated. Inspired by 60s-era New Wave science fiction, author Philip K. Dick and William Gibson (the minds behind Do Androids Dream of Electric Sheep? and Neuromancer, respectively) are often considered the earliest pioneers of cyberpunk, along with the team behind 1977 comic series Judge Dredd. Japanese film in the 80s also stakes its claim as a trailblazer: Out east, the trend began with underground films and rose quickly through the ranks of pop culture, so that the film adaptations of both Katsuhiro Otomo’s manga Akira and Masamune Shirow’s Ghost in the Shell debuted to international acclaim. Later, Cowboy Bebop, known for its film noir atmosphere and jazz soundtrack, became a cult classic in the 90s and remains one of the most celebrated anime series of all time.

Besides The Matrix franchise — a landmark for cyberpunk cinema and for film writ large — the early 2000s saw relatively few cyberpunk titles. But in recent years, a handful of releases have made it to the mainstream and cemented the genre as a resurging trend: Blade Runner 2049; a star-studded (though controversial) reboot of Ghost in the Shell; an upcoming reboot of Akira; Netflix adaptations of Altered Carbon and Alita: Battle Angel. Perhaps most notably, the much-anticipated (but frustratingly elusive) video game Cyberpunk 2077, adapted from the tabletop role-playing game by designer Mike Pondsmith, has — despite a series of release date setbacks — kept eager fans on their toes in a year that seems to prognosticate a tech-centric future.

And what is fiction if not a dramatized reflection of, and response to, the present? This sudden shift in theme is telling: Behind the recent uptick in cyberpunk media lies a concurrent interest in the technology and social conditions that make the genre both unique and eerily prescient.

The beauty of cyberpunk, as opposed to other flavors of speculative fiction, is that it is often based in a not-too-distant future. Rather than relying on difficult to achieve concepts like intergalactic space exploration or light speed travel, it pulls inspiration from contemporary technologies and considers the next immediate phases of that tech, and how we might see our world shift as a result. Even now, we’re living in the year in which the original Blade Runner film, produced in 1982, was set; similarly, films like Blade Runner 2049 and Cyberpunk 2077 seem like a lifetime from now, but are only 29 and 57 years away — years that, with any luck, you and I will live to see.

Our transition into a similar way of life will be more seamless than what we may imagine; in some ways, it’s already begun. Some things, like flying cars, are still years away from public use (though you’d be hard pressed to convince me that I’ll ever be responsible enough to operate machinery at 10,000 feet), but one could make the argument that self-driving cars are almost equally as impressive.

But other inventions that once seemed contained within the realm of fiction — artificial intelligence, personal robots, mechanical body modifications, drones, holographic celebrities — have already been carried out and incorporated into our everyday lives. Other advancements, like Neuralink — a small chip developed by Elon Musk that inserts “neural threads” into brain tissue to aid degenerative brain diseases — announced plans to enter human trials sometime in 2020. It seems transhumanism, human enhancement at the hands of tech, is on the foreseeable, if not imminent, horizon. Even the mobile, touch screen phones we take for granted are a feat of technological excellence; in mere seconds, we can know most anything about everything.

But as Black Mirror (not necessarily cyberpunk in aesthetics, but in concept) suggests with foreboding realism, humanity’s obsession with advanced technology can — and will — go too far.

Years ago, I stumbled out of the theater after a screening of Alex Garland’s Ex Machina, in which a female AI seduces a visitor into believing she is both conscious and benevolent. At the end of the film (spoiler alert), she murders her maker, escapes the isolated facility where she lives, and joins humankind out in the real world — an anonymous android among us. I left the movies that night glancing over my shoulder, toying with the idea that there might be AI in disguise nearby, wondering whether they were friendly, or whether they felt at all.

I’ve always thought humans are incredible creatures for our innovative nature, and for our intellectual insatiability. But what concerns me most about our quest for knowledge is our utter inability to learn from our own mistakes. In movies like Ex Machina, I, Robot, and I Am Mother, we express an understanding that it is possible to take technology too far. And still, no matter how often we watch our own downfall, we still can’t seem to recognize ourselves on the big screen.

As our technological capabilities continue to expand, many will remain distracted by the neon novelty of a new age, filled with tools to make our daily lives more convenient and state-of-the-art features to wow. But those who see beyond the bells and whistles to tech’s true scope of influence will seize the opportunity to use it to their advantage — and some already have.

Our dystopian state of affairs, on a political level, seems to promise the proliferation of cyberpunk. The genre frequently features treacherous government agencies working to manipulate the public as a subversive, exaggerated antihero with technological might combats these aims. We, in the present day, are no strangers to this narrative arc.

For years, government watchdogs and journalists have continually warned us of the dangers of facial recognition technology. As I write, people are calling for the pardon of whistleblower Edward Snowden, who revealed to the public classified information regarding the development of mass surveillance programs by the NSA (officially deemed illegal by a United States federal appeals court in August, 2020); police forces allegedly use social media to identify, track, and arrest protestors and suspected government dissidents; the intervention of Russia in American politics via the internet is no secret, and has and will determine the history of the country — as will the influence of internet giants like Facebook, Twitter, and Tiktok. Then, of course, there’s the infamous Anonymous in their Guy Fawkes get-up.

But the use of technology for political gain isn’t limited to the powers that be. The everyday individual’s ability to intervene through the accessibility of tech is both what defines our era and the cyberpunk genre.

Hacktivism — in which amorphous groups of hackers use technology to knock down websites, obtain and release confidential information, or otherwise cause disruption in the name of social and political justice — is on the rise, pledging to battle the malevolent, tech-savvy governments and megacorporations that threaten public privacy and autonomy.

The story of 23-year-old British hacker Marcus Hutchins paints an excellent (though extraordinary) picture of the future of hacktivism. It’s difficult to sum up what is a decidedly complex and fantastical tale, woven with drugs, DOXing, cybercrime, mansion parties, and enough cryptocurrency to make a man richer than God.

It begins in Devon, England, with a teenage introvert, a web forum, and a Windows 95 computer. A talented, self-taught programmer, Hutchins was only sixteen when he first started forging online connections with shady internet figures — including one named Vinny, who contracted Hutchins to build a tool that could essentially commit bank fraud.

Hutchins obliged, putting himself at Vinny’s mercy and launching a chain of events that would begin with a naïve mistake — and end with the FBI at his door. After acquiring his name and address for a birthday surprise, Vinny began demanding more and more from Hutchins, threatening to provide Hutchins’ personal information to the authorities in order to keep Hutchins working.

As the guilt consumed him, Hutchins eventually turned to using his skills for good, becoming a favorite in online cybersecurity circles and eventually helping defeat WannaCry — the malware behind the world’s worst-ever cyber attack, which had the power to seize entire systems, run dangerous code, destroy data, and demand a ransom in Bitcoin all at warp speed — infecting everything from hospitals to banks. Still, the consequences of his cybercrimes watched him from the shadows, all the way up until his arrest by the feds one early morning in Las Vegas.

The 2015 series Mr. Robot, which follows a 20-something hacker Elliot Alderson, doesn’t feel far from Hutchins’s story (which reads like fiction almost as easily as anything else listed here). Each has its unassuming antihero, lilts back and forth between the inextricably linked realms of cybersecurity and cybercrime, and demands high-stakes consequences. The fictional Elliot and IRL Marcus could easily exist in the same world — and could in fact exist in the same world as you and me. Even if you don’t expect to find yourself in the midst of any international cybercrime syndicates, it isn’t difficult to find the cyberpunk in your own life. Here, in New York, I look outside my window and see settings not dissimilar to this archetypal cyberpunk city: sleek metropolitan centers, and neglected outskirts; a proletariat battling a callous upper class; a planet reliant on tech to survive.

Even as the accessibility of certain technologies increases, it isn’t difficult to imagine this phenomena expanding as the working class struggle to keep up — especially given the widening class divides in places like the United States. The ongoing degradation of the environment, too, will certainly drive humanity to live closer together in massive urban centers (according to the European Commission, 85% of people already do) and bring us to abandon even some of our most famous metropolises as parts of the Earth are rendered uninhabitable; images of a red-hot, deserted Las Vegas in Blade Runner 2049 felt a little too close for comfort.

Over the last several decades, we’ve begun to integrate technology more and more into our quotidien diet through the advent of cell phones, bluetooth, and digital assistants like Siri and Alexa. We’ve found ways to make our lives more convenient; this year, we’ve proven (though not by choice nor without sacrifice) that it is possible to operate a society almost entirely digitally. Now, our next feat of advancement — to enhance the capabilities of human beings themselves and push our relationship with technology to its limits — may be just a few years away (but the fact that biotech is already part of the tech lexicon indicates that we’re at the beginning stages of that kind of advancement anyhow).

Taking into consideration the exponential rate at which humans develop technology — which, according to the Law of Accelerating Returns, will only continue to speed up — it’s likely that the futuristic objects in the mirror are closer than they appear. At just 30 years between the first clunky personal computer and the omniscient iPhone — mere decades, and yet enough to change the face of the Earth forever — is telling of what’s to come, and soon.

I would be lying if I claimed to be above it all myself (and so, I think, would most of us). I’m curious about biotech, implants, and artificial intelligence; I see the allure in asking if we can, and not if we should. And when technology links together such vast swaths of society, participating in the future will require giving into the trials, temptations, and turmoil that such advancements have the power to invite in.

But as in cyberpunk, as in real life, to mistrust and to partake are not mutually exclusive. (After all, even though we know the harm it can do, most of us are still on social media.) And as we ease into more of a tech-centric reality, we can seek guidance in the lesson that cyberpunk fiction on contemporary shelves teaches us: that, even in a world embellished with tech, our rawest motivations, most basic needs, and darkest desires will always be brutally human.


Tiana Attride

is a writer, editor, and lvl 86 doomscroller living in Brooklyn, New York. She loves to talk about travel, media, culture, race, and why Cowboy Bebop is the greatest television show of all time.

All contributions from Tiana Attride

Latest in Criticism