We may come to remember this decade as the one when human beings finally realised we are up against something. We’re just not quite sure what it is.
More of us have come to understand that our digital technologies are not always bringing out our best natures. People woke up to the fact that our digital platforms are being coded by people who don’t have our best interests at heart. This is the decade when, finally, the “tech backlash” began.
But it’s a little late.
Shoshana Zuboff recently published her comprehensive Surveillance Capitalism to deserved acclaim, but the book is really about some decisions that Google was making twenty years ago to harvest our data and sell it to advertisers. The Center for Humane Technology has called attention to the way that the manipulative techniques of behavioural finance have been embedded in our apps — bringing us all up to speed on the science of captology and addiction, circa 1999.
These are necessary critiques, but they’re too focused on the good old days, when the business plans of a few bad actors and the designs of some manipulative technologies could be identified as the “cause” of our collective woes.
That’s really only half, or less than half, of the story. It’s blaming the developers, the CEOs, the shareholders, or even individual apps, programs and platforms for our predicament, when most of these players have either long since left the building, or are themselves oblivious to their impact on our collective wellbeing. Just because the public is finally ready to hear about these tech industry shenanigans doesn’t mean they are still relevant. We can’t even blame capitalism, anymore. The quest for exponential returns may have fuelled the development of extractive and addictive technologies, but the cultural phenomena they gave birth to now have a life of their own.

Different worlds
What this decade’s critiques miss is that over the past 10 years, our tech has grown from some devices and platforms we use to an entire environment in which we function. We don’t “go online” by turning on a computer and dialling up through a modem; we live online 24/7, creating data as we move through our lives, accessible to everyone and everything. Our smartphones are not devices that sit in our pockets; they create new worlds with new rules about our availability, intimacies, appearance and privacy. Apple, Twitter and Google are not just technology services we use, but staples in our retirement portfolios, on whose continued success our financial futures depend.
At this point, the digital environment is no more the result of a series of choices made by technology developers, as it is the underlying cause of those choices. What happened to us in the 2010s wasn’t just that we were being surveilled, but that all that data was being used to customise everything we saw and did online. We were being shaped into who the data said we were. The net you see and the one I see are different. Your Google search results are different than mine, your news feeds are different and your picture of the world is different.
As the decade began and social media took over society, many people tried to call attention to digital technology’s more environmental effects. In Programmed or Be Programmed, I argued that we have to understand the platforms on which we’re working and living, or we’re more likely to be used by technology than to be the users controlling it. But those of us arguing for new media literacies may have been making our case a bit too literally.
 ‘The digital environment is no more the result of a series of choices made by technology developers, as it is the underlying cause of those choices.’
The people and organisations responding to our plea launched the “learn to code” movement. Schools initiated Stem curriculums, and kids learned code in order to prepare themselves for jobs in the digital economy. It was as if the answer to a world where the most powerful entities speak in code was to learn code, ourselves, and then look for employment servicing the machines. If you can’t beat them, join them.
But that wasn’t the point. Or shouldn’t have been. What we really needed this decade was to learn code as a liberal art — not so much as software engineers, but as human beings living in a new sort of environment. It’s an environment that remembers and records everything we have done online, every data point we leave in our wake, in order to adapt itself to our individual predilections — all in order to generate whatever responses or behaviours the platforms want from us. The digital media environment uses what it knows about each of our pasts to direct each one of our futures.
We can no longer come to agreement on what we’re seeing, because we’re looking at different pictures of the world. It’s not just that we have different perspectives on the same events and stories; we’re being shown fundamentally different realities, by algorithms looking to trigger our engagement by any means necessary. The more conflicting the ideas and imagery to which are exposed, the more likely we are to fight over whose is real and whose is fake. We are living in increasingly different and irreconcilable worlds. We have no chance of making sense together. The only thing we have in common is our mutual disorientation and alienation.
We’ve spent the last 10 years as participants in a feedback loop between surveillance technology, predictive algorithms, behavioural manipulation and human activity. And it has spun out of anyone’s control.

 ‘Russian bots, meme campaigns and Cambridge Analytica’
This is a tough landscape for anyone to navigate coherently. We may be benefiting from the Internet’s ability to help us find others with whom we share rare diseases, hobbies, or beliefs, but this sorting and grouping is abstract and over great distances. We are not connecting with people in the real world, but gathered by our eyeballs in disembodied virtual spaces, without the benefit of any of our painstakingly evolved social mechanisms for moderation, rapport, or empathy.
The digital media environment is a space that is configuring itself in real time based on how the algorithms think we will react. They are sorting us into caricatured, machine-language oversimplifications of ourselves. This is why we saw so much extremism emerge over the past decade. We are increasingly encouraged to identify ourselves by our algorithmically determined ideological profiles alone, and to accept a platform’s arbitrary, profit-driven segmentation as a reflection of our deepest, tribal affiliations.
Since 2016, we have summoned demons to embody and represent these artificially generated worldviews — Russian bots and meme campaigns and Cambridge Analytica. But though these may have amplified and accelerated the effect of the digital environment, that environment would have generated standing waves of cultural angst in primary colours no matter what.
Then, all it takes is an ideologue or ideology to jump in and claim that standing wave as their own. Trump is not the originator of his demagoguery so much as the vessel. Ideologically speaking, he’s less a tweeter than a re-tweeter. Likewise, Brexit is not a policy design for an independent England so much as a projection of one group’s collective angst. And these are not even the most monstrous of the phantoms we are generating.
 ‘We’ve surrendered to digital platforms that look at human individuality as noise to be corrected, rather than signal to be cherished.’
Incapable of recreating a consensus reality together through digital media, we are trying to conjure a television-style hallucination. Television was a global medium, broadcasting universally shared realities to a world of spectators. The Olympics, moon landings and the felling of the Berlin Wall were all globally broadcast, collective spectacles. We all occupied the same dream space, which is why globalism characterised that age.
But now we are resurrecting obsolete visions of nationalism, false memories of a glorious past, and the anything-goes values of reality TV. We are promoting a spectator democracy o n digital platforms, and, in the process, we are giving life to paranoid nightmares of doom and gloom, invasion and catastrophe, replacement and extinction. And artificial intelligence hasn’t even arrived yet.

Reconnecting to reality
There is a way out, but it will mean abandoning our fear and contempt for those we have become convinced are our enemies. No one is in charge of this, and no amount of social science or monetary policy can correct for what is ultimately a spiritual deficit. We have surrendered to digital platforms that look at human individuality and variance as “noise” to be corrected, rather than signal to be cherished. Our leading technologists increasingly see human beings as a problem, and technology as the solution – and they use our behaviour on their platforms as evidence of our essentially flawed nature.
But the digital media environment could be helping us reconnect to local reality and terra firma. This is one of its potential breaks from media environments of the past. In the digital environment, we have the opportunity to remember who we really are and how to take responsibility for our world. Here, we are not just passive consumers; we are active citizens and more. That’s the real power of a distributed network: it is not centrally controlled, but locally generated.
The digital environment is also built, quite literally, on memory. Everything a computer does happens in one form of RAM or another – just moving things from one section of its memory to another. The digital media environment functions like a big blockchain, recording and storing everything we say or do for later retrieval. It could be helping us retrieve real facts, track real metrics and recall something about the essence of who we were and how we related before we were untethered from ourselves and alienated from one another.
The next decade will determine whether we human beings have what it takes to rise to the occasion of our own, imposed obsolescence. We must stop looking to our screens and their memes for a sense of connection to something greater than ourselves. We must stop building digital technologies that optimise us for atomisation and impulsiveness, and create ones aimed at promoting sense-making and recall instead. We must seize the more truly digital, distributed opportunity to remember the values that we share, and reacquaint ourselves with the local worlds in which we actually live. For there, unlike the partitioned servers of cyberspace, we have a whole lot more in common with one another than we may suspect. — The Guardian