Writing in The Observer, the journalist Rafael Behr understands that the book is not dystopian or anti-technology, but an argument against an idea of technology which “has emerged as the intellectual orthodoxy of the digital age.” It doesn’t argue that the world is going to hell in a handcart, but warns against the dangers of spending too much time with this electronic information loop. It’s a smart review, and deeply sensitive to what the book sets out to do. Read it below, or here:
The good and the bad of digital dependency: Rafael Behr on a cautionary history of cyber society.
The Observer
22 February 2009
We spend a lot of time giving and receiving feedback: filling in customer satisfaction forms; awarding marks out of 10; appraising and being appraised. It feels like a pretty natural process, but we use a word from electrical engineering to describe it. We are feeding data back into a system so it can correct itself and work more efficiently. That isn’t the only linguistic overlap between circuitry and sociability. No one wants to be “out of the loop” and it is good to be “switched on” or, better still, “plugged in”.
These, as James Harkin shows in Cyburbia , are no ordinary metaphors. They express an idea that has become subtly but deeply embedded in our minds – that human social activity resembles an electronic network. If that doesn’t sound very radical, it is because the internet has thoroughly infiltrated our lives and so much of our social interaction is now mediated through machines. But not so long ago, the idea of equating organic social systems to technical networks – cybernetics – was the province of only a handful of scientists, hippies and futurologists based around San Francisco Bay. Harkin charts the history of this maverick field, how it was born in an obscure military experiment during the Second World War, was nurtured in the quasi-communist ideology of 1960s West Coast counterculture and then emerged as an intellectual orthodoxy for the digital age.
Much of our world has now become a cybernetic fantasy. Our status is defined by the volume of digital traffic that flows through us; we have links instead of contact; our knowledge of facts and figures is outsourced to Google. We are morbidly afraid of disconnection. It is, Harkin argues persuasively, both a wonderful and a sinister new stage in the evolution of human society. He compares it to a mass migration, as when rural populations flocked to cities during the industrial revolution or, more pertinently, when the post-industrial middle classes fled the city for the suburbs.
It is a neat analogy. People immerse themselves in life online in search of new identities, freedom and anonymity. But the communities they form often end up demographically homogenous, hostile to newcomers, culturally sterile and home to all manner of discreetly conducted perversion: welcome to cyburbia.
There is nothing unusual about cyber-scepticism. But rarely is it expressed by someone with Harkin’s genuine enthusiasm for the technology. Writers who are steeped in new media tend to evangelise for it and those who reject the evangelical vision tend to be motivated more by fear than insight. Harkin admires the digital revolution, but is not in thrall to it. He also describes the technology fairly lucidly for the uninitiated.
But Cyburbia is more than an account of how old-fashioned, analogue social dysfunction ends up being replicated online. Harkin believes something more profound is happening, perhaps even at the level of cognitive changes in our brains. He cites research showing a marked effect on the prefrontal cortex (where memories are formed) as a result of constant switching between different data streams – check email; send text; surf web; change TV channel; chat on Instant Messenger; check Facebook; check email again.
Crudely speaking, the kids who are growing up surrounded by this technology will have better hand-eye co-ordination than their parents, but shorter attention spans. They will be better at holding many things in their heads at once, but worse at remembering them afterwards.
The way we handle information and pass it on is changing. The shift, over a generation or two, could ultimately be as profound as the ancient transition in civilisation from oral to written culture. Forget all those grandiose claims that the web was the most important innovation since the printing press. It may, in fact, turn out to be the biggest thing since the alphabet.
One of Harkin’s most penetrating critiques is an account of how the US army’s reliance on computer technology hampered its counter-insurgency tactics in Iraq. GIs were all wired up to each other, constantly feeding information back and forth across the battlefield. But they ended up paralysed by data overload. The network functioned brilliantly, but purely for its own sake.
That is the danger Harkin sees in our ultra-networked society. Of the millions of communications that bind us together, few convey messages of any importance. Real human interaction risks being lost in a fog of self-sustaining, vacuous digital chatter. You don’t have to wander very far into cyburbia to find evidence of communication chasing its own tail: misinformation attracting comment and hysterical rebuttal in an infinite polemical regress. It is the cultural equivalent of that unbearable, high-pitched whine you hear when a microphone picks up the signal from a loudspeaker, which is, after all, also a kind of feedback.