Does Facebook rot our brains, or just tabloid headlines? Earlier this week, the director of the Royal Institution Susan Greenfield was quoted as saying that social networking sites and other communication gizmos might “infantilise our brains.” Maybe, but only if you’re an infant anyway. I interviewed Susan Greenfield for Cyburbia and she pointed out, quite reasonably, that if you leave a young child in front of a video game or other electronic communication device, continually pressing buttons and responding to feedback, then the chances are that it will end up fidgety and being diagnosed as having ADD. That, however, is less a matter of neuroscience than simple common sense. While our understanding of the brain and its malleability is improving all the time, there is a good deal that we simply don’t so about how our brains are being “rewired.” All we have is a range of probabilities and guesswork, and it would be wrong to jump to the worst-case scenario.
So what is the evidence? We are born with hundreds of billions of neurons to play with, and those neurons are constantly forming connections with one other in response to new experiences – from the earliest years of our childhood, everything we think or do changes the relationship between them and helps make us the kind of people we become. The part of our brain whose job it is to deal with signals and stimuli is known as the prefrontal cortex, and is situated at the front of our skulls. While it is difficult to completely isolate functions to different parts of the brain – like an orchestra or a football team, it is best thought of as a group of players acting in indivisible concert – the prefrontal cortex is known to play a highly significant role in organising our short-term decision making. Neuroscientists have known about this function for decades, and they call it executive control. Think of it as the brain’s secretary or chairperson; its job is to plan ahead and make decisions, to take account of all the immediate messages and stimuli which come its way, rank their importance and make sure we focus only on those which are most pressing. The ability to respond to incoming stimuli is crucial for our development. When someone is subjected to a frontal lobotomy – those unfortunate patients in the film One Flew Over the Cuckoos Nest, for instance – their prefrontal cortex is damaged, which is why they often appear dull and unresponsive. On the other hand, rapid reaction to any and every stimulus can be self-defeating. When we walk down the street and hear a loud noise, for example, our natural and immediate instinct is to look around. If we reacted to every single oral and visual stimulus which crossed our path, however, we would become incapable of progressing with any sense of purpose. The job of our executive controller, then, is to filter incoming messages for rank and import and to exercise a little discipline and restraint. Executive control evolves relatively late in life and plays a vital role in readying kids for school and later life, which is the reason why some small children are not very good at it.
The time that we spend in Cyburbia teaches us to respond rapidly to messages and stimuli just as soon as they come our way. But what if there is a price to be paid in return? What if wading through a constant stream of messages from our electronic ties ends up placing an insuperable burden on our executive controller, so much so that it compromises our ability to pay attention to the task at hand? There’s no doubt that trying to react to many different streams of information at the same time tends to slow us down and increases our susceptibility to making mistakes. A very literal warning of the errors which can result came early in 2008, when the fashionable East London district of Brick Lane announced that it was henceforth padding its lampposts as a preventive measure against the growth of “talk and text” injuries which were maiming thousands of the young hipsters who amble along its streets. More surprising, perhaps, is that it is not necessarily the technology-savvy young who are better at avoiding these creeping errors. An ongoing research project into this area at Greenfield’s Oxford University’s Institute for the Future of the Mind, for example, asked two different groups of subjects – one between the ages of 18 and 21, the other between the ages of 35 to 39 – to perform a simple intelligence test of translating images into numbers. The test only lasted 90 seconds, but when both groups were suddenly interrupted by a phone call, text message or an instant message, the results surprised the researchers. The younger group, as expected, performed better than their elders when there were no interruptions. When both groups were interrupted, however, the older ones matched the youngsters for both speed and the accuracy with which they completed the test. The findings successfully flip on their head the stereotype that younger people, who have spent a lifetime immersed in Cyburbia are necessarily better at doing and communicating lots of different things at once. What they suggest, by contrast, is that the ability to keep lots of things on the boil with any confidence comes with maturity, and that young people are still learning those skills. Though it is difficult to say for definite, it may be that the ability of very young people to prioritise their tasks – their internal secretary – might end up hampered by the constant interruptions which flow from their dependence on communications gadgetry.
Before we convict our computers and mobiles of turning everyone into fidgety stoners with the attention span of baby goldfish, however, it helps to get a little perspective. At least for many of us, work doesn’t exercise our minds a great deal, or require us to give it our full attention all the time. Unless your job is writing orchestral symphonies or performing complex medical operations, after all, it is highly unlikely that you need to give it your undivided attention all of the time. Often it is only because many of us are under-stimulated in the first place that we offer ourselves up to a never-ending communications loop of email chatter, texts and instantaneous updates from social networking sites. Latent within all our moral panics about the net is a much more interesting debate waiting to get out, and one which can’t really be solved by neuroscience at all. It is quite possibly the case that our adult brains have been subtly rewired by sending out messaging into the electronic ether and responding to an incessant stream of feedback – on computer games, the internet, or to our different electronic connections on Facebook. But why should rewiring be such a bad thing? One of the arguments in my book is that growing up on this continuous electronic information loop of instruction and feedback has made us more responsive to electronic information, and more keen to zigzag and adjust our way through it. In various ways, some storytellers in film and theatre are trying responding to this new sensibility in audiences by thinking up new, sophisticated kinds of stories which offer us more freedom of movement and let us follow our nose. It is only because the medium of the net is at such an infantile stage of its development that we spend our time Twittering, writing Facebook updates and staring out onto the electronic ether. Despite what Greenfield says, it is the medium which is infantile and not us. What we urgently need are people who capable of engaging our fragmented attention in different ways and doing interesting things with it. Until then we’ll remain endlessly distracted, staring out the window aimlessly into Cyburbia.