"Is Technology Eating Our Minds?"
"Is Technology Eating Our Minds?"
by David Binning
by David Binning
Greenfield - a regular visitor to Australia and former South Australian Thinker in Residence - has written several books on the subject. She speaks widely, arguing that future generations are at risk of everything from desocialisation and autism to damaged cognitive functions such as the ability to think deeply and even read. "Everyone knows that the human brain is sensitive to the environment," she tells Weekend Health. "Therefore, if the environment is now unprecedented and different, how can the brain stay the same?" But what's the hard evidence that today's swarms of computers and gadgets, incessantly bleeping and buzzing with communication and information, are causing what Greenfield calls "mind change"? While it makes intuitive sense to agree with her assumption that the pervasiveness of digital technology in modern society must be having some sort of effect on us, there is in fact little clear evidence to prove it's true.
Or so says Rodney Croft, professor in the University of Wollongong's department of psychology. He's one of Australia's leading researchers studying the effects on the brain of radio frequency exposure from mobile phones and other sources. He points out that, as reported last May in The Australian, the largest study into the effects of mobile phones on the brain failed to show any direct link between mobile phone usage and brain tumours. The Interphone study was conducted across 12 years and in 13 countries, including Australia, and involved more than 20,000 people. Which isn't to say there are absolutely no effects. Critics of the Interphone report, such as Australian neurosurgeons Charlie Teo and Vini Khurana, claim the conclusion that there's little cause for concern is misleading. "Despite the study's methodological limitations that biased it towards finding nothing, the heaviest [mobile] users were found to be at significantly high risk of glioma," they say. Glioma is a common brain cancer. While the study found no increased risk of cancer overall, those in the top 10 per cent of phone use were up to 40 per cent likelier to develop glioma.
Further, studies have found that when people are exposed to mobile phones in their sleep, there's a discernible increase in alpha wave activity during the first non-REM period when the brain is known to engage in so-called memory consolidation. However, the levels of alpha wave activity detected are roughly 1 per cent of those that normally occur when we simply open our eyes, assuredly a safe activity. "I think that in 20 years or so people will have got used to the idea that we have been using radio frequency technologies for a long time without there being any increase in cancer or other health problems," Croft says. "It's very hard to know of course, but that's certainly my feeling."
Ian Hickie, professor and executive director of the Brain and Mind Institute at the University of Sydney, is a fierce critic of Greenfield. He claims with the evidence available it's possible to argue that technology is making us happier and smarter, not miserable and mindless. "It's possible that we may start to see technology encouraging synaptogenesis [growth of new connections]," he says, suggesting it's possible that society's intense engagement with digital technologies is creating new neural pathways our ancestors lacked. Then there's work reported this year by researchers at the Florida Alzheimer's Disease Research Centre. They found long-term exposure to electromagnetic waves associated with cell phone use protected mice from Alzheimer's disease, in some cases even reversing its symptoms.
Those findings haven't been replicated but, regardless, it's society's increased reliance on technology such as internet search engines, video games and, especially, social networking sites such as Facebook and Twitter that worries Greenfield. She says they all contribute to shorter attention spans and encourage instant gratification, while also conditioning the brain to be overly responsive to lights and buzzing noises: "Our brains are being infantisised." Social networking is Greenfield's pet hate. She claims it encourages unnatural forms of social interaction - especially among young people - that ignore important modes of communication such as eye contact and basic body language.
Hickie disagrees, suggesting social networking can be linked to positive social and neurological effects. For instance, he says it's responsible for forging stronger bonds between friends and family than any technology before it, and claims these effects result in positive stimulation of the brain. "To say that social networking isn't pro-social or that it doesn't serve to reinforce empathy is nonsense." Moreover, Hickie says, it has led a sharp decline in the time young people spend playing video games, the bete noir of most child developmental professionals. He goes further: "On this basis I assume Susan Greenfield is against using the telephone."
But while Greenfield's assumptions may not stem from rigorous scientific inquiry, it is telling that the chief executive of Google, Eric Schmidt, has raised concerns about the effect technology is having, particularly the internet. Last year he made headlines when he said: "I worry that the level of interrupt, the sort of overwhelming rapidity of information and especially of stressful information, is in fact affecting cognition. It is . . . affecting deeper thinking. I still believe that sitting down and reading a book is the best way to really learn something and I worry that we're losing that."
Jayashri Kulkarni, professor and director of the Monash Alfred Psychiatry Research Centre in Melbourne, agrees. "I do believe as a society we have become more impatient because we have the capacity to receive information more quickly. We're working in a very much more attention deficit-type mode because we need to assimilate information more quickly."
While the ability to process information quickly is something often associated with high IQ, it's this act of visually processing information - say by using dynamic web pages or the drop-down windows format of a typical computer - that Greenfield and others fear may be impairing our ability to comprehend information on a deeper level. At the core of this issue lies an interesting question. Is the brain like a computer working to allocate a set of finite resources? Or is it, as Hickie suggests, capable of expanding itself to cope with higher input? "Is our [neural] circuitry finite?" Kulkarni asks. "Nobody really knows."
Unfortunately, the best technology available to analyse brain activity isn't sophisticated enough to answer such questions, although considerable money and effort is being spent trying to improve it. Important advances are being made, for instance, in the fields of magnetic resonance imaging and transcranial magnetic stimulation. Meanwhile, Hickie claims a key area of research that isn't getting the attention it deserves is how the brain reacts to being in a constant state of arousal. "People in a constant state of anticipation tend to sleep poorly," he says. And it's well documented that poor sleep can lead to health problems including depression.
Gaming is one area that is attracting plenty of research attention, most notably the effects of long exposure to hyper-violent, hyper-real games such as Grand Theft Auto. These sorts of games are clearly designed to trigger an extreme fight-or-flight reaction in users, something Hickie says may have implications for the natural functioning of the sympathetic nervous system that regulates it.
No doubt Greenfield agrees. She sees no end to the damage that technology may pose to today's and tomorrow's brains. The big question, though, is what proportion of her worrisome predictions is real and what is simply a figment of her imagination. Clearly, there isn't enough evidence to know. But right now Greenfield isn't in the laboratory getting the data to bolster or demolish her claims. Instead, she's putting the finishing touches to her debut novel, one that weaves themes of technology and cyber obsession into a plot involving a 22nd-century neuroscientist and the three women in his life. One supposes the traditional love letter has well and truly become an anachronism in the world they inhabit."
0 Response to ""Is Technology Eating Our Minds?""
Post a Comment