The Doric Column
January 11, 1999
In a few years scientists will know where all our genes are located on the map of life.
The Human Genome Project has already plumbed vast regions of our chromosomes, the vertical files in our cells that house the DNA plat maps. The sequencing job will be done by the year 2003 AD. When completed, it will be regarded as a unique feat in the history of science and human cartography.
But it's kid stuff compared to the next big mapping project--the human brain and its billions of neurons and zillions of connections.
Brain evolution has proceeded along its merry way in the cranium of Homo sapiens since we migrated out of Africa and began dominating the earth a quarter million years ago.
Now there's another factor to consider in our evolutionary experience. Now the average brain, at least in the West, is heavily trafficked by neurotransmitters on brand new missions from those of our ancestors. They are set loose by moving images on an illuminated electronic screen.
Today most American children spend more time watching television than doing anything else. Their neurotransmitters, the chemical substances that act as messengers in networks of neurons, are delivering the goods in a new, hyperactive environment.
TV neurotransmitter hyperactivity began in earnest just as James Watson and Francis Crick figured out how DNA made copies of itself in their laboratory in Cambridge, England. Fifty years ago. On a timeline of the evolution of our genetic family, 50 years figures out to be about 30 seconds in a day.
We know that TV has radically transformed our culture and politics. Peter Sellers dramatized it 20 years ago as a TV-addicted manchild in the film Being There. We know that the technology of electronic visual media is rubbing its hands with glee as new products pour into the marketplace. Millions of salesfolk are trying to persuade us that we need it. And we've bought in big time.
Just look around.
Where TV pretty much left off, at the school doorstep, computers are now showing up by the truckload. Mr. Chips has emerged from newsgroup therapy a new man, surrounded by silicon.
Last week The Times of London ran a series entitled "The Future of Learning." The article titles include:
What does all this have to do with brain evolution, you ask? The answer is no one knows. You can't prove the tidal wave of electronic images has any evolutionary consequence at all. Not right now. Plus we are just beginning to understand how important early experience is in whipping the brain into shape.
But I suspect that those who believe the ubiquitous electronic screen has no bearing on the evolving brain are decidedly in the minority. Either way, some people will want to know whether any "rewiring" is really going on, others whether the brain's amazing "plasticity" can be made impressionable to serve the market. All of us should want to know whether we are molding, literally, the brains of our youth.
Well, the scientific and technological groundwork is being laid right now to find out.
"Neuroscience," wrote author Tom Wolfe in Forbes magazine a couple years ago, "is on the threshold of a unified theory that will have an impact as powerful as that of Darwinism a hundred years ago."
Wolfe is wowed by the combination of powerful imaging and tracking technologies that now allow scientists not only to watch the brain "as it functions"-- not only to identify centers of sensation "lighting up" in response to stimuli, but to track a thought as it proceeds along neural pathways and traverses the brainscape on its way to the great cerebral memory bank, where it queues up for short- or long-term storage.
That's right. Believe it or not, before long our most transcendent thoughts and deepest aspirations will be outfitted with the equivalent of radio collars.
Speaking of radio collars, there's some reverse brain engineering going on, too. In a couple months researchers at the University of Colorado in Boulder along with officials from a local company named Genobyte will fire up Robokoneko, a robot cat housing the world's most advanced artificial brain in its robocat cranium. Some 40 million transistorized neurons will take up the call, simulating, perhaps more effectively, the performance of the axons and dendrites that are collaborating to write this sentence.
The investigators concede that they are trying to simulate biological evolution with their new chip, not merely programming neural networks to "learn" to perform certain tasks. The optimum design required "many generations of random mutations and breeding." [italics added]
Conjugating chips. It was bound to happen sooner or later.
The black-and-white Philco was well-received when it arrived at our home in 1955.
The first program I remember watching was, naturally, a western, featuring the dignified and stately Hopalong Cassidy setting things right out amongst the tumbleweed and sagebrush. My neurotransmitters went into overdrive.
Now, please, fast forward to the present.
I'm something of a milquetoast when it comes to indulging my children, ages 12.5 and 8 years. But despite withering pressure and mild threats, I've never delivered the goods to the boy, the older of the two. By the goods I mean a Nintendo, a Sega Genesis, or a Sony PlayStation. I have delivered a Game Boy, two Game Boys actually. But for the bigger games he has to rely on our visits to the homes of his younger cousins.
Except during the Christmas season. During Christmas, on a "good will toward men" basis, I yield to his raw hunter-and-gatherer drives and rent a Nintendo from Blockbuster Video. He is free to play at will during the holidays.
I was okay with this arrangement until I happened upon a scene displayed on our new 27" Philips TV. I hadn't paid any attention at all to his selection of games to play on the rental, trusting his judgment.
Never trust the judgment of a boy under the influence of hunter-and-gatherer drives in the contemporary electronic era.
The scene I stumbled upon, I learned, was the consequence of losing a battle. We all lose in life, and we pick up the pieces and move on. But the consequences are more severe in this case. The loser is impaled on metal spikes at the bottom of a pit. Bright crimson pools from his ebbing existence spread over the screen. The accompanying audio confirms that, if you have to go, this is not the way.
"What in the dickens is that?" I barked in horror.
Robbie didn't answer. He couldn't answer. All his conscious energies were being devoted to defeating the dark forces in the video game Mortal Kombat IV. He was dealing with characters like Johnny Cage, Sub-Zero, Scorpion, Sonya, Jax, and Tanya.
These creations are a long way from the likes of Hopalong Cassidy, and their appearance on the electronic screen cannot be described as evidence of evolutionary progress. Anyway, big screen video games in which violence is the main draw are not viewed passively like TV programs. They are actively engaged in. Weighty moral decisions, once left in the hands of the trusty Cassidy, are made on the fly by the player at the controls. Complex feedback loops go into high gear. Oceans of neurotransmitters flood the brain's synapses. Synaptic junctions snap, crackle, and pop. The world around the player vanishes. I vanished.
I tried again, and again, moving closer. "Robbie!" The third time I finally broke through. "What?" he shot back as he hammered the controls, his face transfixed by the action on the screen.
You can still pry kids away from TV using the power of suggestion, rewards or mild threats, but to pry them away from video games, especially the kill-for-fun simulations, you need a crowbar. With the virtual reality arcades coming on, heavy equipment will be required.
It was my own fault. Robbie had pressed me to take him the movie version of Mortal Kombat when it came out a couple years ago. I said no. I said no to renting the videotape, too. Then I let down my guard.
There on the couch right next to him was a stack of recent newspapers. One of them had a story about violent video games and the rating system developed by David Walsh of the Minneapolis-based National Institute on Media and the Family, a national resource center for research, education and information about the impact of the media on children and families. Later I looked up Mortal Kombat on his "Report Card." It flunked on practically all counts.
There was an "M" printed on the back of the game's box, but with no indication what "M" meant. I learned from my brother that "M" means "for mature audiences." When I explained to Robbie that the game would have to be exchanged for another with an appropriate rating, he put up some mild resistance but, much to my relief, displayed no Kung Fu moves. There appeared to be no lasting damage.
Soon he was safely out of the dark realm of Mortal Kombat and comfortably maneuvering some kind of spacemobile through a galactic morass.
The debate about the role of television violence in the development of children and adolescents has probably burned more energy than any other social debate of the second half of the 20th century. You could light a city with it.
Youth violence "is related to the world we have provided for our children to grow up in," wrote Bruce Perry of the Baylor College of Medicine and executive director of the CIVITAS ChildTrauma Initiative in his book Children, Youth and Violence: Searching for Solutions (Guilford Press, 1995). Today's world is "markedly different" from the one the brain has developed in for the past 20,000 years.
Just as the introduction of the printing press "allowed the percentage of literate (i.e. cortically-enriched, cognitively-capable) individuals to increase" and transform their society, Perry wrote, the introduction of television has had a similar revolutionary impact on the organization and functional capacity of the human brain.
By one account, the average child entering first grade has spent nearly 5,000 hours watching television. By the time that same child is 18, he or she will have spent more time watching television than attending school. These are 1980 figures, calculated about the same time the PC was being born. TV viewing by kids has slipped since then, down to an average of 90 minutes from two hours on a weekday back then. But it still occupies much more of a child's day than reading or schoolwork.
The general implications are not yet clear, but for the relationship between mass media and youth aggression, from Bruce Perry's perspective "ominous signs abound."
The mountains of data brought to bear in the television violence debate have been gathered, processed and published largely by the social and behavioral sciences.
That is about to change.
In their article "Child Development and Neuroscience" (Child Development, October 1997) Charles Nelson and Floyd Bloom write that "a greater awareness of the neurobiological mechanisms that underlie behavior would improve our understanding both of behavioral and biological development." The time is right, they argue, for "closer alliances between studies of behavior and development."
The reason is straightforward. Never before has science had in its hands the tools it has today to see precisely how biology influences behavior and how behavior influences biology.
Nelson is a cognitive neuroscientist and distinguished McKnight professor in the University's Institute of Child Development. Bloom is a neuropharmacologist at the Scripps Research Institute in La Jolla, California and editor-in-chief of the journal Science, the prestigious weekly published by the American Association for the Advancement of Science.
They examine two emerging fields in neuroscience that together offer the real possibility of getting to the bottom of matters concerning the brain:
The greatest advance in neuroimaging is Functional Magnetic Resonance Imaging, or fMRI. By tracking real-time changes in blood oxygenation levels, glucose uptake or some other metabolic activity, the neuroarchitecture of fundamental processes and systems can be laid out. The regions of the brain that respond to visuals presented to a research subject lying in a high-field magnet allow investigators to collect information about the human visual system. That system has now been mapped in detail.
We know how we see, brainwise. We also know the neighborhoods of short-term memory. Now studies are taking shape that hold promise to reveal the location of the brain's emotional centers and how they are energized by graphic images and interactive media--by the likes of, say, Johnny Cage in Mortal Kombat IV beating some poor fool to smithereens. Nelson and Charles "Chip" Truwit, University associate professor of radiology and director of neuroradiology, have already identified brain regions activated when children ages 10-12 perform cognitive tasks (tasks of "working memory").
And where fMRI falls short, in giving us precise information about the timing of mental events as opposed to where they are occurring in the brain, a recording technique called Event-Related Potentials (ERP) can be brought to bear. Firing synapses show up like minitremors on a seismograph.
The imagers and recorders have an awesome partner in the basic sciences, those neuroscientists looking at neural system development, neural plasticity, and the neurochemical mechanisms that influence behavior.
Our understanding how the primitive brain emerges to produce what scientists like to call "higher-order mental processes" and what we know from personal experience can also be "lower-order mental processes" took a giant step forward with the rise of molecular biology. Once bogged down with what appeared to be "a hopelessly complicated series of events," developmental neurobiologists have moved quickly on two fronts that have practical implications:
Some of the rules are encoded in terms like "experience-dependent synaptogenesis," which is where "plasticity" comes in. Synaptic fields wax and wane depending on experience.
"A good example is the information acquired by learning," Nelson and Bloom write, "Depending on a child's learning history, different information will be obtained and stored for use at a later time, giving rise to individual differences in knowledge base, memory skills, and so forth."
And synaptic fields are like other things in life, as in the adage "use it or lose it." If the activity that produced the connections drifts away, so do they.
The neurogenetic program is loaded with feedback, self-correction, and adaptation mechanisms from the day that the homeotic genes, the master controls, begin to produce the proteins that orchestrate the process by which embryonic sheets are folded and creased into a fully formed brain.
Research on the homeotic genes earned a Nobel Prize for Edward B. Lewis, a 1939 University graduate in biostatistics.
It's been a breathtaking ride. From decoding nature's origami act to revealing the mosaic of synaptic fields to illuminating the incunabula of thought, the "Decade of the Brain" is living up to its billing.
But with all we know now and will shortly learn, it could be that the harder challenge, for a society that purports to value its youth, lies just ahead, as in:
"The Decade of What We Plan To Do About It."
--William Hoffman email@example.com