I sometimes come across statements from autistic bloggers and others who describe the world as being divided into two kinds of people: autistics—who make up 1 percent or thereabouts of the world's population—and "neurotypicals." The latter group, according to this view, comprises the remaining 99 percent of the human species, give or take a few.
Although this tidy little view of the world may seem at first glance to make sense, as with many black-and-white categorization schemes, the real world appears to be a messier and more complex place. The scientific evidence simply does not back up the popular notion that there is some sort of "typical" brain configuration shared by the vast majority of our species. To the contrary, neuroscientists and other researchers are discovering previously unrecognized cognitive differences all the time. Just as the autistic spectrum is itself a relatively recent concept, there are many other neurological variations that were unknown to science before modern times, and probably many more that remain unknown.
Last week I read an article about a research study that initially involved one woman who had extreme difficulty forming mental maps of her surroundings. Functional magnetic resonance imaging revealed specific differences in brain activity attributable to her condition. After information about her case was made public last month, about 60 other people—who presumably had gone through their lives up to that point as part of the "normal" majority—contacted the researchers to report that they had the same condition.
According to the article, "the woman seems completely normal in every other way… as if she didn’t pick up one item in the cafeteria line to create a full tray of cognition."
This cafeteria analogy represents yet another assertion of the common view that most people are "neurotypicals" with identically configured brains. It suggests that human cognitive development is like a school cafeteria in which all children are supposed to go through the same line and pick up the same set of items. But when we consider the vast extent of both the previously known and the newly discovered variations in human brain function, I think that a more accurate analogy would be a buffet with thousands of available food items from all over the world, such that a "full tray" can be any one of billions of different potential combinations.
What are the implications of the buffet analogy with regard to standards of typical social behavior? If what we call normality does not reflect the naturally occurring cognitive processes of a huge majority of neurologically identical people, then what does it reflect?
I believe that normality is the cultural midpoint where, in each generation, the behaviors of many different neurological types approach convergence. As such, consensus normality can change very rapidly as we alter our behavior in response to new technology and other changes in our environment. Its boundaries also can be much wider or narrower at various times in history and in different parts of the world, depending on how much tolerance a particular culture shows toward human diversity in general.
A provocative new study suggests that Internet use may be physically changing the brain circuitry of modern humans and rapidly altering our social behavior. "As the brain evolves and shifts its focus towards new technological skills," contends the study's author, Gary Small of UCLA's neuroscience department, "it drifts away from fundamental social skills." The study reported that regular Internet users showed twice as much signaling in brain regions responsible for decision-making and complex reasoning, compared with those who had limited Internet exposure. The latter group displayed superior ability in reading facial expressions.
Of course, the idea that reading facial expressions is a "fundamental social skill" is itself a culturally dependent characterization. We should not assume that this ability was hard-wired into the vast majority of human brains in past generations. In my opinion, it's more likely that those who lived in urban areas and regularly interacted with large numbers of people had a much better understanding of facial expressions than those who lived as peasants in small villages and spent most of their time working in the fields.
The article closes with the caveat that "modern technology, and the skills it fosters, is evolving even faster than we are… What the future brain will look like is still anybody's guess." Thus, in the context of policy decisions about what social behaviors should be encouraged in today's children, I contend that we must be very careful not to perpetuate an overly narrow concept of normality that fails to appreciate the potential contributions of neurological minorities to our society.