Sunday, April 15, 2012

And the Walls Came Down, bah bum...

…. all the way to Hell... (Traveling Wilburys' Tweeter and the Monkey Man, 1988)

I finished a book a few months back, Reading in the Brain, by Stanislas Dehaene. It was an eternal slog, begun in the brief moments of respite I had while fulfilling my duties as summer school director in the district where I work, and continued through daily, 15-minute silent reading times throughout the school year. It was a tedious book, in some regards, at least to a lay person such as myself, far more interested in the implications of his findings (for reasons of both personal and professional curiosity), than in the minutia of how he came to them.

Fortunately, the reader's patience was well-rewarded. Because the implications of Dahaene's findings are both fascinating and profound. He begins from a readily observable fact- children acquire spoken language automatically, with almost no direct instruction, yet learning to read requires years of intensive, deliberate instruction and practice. Clearly, the brain is primed to acquire one of these skills, and the other can only be managed with sustained, deliberate and, often, frustrating effort. As the father of an emerging reader, and as a professional charged with teaching reading to a population whose background is one of almost universal illiteracy, I can attest to the significant difference between the effort required to master these two distinct aspects of language, as I am sure can many others.

The readily observable differences in the “acquireablility” of these two modes of language usage leads Dehaene to posit, without having to make too much of a leap, that while we have evolved to use spoken language, (and all the evidence suggests it has been in use by Homos for hundreds of thousands of years, if it does not pre-date our genus altogether), reading is a relatively new cultural artifact, a mere few thousand years old, which evolution has certainly not had time to adapt our brains to. Thus, while the mechanisms for spoken language are built into the brain, to acquire reading, one must co-opt and modify other parts of the brain which weren't “designed” for reading at all.

But which parts? Well, this is what Dehaene, as a neuroscientist, is most interested in, and what he devotes most of the book to. However, for the lay reader, a lot of hard science concerning neurons and synapses and Broca's region and the prefrontal cortex gets a little wearisome. So for our purposes, only two of his finds concern us. First, that reading in any language utilizes almost exactly the same regions of the left hemisphere of the brain. (In a healthy, expert, reader- though he spends a lot of time on what happens when a reader isn't these things.) This is true in alphabetic languages, such as English or French, as well as symbolic languages, such as Chinese and Japanese. Secondly, the neurons that the brain co-opts for the purposes of reading, in an expert adult reader, are stolen from use from other visual areas. In short, no matter what language a child is learning to read, the brain co-opts neurons from other visual processes to do so.

It's the answer to the next question that I found particularly fascinating- what parts of the brain's visual toolbox are the co-opted for reading? Dehaene suggests that, over time, languages, especially alphabetic ones, have evolved to be readily identifiable and accessible to the primate brain (more on other primates later). In other words, the letters of the alphabet, probably the most efficient form of written communication in the world, consist of shapes that would have stood out in a primate's three-dimensional visual brain. It is easy to see, when looking at letters such as X, O, I, S, T, Y and practically any other, shapes in the 3D visual field which correspond to these letters. More complex letters, such as R, are more likely compounds of several easily recognizable shapes.

This is part of his answer. The other part points to evidence that people who live in hunter-gather societies, and who are illiterate, are able to “see” signs in the natural world, in the context of hunting, for example, that a literate person, with any amount of training, simply can't see. Some of this evidence is circumstantial, and anecdotal, but it nevertheless makes a lot of sense, and is backed by a fair amount of harder evidence. Learning to read necessarily means losing the ability to see other aspects of the visual world, as the neurons that are normally designated for those aspects are co-opted for the task of reading.

So what do neurons have to do with an 80s supergroup? This. A study thatshows how readily adaptable the primate brain is to the act of visualrecognition of letters. Coming across this article this morning, it occurred to me just how well this would have fit into Dehaene's work. But there are larger implications for findings such as these.

The wall between us and other species is coming down. Almost daily, one can point to a new piece of evidence that something we once thought of as exclusively human is actually a trait or ability we share with many other species, often our cousins the other primates, but others as well. Some of the implications of this are obvious, others are more subtle.

One implication is so obvious that I barely need to mention it; that as it becomes evident that we differ from other species merely in certain degrees, not in any kind, there is less and less (read: no) reason to speculate that we have some divine or eternal soul instilled in us to explain our mental abilities. Our brains are just the same as any other brains on the planet, just very highly adapted for the particular purpose of negotiating the complex social interactions that come with being human, among other things.

The crumbling of this wall also has significant, I would say devastating, consequences for those who worship “culture” or “the social” as the ultimate arbiter of all human experience. The adherents of this faith, who refuse to abandon the patently absurd notion that only Nature or (for them) only Nurture can explain human behavior, rely on this fictitious distinction between human beings and other animals as the jumping off point for all of the subsequent branches of their “theories:” Marxism, radical feminism, race theory, queer theory, etc., as well as much fruitless work in anthropology, psychology and other social "sciences," when they choose to ignore the origins of the very subject they are studying- human beings.

Science moved beyond this argument long ago, recognizing that while our genes endow us with possibilities, culture determines a great deal of their manifestation. Reading, as Dehaene demonstrates, is a prime example of this. Our evolutionary history has given us brains that, with the right instruction, are capable of learning symbolic written language. In every healthy human being who acquires this skill, it occurs in the same place in the brain, is acquired at roughly the same rate, moves through the same stages and transforms our brains in the same predictable way. This is genes. What language we learn to read, and, indeed, if we learn to read at all, this is culture.

This is why the article linked above caught my attention. Because in this study of baboons, the clear beginnings of the eventual human tool of reading are manifest. The most logical explanation for the fact of our shared ability with baboons to recognize with accuracy the same written visual shapes is that a common ancestor of ours also had this ability. (The only other explanation that fits the evidence is that it evolved separately, which is unlikely in two so closely related species.) This pushes the emergence of all the underpinnings of “culture” way back into our evolutionary past, far back beyond the emergence of our species. (Of course, this is not the only evidence for that; the increasing evidence for morality, communication, tool use, self-consciousness, and even “culture” in other, non-human, species are all part of this argument as well.)

As mentioned above, the sciences that are concerned with these questions, biology, evolutionary theory, genetics, neuroscience, etc. have long moved past the Nature/ Nurture question, to the obvious answer- Both. This is also true of the leaders in any field that had traditionally been on the Nurture side of the debate: psychology, linguistics, anthropology, etc. However, there are still a large number of students at less-than-leading universities being taught by professors who are stuck in the middle of the previous century, and this trickles out into the population at large. I'm not entirely sure why it is so essential to some in the social sciences to resist the truce that the hard sciences declared so long ago. Perhaps it is a (justified) fear that many of their traditional methods of inquiry will be exposed as faulty, or rather that acknowledging that this occurred long ago will be embarrassing.

Or perhaps it is from the (totally unjustified) fear that accepting evolution's inescapable role in shaping aspects of our nature means that any and all of those aspects are “right” because they are “natural.” This is largely due to what might be called the “Whole Foods Fallacy”- the pervasive notion in our society that what is “natural” is automatically “good.” This is so obviously erroneous that it can be dismissed with two words- scorpion sting.

Despite the fact that there is no more intrinsic connection between what is “natural” and what is “good” than there is between “might” and “right,” this doesn't stop most people's brains from making the false connection between what they value in what they put in their body to and the wider world as a whole. And this leads people to a difficult position- if what is “natural” is “good” but this guy is saying things like male sexual opportunism are “natural” he must be saying that they are “good.” Ergo- either I must accept something which I find distasteful as “good” or I must dismiss it as not being “natural.” Despite the fact that choosing the later option requires dismissing mountains and mountains of evidence, many people choose this over letting go of one, simple, false tautology.

Of course, what is missed by those who choose option A above, the dismissal of the evidence uncovered by natural science, is the root of their own disgust. How is it that we are all almost universally disgusted by philanderers, yet somehow “culture,” and culture alone, is also responsible for their very existence, and if we could just change our “society” they would all go away? Or murders? Or rapists? Or racists? Or religious fanatics?

The evolutionary explanation does not run into this contradiction. In an evolutionary context, it is very easy to see how an individual can be inclined to act in a way that is counter to the social norms of their tribe, norms that they themselves would reinforce if someone else was the cheat. But enough digression...

At this point, it is impossible to ignore, without intentional self-deceptive blindness, the fact that we homo sapiens are more similar to other species than we are different. We share with them concepts of right and wrong, fairness, the in- group/ out- group dichotomy, tool use, mathematics, language, communication, culture. It is no longer possible to build a wall between “us” and “them” and claim that what is true of them, is not true of us, or vice versa. They have instincts, so do we. They have the capacity to learn, and so do we, just to a much greater degree. Our capacity for culture and learning has been driven by the frenzy of sexual selection, to the point where our brains have a lot of bells and whistles so extravagant that they seem almost a different beast than those of the beasts, but this is merely one of the illusions created by having the human equivalent of a peacock's tail in your skull. (I count 17 mixed metaphors in that last sentence...)

We no longer get to use “souls” or “culture” to pretend that we are playing by different rules than any other species on the planet. Our evolved ability to transmit culture- ideas, both true and false, our learnings about the natural world, stories of our own histories- to one another, across both space and time, has made us very, very good at playing this game, and made some of the rules other species have to concern themselves with almost irrelevant. But most of those rules are still those same concerns that trouble our minds on a minute-by-minute basis: finding dinner, finding mates, acquiring status, protecting it, raising offspring.

It is honestly silly, at this juncture of our understanding of the natural world, to continue to act as if our minds were somehow implanted in us from On High, regardless of what you wish to name the source. We scratched and clawed our way out of a primordial puddle, just like the rest of the gang.

Let's stop lying to ourselves about it.


  1. Yes.

    I have many thoughts on this subject, but for now let it suffice that "Whole Foods Fallacy" is genius.

    And that I hope I didn't miss school vacation week.

  2. Why thank you. I'm glad you approve.

    And no you did not. It is this coming week (like starting tomorrow.) Coffee? Wed.?

  3. Very interesting. I have long been aware of the fallacy of an us/them paradigm when it comes to other species, but was reminded anew at just how ludicrous it is, despite its prevalence, when visiting the Natural History Museum with kiddo recently. As we walked through a display on the ancestors of modern humans and our close cousins who are now extinct, kiddo repeatedly asked, "Is this a human or an animal?" (She knows humans are animals; she meant "another animal.") The answer of course was simultaneously "both" and "neither."

  4. I've been generally under the impression that the Whole Food's Fallacy is rather that something which grows with very little effort in the dirt of your yard should cost as much as the hourly wage of the person who sold it to you.

    That being said, sure. Though the value judgment of natural/unnatural isn't a mere connotation in english (or french, or german), it's one of its denotations. There'd be a lot of mass re-education required to separate out "good/normal" from natural, and particularly with "yeah, naturally."

    I feel like it was you who first pointed out to me that nature is a vast canvas upon which anything can be inscribed, and so "nature" is not to be trusted when introduced into any argument (perhaps it was Max). I'm certain I extended it to this: all which occurs in nature is therefore part of nature, all which humans do is therefore part of the sum total of human nature.

    I appreciate the subtle nod to my recent post (though your comments were sorely missed), but I'm afraid you're missing something which actually would significantly help your sort of science take hold. That is, our knowledge derived from observations (and testing and experimentation), when coalesced or organized into theories (I mean no negative connotation there) becomes a social truth. It's unnecessary for every person who agrees with you to themselves perform all the experiments to reach the same conclusion, they need only embrace the whole as a social artifact for it to function on the realm of truth.

    The function or usefulness or importance of the social "sciences" (the last person I saw who put that in quote was incredibly right-wing, so forgive my wince) is that they follow the production of truth. History uncomfortably reminds how scientific consensus itself changes and is influenced by political and social utility and pressure, because even the most aloof scientist exists also as a social and political subject. Race was accepted as a scientific certainty for decades, and there's significant pressure to resurrect it (Jensen, Lynn, Hernstein, et al).

    I'm not sure there was any such "truce" declared, as humanities programs continue to become defunded in place of bio-sciences (U of W receives staggering grants from bio-tech firms to produce more scientists who will turn more profits from their research). The social sciences become less useful in the realm of truth production when they stand in the way of such forces. It's charming to consider this struggle as merely another step in our march toward enlightenment, but it obscures much more than that.

  5. My lack of comments on your post was due to my failing to understand the purpose of holding up internet commenters as one side of any argument to then knock down.

    I use the "science" in "social sciences" in quotes when it not does as science at all, aka Margret Mead, etc. When it is conducted as science, speak of it as such.

    Yes, there is a social aspect to science, of course. But the point is, yes, you could, in principle, trace back these "truth statements" to something you yourself could empirically witness, even if it is difficult in practice, and may require years of study to truly comprehend. On the other end of the spectrum is a statement such as "Marx codified it as; What is Social is Real," which, when one pushes and pushes, opening the final door just reveals the man himself sitting in a room saying, "Because I say so." And in that difference is the world.

  6. It is interesting, and somewhat disheartening to learn that we sacrifice some other way of seeing the world when we learn to read. I suppose it confirms the statements by Russell Means about rejecting writing and defending oral culture.
    I am sure that Daheane's findings will have interesting implications for literacy programs, it is fascinating that the debate between the hard and soft sciences seems to now include the debate between the phonics and whole language methods of literacy education.
    To be fair, I am a second rate "soft scientist". As such, I worry that this might be another episode of the Bell Curve, or some other form of Social Darwinism. After all, defining the limitations of human capacity, and even the capacity of some humans over others, has historically been the preoccupation of the hard sciences when they try to explain the human condition. Which is why the “soft sciences” arose in the first place. It was because of the devastating popularity of Social Darwinism that Franz Boas created the discipline of Anthropology as a "Soft Science"; one that asked people what they thought, rather than measuring their skulls. Boas trained a whole generation of thinkers including Alfred Kroeber, Zora Neal Hurston, and Margaret Mead, who you referenced above.
    Mead's work, though embattled, is useful here. Her "Coming of Age in Samoa" may have been methodologically problematic, but the her main point, that "teenage angst" is unique to certain, usually post-industrial cultures, has been upheld in multiple recent field studies.
    I bring this up because we know that culture plays a huge role in shaping adolescence, and by culture I don't mean "peer pressure", I mean the pressure of the nuclear family, the vast array of choices and the stakes involved in those choices, and even how our society sees adolescence (though Mead said it much more eloquently). Yet none of this is mentioned in popular media concerning adolescent rebellion (see Instead, we get the hard science short answer: this problem exists because we are "hardwired" that way.
    This isn't to say that understanding how the brain works and grows isn't useful to understanding human problems, but our society tends to give undue reverence to explanations which allow us to bypass any interrogation of our culture, how it works, and how it influences our behavior, and so we miss the entire "subjective" dimension of the issue.
    (more below)

  7. At stake here is what it means to be human. As you point out above, scientists continue to discover new similarities we share with animals (though this should have a greater effect on what it means to be animal, rather than human). More importantly, they debunk mythical differences between humans and animals. What this means is that empirically, we are pretty much the same as animals: We want to eat, find a mate, procreate, make war etc.
    Consequently, things which we tend to take for granted as part of our experience are being proved by scientists to be subjective rather than scientific. Now things like love, the self, the desire to create art, and even culture are sitting next to the gods and spirits in the "there is no scientific evidence for this" pile (what fascinates me about this is if one day people will actively have to choose to believe in the self, the same way someone who believes in a god today must actively believe, but this has more to do with ideology). As a people, we will have to choose whether or not to accept the subjective as part of who we are, or reject them it as mythical.
    What all of this has to do with reading, is that reading -- like everything else -- is a scientific and subjective process. Yes, reading is about recognizing letters and associating them with pictures or sounds. Reading is also about transforming a series of tiny symbols into a world, a voice, a perspective, and a cosmology, one that is actually different from your own, one that you might even disagree with, but you still built it when you looked at those letters.
    Finally, reading is the process by which the soul is forged. We are all aware of the subjective process by which a reader, through a lifetime of interacting with the written word, shapes who they are as a person. I would be interested to see how Dehaene would explain this process, using only data from scientific evidence, and if his description would be sufficient.

  8. Max? As in Max Max?

    I would be less "worried that this might be another episode" of this, that or the other, and more worried that we would ever consider our "worry" into the conversation surrounding which explanation might best fit the facts. Introducing a field to counter the "popularity" of an idea is precisely the problem. What I am concerned with is what is the most accurate way to describe the world- which way best fits observation, experience, and allows us to make the best predictions about the future. If the answers to those questions are distasteful to the way I feel things "should be," then so be it. However, I recognize that others are more concerned other... things.

    I don't think anyone ever suggested culture DIDN'T shape our experience or development. Quite the opposite, in fact.

    And as for the statement that "our society gives undo reverence to explanations which allow us to bypass interrogations of our culture..." Are you freakin' kidding me? What else do "we," as a society, talk about? Go onto Google News and look for every article that catalogs a misdeed, or some pattern of human behavior. 99/100 will offer SOME suggestion that culture- "our" culture, "their" culture, the perps upbringing, whatever, was a factor in what happened. Shooting on a college campus? Do we get discussions about the necessarily violent natures of our paleolithic ancestors? No, we get a discussion of American gun culture, and amateur psychoanalytical guesswork as to his upbringing and recent background.

    I'm not suggesting that this is wrong, I think it is entirely appropriate. But to say that "we bypass interrogations of our culture" is absolutely disingenuous.

    As for the "self" being tossed in the pile with gods and spirits- I have my doubts. Science has certainly called it into question, and really, dismantled it, but precisely due to what it is- the grand illusion of our existence, yet the point of view from which we experience everything- I don't think it is possible we will ever "experience" it as not existing, even if we comprehend it on an intellectual level. It is the converse of quantum mechanics- even though we understand it on an intellectual level, we will never "accept" it on an intuitive one, we just aren't evolved to comprehend things on those scales. The self is the opposite. Every fiber of our being is evolved to experience the world in that way.

    As for bandying around terms like "souls"- you've engaged in a discussion I can't participate in. You're speaking of something which you can't point to in the world where you and I can both agree on its features, and thus what we have to say to each other about it would be utterly meaningless. You're welcome to your own opinion of it, but a conversation about it cannot be had.