Friday, May 11, 2012

Going to Our Power

Teaching is an instructive experience. In the four years I have been involved in the profession, I have found that each year has brought its own lessons. It is cliche at this point to say, "The students teach me as much as I teach them," but I wouldn't go so far as that, and I wonder about the life-experience of anyone who could say that in earnest. No, adolescents are what we thought they were, what we remember ourselves being; petty, over-dramatic, selfish, ignorant, inquisitive, insightful, kind, and fearless. They are the adults they will grow up to be, minus a measure of self-control, wisdom and foresight. They have something to teach, but much, much more to learn.

It is easy, as a teacher, to look at a classroom and see tomorrow's winners and tomorrow's losers. It is also utterly unacceptable to do so, lest you contribute to a self-fulfilling prophecy, though most of us, like I, are far from perfect and do contribute our unfair share. But it is easy to look at a roomful of adolescents and see society writ small, segments of the population each casting their representatives back from the future into a tidy microcosm. And in this microcosm, it is easy to put humanity under a microscope, which is  an unfailingly instructive experience.

This year's lesson was about Power. Not the power of relationships of authority, such as between a teacher and student, though these are instructive. No, I use "Power" here in a newer, more encompassing sense, as I have yet to come up with a better word. But before I define the term as I intend to use it, I think a narrative about how I came to think about the term in this way will prove helpful.

I began the year with a group of three young men in one of my classes. The boys were friends, and began the year with a reputation for causing trouble, some of which I had witnessed the year before. One in particular I had hauled off to the principal's office on more than one occasion, yanking him out of a pregnant fellow teacher's classroom after he'd flipped several desks over in a fit of screaming rage, and him swearing and threatening me the whole way down. So, needless to say, I wasn't totally psyched at the thought of having them in my room 1.5-2 hours a day. 

But I prepped myself to begin with a clean slate, which they all deserve, (god knows I needed a few myself) and we started the year off pretty well. A few days in, I pulled the desk-flipper aside and said, "Look, I know we had a tough time last year, but I want to forget about that. Because I also know that the other student's look up to you and imitate you. I want you to think about what kind of person who want them to see you as, and decide if you want to use that influence you have with them for good things or for bad things." He assured me he wanted people to think of him as a good person and we left it at that.

A few weeks later, things were going well for him, but another of his friends was slipping; doing no work, sleeping through the entire day, and shortly, blowing up at teachers trying to correct his behavior, which  led pretty quickly to detentions, parent-meetings, suspensions, more frustration, more trouble, etc. He started missing more and more school, and was rarely back for more than a day or a morning before he had sucker-punched another kid in the hallway and was looking at another mandatory three-day suspension for violence.   Then, as he started to miss more school, so did his pals- unexcused tardies and absences, detentions for cutting class, skipping these, getting suspended, etc. Pretty soon, it was a safe bet that if one was out, all three were out, roaming the streets... and not to plant flowers or help old ladies at the crosswalk.

And although each of these boys are capable students, even bright ones, their productivity in class began to slip just as rapidly. Most of their time was spent staring out the window, or trying to get away with sleeping, and any free moment they had was spent whispering with their cohorts, laughing, presumably, at memories and tales of their own hijink. And the vicious cycle that is inevitable with these situations set in- as they miss more class-time, they come back to class more and more behind and lost, which leads to embarrassment at not being able to do the work ones peers are doing, which leads one to seek away out of the painful situation, and the easiest way to do that is to act badly as a distraction, or simply act so bad that one is asked to leave the classroom altogether.  

This is an unfortunately common situation in public education, and we spend most of our days fighting against it, though there are times where it is easy to feel like a small Dutch boy. During the first part of the year, I devoted more time, energy and concern to these three boys than I did to my other three dozen students combined. But at some point, that in itself is an egregious sin, and one has to recognize which students are ready to be helped and which will fight you no matter what you do, and you have to give your time and energy and care to those equally, perhaps more, deserving students who are willing and eager to do whatever it takes to better themselves. 

And so the boys slipped into the periphery of my attention. But the pattern continued.

I felt like I'd only utterly failed before to reach one student that I had put a concerted effort into. I tend, unconsciously, to focus on the hardest-nut boys, and the bright girls with low self-esteem. But here were three that I had put a great deal of time into reaching who were all sitting in my room, staring out  the window, just waiting for the opportunity to pretend to work together on an assignment so they could plot the next "enemy" to jump and pummel, or misdemeanor to commit (allegedly). 

This caused me to reflect. Why were they so unreachable? Why were they so utterly, absolutely disengaged in what we were doing, which was at least sometimes interesting (I hope), and even seemingly uninterested in what their peers were up to?

I began to realize that they simply weren't even there. When forced to sit quietly, they did the best they could to mentally remove themselves from this unpleasant situation- they stared out the window. Better than that was when they managed to get together to discuss, relive and plan their misadventures. But best of all was actually being "out there," on the streets, beating kids up (allegedly), thieving (allegedly) and dealing (allegedly). They caused some trouble in school, sometimes a great deal, but in my room they knew they would get away with absolutely nothing. And so they just sat. And waited it out. 

Out there, they were powerful. In my room, they were powerless. So they "left," in whatever way was available to them- physically, through speaking of memories and designs, or at worst, in their own daydreams and imaginations. 

As I reflected on this aspect of our nature, I began to see it everywhere. In students, observing an impulse is easy, since they are subject to nearly every whim of emotion that takes them. I saw it in the girl who was powerless before a difficult reading assignment, but powerful enough to draw the attention of several boys by wearing a tiny T-shirt. I saw it in the boy who was powerless to impress this girl with his not-yet -developed masculinity, but powerful enough to draw a laugh when he acted the fool in front of the class, drawing my ire. And I saw it in the new girl, powerless to bridge cultural, linguistic and religious divides with her peers, but powerful enough to take on her school work with commitment and determination. 

This is what I mean by Power. The feeling of competence, capability and influence that we all have in some areas of life, but not in others. 

And I began to think about myself, and my own life. Where do I feel powerful? On here, writing about things  I am knowledgeable about. In a discussion or argument, especially about politics, philosophy, science, history or some other academic question that I feel more knowledgeable about than the average lay-person. So I provoke them, and have them in my head when alone. With women. So I flirt with them, and think about them often. In a video game, where a click of the mouse can blast a zombie's head into a million pieces, or swing a massive sword, or summon a rain of destruction from the sky. So I game, and think about games often. In a kitchen, with a decade of professional experience behind me. So I love to cook, and start  to think about dinner right after lunch.

Where am I not powerful? On the dance floor. So I avoid it like the plague, or as Johnny Depp says, "I'd rather eat a bag of hair than dance."

I once went to a horse farm with a woman I worked with who had ridden her whole life. Our plan was to rent a couple of horses and ride them on some mild trails up a small nearby mountain. I had never ridden a horse in my life. We showed up, and I, the nerdy city boy driving in from Seattle, with Proust in his bag and in soft leather hiking boots, did not feel powerful around these men with dirt to the top of their stiff cowboy boots, hardened, weathered faces and an unflinching calm around these enormous animals that I was supposed to get on and ride.

Well, if you've ridden a horse, you know that they know exactly whether you are feeling Powerful or not, probably better than you do, and this one knew immediately that I hadn't the slightest idea what I was doing. He took off at a gallop and ran me in circles in a nearby meadow for about 5 minutes (it felt like 5 hours) and at several times I thought he was going to toss me. I did finally get him under control and we had a somewhat pleasant time until my gracious mount decided he needed to stop and piss on the trail and wasn't going a step further. 

We all have places we "go" that make us feel powerful. When I was young it was God. By adolescence, I had trained my faith to the point that whatever was troubling me could be quickly subdued by a few moments reflecting on God and his infinite grace and all he had to offer me. Lots of people "go" here. Others "go" to another world where they are mighty, through a character in a role-playing or video game. People turn conversations from topics they aren't knowledgeable about to ones they are. They read and learn about things they already know a great deal about. They avoid a dysfunctional marriage by going to work, where they have subordinates, and brought in a really good return for the company last quarter. They call into sports-talk radio to share their take, instantly forcing themselves upon the attention of thousands.

This is Pysch 101. Obviously, nothing I am saying is remotely original or revolutionary. However, as I began to perceive our behavior in this way, it seemed easier and easier to categorize the majority of our thoughts, speech and actions as a constant attempt to go towards our power- to extract ourselves, whenever possible, from situations where we have little competence and influence and moving to a situation where we do.

And most of the time, this is okay. No one wants to feel powerless, nor should they. But when we think of all the things that we avoid for fear of showing incompetence or weakness- stepping onto a dance floor, challenging someone more knowledgeable when we disagree with them, reconsidering our own long-held and cherished beliefs, picking up a challenging book, beginning a new sport or activity- we see that we are missing out on a staggering amount of life. 

We pick and choose things we show a natural strength for (usually) and we stick with them. We dig holes of our own expertise. Some of us dig them wide, and some of us dig many. But nevertheless, we dig and we dig, and the further down we go, the harder it is to get a broad view of the sky. 

Sunday, April 15, 2012

And the Walls Came Down, bah bum...



…. all the way to Hell... (Traveling Wilburys' Tweeter and the Monkey Man, 1988)

I finished a book a few months back, Reading in the Brain, by Stanislas Dehaene. It was an eternal slog, begun in the brief moments of respite I had while fulfilling my duties as summer school director in the district where I work, and continued through daily, 15-minute silent reading times throughout the school year. It was a tedious book, in some regards, at least to a lay person such as myself, far more interested in the implications of his findings (for reasons of both personal and professional curiosity), than in the minutia of how he came to them.

Fortunately, the reader's patience was well-rewarded. Because the implications of Dahaene's findings are both fascinating and profound. He begins from a readily observable fact- children acquire spoken language automatically, with almost no direct instruction, yet learning to read requires years of intensive, deliberate instruction and practice. Clearly, the brain is primed to acquire one of these skills, and the other can only be managed with sustained, deliberate and, often, frustrating effort. As the father of an emerging reader, and as a professional charged with teaching reading to a population whose background is one of almost universal illiteracy, I can attest to the significant difference between the effort required to master these two distinct aspects of language, as I am sure can many others.

The readily observable differences in the “acquireablility” of these two modes of language usage leads Dehaene to posit, without having to make too much of a leap, that while we have evolved to use spoken language, (and all the evidence suggests it has been in use by Homos for hundreds of thousands of years, if it does not pre-date our genus altogether), reading is a relatively new cultural artifact, a mere few thousand years old, which evolution has certainly not had time to adapt our brains to. Thus, while the mechanisms for spoken language are built into the brain, to acquire reading, one must co-opt and modify other parts of the brain which weren't “designed” for reading at all.

But which parts? Well, this is what Dehaene, as a neuroscientist, is most interested in, and what he devotes most of the book to. However, for the lay reader, a lot of hard science concerning neurons and synapses and Broca's region and the prefrontal cortex gets a little wearisome. So for our purposes, only two of his finds concern us. First, that reading in any language utilizes almost exactly the same regions of the left hemisphere of the brain. (In a healthy, expert, reader- though he spends a lot of time on what happens when a reader isn't these things.) This is true in alphabetic languages, such as English or French, as well as symbolic languages, such as Chinese and Japanese. Secondly, the neurons that the brain co-opts for the purposes of reading, in an expert adult reader, are stolen from use from other visual areas. In short, no matter what language a child is learning to read, the brain co-opts neurons from other visual processes to do so.

It's the answer to the next question that I found particularly fascinating- what parts of the brain's visual toolbox are the co-opted for reading? Dehaene suggests that, over time, languages, especially alphabetic ones, have evolved to be readily identifiable and accessible to the primate brain (more on other primates later). In other words, the letters of the alphabet, probably the most efficient form of written communication in the world, consist of shapes that would have stood out in a primate's three-dimensional visual brain. It is easy to see, when looking at letters such as X, O, I, S, T, Y and practically any other, shapes in the 3D visual field which correspond to these letters. More complex letters, such as R, are more likely compounds of several easily recognizable shapes.

This is part of his answer. The other part points to evidence that people who live in hunter-gather societies, and who are illiterate, are able to “see” signs in the natural world, in the context of hunting, for example, that a literate person, with any amount of training, simply can't see. Some of this evidence is circumstantial, and anecdotal, but it nevertheless makes a lot of sense, and is backed by a fair amount of harder evidence. Learning to read necessarily means losing the ability to see other aspects of the visual world, as the neurons that are normally designated for those aspects are co-opted for the task of reading.

So what do neurons have to do with an 80s supergroup? This. A study thatshows how readily adaptable the primate brain is to the act of visualrecognition of letters. Coming across this article this morning, it occurred to me just how well this would have fit into Dehaene's work. But there are larger implications for findings such as these.

The wall between us and other species is coming down. Almost daily, one can point to a new piece of evidence that something we once thought of as exclusively human is actually a trait or ability we share with many other species, often our cousins the other primates, but others as well. Some of the implications of this are obvious, others are more subtle.

One implication is so obvious that I barely need to mention it; that as it becomes evident that we differ from other species merely in certain degrees, not in any kind, there is less and less (read: no) reason to speculate that we have some divine or eternal soul instilled in us to explain our mental abilities. Our brains are just the same as any other brains on the planet, just very highly adapted for the particular purpose of negotiating the complex social interactions that come with being human, among other things.

The crumbling of this wall also has significant, I would say devastating, consequences for those who worship “culture” or “the social” as the ultimate arbiter of all human experience. The adherents of this faith, who refuse to abandon the patently absurd notion that only Nature or (for them) only Nurture can explain human behavior, rely on this fictitious distinction between human beings and other animals as the jumping off point for all of the subsequent branches of their “theories:” Marxism, radical feminism, race theory, queer theory, etc., as well as much fruitless work in anthropology, psychology and other social "sciences," when they choose to ignore the origins of the very subject they are studying- human beings.

Science moved beyond this argument long ago, recognizing that while our genes endow us with possibilities, culture determines a great deal of their manifestation. Reading, as Dehaene demonstrates, is a prime example of this. Our evolutionary history has given us brains that, with the right instruction, are capable of learning symbolic written language. In every healthy human being who acquires this skill, it occurs in the same place in the brain, is acquired at roughly the same rate, moves through the same stages and transforms our brains in the same predictable way. This is genes. What language we learn to read, and, indeed, if we learn to read at all, this is culture.

This is why the article linked above caught my attention. Because in this study of baboons, the clear beginnings of the eventual human tool of reading are manifest. The most logical explanation for the fact of our shared ability with baboons to recognize with accuracy the same written visual shapes is that a common ancestor of ours also had this ability. (The only other explanation that fits the evidence is that it evolved separately, which is unlikely in two so closely related species.) This pushes the emergence of all the underpinnings of “culture” way back into our evolutionary past, far back beyond the emergence of our species. (Of course, this is not the only evidence for that; the increasing evidence for morality, communication, tool use, self-consciousness, and even “culture” in other, non-human, species are all part of this argument as well.)

As mentioned above, the sciences that are concerned with these questions, biology, evolutionary theory, genetics, neuroscience, etc. have long moved past the Nature/ Nurture question, to the obvious answer- Both. This is also true of the leaders in any field that had traditionally been on the Nurture side of the debate: psychology, linguistics, anthropology, etc. However, there are still a large number of students at less-than-leading universities being taught by professors who are stuck in the middle of the previous century, and this trickles out into the population at large. I'm not entirely sure why it is so essential to some in the social sciences to resist the truce that the hard sciences declared so long ago. Perhaps it is a (justified) fear that many of their traditional methods of inquiry will be exposed as faulty, or rather that acknowledging that this occurred long ago will be embarrassing.

Or perhaps it is from the (totally unjustified) fear that accepting evolution's inescapable role in shaping aspects of our nature means that any and all of those aspects are “right” because they are “natural.” This is largely due to what might be called the “Whole Foods Fallacy”- the pervasive notion in our society that what is “natural” is automatically “good.” This is so obviously erroneous that it can be dismissed with two words- scorpion sting.

Despite the fact that there is no more intrinsic connection between what is “natural” and what is “good” than there is between “might” and “right,” this doesn't stop most people's brains from making the false connection between what they value in what they put in their body to and the wider world as a whole. And this leads people to a difficult position- if what is “natural” is “good” but this guy is saying things like male sexual opportunism are “natural” he must be saying that they are “good.” Ergo- either I must accept something which I find distasteful as “good” or I must dismiss it as not being “natural.” Despite the fact that choosing the later option requires dismissing mountains and mountains of evidence, many people choose this over letting go of one, simple, false tautology.

Of course, what is missed by those who choose option A above, the dismissal of the evidence uncovered by natural science, is the root of their own disgust. How is it that we are all almost universally disgusted by philanderers, yet somehow “culture,” and culture alone, is also responsible for their very existence, and if we could just change our “society” they would all go away? Or murders? Or rapists? Or racists? Or religious fanatics?

The evolutionary explanation does not run into this contradiction. In an evolutionary context, it is very easy to see how an individual can be inclined to act in a way that is counter to the social norms of their tribe, norms that they themselves would reinforce if someone else was the cheat. But enough digression...

At this point, it is impossible to ignore, without intentional self-deceptive blindness, the fact that we homo sapiens are more similar to other species than we are different. We share with them concepts of right and wrong, fairness, the in- group/ out- group dichotomy, tool use, mathematics, language, communication, culture. It is no longer possible to build a wall between “us” and “them” and claim that what is true of them, is not true of us, or vice versa. They have instincts, so do we. They have the capacity to learn, and so do we, just to a much greater degree. Our capacity for culture and learning has been driven by the frenzy of sexual selection, to the point where our brains have a lot of bells and whistles so extravagant that they seem almost a different beast than those of the beasts, but this is merely one of the illusions created by having the human equivalent of a peacock's tail in your skull. (I count 17 mixed metaphors in that last sentence...)

We no longer get to use “souls” or “culture” to pretend that we are playing by different rules than any other species on the planet. Our evolved ability to transmit culture- ideas, both true and false, our learnings about the natural world, stories of our own histories- to one another, across both space and time, has made us very, very good at playing this game, and made some of the rules other species have to concern themselves with almost irrelevant. But most of those rules are still those same concerns that trouble our minds on a minute-by-minute basis: finding dinner, finding mates, acquiring status, protecting it, raising offspring.

It is honestly silly, at this juncture of our understanding of the natural world, to continue to act as if our minds were somehow implanted in us from On High, regardless of what you wish to name the source. We scratched and clawed our way out of a primordial puddle, just like the rest of the gang.

Let's stop lying to ourselves about it.

Thursday, March 15, 2012

Excusersizing


I've been meaning to write a follow-up post about exercise, and excuse-making, for a long time. It has been quite a while since I looked at that post, so if you have, and some of it is redundant, please forgive me.

Before I say what I have to say however, I should repeat the qualifications that I (don't) have to be writing about this topic in the first place. I'm not a personal trainer, or a professional athlete, or anything of the sort. I'm just a guy who has somehow managed to stick to an (ever-changing) exercise routine for the last 20 years, without any significant lapses. My goal isn't to look like a Men's Health specimen, just to be able to bench-press my own weight a dozen times and do a sprint-triathlon (without actually sprinting, but merely without dying) on any given day of the week. But the commitment issue is what most people seem to struggle the most with, and so that is the one aspect of this that I might be able to shed some light on.

But I am going to start by sharing a little secret that I have learned over the last two decades, and one that is rarely, if ever, heard from those who are trying to sell you an exercise/diet/lifestyle program:

It has to hurt.

No, really, it does. Of course, you don't often hear this because if exercise guru A is selling a program that promises “no-pain,” and exercise guru B is selling one that is honest, whose program is everyone going to buy?

And this is the largely the root of the reason people struggle to commit, especially in American “where can I get a pill to make it all better?” culture. Unrealistic expectations of what it actually takes to get, be and stay fit, cause most people who have a half-hearted desire to be physically and mentally healthy to quickly give up when the mental challenge of commitment becomes too much.

Here is the problem with the almost any approach that is being “sold” to you, whether in the form of a magazine article, a DVD set, or a 6-month deal on gym memberships: almost invariably they try to sell you on the fact that you can get fit in only 90 days, with minimal fuss, and you'll look like Spartacus or Mira when you're done.

You don't need me to tell you that these are lies, lies, lies. But if these are lies, what is the truth?

Not 90 days- 90 years.
There will be a great deal of fuss.
You will never, ever look like someone who is paid to look good.

The Approach

Staying fit is a lifetime commitment. It is. There is no way around it. Taking up yoga for 18 months in your 30s will not keep you trim and healthy in your 50s. Having played high school football does not mean you are in shape now. We all know this, logically, but many people fail to appreciate what it means in a practical sense. Here are some things that anyone who wants to get/be/stay healthy needs to think about:

Forgive yourself. Just because you missed a week or six, does not mean it's all done, you're through, finished, kaput. Just identify the next available time you have to resume your routine (make it soon), whatever it is, and do it. This is one of the problems with thinking of exercise as a “program” that you can “mess up.” It's not a program. It's your life. If you don't get enough sleep for a week, you don't give up on that, do you?

Adapt. We age. Circumstances change. Schedules change. When this happens, you need to change. Maybe you used to jog, but your knees can't take it anymore. So swim. Bike. Do something. Maybe you used to play tennis, but your partner moved away. Keep going and playing off the wall. It probably won't be long until you meet someone else in the same situation. Or join a league. Something. Or maybe you now have a meeting on the Mondays when your yoga class met. So do Zumba. Or cardio-kickboxing. Try something new. Anything.

Variety. I sometimes hear people say, “Well, I was trying to run everyday but I got bored.” Gee, you think? There are people who can run, or bike, or swim every day. But most of us can't. And why would you even want to? Unless you are training for that sport in particular, branch out. There are no exercises that work every muscle in your body, even swimming. Mix it up, so when the day for your run comes around, it has been a little bit. Who knows? You may even look forward to it.

Get outside. Get off that ridiculous treadmill or stair machine or stationary bike. Are you really wondering why you are bored? Here's a hint: If you can read a magazine while you do it, you're not working hard enough. But more importantly, get some fresh air. It wakes you up. The scenery changes. Running through your local park or along the ocean reminds you why you are doing it in the first place- Because you are alive.

Get inside. On the other hand, if you live in a place like I do, where you get, at best, seven months a year where it is really comfortable to get outside, that's not an excuse for taking half the year off. Find something else to do in the winter- swim, play indoor basketball, take a class, dance, even get on one of those awful stationary machines. Whatever. Even if you slow down a bit in the winter, as most of us do, it will make starting up again when the weather clears a lot more fun and a lot easier to look forward to.

Compete against something. This is something I am not terribly good at. This can be someone else, your own personal best, or the stranger running ahead of you on the boulevard. You will find it much easier to push yourself hard if you are determined to beat your husband in racquetball than if you are just going for a jog because it is time.

Have fun. For those of us who aren't training for anything, this is probably the most important thing to remember. All of the above add up to this, but it is important to keep it in mind as a separate goal. Exercise is fun, or at least it should be. If it's not, you need to find something else. Think of it like reading a book. We know when a book is not holding our interest. So put it down and find one that does.

The Fuss

Wait a minute, I thought you said exercise had to hurt? Now you're saying it's supposed to be fun?

Yes.

There are two ways to think about this. You can be having so much fun that you ignore the pain. But also, pain is fun.

But before I go any further, I need to be clear about what I mean by “pain.” I'm not taking about excruciating, about-to-keel-over-and-vomit kind of pain. That usually means you're doing something wrong. What I'm talking about is the pain of those last few repetitions in the weight room, or the sprint of the last hundred yards of a jog. Because it is precisely right before this point where most people quit. And this is precisely the point where virtually all your gains are made.

As an example, this is how I do dumbbell curls. For some reason, my arms ache if I don't work them to absolute exhaustion on a somewhat regular basis. So when I go to the weight room, I do three sets of curls with the 35 or 40 lb. dumbbells. But this only gets me to the point where I can't work those weights anymore. So when I have finished everything else, I return to the dumbbells and do 30 reps with the 20 lbs. Then I put them down and immediately pick up the 15 lbs and do another 30 reps. Then 30 with the 10 lbs, and finally 30 with the 5 lbs. By the time I am done, I can barely lift the 5 lb. dumbbells. The whole thing takes about 2 minutes, and I get more out of it than I do with the 40 lb dumbbells. And those 5 lb-ers hurt.

Its the same principle that applies when you sprint that last quarter mile, or go all-out on the last lap in the pool. That is where gains are made. If you start running three miles, three times a week, your body will quickly adapt to this. You will probably lose some weight, and you'll notice you have more energy and a better appetite (and not just for food...). But, like anything else, if you just keep doing the same thing, the same way, you'll plateau. And the gains will diminish, then cease. Then you'll get frustrated, and it won't seem worth it anymore. So you'll quit.

So you need to find a way to push harder in what you are already doing. One thing I love about weight training is that it allows you to push harder without costing more time- you just add more weight. But with endurance activities, pushing harder usually means adding distance, which means adding time. I typically only have about 45 minutes total to run, including stretching and cool-down. This means I can get about a 5-mile run in, at most. Most people face the same reality of having other duties to attend to. So instead of running further, try running harder. This is a hard thing to sustain over miles. (I can't do it.) But we can select an ever-increasing distance at the end of our run or swim or bike or whatever to really push and really try to empty the tank. If you have anything left when you finish, you're not doing it right.

This isn't to say that moderate exercise doesn't have its benefits. It does. Any exercise, even cleaning the house, is better than nothing. Walking is wonderful. But if you are looking to lose weight, or want to tone up, don't expect a whole lot if you aren't willing to give a whole lot. Exercise is like learning- you get what you give.

And this is why I am a big proponent of pain. If, like me, you only have three or four afternoons a week to exercise, an hour or so each, you really need to get the most out of it that you can. You need to find something that you enjoy, but that pushes you and challenges you to work just harder than you are really comfortable with. You should sweat. You should ache the next day.

The Point

You are NOT exercising just to look good. Stop thinking that. Right now.

If you are exercising to look good, to look like someone in the movies, just quit now. You will never get there. There are some people out there who are paid to look good. You aren't one of them. Remember, when an actor hunks-up for a role, they are in full-time training, often for 4-6 hours a day. They have a professional dietician watching everything they eat. People you see in “health” magazines are on a 24-hour dehydration plan so that their muscles “pop” for the camera. Bikini models are photoshopped up the wazoo.

Exercise makes you healthy. Being healthy looks good. In fact, the definition of “looking good,” in every species, is some display of health and fitness. Among birds this may be shown in bright plumage and robust song, but the principle is the same. The underlying attribute of “attractiveness” is always a display of vitality.

Among humans, we look at faces and bodies. We judge faces based on symmetry, which is a display of health during development, and a lack of any debilitating, disfiguring accidents or diseases. We judge bodies based on vitality- strength, tone, fitness.

In other words, worry about being healthy. The way you look will take care of itself.

(And while we're at it, not being obese is not the same thing as being healthy. Nor is finding a diet that keeps you under-weight, but leaves you without the strength or energy to do a whole lot else.)

And the health benefits of exercise extend far beyond the ability to run a 5k. The most important benefits are those that exercise grants the mind.

Recent studies have revealed two very common-sense facts. Aerobic exercise (running, swimming, etc.) boost your ability to coordinate multiple tasks simultaneously, make long-term plans as well as boost your ability to remain mentally on-task for extended periods of time. Meanwhile, anaerobic exercise (weight training, etc.) boosts your ability to remain focused amid distractors. Both of these make a lot of sense, and the data backs it up. Results like these are also the reason I strongly (very, very strongly) advocate both types of exercise.

Both of these factors come back to pain. Mastering pain requires discipline. It requires learning to resist temptation. When exercising, the temptation is always to quit. When I am finishing my first quarter- mile in the pool, I always wonder if I am actually going to make it for three times what I have already done. Then, when I am working on finishing my third quarter-mile, I am thinking how I could probably do two miles, if I had the time. There is always a hump to get over. When you learn that, and learn to recognize that you can, and will, get to the other side, you start to see this challenge, this opportunity, everywhere.

You'll see it when your spouse has really, really infuriated you, and you've been stewing all day on what nasty thing to say to her when you get home. Until you realize that your anger is the hump. You will get over it. And you'll still have to resolve whatever issue set you off in the first place, but you will be able to do so without any emotional complications.

You see it when you realize you drink too much, but every time you try to slow down, you do well for a few days until your boss chews you out for something that isn't your fault, and then you're right back at it. That's your hump. Get over it. It's not worth slipping back for.

Really, a Lifetime Commitment

More and more, doctors and scientists are recognizing just how much a difference exercise makes as we age. In my family we have diabetes, heart disease and Alzheimer's. What is the number one practice for staving of all of these? Not hard to guess.

But we're not just talking about stretching out your lifespan. We're talking about increasing the quality of it every second you are living it. Who doesn't want to look better, feel better, sleep better, eat better and make love better? Who doesn't want to have more energy, more self-control, more discipline, more focus?

Your body and your brain are intertwined on so many levels. And they are so similar. They are both learning machines that need to be constantly challenged and stimulated or they stagnate. You actually need to push them both, all the time, if you expect them to be there for you when you need them. Just as never reading anything but the sports page, or young adult novels, or never trying to pick up a new skill, or never listening to a contrary viewpoint, are all sure ways to a sedentary mind, so is treating a weekly walk as your “exercise,” a sure way to let your body stagnate.

I can't see exercise as anything less than a duty. You owe it to the rest of us to not be a drain on our health-care system. You owe it to your spouse not to “let yourself go,” just because you've got one in the bag. You owe it to your children to teach them what a healthy lifestyle looks like. You owe it to your spouse, your children and your grandchildren to be there for them as you all grow older.

But lastly, you owe it to yourself. You owe it to yourself to not suffer, on a daily basis, from fatigue, illness, and lack of confidence in both your abilities and appearance. You only get one body. Why on earth would you want to throw it away?

Saturday, March 10, 2012

Lying to My Kid

Irreverent parent that I am, I have been letting my four year old daughter, Charlie, watch the Star Wars trilogy. (The real movies.) We watched A New Hope on a night when my wife was gone late to a conference. She, my daughter, had seen bits and pieces of that one at my parent's house, and she was very curious, and was always talking about Darth Vader, despite only having seen the first 20 minutes or so.

Thus, figuring that she had already been exposed to the highest body count part of the movie, the initial boarding scene, I thought it would probably be best for her to actually see the rest of the movie, and appreciate the characters and the triumph of good over evil, rather than just obsessing about Vader choke-lifting a rebel officer off of his feet, or Obi-Wan sabering the arm off some cantina scum.

And so, we sat and we watched. And thinking the most traumatic parts of the movie were already behind us, I thought nothing off letting her watch all the way to the end. And she did fine, seeming to have no problem with the rest of the movie, beyond asking a million and a half questions, and insisting Leia can't be a super-hero because she's a girl (thank youuuu, Disney)... 

...until Vader cut down Obi-Wan. And then all hell broke lose. Faceless Stormtroopers getting blasted with brightly colored beams of light and falling down is one thing. A friendly old man who has been helping Luke all along was another thing entirely. Where did he go? Is he dead? Sniffling and tears. He's dead, but he comes back, right? 

Well, it's Star Wars, and Obi-Wan Kenobi is a Jedi, so in this case, fortunately, the answer is yes. Yes, honey, just watch. He isn't gone forever. He comes back to help Luke in just a few minutes. Watch.

And so we hyper-drived our way through the rest of the series over the next couple of nights. We had some issues with The Empire Strikes Back, with Han getting frozen in carbonite, but of course, he survives that, even if you have to wait till the next one, and she absolutely would not accept that Vader was Luke's father- "He's too mean."

At this point, she has to see Return of the Jedi, because the whole brilliant arc of the six films is only leading up to one moment, the moment of Anakin's redemption, or "Darth Vader learning how to be nice again," as Charlie would say, and I'd been promising her all along that this would happen- he's mean now, but he does learn how to be nice again. (Thanks to the awesome Despicable Me, this is a story arc she is familiar with.)

But then he dies, and Luke burns his body, though fortunately, it is with the mask on, so she insists it is just Darth Vader's "costume." And then, there they are! The ghosts of all the nice Jedis who have perished over the course of the three films, Obi-Wan, Yoda and Anakin (the DVDs we have are new, so its the young Anakin- sigh.) Not gone, not dead, just friendly ghosts, there to watch over Luke forever and ever.

At this point, Jen and I figure, why not take her to see A Phantom Menace in 3D on the big screen? She'll get to see Anakin as a cute little kid, it'll help her understand that he was once nice, and those movie are so ridiculously cartoonish, she'll probably like them even more. And she did enjoy it quite a bit...

... until Qui-Gon Jinn dies, and he gets burned on the funeral pyre, but this time there is no mask. And this is when the real, uncontrollable tears start. (This is also where we feel like the worst parents in the world. I only ever saw the movie once, I didn't remember much except hating Jar-Jar.) 

This is also when we start lying to her.

No, no, he's not gone. He's a ghost now, like Obi-Wan and Yoda. 

How easy it is, when your child, your precious, your heart now external to you, is sobbing uncontrollably at being confronted with death, how easy it is to just start lying through your friggin' teeth.

____________

I grew up a huge Star Wars fan, like most kids my age. I will always consider those movies among my all-time favorites. But I hadn't watched them much since I was a kid, so it has been curious to view them again with her, through a more critical eye. 

Like almost every other film that comes out of Hollywood, they play into some very familiar tropes. Good and Evil are very clear-cut. Bad guy deaths don't count. Good girls go for scoundrels. But the one that really irks me is this: skeptics are always just small-minded- faith is what counts. 

One masters the Force through faith, through believing in it. Until Luke learns to trust his instincts, to feel the Force, he can't become a Jedi. 

Solo doesn't believe in all that nonsense and hocus-pocus, and hokey religions and ancient weapons are no match for a good blaster by your side, kid. Look, he's been from one end of this galaxy to the other, and he's seen a lot of strange things, but he's never seen anything that would lead him to believe there's some all-powerful force shaping his destiny. 

Solo continues doubting the Force and Luke's potential, pretty much right up until Luke rescues him from Jabba the Hutt. And even then, he never admits that he was wrong, though of course we all know he was.

Twice in A New Hope are the Force and the ways of the Jedi referred to as a "religion." Listing "Jedi" on government census forms has become a world-wide joke, with tens of thousands of people in the English-speaking world getting in on it.  

As an adult, I have a much greater appreciation for why I was so drawn to those movies, especially to the idea of becoming a Jedi. I always played Luke. My much more skeptical (and it turns out, wiser) brother, always played Han. I left for college intending to become a priest. My brother has been agnostic pretty much since he was old enough to have an opinion. 

I so desperately wanted to believe in something bigger than myself, to be part of the cosmic battle between good and evil, to give myself over, entirely and selflessly, to the eventual triumph of good over evil. I would have much, much preferred to be Jedi over Christian, there was so much more action, and who doesn't want a light-saber?, but I'd make do.

____________

Back to lying to your kid.

Children are so, so fragile. We yell at them a dozen times a day because they've put their tiny bodies in some imminent danger, something you wouldn't even notice your spouse doing- walking past an open oven, jumping down two steps, waving their fork in their eye. 

But it is their minds that we work the hardest to protect. I can swallow my instincts, if I have a moment to suppress the gut-reaction, and let Charlie try ice-skating on her own, or riding a bike, knowing that falling and scraped elbows and banged heads are part of growing up. But what I can't do is see her terrified. 

So when your kid is sobbing over the death of even a character on a movie screen, repeatedly asking you, "Why? Why are they burning his body? Why is he dead?" the temptation to tell them anything, any crazy old thing, that will comfort them, is very, very powerful. Even telling them the patently absurd notion that death isn't death. It's just, well, something else. A transition. 

At her age, in the context of those movies, I am fine with her thinking that Jedis become ghosts. (Another similarity to religion here- only Jedis seem to see the afterlife. Sorry, Han.) 

But someday, that answer won't suffice, and I'll have to tell her the truth: I don't know what happens when we die. It seems like we just... die. And that's it.

_____________

There have been numerous theories over the years as to why the propensity to believe in something as irrational and unlikely as life after death is fairly universal in human cultures, and many of those explanations are much more thorough, and legitimate, than my brief foray here. 

But it is a well-established fact that children acquire their beliefs about religion, more than anything else, from their parents. Most of the rest of the ideas we hold about the world can find some referent; they can be cross-checked with reality in some way. Religion is immune to this, and so most people just take other people's word for it. And it is always easiest to swallow the words of those we trust the most already. And there is no greater trust than that which a child places in his or her parents.

So it is not inconceivable to me that the notion of life after death grew out of the pity a parent had for their child, when that child wanted to know where their dead brother or sister had gone. The instinct to protect them from the dangers and fears of the world is so strong, so universal, that it isn't even inconceivable that the same idea occurred over and over again, all over the world. 

And with no checks in the real world, there is no reason to stop believing it. And since good deception begins with self-deception, it isn't hard to see how even those who began perpetuating this idea, slowly began to actually believe it themselves.

No parent wants to see their child in fear. But we also want them to grow. Learning to ride a bike involves taking some falls. Learning to live involves facing some fears. If we want our children to face the world, and their fears, with courage and resolution, we can't give in to our own cowardice. We can't be afraid to speak the truth, to say those words that we dread will undermine our entire relationship with them, dissolve all of their trust-

"You know, honey, I just don't know. I don't think anybody really does."

Don't worry- they'll still love you. And you'll still be their entire world.

But more importantly, you'll have taught them something more valuable than anything they could ever learn from any lie.


Saturday, February 25, 2012

5 Ws


When teaching writing, we remind our students that for a writer to completely capture their subject they need to make sure they've answered the 5 Ws: Who?, What?, Where?, Why? and When?. (hoW is also thrown in here sometimes.) I've had several Ws on my mind over the past few days, though I can't promise that I will hit on all of them. I can promise that there is a connection between them.

What?

The one that got me off on this train of thought, and the one that led me to reflect on the other four, came out of a conversation I had with a good friend of mine, and fellow blogger, over coffee. As we sat and sipped, she told me of a conversation she'd had with a friend of hers, where, for whatever reason, she was talking about myself, and this blog. She said she'd said, “He writes this kinda atheist blog.” This made me cringe, for several reason, most of which should be known to anyone who has been reading this blog for a bit.

The first is, I rather dislike the term “atheist,” and avoid it whenever possible. In fact, now that I'm well past the point of needing to make that aspect of my thinking as clear as possible, I've removed the term from the blog's header. It's not that I find the term offensive (I don't find any term for anything offensive). It's that I think the term "atheist" frames the ideological affirmation of reason, evidence and unrelenting questioning in a negative way. Not negative in the put-down sense, but negative in the sense of missing or lacking something. 

Of course, an atheist is "missing" something from their ideological toolkit- namely faith and/or superstition. But since these things are themselves negative concepts- belief in the absence of, or in spite of, evidence- framing atheism in a negative sense is rather like saying a meadow is "incomplete" because it is not full of holes. Of course, some will assert that the emphasis in the definition of faith should rest not on "absence" but on "belief." This fails, of course, because while there are many reasons one can come to believe something, only faith does it with no reason at all.

Why Not?

Since I have returned to the topic of terminology, I'd like to take a minute to add some thoughts on why, although I dislike the term atheist, I much prefer it to "agnostic." Now, I will be the first to admit that in a technical sense, I am an agnostic on every single question there ever was- there is nothing that I would ever assert I know the answer to, beyond any doubt, and or that there are any questions on which I will never change my mind. ('Cause I'm not a fundamentalist.) However, while this is sometimes thought of as "deep agnosticism" (not "deep" in the "Whoa man, that's deep," sense, but in the "very expansive" sense), it is actually the shallowest sense of agnosticism, since it only asserts the very, very obvious. Because this use of the term is basically the equivalent of pointing out that you are not omniscient, or pointing out that you are human, since both of these imply the same thing.

But calling oneself "agnostic" in reference to the question of the existence of God is very misleading, and inaccurate for most of the people who refer to themselves in this way. Because calling oneself "agnostic" suggests that on the question of God, specifically the Abrahamic God that is foremost in most people's minds, you think the odds are split right down the middle between his existence and non. If this were the case, if someone really thought there was a 50/50 chance that the Abrahamic God existed, in all his retributive, malevolent hegemony, shouldn't you really be taking Pascal's Wager

But most people I know who label themselves agnostic, don't seem to be living their lives under this assumption. They seem to be, like me, living their lives with the conviction that the existence of the Abrahamic God, with all his thought-policing, prayer-sometimes-answering, natural-disaster-as-teaching, and eternal-"rewarding"-and punishing, is very, very, extremely improbable. They also probably think, like me, that there is a somewhat greater chance of the existence of a non-meddling god, one who was genuinely perfect enough to create everything and not need to make any after-the-fact corrections (sending floods, sons, prophets, etc.) And then they may find more or less probable than that scenario (I find it somewhat more probable) the idea that there is no god, not in any sense that that we would recognize from the way he is spoken of on our little planet. (I'm not even considering the use of the term "God" to refer to the "universe" or "all existence" because that is so broad that it loses all meaning.) And of course, there are myriad other possibilities of which we have not or cannot conceive, which we are, technically, agnostic about, to the last.

But if someone is living their life without the thought of God crossing their mind on a regular basis, isn't spending precious hours wondering if they might be wrong, is that really agnosticism, at least in any meaningful sense? Someone attending church service or mass for the first time in many years, not sure what they believe, curious to see what this is all about, and wondering if they really have been missing something in their life, but still not convinced enough to believe, this would be agnosticism on the God Question, in a meaningful sense.

To put it another way, just to be very clear here, let's do a quick thought experiment. I am sitting in the living room of my family's house on the lake (gorgeous in the February still) writing on my laptop. There may or may not be an invisible, incorporeal elephant across the room from me. My non-omniscience makes it so that I am incapable of ever having an absolute, definitive, complete and total certitude on this question, one way or the other. I am technically agnostic about the existence of said elephant. 

But so what? The very, very, very slim possibility that there may be an invisible, incorporeal elephant sharing this view of the lake with me, is not affecting my thoughts, behavior or life in any way. (Beyond his utility for this thought experiment, of course. But the point stands.) So does it mean anything for me to say that I am "agnostic" about the elephant question? It really doesn't, not without utterly diluting the term "agnostic" itself, which is a beautiful term with a great many uses, much more purposeful than this.

Who?

In further reference to my friends description of these pages as "an atheist blog:" 

As often as I return to the question of non-belief, and as often as I employ faith as an example of quintessentially non-critical thinking, I've tried my durndest to keep the blog from becoming as single-track as all that. I may have failed. (My wife assures me that this is not the case, but she may have just been saying that.) And it is certainly not my friend's fault for framing it in that way, if that is how she has perceived it.

However, several recent experiences have led me to not think so harshly of writing an "atheist blog," though I'd at least like to switch the term to "a freethinker blog." I've gotten more and more feedback, through the comments section of the the blog itself, through email or RL conversation, that this is the aspect of the blog that has had the most positive impact on people's lives. And if this is the case, then I will continue on that theme less reluctantly than I have done recently.

So the W above refers to Who Do I Write This For? Well, I am number one. I enjoy writing, I enjoy debating, and I find that the process of writing helps me sort my own thoughts. But after that, I am writing for the number of people who find themselves in a position, as I so often do, where faith is assumed, where it is assumed that without faith, one cannot be truly good and where questioning faith is still regarded as one of the most offensive, inconsiderate, disrespectful things one can do.

And I write because it seems to help those people who haven't had the time to ponder these questions quite as much as I have, or read up on the topic quite as much as I have. Or maybe they have, but a different perspective is valuable to them. It seems to have helped some readers articulate thoughts they'd had themselves but couldn't express quite as clearly or succinctly as they wanted to.

Some people are surrounded by a family whom they love, and loves them dearly, but who consistently fail to consider that on some of the biggest questions in life, they don't see eye to eye. Your family may pray before every meal when you get together, and while there is no harm in letting people execute rituals that are important to them, since there is rarely a polite way to excuse oneself from these things if you don't believe in their efficacy, it often puts a freethinker in between the regrettable choices of being a fake or being rude.

Or your family may send you God-inspired well-wishes when you are sick, or tell you that they will be praying for you when you go in for surgery, or labor, or a job-interview. And again, while this is almost entirely harmless (although studies have shown that those who are prayed for recover more slowly), it puts a freethinker in the same undesirable position between dishonesty and rudeness. (I mean, you can say nothing, as I usually do, choosing the least rude rudeness, but even a "Thank you," lacks integrity, to those of us who are picky about such things.)

Or you may be a parent who wishes his children's education to continue free of the distortions and lies that come from a teacher injecting their superstitions into their curricula.

Or you may be a teacher in a public school who is ostracized by his colleagues for pointing out, "Well, no, technically the school can't hang 'Merry Christmas!' banners everywhere, as that violates the separation of church and state."

This blog can't get you out of those sticky situations on its own, but perhaps you, having had a little more time to read, think, and question, can.

When?

This one has come up quite a bit for me personally, lately. When should someone "come out," as it were, to family and friends who assume that everyone they care about shares their unfounded superstitions? While this may sound rather trivial, especially to those who did not grow up in a family or community of strong believers, I have enough personal experience with this to assert that it is not as trite as it sounds.

There are some particularly unfortunate folks who have legitimate reason to believe that if they were to come out and assert their non-belief in the same superstitions that those around them hold dear, that they would be irrevocably shunned by many of their friends and family. There is no easy answer to this. It is a question that each individual must decide on their own, but they need to be cognizant of what they decide between: integrity or acceptance. I would always, always (try to) choose the former, but I am fortunate enough to not feel a deep-seated need to be accepted by more than a small handful of those nearest and dearest to me. Others need more widespread acceptance to fulfill their emotional needs, and find peace and happiness in their lives, and I can respect that. 

But most of us are probably in a slightly easier situation. Our family and friends are not so hard-core that we would be cast out, but we do fear that they would be deeply hurt by the revelation. When this is a spouse or a parent, that thought can be a very strong deterrent. We can imagine the pain and distress of a mother who fears that the souls of her children are eternally damned, even if we believe that fear to be entirely unfounded.  

So here the choice is a bit different, as acceptance is not the major consideration, nor is the question of integrity so black and white. For where does integrity lie, on the side of honesty, or on the side of consideration for others, even if means allowing them to live a delusion? 

This is essentially the choice I was faced with, when questions of faith and its place in the world reasserted themselves in my life with the birth of my daughter. And hence the blog, the most non-confrontational (medium, not content) approach I could take. I figured it would give me space to share my thoughts on this question and others, without requiring me to assert my non-belief to those who might be hurt by learning of it. Those who wish to read it, can. Those who don't aren't required to. (I do not believe that it is read by the people in my life who would be most disturbed by its contents.) And of course, I am always willing to discuss these questions, or any questions, with anyone on anything, at any time, in any place. (I mean real questions, meaningful questions.) 

This passive approach has worked for me, but others may not desire to make their thoughts quite so public, forcing them, ironically, towards a more aggressive approach if they really wish to make them known to those around them. Or not. If you are a freethinker, have shunned superstition and any show of it, chances are pretty good that those who are closest to you already know, or at least suspect, your thinking. It may be best to let them come to you. If they don't ask, it's probably because they don't want to know, and it may be best to respect that. If they do ask, be honest. Hide nothing. Dishonesty never got anyone anything of any lasting value.

Where?

Where can we do good? 

This is the question most on my mind as of late. While I think it is vitally important to know what you believe and why you believe it, to be able to articulate and defend it, that is only the very beginning. If we ever wish freethinking to be seen as more than a rejection of venerable institutions of morality, community and philosophy, we must demonstrate what we would put in its place. Destruction is easy. An earnest child who demands real answers would best the most sophisticated theologian in a debate, if it were judged objectively. We can't be content with simply trying to show people why we think they are wrong, or even that their superstitions are inherently harmful. That would be an abdication of the responsibility that comes with moral conviction.

Faith has wrought incalculable damage on this world. It has, and continues to, foster mistrust, hatred, prejudice and reliance on ritual over action. It plays into the absolute worst of the innate human tendency towards the in-group/ out-group dichotomy, where those who are "in" are loved, protected and cared for, while those who are "out" are mistrusted, feared and in extreme (but common) cases, murdered. (And please don't tell me what Jesus taught about loving everyone- when even a majority of his followers are actually practicing that, let me know.) 

But faith is not the only culprit. Life is the struggle for finite resources. Most species, and most of this one, are incapable of seeing beyond their own immediate needs and wants in this. Humans have the capacity to, if this skill is honed. Hone it. 

Because the world grows smaller every day, and there is less and less room for the irrelevant distinctions that have divided our species for the last 200,000 years. There is no more room for divisions based on the color of someone's skin, who they lie down next to at night, or which never-seen deity they bow down to. All distinctions are a product of our perception- the distinctions we envision in our mind exist no place else, even when they are shared with others. 

So let us take responsibility for putting the world back together. Neither Jesus, any other son of David or the 13th Imam are going to be coming back in glory any time soon. Nor is global-warming, financial-armageddon, or the end of fossil fuels going to utterly destroy civilization as we know it, though each will certainly be a test. 

But we have to accept that we will face these tests, or others like them, and that there is no parent in the sky to make it all better on the other side. What we cannot do is hole up with like-minded fellows and expect that we will be the lucky few who will come through unscathed. This would make us no better than a dazed congregation swaying and weeping in anticipation of the Rapture. 

But we also must accept that as the world gets more and more crowded, many will adopt this cowardly attitude. There is little we can do about that, expect set a better example.

So find some small bit of good you can do, and do it. You don't need to do it because you are an atheist, or agnostic or a freethinker, but because it makes the world a better place. And that's good for all of us, yourself included. 

But at the same time, don't forget why you are doing it, and don't be afraid to say it. You're not doing it because you fear punishment, or because you hope for some trivial reward. You're bigger than that. You're doing it because your reason has led you to understand that those of us who have managed to shake off the atavistic prejudices of fear and superstition also have the capacity to do more. 

So do it. 

Tuesday, February 21, 2012

IQocracy

If you've been paying any attention at all, and have read this blog regularly, you'll have noticed that each post is basically a thesis paper. By the end of the first paragraph, you know that "I'll argue that X is true," or "... that we should do X instead of Y." So I'm going to be upfront with you here, I can't be so clear this time. I don't think I actually have a point to what I am writing about today, though maybe I will by the end. Today, I really just have a bunch of similarly themed thoughts that have been bumping around in my head over the past few weeks. 

What I want to talk about today is intelligence, and the role it plays in human society. I've encountered a number of different discussions, and had a number of personal experiences, all over the last few weeks that have all kept redirecting my thinking back to this question of intelligence. So, please, come along as I try to sort through these out loud.

Intelligence, along with bipedalism, symbolic language, upright posture and opposable thumbs, but intelligence more so than all the rest, is the trait that distinguishes our species from the rest. (OMG I hate Blogger's spell-check- it didn't recognize "bipedalism" or "opposable.") And, from our species-centric perspective, we can't imagine anything more valuable. (Note, the having of a species-centric perspective does not distinguish us from other species at all, as most species seem to share this, each in their own limited way. If you were a bat, what could possibly be a more important trait than the ability to echo-locate? How else are you gonna find dinner?) Nevertheless, this natural favoritism of our most distinguishing trait plays out in complicated ways in human society, as it is simultaneously highly-regarded and deeply mistrusted.

(Before we go on, I should not wish to leave language behind. While symbolic language does not seem to be a prerequisite for intelligence up to a certain point- I just read about pigeons that had been trained to do math with numbers less than 10- it is fairly clear, from the evidence of species on Earth anyway, that intelligence beyond a certain point does require symbolic language. However, while I think language is a huge part of this discussion, incorporating it at every turn would add unnecessary complications, so I will try to simplify my points by sticking to "intelligence." When I finish Pinker's The Stuff of Thought, Language as a Window into Human Nature, I'll surely have a lot to say about language.)

So let me begin by discussing what I mean by intelligence being "simultaneously highly-regarded and deeply mistrusted." The first is more obvious- we all want to be smart, or at least thought of as such. Few things sting more than an insult to one's intelligence. (Except among middle-school boys, where being dumb is cool, though I always tell them that by high school girls don't think it nearly as charming as they do in 8th grade.) We say in awe when we recognize that someone is more intelligent than us in a certain area, "Wow, she's really smart." So while it isn't always the most highly regarded trait- some people would take looks over brains any day, despite the former's ephemeral nature- having it is generally regarded as a good thing.

But at the same time, being too smart, especially in the presence of those who are aware of the limits of their own intelligence, can be seen as something that merits distrust, suspicion, even animosity. Think about the way people talk about lawyers. While some of the distrust and dislike people have of lawyers is directed at the perception that they are greedy (surely some are, but the ones I know are the opposite of that) or that they use the law to manipulate the system for their own gain or profit (again, surely some do, but not all), most of the deep-seated distrust of lawyers stems from the fact that they are "tricky." But what people mean by "tricky" is that a good trial lawyer can force a witness to acknowledge the contradictions in their own statements- contradictions that the witness's own brain hadn't yet perceived. In other words, people distrust lawyers because they are generally smarter than the average person and their whole job is to use that intelligence to prove that what someone else is saying is false, misleading, inaccurate or impossible. (Perhaps Great Expectations' Mr. Jaggers is the epitome of what I mean here.)

This same mode of distrust towards intelligence and the people that have it can be witnessed in subtler ways in society at large. Try this: go out before dawn during hunting season in Maine, or anywhere, stop in a road-side gas station/ diner, saddle up to one of the counter stools, turn to the large flannel clad man in the orange cap next to you, and ask him what he thinks about the homo-eroticism in Shakespeare's sonnets. 

Okay, while there is certainly a cultural divide that factors in here as well, degree of education, and field of education, these are all part of the same discussion. But the point could still have been made with something less provocative, for any soliloquy on your part that tended beyond the monosyllabic in usage would risk your leaving with fewer teeth than you entered with. 

With these admittedly extreme examples in mind, we can surely imagine more subtle instances that we actually encounter in our own lives. I think that the these are the factors at work here- When someone with a more sophisticated intellect tries to explain a complex idea to someone who is struggling to grasp the concept, the struggler adopts an attitude akin to someone who fears they are being conned. Even while they understand on some level that the explainer grasps something they don't, they aren't willing (somewhat understandably so) to take the explainer's word for it, and adopt a defensive attitude of mistrust, often refusing to accept an idea that would have greatly benefited them.

We see this writ large in American society. It has become something of a truism that Americans don't want their president to be too smart; they'd prefer someone they can have a beer with. (If you're in a another country reading this, yes, that is accurate- the American public is willing to trust the launch codes of the world's largest nuclear arsenal to someone who we think would be hoot to shoot a few games of pool with down at the local pub.) At least to some degree, this underlies the current right-wing hatred of Obama, because many people don't feel like he is someone they could talk to, mano-y-mano, despite objective analysis that suggests he is the most moderate president since WWII .

While I am on the topic of the prez, it was actually the man himself that got me off on this train of thought a few weeks back. By any objective analysis, regardless of what you think of his policies, Obama is an intelligent human being (despite the common American usage of the term "idiot" to refer to anyone we disagree with.) In January, Obama gave his third State of the Union address. One of the most common accusations against him is that he sounds to "professorial," that he "talks over people's heads." Well, I guess if your head was up your ass, I can see how he could be talking over it, because each of his three SOTU addresses have been given at an 8th grade level. (Please follow that link, it's fascinating, and relevant to what I am about to say.) In fact, Barry Os SOTUs are all near the bottom of the charts in terms of the complexity of the language, behind even, yes, W. Of course, then begin the accusations that he is dumbing things down too much, being insultingly simplistic (he kinda can't win), except the average American reads at a 7th grade level. Yes, you read that correctly. Seventh grade. In other words, Johnny Tremain or The Old Man and the Sea, both 8th grade lexiles, are a bit of a struggle for your typical voter.

Now, reading level and intelligence are not exactly the same thing, but there is a very high correlation. Besides the obvious fact the literacy increases vocabulary, verbal intelligence and background knowledge (which does play into intelligence- you can build a better car if you have access to more parts), literacy is tied to working memory and the ability to sustain concentration, both essential for understanding complex ideas. And as us educators like to remind our students, "Reading is thinking," and like any muscle, the more practice a brain gets, the more efficient it is.

What it boils down to is this- The United States of America, the most affluent and powerful nation in the history of the world, is guided by the political will of millions of people whose intellectual capacity (and arguably emotional capacity as well) was arrested when the most important thing on their mind was whether or not that zit on their chin would be gone in time for the school dance. And what we look for in our leaders is that they fit roughly into this mold as well. (I don't think the stupidity problem is limited to the US, but such intense hatred of those we think of as "elite" is a particularly American phenomenon.)

You probably find that thought as depressing as I do. But fortunately, the situation is more complex than this. First, we must recall the meaning of the term "average." George Carlin puts a hilarious spin on it with his elucidation of the term in this particular context; "Think about just how dumb the average person is. Just think about that for a minute... and then remember, half the people are even dumber than that!" Fortunately, the opposite of this is also true. Half the people are smarter than the average, and this, probably, is enough to save us from the doom of our own self-inflicted stupidities. 

Let's just talk about this from a statistical perspective for a moment. Let's say you take an IQ test and you get a score of 138 (on the Weschler scale). (Yes, I am aware of the limitations inherent in using IQ as the sole measure of intelligence, but it is one, it is widely accepted, and it shows very strong correlations to the ability to perform all sorts of mental and creative tasks. So for simplicity's sake, I will stick to that here.) This score puts you in the highest category on any conversion-scale (Stanford or Cattel) and says your IQ is in the 99.5 percentile of the population. Well, doesn't that sound extraordinary! Aren't you special? 

Well, not really. Being in the 99.5% on any scale means that out of a sampling of 200 people chosen at random, you should, statistically speaking, have the highest whatever-you-are-measuring. But 200 people isn't really all that much, when you consider the population of the human race. In other words, there are 1.5 million people as smart as you in the US alone, and 35 million of your peers in the world. 

Don't feel so awesome now, do you?

The point is, while there are certainly a lot of truly dumb people out there, there are fortunately, a fair number of intelligent ones. And that may be enough.

Because despite what we think, we human beings are not innovators. We are copiers. (I owe most of what I am about to say to Mark Pagel, evolutionary biologist at Reading. That talk is from edge.org, "the world's smartest website." Check it out.) To use Pagel's example, think about how many truly original thoughts, ideas or innovations you have come up with in your lifetime. I'll give you a minute... Let me guess? Zero? If you are like most of the rest of us, that is very likely the number. (I actually thought I had two original ideas myself, both regarding how our evolutionary heritage factors into certain key aspects of the way we view the world, but alas, when I read Dawkin's The God Delusion I discovered that he, and likely others, had already thought of both of them. Not only that, but he even coined, in the book, the exact term I had settled on for one of those ideas. So, back to zero...)

Pagel's not talking about a poem you wrote, or some chords you put together on your six-string, because we all know how heavily influenced all art is by those who came before. He, and I, are talking about truly original, hand-axe, spear, wheel, fire, printing press, telephone, light bulb-type innovations, the kind that utterly and irrevocably change human history. Of course, that is setting the bar awfully high, but isn't that the point? Very, very few of us will be clever enough, or lucky enough, to stumble upon an idea so incredibly history-altering. 

And, the sort-of good news is that we don't need to. Because we have language. And we have writing. And we have long-distance communication. And now we have the Internet and mobile devices. And who knows what will come next? As Pagel points out, our species ability to disseminate a single good idea to millions and billions of new brains is at an unprecedented peak, and is accelerating at an accelerating rate. Much has been made of this, but think back to the Arab Spring. A single fruit vendor immolates himself to protest the abuse of corrupt officials, and, within months, tyrannies that had survived decades fall. 

As Pagel argues, the reason Facebook (not even an original idea itself, just a superior copy, now worth $70 billion) is so popular is that we have an innate desire, even need, to see what others are up to, particularly those who we admire, and imitate them. Not overtly, in a toady kinda way, but in the more subtle, "Oh, my friend that's in a band and knows a lot about music is listening to this band I've never heard of on Spotify. I should check them out, maybe they're good. And if they make it big, I'll be ahead of the curve for once, instead of 'discovering' an artist when they release their third album" kinda way. Of course, our thinking is never so overtly pathetic, but honesty will help you appreciate that something similar to that is going through your mind on a regular basis. 

And we all do it, every single one of us. Even those of us who like to pretend we are different, above all that. Whether you are the most mall and Facebook addicted wannabe, or the most extreme living-off-the-grid anarchosocialist queer pagan, you are a target market. Somewhere out there, an idea, a style, a trend is heading your way that you will snatch up greedily because it will make you feel more like what you envision yourself wanting to be, or at least thinking you want to be. And the chances are very, very incredibly slim that the idea will be your own.

I was reminded of this yesterday when my wife and I actually went to the mall, something we do about twice a year. She was getting dressed, and came down in a leggings, a long shirt that came below her butt, a long open "sweater" (though it probably weighed 3 oz), and knee high leather boots. She said, "Do I look ridiculous? This is what everyone is wearing." Well, I thought she looked smoking hot, so I told her so. But if you know my wife, you know that she is not "trendy" in any way, which is one reason I adore her. Most of her favorite clothes are from the 1940s. But nevertheless, this is the "look" right now, and the allure of that is strong. 

(I am so glad to be a guy, since our styles don't change all that much. I'd wear the same perfectly innocuous , boring, one-in-a-million outfit everyday if I could, because I happen to be of the opinion that if you need fashion (or piercings or tattoos) to show "who you are," there may not be much there to begin with. If an article of clothing or spot of ink can sum you up, you must not have a lot to say. But that's just my own snotty, superior opinion, so, please, don't mind me.)

Okay, so I've gotten from intelligence to fashion, by means of discussing how ideas spread, and how readily our brains take them in. We are copiers, and in a sense, that's okay. This fact does, again, as Pagel argues, have two effects. One, our ability to copy does put a downward pressure on any one individual's necessity, and hence, ability, to innovate. Why do all the hard work of coming up with an original idea when you can just wait for one of the other 7 billion people on the planet to do it and copy it from them? It's bound to happen, and statistically speaking, bound to happen fairly soon. 

For example, my wife is photographer. Using a digital camera requires storing photos on SD cards, which can be corrupted, or get lost or destroyed, before the images are downloaded and backed up. When these photos are of a once-in-a-lifetime event, such as someone's wedding, this is a big deal. Now, when I take a picture on my Android phone, it immediately (if I have it set this way, which I do) uploads that photo to a folder in my Google+ account, where it is private until I share it. So I asked my wife the other day, "When is your camera gonna do that? Why aren't they already making professional digital cameras with 4G connections that immediately store all your photos in the cloud so that they are automatically backed up the instant you take them?" 

That's not an original idea, but it is a combination of several ideas to come up with a new solution to a relatively new problem. I hadn't heard of this yet, and neither had she, but a quick google shows that there are already plenty of cameras that do just that, and more on the way that will do it more efficiently. So even as I was thinking I had come up with something fairly original, products were already rolling off the line that anticipated what I had been thinking. So now, instead of me having to build one for her, she can just go buy one the next time she decides to upgrade. 

Now, if we had been living in the Middle Ages, or any time previous, and this was a problem with, say our plow, we wouldn't have had that luxury. I would have had to analyze the problem on my own, come up with a solution, manufacture the parts on my own and implement the solution. Unfortunately, this would have come at the expense of valuable crop-tending time, time I probably could not afford to lose. So the innovation I had dreamed up for our plow, the one that would have saved us time in the long run and kept us better fed, would have to perish before the more immediate needs of our bellies. If this sounds dour and extreme, recall that it took hundreds of years for Europe to make the transition from two-crop rotation to the vastly superior three-crop rotation.

So, am I dumber than a medieval farmer because I leave it up to others to come up with solutions to most of my problems? Well, first of all, at no point in history was the average person a better innovator than at any other time. That's the whole point. If they were, the idea of three-crop rotation wouldn't have taken hundreds of years to spread, and it would have started much sooner. People are copiers, it is just that back then, ideas transmitted at the speed of donkey, not the speed of light (which is about 185,999.999... miles per second slower.) The average person has always relied on others to do most of his innovating for him.

So, despite the Flynn effect (which shows that the average IQ increases over time, though the tests are always re-normed to 100), we really aren't much better at innovation than our ancestors were, and we may be, according to Pagel, a bit worse. However, as a society, we should expect to see more innovations, and also expect to see these innovations have more rapid and widespread impact on society as a whole.

This is for two reasons. First, as the population continues to increase, hopefully leveling off around 9 billion (if the faithjobs don't have their way stifling the education of women and limiting access to contraceptives and family planning), we will continue to see a greater absolute number of people with high intelligence (though as a ratio it will stay the same). Now, intelligence isn't the sole criterion for innovation, though it would be hard to argue that it doesn't help, and the same ratios would likely hold true for "creativity" if it could be measured in a similar way, which it really can't be. Second, as we have more and more "innovators" out there, more of the rest of us will have more immediate access to their ideas, as communications and travel technology continues to make the world a smaller and smaller place.

So while there is even less and less need for any one individual to be innovative on their own, society as a whole is getting "smarter."

Or is it? Because of course, if individual people aren't all that sharp to begin with, what's to say that bad ideas won't spread just as quickly as good ones? Nothing, of course, and we see the spread of bad ideas taking place all the time. (Yeah, I'm just kinda going with this now. I know this post is absurdly long, but what the heck, right? I've already told Pandora "I'm still listening," about 4 times.)

Take Harold Camping. Twice in the past year, that crazy old loon got a large number of people in this country into a tizzy because he said he knew that the world was going to end on a certain imminent date. He was on the news, trending on Twitter, people in my own family were discussing the likelihood of his prophecies. Think about that. An obviously bat-shit crazy old bastard, who had already pulled this stunt once before years back and been proven wrong then, says that he knows, absolutely knows, based on his reading of a 1700 year old book, about a guy who may or may not have lived 300 years before it was written, and was written precisely to make it look like other prophecies which had been made hundreds of years before that had come true, and based on Harold's reading of this myth, he says that the world is going to end in a few days. And it is on the news. And people are actually a little bit nervous.

Yeah. "Holy shit" is right.

Obviously, people's bullshit detectors suck, which is a lot of what this blog has been about. And there are probably a lot of people out there thinking, "Well, I don't think a lot of people took him very seriously." No, that's probably true. But do you wanna know what the most common argument against Harold's prophecy was?

Not, "Why is anyone listening to someone who belongs in an asylum?"

Oh, no. The most common comeback I heard was Matthew 24:36, "No one knows the day or the hour of His return."

BOOM! Take that! Suck on that Harold, you false prophet you! Burn in hell, sucka!

Yeah, I know. But that was all over Facebook, all over Twitter, all over the news media and the religoblogs. I even saw two people in my own family, after Harold died of a heart attack a few weeks after the world failed to end for the third time, high-fiving over God giving Harold his comeuppance.

So this little example should be a reminder of how quickly bad ideas can spread, just as easily as good ones. Sometimes even more so, because an indulgent lie is often much more comforting than a harsh truth.

So here is where we're at. A regular commentator on here, who writes her own excellent blog, admitted to me that she was a little nervous about writing a post that touched on a something some people might find offensive (it was religion, in that particular case). The short answer I gave her was, "If you say something that makes people uncomfortable, good. People need to be uncomfortable from time to time, or else they just get stuck on unquestioned, often very bad, assumptions."

This post is, I guess, my long answer to her question. You can't take your ideas out of the marketplace simply because you worry that people might be offended. Too much is at stake. Too often, very intelligent, thoughtful people end up being "good idea sponges" that suck suck suck up intelligent, thoughtful positions on issues that are critically important to society as a whole and then... nothing. They don't speak their mind when around others who they know are less informed, and maybe even obviously less intelligent than themselves.

They do this for several reasons. One, they fear being treated with that mistrust people have for those who are obviously smarter than the rest. Two, they know that no matter how clearly they articulate and defend their position, the hoopleheads will just shout louder, use more caps, and take even more extreme positions, rather than simply fessing up to the fact that they are talking to someone who clearly knows more than they do, and understands the topic better.

I see this all the time on Facebook, when the discussion turns political or something equally heated. You can tell the most knowledgeable commenters because they generally have the least to say. They will make a subtle point here and there, or point out a gross inconsistency or contradiction in what someone else says, but they usually drift away long before the most ignorant have tired of screaming hypocritical or irrelevant platitudes in all caps. I personally try my best to make my case when I see a discussion taking a turn like this, and it is on a topic I feel informed about, but I am also usually quick to succumb to the exhaustion of carefully picking apart every line of someone's argument only to have them utterly ignore you and just repeat the same thing in a slightly different way.

So the point is this. We all need to throw our thoughts in the ring, even at the risk of appearing "smart." Because that is how the world works. Most of our ideas aren't our own. Heck, probably none of them are. But the world needs good ideas to compete with the bad. And chances are, if you are smarter than 99.5% of the population, the ideas that got by the "contradiction!" "hypocrisy!" "untruth!" "impossibility!" filters in your own head, have a certain merit to them. That doesn't mean you are always right, or even that most people whose filters aren't as stringent will even listen to you, but we need your ideas anyway.

Or else we spend our time battling lunatics on Twitter with Bronze Age literature as our most sophisticated weapon.