8/23/08

Ron Fournier: Not A Journalist (Actually, He's A Dick)

So a guy named Ron Fournier (RonFo) is the AP's Washington Beaureau Cheif. This morning he put on the wire that:
He [Obama] picked a 35-year veteran of the Senate — the ultimate insider — rather than a candidate from outside Washington, such as Govs. Tim Kaine of Virginia or Kathleen Sebelius of Kansas; or from outside his party, such as Sen. Chuck Hagel of Nebraska; or from outside the mostly white male club of vice presidential candidates. Hillary Rodham Clinton didn't even make his short list.
This gives the Obama-as-neophyte meme some legs, no? But it's bullshit. Here is why:

There are two ways to consider Fournier's piece: substantively and in the broader context.

First, on the substance, Fournier's analysis seems a little lazy. By his logic, any potential running mate shows a "lack of confidence" -- picking Hillary would mean Obama lacked confidence in his ability to win over women voters; picking Bayh would mean Obama lacked confidence in his ability to win over independents and conservative Dems; picking Webb would mean Obama lacked confidence in his ability to win over voters concerned about national security; picking Kaine would mean Obama lacked confidence in his ability to win over voters in the South; etc. For that matter, "the status quo" in Washington has been conservative Republican rule. Biden may be an old pro and a DC insider, but he's anything but "the status quo."

Second, in context, Fournier's objectivity covering the presidential race continues to look shaky. We are, after all, talking about a journalist who, as recently as last year, considered working for the McCain campaign.

RonFo is a McBushie! And he sends news all over the world on the AP wire! Keep up the smackdowns!!

A Young Journalist's Take On Biden

Spencer Ackerman (formerly of TNR, stupid bastards) who writes Attackerman has interviewed Biden a few times. Biden, like the universe, is yin and yang, dark and light, up and down, so on and so on. But what shines through is the fact that Biden doesn't bullshit around. He is a serious guy, a wonk, and it is apparently obvious to those who meet him. Here is Ackerman's best description from his post:
Something else that struck me from our interviews. Biden is many things, but he's absolutely not intellectually insecure. I've seen his key staffers argue with him on important, substantive points of policy -- war policy, even -- while I was in the room, notebook out, voice recorder on. Once Biden agreed to an interview about the war after coming back from an eleven-hour flight from Libya, and was disturbingly sharp, and so was his key foreign-policy aide, Tony Blinken. A lot of politicians keep yes-men around. Biden keeps intellectual counterweights around, both his staff and the press, to keep himself sharp. Whatever his faults, he'll be ready to govern from the start.
I am thrilled about Biden as the choice. I like the fact that he argues, has opinions, and takes no shit. Forget the other nonsense about plagerism (not really) and credential padding (more like parsing). The guy will be an asset to an Obama administration. So, just shut up!

The Republican Machine Will Eat Your Charismatic Candidate

Mike Kinsley has a new post at the Post about the dangers of success. Money quote:
The greatest strength of this year's Democratic candidate, Barack Obama, is his eloquence, his charisma, his ability to create excitement and draw a crowd. This could be a legitimate debating point if the Republicans were saying that, on some particular issue or even many, Obama is using his charm and way with words to disguise a lack of substance in what he says. But Republican ambitions are grander: They are attacking Obama's charisma, as if popularity itself were a disqualifying factor and whoever draws the larger crowds is by definition the lesser candidate. This is truly perverse. It comes close to being an attack on democracy itself. Can the Republicans possibly score with such a preposterous argument?

Oh, probably.
Let's not fawn!

Literally!

It's nice to hear Biden fired up, literally!

Biden Helps Obama


This is Bidentown. Notice how close it is to Pennsylvania. He is an asset, and may sway some of those in need of catharsis. Anyway, I was just thinking and mapping...

I Had An Original Thought: 2

The school reformers who would "professional development" us to death would also have you believe that most teachers just aren't very smart. These are the same folks who, rightly or wrongly, put weight in IQ indicators. For argument, let's stipulate that too many teachers are stoopid. Here is a chart showing the population's IQ distribution:

If 100 is normal, and a standard deviation is 15, you see a couple SDs in either direction and you're looking at someone packing, or not, so to speak. How many folks with IQs of 120+ are going into teaching? Not many. Hell, there aren't very many! What might lure them?

Also, I think there are teachers out there who are incredibly smart, and do their thing their way, and it works. They are indeed, and in IQ score, exceptional. So leave them alone! Let them teach! They bring what NCLB is quickly draining from our kids--creativity! Thoughtfulness! Things that matter to humanity, not some bottom line or rung on a career ladder.

8/22/08

Like I Said...

It's Biden. Secret Service have been dispatched to his house.
More here.
Update: It's 1:23am in CA. Where's my damn text message!
Update II: It's 1:55 and I just got the email. Sent Joe a nice welcome note. Now GO GET ELECTED!!!
Update III: It's 1:58 and I got my text message. Carry on.

Schools Aren't Businesses!

Here is another one of those school-as-business/weigh-the-pig debunking articles. Most of these, and the ones I write, come from the position that teachers and schools cannot perform miracles; parents must be involved, and be accountable for the education of their children. Read the article below.
Charlie Kyte: Don't just take aim at the schools; work with them

By CHARLIE KYTE

August 21, 2008

The only chance Minnesota has of developing a competitive future workforce lies in a closely collaborative relationship between businesspeople and educators.

That relationship, though, is far from mutually supportive. Business leaders criticize educators and demand that market-model benchmarks be met to earn what seems like grudging support. Educators resist engaging with the business community for fear of scrutiny that focuses on shortcomings rather than on successes.

Last Sunday, Bill Blazar of the Minnesota Chamber of Commerce laid out his expectations of the Minneapolis public schools in order to gain his support of their much-needed operating levy, to be voted on in November. He stated that he wanted a demonstration of accountability from the school system.

Since success will only come with a sincere effort to work together, I've developed a number of suggested benchmarks for our business community partners. This is what they need to do to earn the support of educators as we together try to have more students be successful.

•Make sure every one of your employees who is a parent of young children has their own child "learning/reading/behaviorally" ready when they begin kindergarten. Make part of their compensation dependent on this goal, and reward them for success.

•Create every possible opportunity to have teachers come inside your businesses to see firsthand the skills your employees need to be successful. Do this at "scale" for large numbers of educators.

•Create a media marketing/advertising campaign that will embed the belief in students and parents that obtaining an education will provide a better future. Tell them "do well in school and we will have a job waiting for you." Sell education as their best hope for the future.

•Begin to compliment educators when there is success. The constant "tough love" set of messages is not lifting our teachers. Rather, it is crushing their spirits.

Oh, and by the way ... we would like all these goals to be measurable.

Let's start working together.

Charlie Kyte is the executive director of the Minnesota Association of School Administrators and served for 20 years as a superintendent of schools in Minnesota.

8/21/08

The Perpetuation Of False Memes (Its Bad!)

I seem to be on a if you say it enough it becomes effectively true thing. Over at LG&M they have a great little piece demonstrating how destructive such stupidity can be. Remember Al Gore in 2000? A little perspective. Here you go:

Manufactured Outrage

Since it's amazing how persistent the idea that Al Gore was an awful candidate is -- with special focus on his allegedly bad debate performance -- it's worth returning to this simple fact:

But without question, “Al Gore’s [alleged] operatic sighs” played a key role in Campaign 2000. In the wake of that first Bush-Gore debate, TV journalists put Gore’s (infrequent) sighs on a tape; jacked the volume way, way up; and played them again and again, in a loop. And yes, this seemed to affect the election. In the immediate aftermath of that debate, five polls of viewers were taken; in all five polls, viewers said that Gore had won the debate, by an average margin of ten points. But so what? After “journalists” played that loop tape—and flogged some trivial errors by Gore—judgments about the debate began changing. Within a week, Gore lost his lead in the national polls. He was forced to fight from behind right through Election Day.
To be clear: it wasn't "the public" that thought Gore's sighing was more important that the substantive matters, or that Gore's trivial errors about which particular FEMA official accompanied him on a particular visit were more important than George Bush's baldfaced howlers about his fiscal proposals. It was the press. People who actually watched the debate thought Gore won; people who learned about it from press coverage didn't. And it's not clear what a candidate can do in the face of such skewed priorities. It is, however, crucial to be alert to the ways these narratives develop and to counteract them before the fact.

More Education Myth Debunking: Learning Styles

Learning Styles. Multiple Intelligences. Blank Slates. Ghosts In Our Machines. Noble Savages. Jewish Popes. These things do not exist. They are products of our minds, and as such hold great weight, even though they are wrong. You know how these things work--something that is not true is said enough, and heard enough, that everyone just believes it, even though the something is blatantly, obviously, demonstrably false. An example: Saddam Hussein had something to do with 9-11. It was said enough that many believe(d) it. And it effected our behavior. We started a war and killed a bunch of people. So, believing things that are untrue can be very dangerous, even when everyone thinks they are true.

Now, this video by Daniel Willingham debunks the learning styles myth. It has been debunked in many places, fortunately (just Google "learning styles debunked). Watch, and enjoy the feeling of "ahhhh, I thought so!"

Who Should Obama Pick?

Good grief! Obama should, and apparently did, pick his VP based on what he wants, not what the game may enjoy. But, he gotta play the game, right? It's unfortunately looking more and more like he is going to have to play the game to get elected.

How horrible is it that the one candidate in my voting life that has morals I want to support (like not pandering--to anyone) may be required to pander to a bunch of whiny, in need of catharsis, foot-shooting Clinton supporters by making her his VP. I don't think it will happen because I think Obama is pretty sure he wants to be the president when he gets elected, and not the Clintons.

But is the race so tight that Democrats risk losing to McCain? Looks like that may be so. So, do we need to play the game and start getting behind a Clinton for VP thing? Maybe! But you watch, it will be Biden!

What If There Were No Toys?

Kids used to play. Now they seem to compete with objects bought for them. It is disappointing. Here is a little ditty pointing out how kids morphed from imaginative to needy.
Old-Fashioned Play Builds Serious Skills

by Alix Spiegel

Morning Edition, February 21, 2008 ·

On October 3, 1955, the Mickey Mouse Club debuted on television. As we all now know, the show quickly became a cultural icon, one of those phenomena that helped define an era.

What is less remembered but equally, if not more, important, is that another transformative cultural event happened that day: The Mattel toy company began advertising a gun called the "Thunder Burp."

I know — who's ever heard of the Thunder Burp?

Well, no one.

The reason the advertisement is significant is because it marked the first time that any toy company had attempted to peddle merchandise on television outside of the Christmas season. Until 1955, ad budgets at toy companies were minuscule, so the only time they could afford to hawk their wares on TV was during Christmas. But then came Mattel and the Thunder Burp, which, according to Howard Chudacoff, a cultural historian at Brown University, was a kind of historical watershed. Almost overnight, children's play became focused, as never before, on things — the toys themselves.

"It's interesting to me that when we talk about play today, the first thing that comes to mind are toys," says Chudacoff. "Whereas when I would think of play in the 19th century, I would think of activity rather than an object."

Chudacoff's recently published history of child's play argues that for most of human history what children did when they played was roam in packs large or small, more or less unsupervised, and engage in freewheeling imaginative play. They were pirates and princesses, aristocrats and action heroes. Basically, says Chudacoff, they spent most of their time doing what looked like nothing much at all.

"They improvised play, whether it was in the outdoors… or whether it was on a street corner or somebody's back yard," Chudacoff says. "They improvised their own play; they regulated their play; they made up their own rules."

But during the second half of the 20th century, Chudacoff argues, play changed radically. Instead of spending their time in autonomous shifting make-believe, children were supplied with ever more specific toys for play and predetermined scripts. Essentially, instead of playing pirate with a tree branch they played Star Wars with a toy light saber. Chudacoff calls this the commercialization and co-optation of child's play — a trend which begins to shrink the size of children's imaginative space.

But commercialization isn't the only reason imagination comes under siege. In the second half of the 20th century, Chudacoff says, parents became increasingly concerned about safety, and were driven to create play environments that were secure and could not be penetrated by threats of the outside world. Karate classes, gymnastics, summer camps — these create safe environments for children, Chudacoff says. And they also do something more: for middle-class parents increasingly worried about achievement, they offer to enrich a child's mind.

Change in Play, Change in Kids

Clearly the way that children spend their time has changed. Here's the issue: A growing number of psychologists believe that these changes in what children do has also changed kids' cognitive and emotional development.

It turns out that all that time spent playing make-believe actually helped children develop a critical cognitive skill called executive function. Executive function has a number of different elements, but a central one is the ability to self-regulate. Kids with good self-regulation are able to control their emotions and behavior, resist impulses, and exert self-control and discipline.

We know that children's capacity for self-regulation has diminished. A recent study replicated a study of self-regulation first done in the late 1940s, in which psychological researchers asked kids ages 3, 5 and 7 to do a number of exercises. One of those exercises included standing perfectly still without moving. The 3-year-olds couldn't stand still at all, the 5-year-olds could do it for about three minutes, and the 7-year-olds could stand pretty much as long as the researchers asked. In 2001, researchers repeated this experiment. But, psychologist Elena Bodrova at Mid-Continent Research for Education and Learning says, the results were very different.

"Today's 5-year-olds were acting at the level of 3-year-olds 60 years ago, and today's 7-year-olds were barely approaching the level of a 5-year-old 60 years ago," Bodrova explains. "So the results were very sad."

Sad because self-regulation is incredibly important. Poor executive function is associated with high dropout rates, drug use and crime. In fact, good executive function is a better predictor of success in school than a child's IQ. Children who are able to manage their feelings and pay attention are better able to learn. As executive function researcher Laura Berk explains, "Self-regulation predicts effective development in virtually every domain."

The Importance of Self-Regulation

According to Berk, one reason make-believe is such a powerful tool for building self-discipline is because during make-believe, children engage in what's called private speech: They talk to themselves about what they are going to do and how they are going to do it.

"In fact, if we compare preschoolers' activities and the amount of private speech that occurs across them, we find that this self-regulating language is highest during make-believe play," Berk says. "And this type of self-regulating language… has been shown in many studies to be predictive of executive functions."

And it's not just children who use private speech to control themselves. If we look at adult use of private speech, Berk says, "we're often using it to surmount obstacles, to master cognitive and social skills, and to manage our emotions."

Unfortunately, the more structured the play, the more children's private speech declines. Essentially, because children's play is so focused on lessons and leagues, and because kids' toys increasingly inhibit imaginative play, kids aren't getting a chance to practice policing themselves. When they have that opportunity, says Berk, the results are clear: Self-regulation improves.

"One index that researchers, including myself, have used… is the extent to which a child, for example, cleans up independently after a free-choice period in preschool," Berk says. "We find that children who are most effective at complex make-believe play take on that responsibility with… greater willingness, and even will assist others in doing so without teacher prompting."

Despite the evidence of the benefits of imaginative play, however, even in the context of preschool young children's play is in decline. According to Yale psychological researcher Dorothy Singer, teachers and school administrators just don't see the value.

"Because of the testing, and the emphasis now that you have to really pass these tests, teachers are starting earlier and earlier to drill the kids in their basic fundamentals. Play is viewed as unnecessary, a waste of time," Singer says. "I have so many articles that have documented the shortening of free play for children, where the teachers in these schools are using the time for cognitive skills."

It seems that in the rush to give children every advantage — to protect them, to stimulate them, to enrich them — our culture has unwittingly compromised one of the activities that helped children most. All that wasted time was not such a waste after all.

(And some suggestions:)

Your Questions on Kids & Play

Organizing play for kids has never seemed like more work. But researchers Adele Diamond and Deborah Leong have good news: The best kind of play costs nothing and really only has one main requirement — imagination.

Here, they answer your questions about play.

Better Ways to Play
Self-regulation is a critical skill for kids. Unfortunately, most kids today spend a lot of time doing three things: watching television, playing video games and taking lessons. None of these activities promote self-regulation.

We asked for alternatives from three researchers: Deborah Leong, professor of psychology at Metropolitan State College of Denver, Elena Bodrova, senior researcher with Mid-Continent Research for Education and Learning, and Laura Berk, professor of psychology at Illinois State University.

Here are their suggestions:

Simon Says: Simon Says is a game that requires children to inhibit themselves. You have to think and not do something, which helps to build self-regulation.

Complex Imaginative Play: This is play where your child plans scenarios and enacts those scenarios for a fair amount of time, a half-hour at a minimum, though longer is better. Sustained play that last for hours is best. Realistic props are good for very young children, but otherwise encourage kids to use symbolic props that they create and make through their imaginations. For example, a stick becomes a sword.

Activities That Require Planning: Games with directions, patterns for construction, recipes for cooking, for instance.

Joint Storybook Reading: "Reading storybooks with preschoolers promotes self-regulation, not just because it fosters language development, but because children's stories are filled with characters who model effective self-regulatory strategies," says researcher Laura Berk.

She cites the classic example of Watty Piper's The Little Engine That Could, in which a little blue engine pulling a train of toys and food over a mountain breaks down and must find a way to complete its journey. The engine chants, "I think I can. I think I can. I think I can," and with persistence and effort, surmounts the challenge.

Encourage Children to Talk to Themselves: "Like adults, children spontaneously speak to themselves to guide and manage their own behavior," Berk says. "In fact, children often use self-guiding comments recently picked up from their interactions with adults, signaling that they are beginning to apply those strategies to themselves.

"Permitting and encouraging children to be verbally active — to speak to themselves while engaged in challenging tasks — fosters concentration, effort, problem-solving, and task success." — Alix Spiegel

McCain Was NOT Tortured! (According To (V)POTUS)

Andrew Sullivan says in black and white what needs to be said about the sinking depths to which Republicans will go to get elected. It proves again the Repubs care about the end, not the means!
Does Bush Believe McCain Was Tortured?

In all the discussion of John McCain's recently recovered memory of a religious epiphany in Vietnam, one thing has been missing. The torture that was deployed against McCain emerges in all the various accounts. It involved sleep deprivation, the withholding of medical treatment, stress positions, long-time standing, and beating. Sound familiar?

According to the Bush administration's definition of torture, McCain was therefore not tortured.

Cheney denies that McCain was tortured; as does Bush. So do John Yoo and David Addington and George Tenet. In the one indisputably authentic version of the story of a Vietnamese guard showing compassion, McCain talks of the agony of long-time standing. A quarter century later, Don Rumsfeld was putting his signature to memos lengthening the agony of "long-time standing" that victims of Bush's torture regime would have to endure. These torture techniques are, according to the president of the United States, merely "enhanced interrogation."

No war crimes were committed against McCain. And the techniques used are, according to the president, tools to extract accurate information. And so the false confessions that McCain was forced to make were, according to the logic of the Bush administration, as accurate as the "intelligence" we have procured from "interrogating" terror suspects. Feel safer?

The cross-in-the-dirt story - although deeply fishy to any fair observer - is in the realm of the unprovable. But the actual techniques used on McCain, and the lies they were designed to legitimize, are a matter of historical record. And the government of the United States now practices the very same techniques that the Communist government of North Vietnam once proudly used against American soldiers. When they are used against future John McCains, the victims will know, in a way McCain didn't, that their own government has no moral standing to complain.

Now the kicker: in the Military Commissions Act, McCain acquiesced to the use of these techniques against terror suspects by the CIA. And so the tortured became the enabler of torture. Someone somewhere cried out in pain for the same reasons McCain once did. And McCain let it continue.

These are the prices people pay for power.

Teachers Hate NCLB (You Would Too If It Happened To You!)


Here is an article by an elementary school teacher. She sounds like most of us! Read it and weep.
One Teacher’s Cry: Why I Hate No Child Left Behind
By Susan J. Hobart, August 2008 Issue

I’m a teacher. I’ve taught elementary school for eleven years. I’ve always told people, “I have the best job in the world.” I crafted curriculum that made students think, and they had fun while learning. At the end of the day, I felt energized. Today, more often than not, I feel demoralized.

While I still connect my lesson plans to students’ lives and work to make it real, this no longer is my sole focus. Today I have a new nickname: testbuster. Singing to the tune of “Ghostbusters,” I teach test-taking strategies similar to those taught in Stanley Kaplan prep courses for the SAT. I spend an inordinate amount of time showing students how to “bubble up,” the term for darkening those little circles that accompany multiple choice questions on standardized tests.

I am told these are invaluable skills to have.

I am told if we do a good job, our students will do well.

I am told that our district does not teach to the test.

I am told that the time we are spending preparing for and administering the tests, analyzing the results, and attending in-services to help our children become proficient on this annual measure of success will pay off by reducing the academic achievement gap between our white children and our children of color.

I am told a lot of things.

But what I know is that I’m not the teacher I used to be. And it takes a toll. I used to be the one who raved about my classroom, even after a long week. Pollyanna, people called me. Today, when I speak with former colleagues, they are amazed at the cynicism creeping into my voice.

What has changed?

No Child Left Behind is certainly a big part of the problem. The children I test are from a wide variety of abilities and backgrounds. Whether they have a cognitive disability, speak entry-level English, or have speech or language delays, everyone takes the same test and the results are posted. Special education students may have some accommodations, but they take the same test and are expected to perform at the same level as general education students. Students new to this country or with a native language other than English must also take the same test and are expected to perform at the same level as children whose native language is English. Picture yourself taking a five-day test in French after moving to Paris last year.

No Child Left Behind is one size fits all. But any experienced teacher knows how warped a yardstick that is.

I spent yesterday in a meeting discussing this year’s standardized test results. Our team was feeling less than optimistic in spite of additional targeted funds made available to our students who are low income or who perform poorly on such tests.

As an educator, I know these tests are only one measure, one snapshot, of student achievement. Unfortunately, they are the make-or-break assessment that determines our status with the Department of Education.

They are the numbers that are published in the paper.

They are the scores that homebuyers look at when deciding if they should move into a neighborhood.

They are the numbers that are pulled out and held over us, as more and greater rigidity enters the curriculum.

I was recently told we cannot buddy up with a first-grade class during our core literacy time. It does not fit the definition of core literacy, I was told. Reading with younger children has been a boon to literacy improvement for my struggling readers and my new English-speaking students. Now I must throw this tool away?

In an increasingly diverse public school setting, there is not one educational pedagogy that fits all students. We study and discuss differentiated curriculum, modify teaching strategies, and set “just right reading levels” to scaffold student learning. But No Child Left Behind doesn’t care about that. It takes no note of where they started or how much they may have progressed.

As a teacher, I measure progress and achievement for my students on a daily basis. I set the bar high, expecting a lot.

I don’t argue with the importance of assessment; it informs my instruction for each child.

I don’t argue with the importance of accountability; I believe in it strongly—for myself and my students.

I have empathy for our administrators who have to stand up and be told that we are “challenged schools.” And I have empathy for our administrators who have to turn around and drill it into our teacher heads, telling us we must do things “this” way to get results. I feel for them. They are judged on the numbers, as well.

No Child Left Behind is a symptom of a larger problem: the attack on public education itself. Like the school choice effort, which uses public funds to finance private schools and cherry-pick the best students, No Child Left Behind is designed to punish public schools and to demonstrate that private is best.

But I don’t think we’ve turned a corner that we can’t come back from. Public education has been a dynamic vehicle in our country since its inception. We must grapple with maintaining this progressive institution. Policymakers and educators know that education holds out hope as the great equalizer in this country. It can inspire and propel a student, a family, a community.

The state where I teach has a large academic achievement gap for African American and low income children. That is unacceptable. Spending time, money, energy on testing everyone with a “one size fits all test” will not eliminate or reduce that gap.

Instead, we need teacher-led professional development and more local control of school budgets and policymaking. Beyond that, we need to address the economic and social issues many children face, instead of punishing the schools that are trying to do right by these students.

We’ve got things backwards today. Children should be in the front seat, not the testing companies. And teachers should be rewarded for teaching, not for being Stanley Kaplan tutors.

Ten years ago, I taught a student named Cayla. A couple of months ago, I got a note from her, one of those things that teachers thrive on.

“Ms. Hobart was different than other teachers, in a good way,” she wrote. “We didn’t learn just from a textbook; we experienced the topics by ‘jumping into the textbook.’ We got to construct a rainforest in our classroom, have a fancy lunch on the Queen Elizabeth II, and go on a safari through Africa. What I learned ten years ago still sticks with me today. When I become a teacher, I hope to inspire my students as much as she inspired hers.”

Last week, I received a call from Niecy, another student from that class ten years ago. She was calling from southern Illinois to tell me she was graduating from high school this month and had just found out that she has won a scholarship to a college in Indiana. I was ecstatic in my happiness for her. We laughed, and I told her I was looking at a photo of her on my wall, building a pyramid out of paper bricks with her classmates.

I also had a recent conversation with Manuel in a grocery parking lot. He reminded me of my promise eight years ago to attend his high school graduation. I plan to be there.

Cayla and Niecy and Manuel are three of the reasons I teach. They are the reasons that some days this still feels like a passion and not a job.

When I pick up the broom at the end of the day to sweep my class due to budget cuts, I remember Cayla.

When I drive home demoralized after another meeting where our success is dissected with a knife manufactured in Texas, I remember Niecy.

When another new program that is going to solve the reading disparity, resulting in higher test scores, is introduced on top of another new program that was supposed to result in the same thing, I remember Manuel.

They are the fires that fuel my passion. They are the lifeboats that help me ride this current wave in education.

Eight or ten years from now, I want other former students to contact me and tell me a success story from their lives. I don’t want to be remembered as the teacher who taught them how to sing “Testbusters” or to “bubble up.” I want to be remembered as a teacher who inspired them to learn.

Susan J. Hobart, M.S. Ed., is a National Board Certified Teacher living in the Midwest.

Tolerance And Diversity: Trouble?

Here is an essay (from the Council for Secular Humanism) about some pitfalls one encounters in the "diversity/tolerance" pedagogy. I agree with most of it, though some of you may find it a bit too critical. In my classroom I have seen how exposure does not necessarily impart good feeling about others. My take on diversity is like my take on most things; you must be smart enough to see the forest for the trees.


Doubts about Celebrating Diversity
By Kenneth R. Stunkel

We like to believe that colleges and universities are unique sanctuaries for critical inquiry and mostly logical thinking and that academics resist unexamined, foolish beliefs as a professional responsibility. As an example of how shaky these assumptions have become in an atmosphere charged with political correctness, I cite my own university. A statement from the president's office has put faculty and students on notice that all beliefs are to be acknowledged and respected, and that "socially constructed" differences are to be acknowledged and celebrated. Well-meant as all this may sound, apparently, no one thought out the implications of these injunctions, had doubts about their wisdom, or raised objections. This essay attempts to do all three by focusing on an irrational, unhealthy phenomenon in higher education that has become insistent and pervasive. American pluralism and egalitarianism have merged with identity mania, the self-esteem movement, and postmodern indeterminacy to produce a seemingly fine, idealistic notion—a celebration of differences between peoples and cultures. Diversity as a celebration must not be confused with diversity as a redress of historic inequities through just representation of women and minorities in the public life of opportunity, work, and study. Nor is it about enjoying a harmless variety of taste, style, demeanor, or aesthetic experience, as when people unlike one another in small ways (as most people are) assemble to hear an African poet or to sample cuisines at an international food festival. And on the most trivial level, the issue is not fashion anomalies like students wearing rings in their noses, eyebrows, and tongues.

The Cult of Differences

At issue is a sometimes overt but more commonly hidden assumption that differences are better and more fundamental than similarities. The idea is not new. The first theorist and champion of incommensurable cultural diversity was Johann Gottlieb Herder, who flourished near the close of the eighteenth century and argued for cultural nationalism and the accepting of differences based on a common heritage of language and custom. In the Middle Ages, nominalists and realists debated whether individual things are more real than any similarities they may share. The tension between differences and similarities is my theme as well, but what I have in mind is an inflated status for ethnicity, not physical differences of race or gender, which are unavoidable and given. Ethnicity implies traditions, beliefs, and practices rather than the anthropology of physical appearance or differences of reproductive anatomy. Academic paeans to ethnicity claim that cultural differences between groups merit spontaneous admiration. The questionable premise is that traditions, beliefs, and practices in all their ethnic and historical profusion self-authenticate their claims to truth, beauty, and goodness. Not only must all the "voices" be heard, whatever they come up with must be treated with respect, since no voice has less or more significance than any other. From this hyper-tolerant perspective, it is not good enough in a pluralistic society to cultivate forbearance or to be content with provisional civility extended to differences of belief, experience, and cultural background. Open-ended diversity is thrust upon us as a positive object of obligatory good feeling. Acceptance of differing outlooks, behavior, habits, customs, and values must be enthusiastic to ward off intolerance and confirm difference as virtue.

For converts to this doctrine of good feeling about differences, the more differences the better and all differences are equal in a spirit of radical democracy. Without an abundance of diversity, sanctified by parity, there would be no cause for revelry. A dictionary (Merriam-Webster's) defines the word celebrate along a continuum from the sacred ("to perform a sacrament") to the secular ("to hold up or play up for public notice"). A generic slant on celebration suggests a receptive attitude in which all sense of discomfort about differences is sponged away. Doubt about the value of diversity is tantamount to outright intolerance, hateful perversity, or lamentable backwardness. If any fragment of difference should provoke indifference, dislike, outrage, skepticism, or resistance, the suspicious party may face quarantine for sensitivity therapy or slip into disrepute as a reactionary. Despite these risks of dissent, I invite some reflection on the pitfalls and limitations of celebrating diversity.

The Price of Ethnicity

However one may react to various cultural practices and beliefs, it is not self-evident that diversity is either good or bad, which holds for similarity as well. Good judgment about what is desirable or not requires historical and social context and invites cautious reflection about consequences. Non-Western traditions have viewed social and cultural differences as little more than blunt facts of life, inviting exclusion, repression, or degrees of accommodation. Celebration has never been an issue. In many countries with an ethnic mix, plain, old toleration ("live and let live") is something of a miracle. Consciousness of kind, however minor the criteria for better or worse, is the mortar that binds people into cohesive groups, until education or wider perspectives crack the mold. Such bonds have the functional purpose of promoting social harmony. Should ethnic differences intervene with consciousness of kind, the outcome might be harmless enough, but it can also be disastrous. Who can argue credibly that diversity has been good for Hutu and Tutsi, Albanian and Serbian, Israeli and Palestinian? At the right historical moment, relatively unimpressive differences of tradition, perception, and interest have triggered mutual persecution and slaughter, with no sense of a common humanity that ought to take precedence over narrow tribal or ethnic identities. The historical reality is that differences have seldom been acknowledged and tolerated, much less celebrated.

The contemporary prevalence of ethnic and tribal conflict suggests it is unrealistic, irrational, and dangerous to embrace difference as an absolute good. Unqualified diversity can be as oppressive as unqualified uniformity. A decision about where to draw the line normally occurs in practice rather than theory. Nevertheless, some obvious antinomies come to mind. A patriarchal status system in which rights are gender specific, for example, is incompatible with a gender-neutral system of equality before the law. Where a gulf between competing values is less dramatic and more bridgeable, to want celebration on top of judicious, humane accommodation invites caution. Sharing the same country and history does not prevent nasty conflicts between secular and orthodox Jews in Israel. Irish Protestants and Catholics present the same spectacle of minor differences adding up to serious conflict. Across the globe, there are peoples sharing the same land and history who are eager to kill one another, like Muslims and Hindus in Kashmir, or Muslims and Muslims in Iraq.

Cultural differences can proliferate with no thought whatever of common interests. A spectacular example at hand is in Indonesia, where one tiny island out of some 17,000 (called Alor) has 140,000 people divided into fifty tribes, speaking nearly as many tongues in seven language groups. Agreement on anything touching the common good is understandably difficult. Imagine a school system in Alor trying to be "ethnically sensitive" while also laboring to impart a shared foundation of knowledge, goals, and commitments. The sensible option in modern pluralistic societies is to ask how much and what kinds of difference to accommodate before consensus becomes impossible and the social order devolves into an incoherent sheet of sand. Much the same can be said for multicultural curricula in schools and colleges, which have overwhelmed any sense of standards and coherence in many places. Eagerness to promote and vindicate diversity is usually indifferent to the social unity needed to keep a school system or a curriculum afloat.

Irreconcilable Values

Celebration of diversity in general makes it difficult to stand for anything in particular. Postmodernists, who claim that truth, meaning, and reality fluctuate with the rise and fall of individual and group perspectives and interests, exploit messy historical facts of irreconcilable or warring differences. Allegedly, no impartial viewpoint is available (a caveat that logically must include postmodern doctrines) to judge the adequacy of "stories" or "narratives" by which minds and bodies are connected to the world. A consequence of unlimited pluralism as a higher good is the demolition of shared purpose in a common world that nullifies any plausible idea of universal human rights. When a bill of particulars is requested, inclusive diversity clashes with familiar notions of impartial justice, fairness, compassion, and rationality. An appeal to human rights assumes the existence of needs and interests embedded in a human identity that takes precedence over lesser identities defined by narrow categories of race, class, gender, and ethnicity.

The world is and always has been a playground for incompatible, mutually hostile value systems. Mormons would be practicing polygamy openly as part of their religion if they had not emerged as a minority in a monogamous society. The Taliban in Afghanistan was persuaded by religious conviction to dispatch adulterous women by stoning them to death. In Morocco and Iran, a Muslim who converts to another faith is severely punished. There are still societies in the Middle East, Asia, and Africa that accept and practice human bondage, the favorite commodities being women and children. How are such cultural practices to be evaluated simultaneously from the perspective of human rights and the blanket imperatives of diversity?

Should an ethnic attachment to astrology be included as a legitimate discipline in college curricula because politicians and bureaucrats in India submit decisions bearing on public issues to readings of the stars? Should tribal shamans be licensed to practice "alternative" medicine? In postmodern jargon, is not one scientific or medical "narrative" as good as another? Japanese identity is still defined markedly by a code of duty and obligation to a group, and the Japanese are notably ethnocentric, acutely aware that someone is a foreigner (gaijin). The Western preference, traceable to the eighteenth century's Enlightenment, is for liberty and expression of the individual in a spirit of tolerant cosmopolitanism. Are both options to be celebrated equally without a murmur of skepticism? Lewis Mumford argued that all cultures and civilizations can be judged by a simple criterion—to what extent are autonomy and unimpeded development of the person respected and nurtured? In other words, do beliefs and practices of a culture enhance possibilities of human life or diminish them? If such a standard resonates, where does that leave Islam with regard to the status of women?

Learning about the "Other"

Ideological pluralists assume that education is the royal path to happy as well as peaceful diversity. Let us better understand other peoples and cultures, goes the argument, so that non-Western "voices" are heard impartially and all identities surface to stand as equals. In secondary and higher education, this belief has been codified as doctrine, but is loaded with impediments. Any teacher wanting a curriculum to mirror diversity faces an insuperable and value-laden problem of selection. Some five thousand ethnic groups are scattered across some two hundred nations, and America hosts around ninety ethnic enclaves. In educational venues, which are to be represented or left out, since no one has time or knowledge to include all of them? If differences are equal, what is the principle of choice? Is not choosing one over the other arbitrary bias and discrimination? And where in a course of study does one find relief from ethnic exposition and celebration to address priorities like reading and writing, geography and mathematics, history and science?

The root fallacy is to think that mere exposure to unfamiliar cultural traditions will promote sympathy and understanding. It may or may not, depending on depth of exposure, a recipient's aptitude, and what ends up being understood and assimilated. Toleration in the ethnic domain is not an inevitable result of understanding. Really knowing the ways and thoughts of non-Western cultures, as opposed to brushing against sanitized versions of them, may have the opposite effect and stimulate dislike. Whatever an individual takes away from cursory reading, group confessionals, show-and-tell sessions, or field trips is likely to be shallow, ephemeral, and misleading. Even if a modicum of interest results from selective exposure, it does not follow that understanding has been achieved.

Every cultural tradition has a grim, murky side that discourages celebration. Censorship of the bad stuff to avoid offending someone invites deficient understanding and later disillusionment. The more some of us understand the social basis for the widespread African practice of vaginal mutilation of young girls to protect their virtue, the more we dislike and oppose it. The Muslim practice of secluding and controlling women is hard to tolerate much less celebrate. Some teachers indirectly praise Aztec culture because it was "victimized" by predatory Europeans, but conveniently ignore brutal Aztec imperialism in old Mexico. An appreciation of Aztec temple architecture is shallow without an understanding that thousands of people had their hearts cut out at the top in religious ceremonies that culminated with bodies tumbling down the steep steps. Full understanding of what the structures were used for makes appreciation more difficult and ambiguous. Indeed, a full understanding that Aztec practices centered on propitiation of bloody, improbable deities might well induce disgust and alienation.

There is a price for understanding other cultural traditions: investment of time and effort, an immersion in chores of hard study that contemporary students resent and evade. Submission to historical settings and absorption in difficult texts are unavoidable conditions for real understanding. The imperfect but attainable attitude of suspended judgment supported by deep knowledge is sine qua non. Alien terms must be mastered. Islam cannot be understood without the Qur'an or Hinduism without the Vedas, nor can words like Sharia, jihad, Shi'ite, karma, Shaivite, and puja be ignored. Convictions and teachings in either case are not grapes plucked effortlessly from a vine, and complications abound. Quite apart from barriers to understanding ways of thinking associated with Confucian China or Buddhist Thailand, life and ideas in the West can be as mystifying in some historical periods as anything gleaned from the anthropologist's notebook. Medieval scholasticism and Renaissance kabbalism, at least in my experience, mystify and befuddle students as much as Hindu Vedanta or the Daoist yin and yang.

Humanity versus Ethnicity

If the idea of universal human rights is taken seriously, then an excess of conflicting social and cultural differences become impediments to their realization. The sociologist Karl Mannheim observed long ago that no society could expect a shared system of coherent values without a process for their creation, dissemination, reconciliation, and assimilation. Such a process in American institutional life, particularly higher education, has been notably weak; it has also been rejected outright by an assortment of ideologues in the past quarter century. The task for American democracy is to secure and sustain an accommodation between diversity and the shared beliefs and commitments that define a society and a nation. If a tradition of universal human rights transcends local ethnic traditions, ethnic diversity without limits or interference will have to yield.

A plausible aim of enlightened education is to lift people above their parochial roots to a larger view of the world, to help them transcend limitations of birth and upbringing, to hasten their liberation from shuttered windows of race, class, gender, and ethnicity. Diversity ideology supplies instead a melancholy determination of schools, scholarship, and public rhetoric to herd people more deeply into a cul-de-sac of glorified particularity. Another aim of good education is critical thinking, which was once a distinguishing mark of the academy. The fate of that ideal is ironic and bizarre. On the one hand, criticism has become a form of intellectual suicide in which "theories" like deconstruction and social construction set out to level everything in sight and end with self-immolation on their own dead-end premises. On the other hand, academic multiculturalists insist that questioning beliefs, values, and practices of the "other," whatever they may be, much less rejecting them, is "insensitive" and "intrusive." Criticism is tantamount to intolerance.

A rare spirit of criticism was codified in the Enlightenment, which flourished only once before in the Greek world, from the sixth to the fourth centuries b.c.e. Its goals were to expose errors and make way for unexpected truths. In our postmodern euphoria, radical pluralists deny the existence of truths that make us free (while, of course, claiming or implying that a truth has been enunciated). Truth is rejected as a form of bondage, because it implies whatever may be true is true for all. The present surrogate for truth is diversity. It is sad that a quintessentially European ideal like diversity is in conflict with the ideal of criticism, which requires argument and evidence to support any belief. On Enlightenment premises, all views and ways of life cannot be admitted as equals. No belief or practice, however sacred or wedded to group or individual self-esteem, is immune from examination. It is intellectual and moral cowardice to refrain from responsible criticism on the ground that offense may be taken. The discomfort of being offended is a consequence of living in a complex world while holding questionable or unsupportable beliefs. It is also a risk associated with getting a decent education and growing up (no more Tooth Fairy). It is an inevitable consequence of encountering points of view and ways of life that cannot or do not want to be reconciled. Indiscriminate "celebration" is incompatible with critical thinking.

No matter how one cuts it, tension and conflict will surface when incompatible value systems confront issues of belief and action in public life. The best hope lies in selective accommodation of differences guided by modest expectations anchored to a core of shared convictions sheltered by common sense and open to criticism. These realities about diversity are widely evaded and denied in higher education, where a dreary scene of identity seeking is being played out in exclusive, solipsistic groups, each claiming a unique version of meaning, truth, and reality, each contributing to an impenetrable social babble, all of it stoutly defended by uncritically idealistic academics—but also by campus zealots and block wardens on the lookout for heresy.

An uninformed, unsuspecting student body, awash in diversity rhetoric and pedagogy, maneuvered by solemn, earnest action plans shaped by diversity ideologues, might be led to think that ethnic violence and hatred, alive and readily visible around the world, has nothing to do with ethnicity and its inherent premise of exclusiveness.

Kenneth Stunkel is a professor of history at Monmouth University in New Jersey.

Total Pageviews