Showing posts with label Mistakes were made. Show all posts
Showing posts with label Mistakes were made. Show all posts

Wednesday, December 21, 2011

Testing and teaching are at odds

I will look at any additional evidence to confirm the opinion to which I have already come.
 --Lord Molson, British politician (1903-1991)

The Washington Post ran a piece On Leadership July 18 and 19, 2011, that featured responses to the Atlanta cheating scandal from Dan Ariely, Arne Duncan, Howard Gardner and Steven Pearlstein.

Despite the evidence mounting against high stakes, standardized testing, Arne Duncan and many other advocates for test and punish accountability want to stay the course with their "despite cheating scandals, testing and teaching are not at odds" mantra. So powerful is Duncan's need for consonance, his reaction to disconfirming evidence is to criticize, distort and dismiss cheating as the sole responsibility of those individuals who did the cheating.

In other words, Duncan is saying "mistakes were made, but not by me." This mental jockeying is known as confirmation bias, and at the moment Duncan is its poster boy.

False dichotomies make choosing easy. Duncan frames his argument very carefully -- either you are for accountability (with him) or you are against accountability (against him). But in reality, the situation is far from this simple.

In her book Willful Blindness, Margaret Heffernan writes about "the ostrich instruction":
We all recognize the human desire at times to prefer ignorance to knowledge, and to deal with conflict and change by imagining it out of existence... In burying our heads in the sand, we are trying to pretend the threat doesn't exist and that we don't have to change... A preference for the status quo, combined with an aversion to conflict, compels us to turn a blind eye to problems and conflict we just don't want to deal with.
Sometimes it's the leaders with the most power and responsibility who are the most blind because they believe they know what they were doing -- or feel like they have to look like they know what they are doing.

In their book Mistakes Were Made but not by Me, Carol Tavris Elliot Aronson write:
In a study of people who were being monitored by magnetic resonance imaging (MRI) while they were trying to process dissonant or consonant information about George Bush or John Kerry, Drew Westen and his colleagues found that the reasoning areas of the brain virtually shut down when participants were confronted with dissonant information, and the emotion circuits of the brain lit up happily when consonance was restored. These mechanisms provide a neurological basis for the observations that once our minds are made up, it is hard to change them. 
Indeed, even reading information that goes against your point of view can make you all the more convinced you are right.
In light of this, it's not surprising that when test and punish accountability supporters like Arne Duncan are faced with evidence that shows cheating as an inevitable and inherent characteristic of high stakes testing, they simply turn to discrediting the facts and become even more committed to their own argument. At this point, I'll be fair to Duncan and say that this behavior is as predictable as it is unfortunate, especially if he believes staying the course is his only option. Because people become more certain they are right if they can't undo it, nothing is more dangerous than an idea when it's the only one you have.

At this point, I'm reminded of what Edward De Bono meant when he said:
If you never change your mind, why have one?
This is precisely why we need to listen to people like Bob Schaeffer from Fairtest who say:
The failure of NCLB and its state-level clones cannot be reversed by “staying the course,” “raising the bar” or any of the other faith-based notions frequently invoked by high-stakes testing true-believers.

Wednesday, August 31, 2011

You have to open your own eyes

I'm not here to convince anyone of anything, nor am I here to motivate others. I can't change your mind and I can't motivate you.

Only you can do either.

 We don't resist change - but we do resist being changed, and we can't motivate anyone but ourself.

The best anyone can do is create an environment where others will motivate themselves to change and improve.

This can be incredibly frustrating.

But wait. It gets worse.

In their book Mistakes Were Made (but not by me), Carol Tavris and Elliot Aronson explain a little thing called confirmation bias:
"So powerful is the need for consonance that when people are forced to look at disconfirming evidence they will find a way to criticize, distort, or dismiss it so that they can maintain or even strengthen their existing belief."
Why is this? Why are we so set on defending what we think we already know to be true even if it means ignoring new evidence that would allow us to improve? Why is it that we are so much more likely to seek out others who will support our current opinions at the expense of those who would challenge us and help us grow?

Why is it that conventional wisdom is shaped by so many urban myths about parenting and educating children?

I believe we have a vision problem -- both literally and metaphorically.

We cannot fix a problem that we refuse to acknowledge. We know that confronting a problem is the only way to resolve it, but any real resolution will disrupt the status quo -- but disturbing the special momentum of the status quo is a great way of being labelled a "troublemaker".

Paradoxically, being a "troublemaker" may be the most effective way of getting fired, but it's also the surest way of differentiating between "doing things right" and "doing the right things".

In her book Wilful Blindness, Margaret Heffernan explains:
In business circles, this is known as the "status quo trap": the preference for everything to stay the same. The gravitational pull of the status quo is strong - it feels easier and less risky, and it requires less mental and emotional energy, to "leave well enough alone." Nobody likes change because the status quo feels safer, it's familiar, we're used to it. Change feels like redirecting the riverbed: effortful and risky. It's so much easier to imagine that what we don't know won't hurt us.
The slippery slope of the status quo is fueled by silence which Heffernan calls the language of inertia. In an ever changing world, inertia won't just keep things the same, rather it will guarantee things get worse. It's like trying to stand on top of a rolling yoga ball; sure, we might find the right balance between left and right, up and down, but if we want to remain on the ball, we're going to need to constantly adjust and readjust.

What works one moment, won't necessarily work the next.

Business guru Jim Collins warns, "If you notice marked decline in the quality of debate and dialogue around your workplace, things are on the decline." This kind of work environment favours compliance at the cost of engagement. It's like nothing is wrong, but everything is wrong. We are snookered into believing that the absence of conflict is the equivalent to happiness and so there may be plenty of polite conversation but nothing meaningful.

It's not unheard of for widespread knowledge and widespread blindness to coexist. There's a reason why some scandals or problems are known by all but no one will admit it. If an entire society, institution or company is built on denial because self preservation, survival or advancement demands blindness to the truth - disaster is imminent.

This is what Heffernan calls the paradox of blindness: We think turning a blind eye to the truth will make us safe even as it puts us in danger. As long as the work or learning environment convinces us that it is safer to say and do nothing, injustices can and will likely continue. There is a real danger in having a fixed view of the world and not being open to evidence that you're wrong until it is too late. Ironically, some of the most educated professionals can end up the most blind because they come to see their expertise as definitive.

So how can we best counter the harmful effects of confirmation bias and willful blindness?

There's no one answer to such a complex question, but the first step might be understanding that we all have blindspots.

No one is exempt.

Like the best drivers, the most successful people navigate through the hustle and bustle of their daily lives knowing that they have blindspots -- things that they just cannot or will not see. Tavris and Aronson explain:
Drivers cannot avoid having blind spots in their field of vision, but good drivers are aware of them; they know they had better be careful backing up and changing lanes if they don't want to crash into fire hydrants and other cars. Our innate biases are, as two legal scholars put it, "like optical illusions in two important respects - they lead us to wrong conclusions from data, and their apparent rightness persists even when we have been shown the trick." We cannot avoid psychological blind spots, but if we are unaware of them we may become unwittingly reckless, crossing ethical lines and making foolish decisions. Introspection alone will not help our vision, because it will simply confirm our self-justifying beliefs that we, personally, cannot be coopted or corrupted, and that our dislikes or hatreds of other groups are not irrational but reasoned and legitimate. Blind spots enhance our pride and activate our prejudices.
We are all better off when we are willing to catch ourselves sacrificing truth in service of self-justification, but to do this we have to stop believing that we are above bias. Because we all have a personal or professional interest in how the things we do turn out, objectivity is a myth.  The most successful people understand that just because they can't see something doesn't mean it doesn't exist which is precisely why the best leaders challenge themselves to never mandate optimism, always openly and actively seek dissent and continually surround themselves with trusted naysayers. All this is in an effort to reduce their authority and disrupt groupthink.

If you aspire to this kind of profound leadership and acute awareness you have to understand that the best anyone can do is tap you on the shoulder. You have to open your own eyes and choose to see.

Thursday, February 10, 2011

Trusted naysayers

I will look at any additional evidence to confirm the opinion to which I have already come.
-Lord Molson, British politician (1903-1991)

I blog because I care.

I blog because I want to share.

I know too many people who wish I would just shut up.

I know too many people who kind of agree with me but think I'm foolish for saying it out loud.

I know too many people who have implicitly or explicitly threatened me (and others like me) to "be careful" with my blog.

These people are products of a system built on a culture of compliance. The system's use of carrots and sticks have most people drunk on incentives or scared shitless - so not much can change or improve.

When someone speaks up, they are predictably seen as someone who is up to no good. But if you understand how self-justification works, it should be no surprise that those who are comfortable with the way things are become angered by those who wish to influence change.

Carol Tavris and Elliot Aronson discuss the effects of self-justification in their book Mistakes Were Made (but not by me):

Self-justification has costs and benefits. By itself, it's not necessarily a bad thing. It lets us sleep at night. Without it, we would prolong the awful pangs of embarrassment. We would torture ourselves with regret over the road not taken or over how badly we navigated the road we did take. We would agonize in the aftermath of almost every decision: Did we do the right thing, marry the right person, buy the right house, choose the best car, enter the right career? Yet mindless self-justification, like quicksand, can draw us deeper into disaster. It blocks our ability to even see our errors, let alone correct them. It distorts reality, keeping us from getting all the information we need and assessing issues clearly. It prolongs and widens rifts between lovers, friends, and nations. It keeps us from letting go of unhealthy habits. It permits the guilty to avoid taking responsibility for their deeds. And it keeps many professionals from changing outdated attitudes and procedures that can be harmful to the public.
None of us can live without making blunders. But we do have the ability to say: "This is not working out here. This is not making sense." To err is human, but humans then have a choice between covering up or fessing up. the we make is crucial to what we do next. We are forever being told we should learning from our mistakes, but how can we learn unless we first admit that we made any? To do that, we have to recognize the siren song of self-justification.
If to err is human, and systems are made by humans, then systems are built on an edifice of errors. But you wouldn't know it based on how systems systematically deny their mortality.

The education system throws around the phrase life-long learning a lot. But I don't see us walking the talk. Too often, life-long learning is something for the kids to do while the adults watch from the sidelines - as if we've already crossed the learning finish line and no longer need to participate.

Adults who subscribe to learning as a linear race to the finish line are no different than a commissioner of the patent and trademark office resigning because "everything that can be invented has been invented." If we are not careful, blind self-justification can mislead us to believe that the here and now is as good as it gets.

Tavris and Aronson explain:

...the most harmful consequences of self-justification exacerbates prejudice and corruption, distorts memory, turns professional confidence into arrogance, creates and perpetuates injustice, warps love, and generates feuds and rifts.
While it is true that understanding the costs and benefits of self-justification may be the first step to finding solutions that will lead to change and improvements, it is likely just as important to understand why:

We need a few trusted naysayers in our lives, critics who are willing to puncture our protective bubble of self-justifications and yank us back to reality if we veer too far off. This is especially important for people in positions of power.

If those in power are the shepherds who expect everyone else to simply fall in line like good little sheep, problems are as inevitable as they are predictable.

Which is exactly why we must tread softly when we are fortunate enough to travel in a flock with a handful of colleagues who are willing to stand up and challenge things that we've always done and those that have always done them.