Showing posts with label Margaret Heffernan. Show all posts
Showing posts with label Margaret Heffernan. Show all posts

Wednesday, August 31, 2011

You have to open your own eyes

I'm not here to convince anyone of anything, nor am I here to motivate others. I can't change your mind and I can't motivate you.

Only you can do either.

 We don't resist change - but we do resist being changed, and we can't motivate anyone but ourself.

The best anyone can do is create an environment where others will motivate themselves to change and improve.

This can be incredibly frustrating.

But wait. It gets worse.

In their book Mistakes Were Made (but not by me), Carol Tavris and Elliot Aronson explain a little thing called confirmation bias:
"So powerful is the need for consonance that when people are forced to look at disconfirming evidence they will find a way to criticize, distort, or dismiss it so that they can maintain or even strengthen their existing belief."
Why is this? Why are we so set on defending what we think we already know to be true even if it means ignoring new evidence that would allow us to improve? Why is it that we are so much more likely to seek out others who will support our current opinions at the expense of those who would challenge us and help us grow?

Why is it that conventional wisdom is shaped by so many urban myths about parenting and educating children?

I believe we have a vision problem -- both literally and metaphorically.

We cannot fix a problem that we refuse to acknowledge. We know that confronting a problem is the only way to resolve it, but any real resolution will disrupt the status quo -- but disturbing the special momentum of the status quo is a great way of being labelled a "troublemaker".

Paradoxically, being a "troublemaker" may be the most effective way of getting fired, but it's also the surest way of differentiating between "doing things right" and "doing the right things".

In her book Wilful Blindness, Margaret Heffernan explains:
In business circles, this is known as the "status quo trap": the preference for everything to stay the same. The gravitational pull of the status quo is strong - it feels easier and less risky, and it requires less mental and emotional energy, to "leave well enough alone." Nobody likes change because the status quo feels safer, it's familiar, we're used to it. Change feels like redirecting the riverbed: effortful and risky. It's so much easier to imagine that what we don't know won't hurt us.
The slippery slope of the status quo is fueled by silence which Heffernan calls the language of inertia. In an ever changing world, inertia won't just keep things the same, rather it will guarantee things get worse. It's like trying to stand on top of a rolling yoga ball; sure, we might find the right balance between left and right, up and down, but if we want to remain on the ball, we're going to need to constantly adjust and readjust.

What works one moment, won't necessarily work the next.

Business guru Jim Collins warns, "If you notice marked decline in the quality of debate and dialogue around your workplace, things are on the decline." This kind of work environment favours compliance at the cost of engagement. It's like nothing is wrong, but everything is wrong. We are snookered into believing that the absence of conflict is the equivalent to happiness and so there may be plenty of polite conversation but nothing meaningful.

It's not unheard of for widespread knowledge and widespread blindness to coexist. There's a reason why some scandals or problems are known by all but no one will admit it. If an entire society, institution or company is built on denial because self preservation, survival or advancement demands blindness to the truth - disaster is imminent.

This is what Heffernan calls the paradox of blindness: We think turning a blind eye to the truth will make us safe even as it puts us in danger. As long as the work or learning environment convinces us that it is safer to say and do nothing, injustices can and will likely continue. There is a real danger in having a fixed view of the world and not being open to evidence that you're wrong until it is too late. Ironically, some of the most educated professionals can end up the most blind because they come to see their expertise as definitive.

So how can we best counter the harmful effects of confirmation bias and willful blindness?

There's no one answer to such a complex question, but the first step might be understanding that we all have blindspots.

No one is exempt.

Like the best drivers, the most successful people navigate through the hustle and bustle of their daily lives knowing that they have blindspots -- things that they just cannot or will not see. Tavris and Aronson explain:
Drivers cannot avoid having blind spots in their field of vision, but good drivers are aware of them; they know they had better be careful backing up and changing lanes if they don't want to crash into fire hydrants and other cars. Our innate biases are, as two legal scholars put it, "like optical illusions in two important respects - they lead us to wrong conclusions from data, and their apparent rightness persists even when we have been shown the trick." We cannot avoid psychological blind spots, but if we are unaware of them we may become unwittingly reckless, crossing ethical lines and making foolish decisions. Introspection alone will not help our vision, because it will simply confirm our self-justifying beliefs that we, personally, cannot be coopted or corrupted, and that our dislikes or hatreds of other groups are not irrational but reasoned and legitimate. Blind spots enhance our pride and activate our prejudices.
We are all better off when we are willing to catch ourselves sacrificing truth in service of self-justification, but to do this we have to stop believing that we are above bias. Because we all have a personal or professional interest in how the things we do turn out, objectivity is a myth.  The most successful people understand that just because they can't see something doesn't mean it doesn't exist which is precisely why the best leaders challenge themselves to never mandate optimism, always openly and actively seek dissent and continually surround themselves with trusted naysayers. All this is in an effort to reduce their authority and disrupt groupthink.

If you aspire to this kind of profound leadership and acute awareness you have to understand that the best anyone can do is tap you on the shoulder. You have to open your own eyes and choose to see.

Sunday, June 26, 2011

The inconvenience of Cognitive dissonance

Do you remember believing that the moon only came out at night? If so, do you remember the day that you saw the moon during the day?

If you can relate to this experience, you have an understanding for what cognitive dissonance feels like. In her book Willful Blindness, Margaret Heffernan describes cognitive dissonance as:
The mental turmoil that is evoked when the mind tries to hold two entirely incompatible views.
Believing that the moon only comes out at night and then seeing the moon during the day are incompatible - this is cognitive dissonance - and for a species that spends so much of its time making meaning of the world we live in, cognitive dissonance is, to say the least, inconvenient.

However, rather than seeing cognitive dissonance as a crisis to be avoided, the most successful people in the world embrace cognitive dissonance as a remarkable opportunity. They see it as a fork in the road where they can choose to continue down the comfortable status quo, or they can take a turn down a new, unfamiliar road.

This is exactly how trailblazing starts. There may be no other way to engage in real improvement and authentic innovation.

For educators and parents, it is time to engage in some cognitive dissonance. Traditional education has been built on a number of assumptions that we stopped questioning a long time ago. If we are prepared to make school a better place for our kids, we need to stop and reflect on some of these assumptions which means we might have to make school look a lot less like school.

I think it's safe to say that the following list of items are mainstays for traditional schooling:
  • homework
  • praise or positive reinforcement
  • grading or marking
  • final exams or exit exams
  • punishment
What if:
  • the biggest problem with the homework that didn't get done was that it was assigned in the first place?
  • praise actually discourages children?
  • children not only learn less but they choose to learn less because of the grade assigned by the teacher?
  • exit exams have actually proven to reduce graduation rates?
  • punishment and discipline are the problem, not the solution?
If any of these What Ifs make you angry or if you have a sudden desire to kill this page and move on with your day, that's ok. You've just experienced the inconvenient feeling of cognitive dissonance.

The trick isn't to avoid the feeling. To be honest, because we are human, I don't think avoiding the feeling is within the realm of possibility. The trick is to not succumb to the feeling and remain aware that you are uncomfortable while at the same time accepting the challenge that your longtime beliefs may be wrong.


Mara Sapon-Shevin writes in her book Widening the Circle:
Courage is what it takes when we leave behind something we know well and embrace (even tentatively) something unknown or frightening. Courage is what we need when we decide to do things differently... Courage is recognizing that things familiar are not necessarily right or inevitable. We mustn't mistake what is comfortable with what is good.  
After all, can you imagine how much worse off we would all be had we all chosen to turn our backs on that moon during broad daylight?

Sure we might salvage our comfort in remaining right (at least in our own minds), but this comfort comes at an alarming cost - that is, to remain comfortable, by definition we have to remain ignorant.

And there may be only one thing worse than ignorance and that is willful ignorance.

Monday, June 20, 2011

Finding what we look for

In their book The Myths of Standardized Tests, Phillip Harris, Bruce Smith and Joan Harris tell this story:

"What are you doing?" a helpful passerby asks.
"Looking for my car keys," answers the drunk.
"Did you drop them somewhere around here?"
"I don't think so," replies the drunk.
"Then why look here? the puzzled would-be helper wonders.
"It's the only place where there's any light."
What we find is largely dependent on where we look. The more we tighten our focus on highly prescribed curriculums that are enforced by test and punish standardized exams the more we miss. Ironically, an intense focus requires a kind of tunnel vision that blinds us to the wider consequences of our decisions.

Here's what I mean:

Before you read further, you might want to try out this selective attention experiment.



One of the designers of the experiment, Dr. Daniel Simons, explains what he's learned from conducting this experiment around the world:

We experience far less of our visual world than we think we do. We feel like we are going to take in what's around us. But we don't. We pay attention to what we are told to attend to, or what we're looking for, or what we already know. Top-down factors play a big role. Fashion designers will notice clothes. Engineers will notice mechanics. But what we see is amazingly limited.
In her book Willful Blindness, Margaret Heffernan puts it this way:

We see what we expect to see, what we're looking for. And we can't see all that much. 

And when Heffernan asked Simons if some people see more than others here was his response:

There is really limited evidence for that. People who are experienced basketball players are slightly better at seeing what's happening in the video - but that's probably because they're more accustomed to watching passes; it isn't so hard for them to read what's going on. You can train yourself to focus on more than one spot. You might improve your eye muscles somewhat. But the limits are pretty fixed. There's a physical and an evolutionary barrier. You can't change the limits of your mind.
The point to be taken here for educators is that our attention and vision is biologically limited, and the more time and effort we spend collecting and analyzing test scores, the less time and effort we can expend looking at things that are never found on tests like creativity, perseverance, empathy, resourcefulness and work ethic. In life, there's always too much data. The trick is knowing which to collect and which to let go. The same is true with learning. And unfortunately, today's accountability regimes are encouraging educators to become slaves to the wrong sort of data.

Here's Simons:

For the human brain attention is a zero-sum game: If we pay more attention to one place, object, or event, we necessarily pay less attention to others.
The more we focus on data-driven decisions based on measurable outcomes, the less we attend to educating the whole-child. This might look something like this:
"What are you doing?" a helpful passerby asks.
"Looking for learning," answers the teacher.
"Is there learning in that test?"
"I'm not sure," replies the teacher.
"Then why look here? the puzzled would-be helper wonders.
"This is the easiest place to look."