Category Archives: Uncategorized

Speculations on the Culture of Fear


I wondered aloud, in my last post, how we could have gotten to the point where, in direct contradiction of some very clear, very scary, not-at-all-obscure-or-complex evidence of the consequences of our actions, we have become so terrified of risk that we are actually killing our kids, after making them miserable. All in the name of love.

This is not an easy question to answer. Certainly, it won’t help to oversimplify, as Tim Gill points out in this little diagram, from his website ‘Rethinking Childhood’:




This is a tempting shorthand for what’s been going on in kids’ lives and in the minds of parents, culminating in what I believe is a real crisis in kids’ physical and mental health. But a more thoughtful approach looks for the roots of problems, and is not distracted or satisfied by proximal causes. Here’s Gill’s proposed ‘rethinking’ of the problem:

Of course, as he admits, there’s more to it even than that. Where do all these gadgets come from? Whence all the traffic? Why are parents working such long hours? And are these fears, in fact, well grounded? These questions need answers, and I’ll try to provide a few, in a minute.

But in the meantime, parental anxiety has been identified over and over again as the most proximal cause of the inactivity of our kids. Why has parental worry seemingly exploded since the days of our pragmatic, capable, depression-era parents or grandparents? A lot of the answers to this question actually intersect significantly with the answers to Tim Gill’s questions, above.

Margaret K. Nelson, author of ‘Parenting Out Of Control’, looks at the phenomenon of parental anxiety and its consequences, both to anxious parents themselves, and of course to their offspring. In an impressive two paragraphs near the beginning of the book, she summarises some if the prevailing theories for its genesis, and then provides additional research of her own, which expands on those theories. Briefly (with shorthand names I’ve given them), some of them are:

The “Culture of Fear” argument: Due to media exaggerations and obsessions with violence, terrorism, and sexual predation, parents are hyper-aware of potential dangers that lurk in what they perceive as an increasingly violent, risky world.

The “Only Child” argument: Because (partly due to increasing urbanisation), parents are frequently having only one child, they will never have the perspective that comes with experience, and remain anxious “new parents” the whole time they raise their first – and only- child. This is so common that in families with multiple children, the eldest child often ends up bearing some resentment to his younger siblings, who because their parents were more relaxed and experienced by the time they had them, grow up with fewer anxiety-induced restrictions on their activities, and often receive privileges at earlier ages than the first-born.

The “Little Emperor” argument: Related to the Only Child, this argument suggests that parents’ adulation of, and anxiety for, their offspring is exaggerated to unhealthy levels because of the uniqueness of a child without siblings. My name for the argument is taken from the “one child” policy of China, which has been popularly blamed for creating a generation of ‘little emperors’ – spoiled children who are treasured by their parents because the state forbids their having siblings.

The “Erosion of Adult Solidarity” argument:  Suggests that, as our society becomes more and more individualistic, we have transferred the burden of rearing a child to the sphere of the family, or in many cases, single individuals, whereas in more cohesive cultures around the world, this job is seen as the collective responsibility of the whole community.
The “Risk Society” arguments: (Broken down into sub-categories):

The “Amnesiac argument”: Anthony Giddens and others suggest that the erosion of any strong cultural or historical link to the past creates an overemphasis on, and anxiety for, the future, including notions of safety.

The “Master of My Fate” argument: As danger is redefined, from “fate”, or “chance”, to “manageable risk”, the emphasis is placed squarely on personal responsibility. The idea that the natural world is complex and fundamentally unmanageable is replaced with legal notions of ‘due diligence’ that retroactively assign blame whenever an event takes place that is deemed, in retrospect, to have been avoidable.

The “No Social Net” argument: As governments increasingly retreat from what many see as their fundamental duty to provide for their citizens, more and more responsibility for the health and safety of their kids are placed on individual parents.

To these, Nelson, based on her research, adds some nuance in the form of:

The “Social Class” argument:   Having noticed significant differences in the ways intensive parenting manifests itself between the lower- and “professional-middle” classes, Nelson suggests that ideas about the future financial security of their offspring motivates parents of different classes to ‘helicopter’ in different ways. What is constant, though, is the basic assumption of an uncertain economic future, which the Boomer generation did not share, and the desire of parents to see their kids replicate or exceed their own social class, which is no longer seen as being guaranteed.

But there’s more! Lenore Skenazy, in her book ‘Free Range Kids’, suggests a number of other reasons, namely:

The “Opportunistic Vendor” argument: Recognising the immense opportunity for lucrative businesses that pander to the health- and security-obsessed, whole markets spring up that attempt to sell ‘solutions’ to so-called problems that would have been laughable even a generation ago (baby knee-pads, anyone? How about tracking devices for your teenagers? You get the idea.)

The “Know-it-all Expert” argument: Related to the one above, this argument questions the rise of the “one size fits all” brand of so-called “Parenting Experts”, whose primary function seems to be to sell books and magazines telling parents what they are doing wrong and how it will permanently damage their children. They make their money by claiming that there is a ‘right’ way to raise a child, and that only they have the secret – which they will impart to you for a price!

The “Social Pressure” argument: Caught in a media firestorm when she allowed her son to take the NYC subway on his own, and dubbed “America’s Worst Mom”, Skenazy certainly felt the pressure to conform to the new social norms. Luckily, she educated herself about the origins and viability of those norms, and stood her ground. Many parents succumb. My own sister was upbraided by a stranger in a car-park outside a bank, where she had briefly left her 10-year-old daughter in charge of her toddler-aged brother in order to run in to make a deposit. She wasn’t gone more than a few minutes, and though the day was warm, my niece was perfectly capable of opening a window at age ten! The stranger actually called the police, apparently having read in the media one of the many stories of tragedy involving infants or dogs left in hot cars, and being unable to make the distinction in context. Though my sister is still adamant that she did nothing wrong, the experience was unpleasant enough that she has never repeated it. The intense pressure that mothers face from Nosey Parkers and busybodies is real – nobody wants to be called a Bad Parent. Especially when – as is increasingly the case – being branded such is likely to bring you under the cruel scrutiny of the law – which brings me to the next point, viz:

The “Legal Pressure” argument: Laws are often reflexions of social norms, and when those social norms become bugshit crazy, so often do the laws. The ‘Free Range Kids’ blog is full of anecdotes about draconian, blinkered applications of stupid laws that have profound negative effects on the lives of parents who are trying to buck the trends and raise their kids as sanely as they know how. One Florida lawyer actually presents convincing arguments that many parents unjustly accused of negligence in the U.S. actually cannot even get fair trials anymore, because the public dialogue has been so severely compromised on the subject of child safety that jury members and even judges cannot make rational decisions on the subject in this culture of fear.

The “Lousy Judge” argument: Our brains, as Skenazy and others like Dan Gardner, point out, are just phenomenally, evolutionarily predisposed to stupidity when it comes to risk assessment. Without education, our brains get the numbers wrong every time. Of course, not having sensibly-presented data from the media doesn’t help. See my previous post for a deeper look at this one.

The “Cultural Shut-Ins” argument: When our lack of interest in the past, combined with a cultural insularity, give us little knowledge of how other cultures (including our own, in the past) have treated issues of child-rearing, the echo-chamber of our own modern-western-culture-specific worries grows louder and louder, with no parallel experiences to contrast or challenge them. North Americans’ fabled lack of worldliness and knowledge of history combine to make a massive handicap here, aside from just making us insufferable to people of other nations.

To these, I myself might also add two psychological arguments that, while they might not directly cause anxiety, certainly help to explain why it might be augmented under certain circumstances:

The “Self-Efficacy” argument: Related to, but distinct from, self-esteem, self-efficacy is the increased sense of personal confidence and ability to deal with difficult things that comes with…well, doing difficult things. It’s a sense of competence that comes with skill, which in turn comes with experience. The safer we become, the fewer difficulties we encounter, which means that the self-efficacy ‘muscle’ becomes atrophied, and we lose perspective about what constitutes real danger, as well as our ability to cope with simple inconveniences.

• Related to the self-efficacy argument, the “Crooked Barometer” argument is a psychological argument that suggests that when a high level of risk is either eliminated or otherwise subverted, as in our modern ultra-safe society, the brain has a way of “advancing the queue” of smaller anxieties to fill the space left by the genuine threat, making small worries seem comparatively larger. It promotes, in other words, molehills to the rank of mountains, but only in the absence of real mountains, which would provide perspective.

Have I missed anything? 🙂

Of course, most of these factors are linked to each other, and reinforce each other, making it more and more difficult to have a coherent, calm conversation on the subject at all. But I’d like to try to construct a narrative out of these seemingly disparate proximal causes, in the hopes of stumbling onto something closer to the root of all of them. Here’s the (probably too graphically challenging) flowchart I came up with based on the factors listed above:

I’m kind of impressed with how central the rise of corporate capitalism is in all this, as well as the brand of urbanisation that it encourages. The media, while extremely influential, is mostly just reacting to market forces when it fearmongers to the extent that it does, as well as reflecting and augmenting the elevated levels of societal fear. Dan Gardner refers to this as an “echo chamber” effect.  I don’t want either to “let the graphic speak for itself” when it’s so obviously complex; nor do I want to belabour a point here.  So perhaps I’ll try to articulate the narrative that this graphic suggests to me in a future post.


Filed under Uncategorized

Student Protests: Part 2

I’d like to examine some of the assumptions that are made when one looks at the student protests in Québec through an inappropriate cultural lens.  I’ll try to take them one at a time.

1.  Student “Selfishness”

A common meme in English media and culture is that the students are simply being selfish, and unwilling to pay their ‘fair share’ of public education.  In English Canada, as I mentioned in my last post, the understanding of education past secondary school has become a rather corporate experience, paid for by individual consumers at great (and increasing) personal cost.  It is largely seen as a means to an economic end, and thousands of students in Ontario and elsewhere end up in debt to the University system out of a feeling of necessity, thinking that a degree is a laisse-passer to get a job.  The education is not usually an end in itself, and the crippling debt that one incurs in this exercise in credentialism is merely seen as the price one pays for the degree itself.  An extremely capitalist attitude, in other words, and one fraught with misconception, as James Côté and others have pointed out.

Under such a cultural understanding of education, the education of an individual student is just that:  an opportunity for the personal economic advancement of a single person.  As such, it is thought, it should be paid for by that individual, and not out of the common purse.  Many commentators have forwarded the opinion that “I paid through the nose for my education; why should the students in Québec get a free ride at my expense?”  This completely ignores the utterly different notion of the value of education in Québec.  In that province, education for the masses, and full participation in industrialised society and economies, was not always a given.  They were fought for.  Taxes in Québec (personal taxes, that is; corporate taxes in the province are astonishingly low) are among the highest in all of North America as a result of the decision that education (and childcare, e.g.) benefit society as a whole, and not just individuals.  Like health care, it is believed that everyone should be able to access it, regardless of income, and to be denied it for any reason is a violation of basic rights.  We (mostly) subscribe to the argument when it comes to health care in English Canada, but for some reason not in regards to education past high school.  The fallacious and extremely conservative claim that education is ‘for’ financial advancement certainly adds to this blind spot; under that model, after all, competition should be the dominant model, not cooperation.  More on that some other time.

So let’s look at the claim that students are being ‘selfish’.

Actually, by any reasonable definition of the term, students are not being selfish. The current student population will not be seriously affected by the hikes. The students who will be affected are currently not yet in high school. Ergo, the students are taking a principled stand for the future of people who are not them. They’re doing this for their future society, not for themselves. In fact, they took a risk with their own academic year, and the money they spent on it.   Hmmmm…..sounds like the OPPOSITE of selfish to me!   In fact, though the Charest government offered them the opportunity to be selfish, they did not take it.  The government offered to delay the implementation of the tuition hikes beyond the period when the current students would be at all affected. (They also simultaneously upped the fee hikes to something like 85% for that future generation of students.)  This offer was rejected on principle.

So, despite repeated demonstrations and statements by students that their protests are notselfishly motivated, that remains the dominant meme in the English media.  As to the claim, resulting from this fundamental misunderstanding of the principles involved, that “These students pay less than I did for university; therefore they have no right to speak,” leaving aside for the moment my last point, which was that their cultural and historical context is utterly different from other Canadians’, AND leaving aside the fundamental error of fact that such a statement comes from, let me address the logic of that ‘argument’.

A major flaw is the fact that it represents a race to the bottom.  If your own circumstances are bad, but you can find someone somewhere whose situation is worse, it is not a strong argument to say either a) that your circumstances are therefore comparatively fine, and you have no right to try to better those circumstances, or b) that everybody’s circumstances should be as bad as that other person’s.  Sour grapes, however, are a powerful tool of division.

Let’s say that Person A pays $5 for an apple. Person B goes elsewhere and pays $10 for the same fruit. Person B finds out about person A’s good luck, and instead of thinking, “Man, I got scammed…I’ll take it up with that unethical apple seller”, he thinks, “That stupid person A!! I’ll make his life miserable. How dare he get a better deal than me?” And then he goes to person A’s apple seller, and forces him to raise his prices to $10 as well. Result: everybody loses, except the apple merchants. How can we do this to each other? Is the Canadian ideal to just drag everybody down to the lowest, worst  level possible ?

The illogic of that position should be obvious, and the only question remains, “Why would anyone think like that?”  The question cui bono? (“who benefits?”) is a useful one.  While we bicker amongst ourselves about $5, the CEOs of Apples, Inc. make $5M bonuses. Divide and conquer.  This, by the way, is a staple of colonial education; if the colonised are fighting each other, they can hardly spare the attention it would take to fight their real oppressors.  As Ngũgĩ wa Thiong’o, a Kenyan Nobel-nominated author and political writer, says in Decolonising the Mind, 

[The colonisation of Kenya]  was effected through the sword and the bullet. But the night of the sword and the bullet was followed by the morning of the chalk and the blackboard. The physical violence of the battlefield was followed by the psychological violence of the classroom.  

[…] Thus one of the most humiliating experiences was to be caught speaking Gikuyu in the vicinity of the school. The culprit was given corporal punishment – three to five strokes of the cane on bare buttocks – or was made to carry a metal plate around the neck with inscriptions such as I AM STUPID or I AM A DONKEY. Sometimes the culprits were fined money that could hardly afford. And how did the teachers catch the culprits? A button was initially given to one pupil who was supposed to hand it over to whoever was caught speaking his mother tongue. Whoever had the button at the end of the day would sing who had given it to him and the ensuing process would bring out all the culprits of the day. Thus children were turned into witch-hunters and in the process were taught the lucrative value of being a traitor to one’s immediate community.

 […]The attitude to English was the exact opposite: any achievement in spoken or written English was highly rewarded. [In the colonial education system, which advanced by qualifying exams,] nobody could pass the exam who failed the English language paper no matter how brilliantly he had done in the other subjects. [. . .] English was the official vehicle and the magic formula to colonial elitism.

Who benefits from this division?  Those who divide, obviously.  Those who wish to undo and negate the advancements made during the Quiet Revolution.  Luckily, just because I already got scammed and paid for my $10 apple, that doesn’t mean my kids have to suffer — by the time they get to the apple cart, it’ll be $30 an apple!  I see it (as the students in Québec see it) as my duty not to let that happen, even if I am not myself going to benefit proximally from low tuition costs.

While we’re on the subject, I might mention the vast social benefits that come from a more educated population.  Contrary to the capitalist, consumer-model, where the only beneficiaries of education are the students themselves, on an economic level, everyone benefits from high levels of good education.

Here are just a few of the big points from a study on the fiscal investment returns of education:

•Parents’ education has strong effects on children. Thus the benefits of higher education accrue over extended periods.
•Higher parental education is associated with greater family investments in children in the form of parental time and expenditures on children.
•Children of more educated parents generally perform better in school and in the labour market, and have better health. A substantial amount of research concludes that education has a causal impact on health.
•Higher parental education is also associated with lower criminal propensities in children, and less child abuse and neglect.  Lochner and Moretti(2004) calculate that raising the high school graduation rate by 1% will reduce the costs of crime by approximately $1.4 billion dollars per year in the U.S.

These estimates suggest that the social return to education is similar to the private returns associated with higher lifetime earnings,which are also in the range of 7-10 percent.  Evidence suggests that the social returns to education are substantial and justify significant public subsidization of this activity.  It seems like we’d be saving money in areas like health care and the justice system, in other words:  sounds like a good argument against the ‘selfish’ moniker to me.




Leave a comment

Filed under Uncategorized

Are Lockdowns Really Necessary? And Why Can’t We Discuss This?

A little while ago, I posted on the subject of risk assessment in education, and how educators, like politicians, reporters, and all other humans, are just terrible at it.  Among the issues I mentioned in the post was the subject of school shootings.  Here is the relevant paragraph:

To my knowledge, there have only ever been ten acts of gun violence in Canadian schools since 1902.  The total death toll was 26, more than half of which came from a single incident at the École Polytechnique in Montréal.  One came from a school in Alberta where a friend of mine was teaching, eight days after the Columbine case in the U.S.  If you estimate the total number of students in Canadian schools since 1902 (hard to tell:  there are 5.2 million kids in school today, NOT counting universities and colleges; multiply that by 110 years and skim a bunch off for the smaller population in previous generations….you still get several hundreds of millions), and figure those 26 unfortunate people into that number, the chances of dying in a school shooting in Canada are too small for my calculator to measure without an error message.  But every year, we now have to suffer through “Lockdown Drills”, officiated by the police, where we all have to pretend there’s a maniac in the halls.  Time is wasted, kids are frightened, and money is spent for no good cause.  Remember, all violent crime is on the DEcrease, very dramatically.  Polls show that children’s safety at school is the single most common crime-related concern, and yet the school environment is statistically, indisputably, the safest place for kids – much safer than the home or the street.

I argued this at the school I work at, with the result that I got a lot of flak from colleagues who either a) took issue with me questioning lockdown drills, which they regarded uncritically as a clear benefit to our society, or b) took issue with me “wasting time” by thinking critically about policies that affect us all.  Out of this somewhat one-sided discussion came a number of interesting points that I thought warranted a separate post.  I also emailed  Lenore Skenazy, the author of Free Range Kids, and she put the question to the readers of her blog, who repeated a number of questions and assumptions made by my colleagues,  that to my mind are missing the point, since they are framed by the assumption that lockdown drills are the only real response to the threat of gun violence at schools.  I take issue with those assumptions, and I’ll try to explain why here.

1.  Likelihood of Danger

 First of all, I’m not convinced that the threat of a school shooting is severe enough to warrant this kind of attention. I totally understand the perceived reason for lockdowns.  I do.  I really get the fear that comes with kids, and guns, and the potential for disaster and loss.  God forbid that anything should happen, as we all say.

 That said, here are some numbers that as far as I can tell are correct:

 1.      There have been ten incidents of gun violence in Canadian schools over the last 100 years.

2.      The total casualty number is 26.  More than half of those were at Polytechnique, in 1989.  This is a date that comes before the dramatic statistical drop in incidents of violence after the 1990s.

3.      The stranger-as-gunman situation is not the norm.

4.      There are about 5.6 Million Canadian students below the postsecondary level right now.

5.      That gives us a pool of tens (hundreds?)  of millions of students, a century of recorded time, and 26 casualties, which statistically gives us such a tiny risk that it is treated as zero.  “De minimis” is the term.

6.      Compare this (for example) to the possibility of dying in a car crash:  1 in 6000, approximately.  Driving to school is thousands or even millions of times more dangerous – and a real risk – than the vanishingly small chance of a shooting in school.

 I can say with confidence – way more confidence than I could discuss even safety from shark attack – that a school shooting will not happen here.  In fact, there is nothing in my life that I can say for certain will never happen, but this is about as close as it comes.  I cannot have that same confidence that students will survive their car ride home.  We say that we are preparing for something that might happen:  but the list of things that might happen is long, and we can’t work that way.  We have to work with what realistically has a probability of happening.  We are actually preparing for something that statistically will not happen.  So, the reality is that we are doing this.  Things we do have real impacts —  more so than things that will not happen.

 Even if we do accept that school shootings are rare, people sometimes argue that lockdowns are useful in response to incidents of violence in schools other than school shootings, such as knives being brandished, etc.   One of my colleages mentioned his experience of such incidents in support of this theory.  But again, these incidents are extremely rare and getting more rare.  ALL violent incidents are dramatically down, inside schools and outside.  I think the availability heuristic might be at work here.  Just because we can recall an incident to mind does not mean that it is actually more likely to happen.

 People also mention the potential usefulness of a lockdown in the case of a bomb threat.   Has there ever been an incident involving an actual bomb at a school?  I can’t remember hearing of one.  There are plenty of threats; in fact, the school I teach at suffered a rash of them recently once some students found out that our reaction to such a threat (school-wide lockdown or evacuation) was so extreme and disruptive.  If you want to disturb shit, get attention and disrupt classes, what could be better?  It’s the go-to strategy of the sociopaths among our student population.  We’ve had way more fake bomb threats based on the understanding that we will react dramatically than we have had real threats.

 Strangers in the building?   Recently a stranger came into our school .  He entered, went to the bathroom, and left.  We went into lockdown, and though our principal calmly soothed fears by telling students over the P.A. when it was over that he had done no harm, I asked myself why on earth that it would be assumed that a member of the society we all live in would automatically have dastardly intentions.  He probably had to pee.  Why would we assume the worst, all the time?  This says way more about us, in my opinion, than about “strangers”.

 As Mark Twain said, “The trouble with the world is not that people know too little, but that they know so many things that ain’t so.”

 2.  Cost / Benefit Analyses

 People have said that the costs of NOT having drills might be very high, whereas the costs of doing them is nil, aside from some time lost.  The problem with the typical cost-benefit analysis of a lockdown drill (aside from not factoring in the real monetary costs of having the police at the school for several hours while more than a hundred staff members are sitting in the dark, not teaching) is again that we don’t know if it’s true.  Do these drills in fact help to reduce cost in terms of human life?  What is their cost in terms of quality of life?  Where are the studies on this?  I understand that we’re mandated to do these, but I would really like to wonder about the resources allocated to such things.

 In addition, could we think about the perceived potential benefits of performing lockdowns, compared to the real effects of actually doing them?  We might think about possible social repercussions of  normalising paranoia (that’s what it is; it’s not a realistic risk), and the anxiety that we produce.  I also wonder how many other, more pertinent risks we are ignoring.  Why is First Aid not a priority, for example?  In explanations I’ve read for the mandated 2-a-year lockdowns, fear of litigation is the most prominent reason given.  What if something happened, and we hadn’t been seen to “do something”?  My fear is that our response is like the one from “Yes Minister”:   in syllogistic format:

 a)  Something must be done.

b)  This is something.

c)  Therefore, this must be done.

 Could time and money be spent on stopping the bullying that some people claim produces school shootings?  Or on building community? Are we reacting emotionally (or politically) to irrational fears —  in other words to symptoms?  How can we stop doing this and get to the root of the issues which (rarely) produce problems?  I don’t have the answers, but it seems to me like we might not be asking very good questions.

 The social costs, on the other hand,  seem to be real.  “Lockdown” is a term that had its origins in jails.  Now it’s common parlance in schools, where we are MORE safe than ever.  Just look at these figures:

                a)  In the U.S., which has ten times the number of students we do (and better statistics; hence my use of data from south of the border), the rate of “serious violent crime” in 2004 was 4 per 1000 students.  That’s down to less than 1/3 of what it was in 1994.

b)  In the U.S., in 1997-1998, at the height of the statistically anomalous spike in violence during the 1990s, the average student had a 0.00006 % chance of being murdered at school.  That’s 1 in 1, 529, 412.    And the risk has shrunk a lot since then.

c)  Studies that I have read indicate that the kind of lockdown drills that we do, where kids are sitting or lying on the floor, are the least effective and most anxiety-raising.  Aside from the godawful lockdowns that happen in the States where people actually roleplay shooters and bloodied victims, they are the worst.

d)  Remember that the risk of your child being a victim of a school shooting is effectively zero.  But in 1997, a poll showed that 71% of Americans said it was “likely or very likely” that a school shooting would happen in their community.  One month after Columbine, 52% of parents feared for their kids’ safety at school; 5 months later this was unchanged.  Why are we allowing public policy to be decided when the data and people’s fears are so unbalanced?

e)  Significantly, media “feedback loops” continue.  Here at home, do you remember the “anniversary” episodes of both Polytechnique and (for a whole WEEK!) 9/11 on the local news?  Even the normally moderately sane CBC was guilty of this.

f)  Politicians also ought to make it clear that schools are safe.  Instead, they don’t take the political risk of appearing not to take the “problem” seriously.  Once again, schools are safe.

g)  In the U.S., vast amounts of money are spent on metal detectors, police presence, and other invasive security measures that drain monies away from books and events at school.  It also creates an oppressive atmosphere, which is unnecessary since less than 6% of students are reported to carry weapons of any kind (even pen knives) to school.  Fewer still would use them in a violent manner.

h)  The adoption of “zero tolerance” policies towards violence actually was found to INCREASE bad behaviour and dropout rates.  The APA called for them to be dropped.

i)  Studies also show that schools operate best when they are connected to the community in strong ways.  Treating all strangers as homicidal maniacs does not seem to strengthen community.

j)   1 in 5 parents report “frequently” worrying that their child will come to harm at school.  Another 1 in 5 worry ‘occasionally’.

k)  In the U.K., parents are so anxious when their kids leave the house that in 2004, a poll found that 2/3 of parents experience anxiety WHENEVER their kids are outside the home.  1/3 of kids NEVER GO OUT ALONE.  The result is that almost half of British kids stare at screens for more than 3 hours a day.  Child welfare likened it to being raised like “battery chickens”.  As far as I can tell, there has been only one school shooting in the U.K., in Scotland in 1996.  It took a total of three minutes from start to finish, and I am unsure of the efficacy of a lockdown situation there.

l)  In 2007, a group of 270 child psychologists from across the Commonwealth and U.S. wrote an open letter in a British newspaper, declaring that parental anxiety over “stranger danger” may be behind “an explosion in children’s diagnosable mental health problems”.  They advocated a return to unstructured unsupervised play as part of a remedy.

m)  The usual “better safe than sorry” also doesn’t take into account the prolonged angst that comes with feelings of living in a dangerous environment.  This we do know causes feelings of helplessness, listlesness, and depression, all of which negatively affect learning.  This should come into the equation somewhere if we’re serious about education.

 3.  Emergency Preparedness

 People say that it’s best to be prepared.  Okay.  That’s a good enough sound byte that the Boy Scouts use it as a motto.  But prepared for what?  How?  Analysis of the few historical incidents relevant to the discussion seem to indicate that lockdown procedures might not have helped during Columbine or other similar situations.  As Dan Gardner points out, we are really, really bad as a species at predicting major events.  The Black Swan theory of historical events holds that nearly every significant historical game-changer was unpredicted, and probably unpredictable.  That’s part of why they’re so potent.  Not knowing where bizarre, unpredictable events will come from is just part of life.  We really ought to prepare for things that are statistically likely to happen.  Jumping to the worst-case scenario is just feeding the beast.  You should see how much money gets allocated to things like “security experts” these days.  Their job is to think of the worst, most horrible things that could possibly happen, given egregious circumstances.  That’s not really being balanced.  And they make money by doing so.  These are the people who helped to kill community by teaching kids to scream “stranger danger!” and run away from people in their neighbourhoods, when all good data tells us that they are much more likely to be abused at home.  Nobody really predicted the various outré events like 9/11 or Columbine; nothing that happened during those bizarre events matched anything that people had imagined.  The amount of money that the TSA makes by groping airline passengers is grotesque, and I’m not convinced it increases security.  Cui bono? as they say.

 We also don’t know if THIS PARTICULAR reaction to a threat (whether that threat plausibly exists or not) is the most appropriate.  This is not the only conceivable reaction to a perceived threat.  We don’t know if a lockdown drill does increase safety or minimise anxiety, as is the claim; in fact, the only studies that I’ve read concerning lockdowns have been on the other side of the question, with psychologists questioning whether they increase anxiety, particularly in students who come from high-risk, high-stress backgrounds like war zones, or so forth.  Of course we’d all rather be safe than sorry.  But does this particular thing make us safe?  And are we unsafe from the get-go?  From what?   Even if we are unsafe, which I don’t yet see evidence for, I want to know as a teacher and a member of society that we’re not just sticking bananas in our ears to keep the tigers away.   Remember the duck-and-cover drills kids had to do during the Cold War?  We laugh at those now, and I am sure that we will laugh at our own foolishness in the future.

 Not only that, but the idea that this kind of lockdown drill is somehow proactive is silly, too, in my opinion.  Our model remains a post-facto reaction to something that is clearly already out of control.  And it involves the violent intervention of paramilitary actors in the role of police SWAT teams.  If there’s one thing we’ve learned about the perception of violence, it’s that it tends to ramp itself up.  If you want to be truly proactive, seek out the root of violence and address those issues before they add to the statistics.  Not that the statistics are really even worth worrying about.  The message ought to be that school is the safest place you can be, where you can send your kids to be statistically multiple times more safe than at home or on the street.  We’re acting like we know this lockdown stuff works, and that it’s the only option.  In fact, there are no good studies yet to tell us if this is true. To affirm that, even if action of some sort does turn out to be required, it is only this action, and no other, is illogical.

 4.  Protection of Children

 Statistically speaking, the most likely way for a kid to die in North America is because of a car accident.  Every single time a kid gets in a car, she has about a 1 in 6000 chance of dying.  That’s real.  In my opinion, if we want to increase safety, we would focus on the daily carnage that are North American roads.  Why are we so blasé about driving?  It is actually quite likely to kill our students.  I have several times had to bear this bad news to classrooms full of students whose friends have just died in car crashes.   This is hypocrisy of the highest order.  Do we care about kids’ safety, or do we just care enough to want to look like we’re doing something without dramatically inconveniencing ourselves in the process?  We can legislate lockdown drills for things that statistically will not happen, and “let the schools deal with the problem” while we go about our business of driving around in our dangerous cars, killing kids.  In fact, our stupendous stupidity at risk assessment has created a situation where more kids are hit by cars driven by parents who are driving their kids to school for fear of safety issues than by anybody else.  The irony drips.

 So let’s just be clear here again:  I’m not anti-safety, and certainly not anti-children (thanks, black-and-white thinkers!).  I would like to increase safety by figuring out what that means and how to do it.  And I would like to do that without contributing to any corrosion of the society I live in.  If we want to really make a huge difference in student safety, we might think about public transportation, for example, which would get them out of cars going to and from school.  That would save lives for sure.  And how about resources to create anti-bullying campaigns that promote acceptance and even affection between all members of the student body?  That would have saved a life in this city recently, where a young man committed suicide after being bullied for being gay.   Sadly, more kids kill themselves than each other.  In fact, it’s the second most likely way a young person will die, after car crashes.  It seems like there’s a danger of complacency when we think of safety in terms of lockdowns, and not focus on deeper matters.

 Now, the good news is that the OLD reason for kids dying, i.e. disease, is mostly way down, including things like cancer.  Cancer rates are down, cancer in kids is down, and mortality for kids with cancer is down.  This should be a good news story.  According to Steven Pinker’s newest book, we are living in the least violent, most peaceful, safest, healthiest, longest-lived, most leisured society on Earth since the beginning of humanity.  There are fewer wars, they last less long, and take fewer lives, than ever.   We should be the least anxiety-ridden people ever to walk the planet.    So check it out:  we ARE safe.  We’re the safest human beings have ever been.  Anybody who tells you different may be selling something.  Despite our relative Über-safety, though, anxiety levels – particularly amongst teens – are WAY up.  We have to try to accept some of the societal blame for this, and for the consequences of teen anxiety and depression, which as I said, is the second most likely cause of death of young people.  We’re safe, but we make up fears for ourselves to fill the gap, and pass those fears along to our kids.  But we don’t even do that well!  Here is a short list of the kinds of things that we could realistically be afraid of, and spend time and resources on, based on statistical danger (again, most of this data is U.S. specific):

 The ‘flu still kills 36 000 people annually (the normal kind, not the swine or bird ‘flus, which frightened far more people than they actually killed).  Globally, the seasonal ‘flu kills about half a million people every year:  the swine ‘flu killed under 20 000, putting its global death toll at less than the normal yearly rate of U.S. seasonal ‘flu death.

68 people are wounded by pens and pencils every year.

According to the U.S. Consumer Product Safety Commission, there were 37 known vending machine fatalities between 1978 and 1995, for an average of 2.18 deaths per year.

3 000 people are injured by chairs at work or school.

2 944 people are injured by desks.

Photocopy machines injure 497.

1 241 people are injured by computers.

Clocks injured 74 people in 2001.

212 people were sent to hospital after encounters with telephones.

73 people are killed every year by lightning.

120 people are injured by toilets DAILY!!  (Read Dave Barry’s column or blog for statistics on exploding toilets.)

You have a 1 in 150 000 chance of choking to death every time you eat (not insignificant!  But the biological benefits of having your larynx in this awkward position, and therefore giving you the power of speech, outweigh the risk, even at those odds.  Nature, at least, seems to understand risk assessment!)

Even if you just sit quietly and do nothing, your chances of dying randomly at any moment are about 1 in 450 000, given the entire population of the U.S., which of course includes the elderly and ill.

 Those are some things that we might worry about if we were more rational about risk.   Instead, we worry about terrorism, child abduction, and school shootings, which are about as unlikely to kill our kids as sharks.  Okay, I know that when kids come into the picture, realistic assessment of risk goes WAY down.  That’s not anybody’s fault; it seems to be a common cognitive bias.  But come on!  Are we adults?  Can we not get over this?

 5.  Lockdowns and Fire Drills

 People often compare lockdown drills to fire drills, but I’m not 100% sure of how useful it is to make this comparison.  They seem to be many orders of magnitude apart.  But it’s a fair question: have many schools burnt down in the last century?  I’ve heard of it happening with tragic results; I think a lot of the new fire codes, including mandatory drills, came from such incidents.  Fires in general are pretty common, so I don’t know if it’s in the same league.  Let’s check the stats:

Good news – deaths by fire have been on the decline for the past several decades, though they’re still the third most common way people die at home. On average in the United States in 2010, someone died in a fire every 169 minutes, and someone was injured every 30 minutes.  Only 15% of these fires occurred in non-residential buildings.  So no, they’re not on the same scale at all.   And I’m afraid that the lockdown drill, although it is modelled on the fire drill, does not work from the same basic assumptions.  In a fire drill, you know what the situation is, and there are time-tested methods of dealing with it in an efficient manner by using behaviourist training.  Leaving a burning building doesn’t require training, but doing so quickly but in an orderly manner, and overcoming the common instinctive reaction to grab meaningful possessions, means that you have to program anti-intuitive behaviours into people.  That’s what drills are for.  When I was in the Army, we did drills to try to ingrain habits in us that countered powerful intuitions that, unfortunately, were dangerous.  When we smelled gas, we had to be trained to put our own gas masks on before we warned our platoon mates of the danger:  something like putting your own oxygen mask on in a plane emergency before helping a child.  It’s not intuitive, but it saves lives.  In a lockdown, we do not always know what kind of situation this response might address.  In such a situation, where there are no parameters, how are we to know that sitting quietly and waiting for police is actually the best strategy?

 As to the argument that fire drills and lockdowns reduce panic, there’s no evidence that people run scared when faced with unexpected events.  Hollywood has people screaming and running from everything from terrorists to Godzilla, but in real life this does not seem to happen.  People sometimes do kind of stupid things in emergencies, but there is not much evidence for panic like many people describe.  In the one case of a suspected gun at school that I have experienced, there was no panic, and I stupidly entered the building, thinking I could help somehow (it turned out to be a kid with a toy gun).   I was teaching in London during the Underground bombing in 2005, and for nine horrible hours I thought we had lost students.  I was dreading calling parents; it was really awful.  It turned out that things were so outwardly normal in the city that the kids (who had the day off and were shopping) had no idea anything was wrong, and therefore didn’t check in.

 Instead of fire drills, a better comparison might be to the “duck and cover” drills of the Cold War, when the baby boomers who are now forming public policy had to hide under their desks for fear of The Big One, courtesy of the Communists, who may or might not have been a bigger threat than the hypothetical gunmen we’re talking about.  The perceived risk of nuclear attack was always higher than the actual risk, even during the Cold War. Keep in mind that there were only ever two nuclear bomb attacks on anybody anywhere, neither of them by a Communist regime. Even the Cuban Missile Crisis, we are now learning, was a long way from the near-annihilation that was in the papers. When I was in the Army, our field manual showed us the response to a nuclear blast, which was to lie down on the ground and point our helmets at the mushroom cloud. The whole thing is absurd, and is (mostly) remembered by sane people as absurd.  My feeling is that these drills will be too, once we’ve either calmed down or moved on to the next paranoid delusion to grip our fragile minds.  Remember, we’re talking about either things that essentially, statistically, DO NOT HAPPEN, or whose risks are in the millions to one against, which is more or less the same thing.

 6.  A Lack of Emergency Preparedness Sank the Titanic, Didn’t It?

 A criminal type of insouciance led the Titanic’s owners not to anticipate disaster, and not provide enough lifeboats for all the passengers, with tragic consequences.  They claimed that having lifeboats on board and emergency drills would cause undue panic.  Doesn’t that prove the need for such legitimate drills? If not this, what are the legitimate reasons a lockdown might take place?

 Using the term ‘legitimate lockdown’ is unfortunately tautological.  The effectiveness of lockdowns is what is under question here.  Again, what assailants are we talking about here?  Who are they?  We are talking hypotheticals.  “What if” is rarely a useful question to ask when assessing risk.  Seriously, who are we afraid of?  Once we identify them, we can figure out whether they’re worth worrying about.

 I also don’t see that the analogy to the Titanic is warranted.  Arrogance, not adherence to facts, made them under-supply the ship with lifeboats.  I’m not advocating that we do nothing in the interests of security.  I’m just saying we need to look carefully at what is reasonable, and address issues that actually 1.  happen , and 2.  we can do something about.  The Titanic is actually a good example of NOT taking a reasonable precautions to risk.  It was not actually all that unlikely that lifeboats would be needed, and certainly if they were needed at all, everybody would need one. Even in modern times, “Two large ships sink every week on average,” says Wolfgang Rosenthal, of the GKSS Research Centre in Geesthacht, Germany. That’s about 100 every year.  I imagine it was even worse leading up to 1912.  The line about causing undue panic seems like somebody’s excuse for bad planning.

In fact, one might argue that it was complacency that was created by newfangled security measures (the system of bulkheads) that sank the Titanic. They had security measures in place, and due to an extremely unlikely turn of events, those weren’t effective — whereas the backup safety measures that might have saved people’s lives were ignored because of a feeling that safety concerns had already been addressed.  Complex interrelationships of unlikely events,  poor decisions, and human failings sank the ship, and those are things that are extremely difficult to plan for.  That said, again, the reason we all know about the Titanic in the first place is because of the phenomenally unlikely circumstances that led to the incident, as well as the press reaction to the event.  It was in the news because it was rare.

 7.  Don’t Spout Statistics:  These Are People’s Lives!

 Precisely.  So let’s start thinking honestly and rationally about what puts them in danger, and then deal with them effectively.  So, once again, let’s get some perspective.  We are, by all accounts, the safest people ever to walk the planet.  But this seems to make us adjust our criteria for risk downward, filling in the anxiety gap with more and more trivial worries.  We are the most risk-averse society that I have ever heard or read of.  Although the possibility of reaching zero risk is impossible, a study by a professor at Ottawa U. finds that most Canadians think that it is possible.  Not only that, they expect the government and institutions to provide it for them (!)  Considering how important risk is to normal cognitive and social development, I find this very troubling as an educator.

 There are good examples of how realistic awareness of risk might have prevented tragedy.  In addition to the statistic I quoted above, where the majority of traffic injuries involving children are caused by anxious parents driving their precious bundles to school, there are others.  This one is for everyone who thinks that we ought to have a plan to deal with emergencies:  You’re right.  I am all for safety.  The question is a matter of finding out what actually makes us unsafe, and then dealing with those things in a way that actually improves safety, while not compromising quality of life more than is necessary.

 The TSA, for example.  I don’t think anyone has successfully shown that the horrorshow that is U.S. customs and security actually adds much to actual safety.  It detracts way more from quality of life, dignity, privacy, and common decency than it adds to safety.  And the issue that it supposedly addresses, while terrible and frightening, is astronomically remote.  Terrorism is down, too, if anyone’s wondering (with the exception of within the state of Israel).  In order for taking a plane to be even close to as dangerous as driving (which we all do, and NEVER seem to question it, despite the recent seeming glut of people being mowed down by cars in the city in which I live and work), terrorists would have to hijack and crash a plane a week for months, AND you would have to get on a plane daily.

 In fact (getting to the point), after 9/11, many people cancelled flights out of fears of crashing planes, and got into cars to take their trips.  Someone crunched the numbers and found out how many people died unnecessarily in car crashes as a direct result of that decision:   Turns out it was close to 1600, or about half the total life cost of 9/11, including the terrorists.   It’s six times the number of people on the planes that crashed.  That’s 1600 people who tried to make the right decision, but are dead because they didn’t take the actual facts into account.

 It just seems like instead of reacting to risk, we could respond to it and try to make sure that what we do to address it doesn’t either miss the boat or make things worse.  Safe is good, but we have to define our terms.  Driving feels safer than flying, because we think we’re in control, but it is one of the single most dangerous activities we can partake in, unless we’re deepsea divers or active-service paratroopers.  I’m all for CPR and first aid.  The chances of needing those skills are actually quite high:  in the U.S., heart disease killed 700, 142 people in 2001.  If they SEEM rare, that’s part of our perceptual blind spot.  Things that seem rare or safe are often quite dangerous, and things that seem (emotionally?) to be dangerous are often not worth the angst.

 8.  Risk and the media

 Much of our risk aversion comes from the “if it bleeds, it leads” mentality of media coverage of events.  A lot more of it comes from lawyers.  Statistically, I would bet that your chances of being sued for some improbable event would greatly outweigh the chances of the original event happening in the first place.  I have, though, weirdly enough, also read articles that suggest that the number of frivolous lawsuits actually brought to court in North America are much smaller than most people assume – there seems to be some evidence that the insurance companies are actively encouraging a sense of the overwhelming use of frivolous lawsuits in popular culture so that they can justify higher rates. On top of that, a major beneficiary of the Cult of Fear is the horde of manufacturers of Safety Products, who prey on irrational paranoia.  Free Range Kids details some of the more egregious examples of a manufactured crisis with expensive manufactured cures:  baby kneepads, for instance, for crawling tots.  As if thousands of generations of infants had evolved to crawl “unsafely”, just waiting for the right product to correct nature’s deficiencies.  Sigh.  So, in answer to my earlier question of Cui Bono:  “Too many dubious sorts of people”.

 One of the major factors that affect our minds’ perception of risk is what Daniel Gardner, who wrote a book on the subject and spoke eloquently at a lecture I attended a couple of years ago,  calls a “feedback loop” generated by media.  The original noise is picked up and amplified in a kind of echo chamber, and this escalates the brain’s response to threat in ways that we could never have without the media’s involvement.  Reporters are people too, and their risk assessment tools are just as terrible as the rest of ours.  Their choices, though, have social effects that are wide-reaching.

 In Canada, where I live, there was a news story recently that, though local in nature, became a national story:  a doctor at a clinic in Ottawa had improperly sterilised her colonoscopy equipment, and around 5 000 ex-patients were being contacted by letter, informing them of a remote risk of infection by Hepatitis or HIV.  The odds against HIV infection were more than a billion to one against, which so far exceeds the “de minimis” rule that it kind of shocked me that it was even mentioned.  The media went nuts, probably because of the shock value attached to AIDS-related material.  The authorities hesitated to contact the media at all about the matter, perhaps knowing what kind of a zoo it would become.  When the media got hold of that fact, of course, it was played to the hilt.  It was made to look like a conspiracy; at the very least, the mostly-risk-uneducated public felt that they were being patronised.

 The next bit of newsworthy material that came out of this story was that clinics around the country were worried about cancellations of important colonoscopic procedures by people who had heard the news item and lost confidence in the procedure, thereby creating a real risk due to undiagnosed conditions that could easily eventually become fatal.  This was creating a situation somewhat reminiscent of the anti-vaccine movement that puts thousands of people at risk based on ignorance, fear, and bad science.

 I’ll add this:  Information is not automatically a good thing, though I definitely want as much as possible in order to make decisions.  How we use that information to make decisions is just as important.  I partly agree with those who say that the witholding of information can seem patronising, but since the medium is the message, how that information is disseminated and presented is enormously influential.  The Feedback Loop that Gardner describes is an unfortunate, but real, byproduct of the way media produces stories.  And until schools’ curricula start to focus a lot more (as we have hopefully begun to do) on things like cognitive blind spots, logical fallacies, analysis of information, correcting lazy thinking, etc., and until politicians’ use of language is held to a higher standard, we’re going to have to deal with a question mark as to how people are going to react to risk.


Filed under Uncategorized

Ten Things Our Grandparents Got Right #5: They Didn’t Treat Teenagers Like Infants

Picking up from my last post’s ellipsis, I feel I need to address the infantilisation and outright ageism displayed by adults toward teenagers. This rather repugnant reincarnation of genetic determinism (for which there is no good evidence, and against which Stephen J. Gould spent much of his career combatting) is particularly dunderheaded when you take into account the plasticity of the brain, just now beginning to be understood. “Don’t talk for more than ten minutes on any subject”, we were told in teachers’ college, “because the adolescent brain has an attention span that tiny, and there is nothing anyone can do about it”. The contradictory complaints that attention spans are getting smaller, often iterated by the same people, never seemed to present a serious challenge to the accepted wisdom. Surely if they can shrink, they can also grow.

Under the deterministic model, adolescents’ potential is viciously undercut, and a condescending attitude of pandering to existing biases, tastes, knowledge, interests, and capabilities is adopted, with real change or growth is almost completely negated. “Teenagers are lazy and surly because of physiology, or perhaps hormones”, people say, though there is absolutely no record of this being true  either in historical records or even in other cultures existing today. And the list goes on: teenagers are incapable of making rational decisions because of brain chemistry, not because they are systematically denied the opportunity to practice making good decisions on a daily basis. Teenagers are not punctual because of circadian rhythms unique to them, and not because of poor sleep and nutrition habits that are actively encouraged by our society.   Here’s an article that actually suggests that teens’ supposedly biologically based inability to process risk effectively might be the result of sleep deprivation!   Knowing the effects of sleep deprivation on the human psyche (I am an ex-soldier), it surprises me that nobody has made that link earlier.  I have heard more than once the lament that “if we really wanted teenagers to pay attention, we wouldn’t hold classes before ten o’clock”. Recently, this has been challenged, the only question remaining being how such a parochial view could have survived this long. Anyone who travels outside of North America or who has read history (think Alexander the Great, Joan of Arc, Augustus Caesar, Mary Shelley, Louis Braille, et al.) will shake his head at such statements of the inevitability of “the teenage brain” and its limitations. The list of historically significant teenagers is as long as my arm, at least until about the middle of the twentieth century, when they suddenly became incapable of surviving the most basic of situations, such as wearing hats that light up .  This photo is taken from the wonderful Free Range Kids blog, which also has so many first-hand anecdotes of infantilising in America that it sometimes makes me afraid to read it.

Fourteen-year-olds would burst into flames.

For starters, the concept of the teenager as a separate class of individual, or a distinct stage in life, is a very recent coinage – nobody used the term before 1941. Now the invention of the ‘tween’ is pushing it even further: it is only attested to since 1988. The theories as to exactly what purpose the invention of such a repugnant, incapable figure was supposed to serve vary, but John Taylor Gatto, Neil Postman, and Dr Robert Epstein have some suggestions.

Postman, in The Disappearance of Childhood, argues eloquently for the phenomenon of childhood in general being a socially constructed event. He points out that the idea of childhood, a time of life in which one is supposed to be controlled by a sense of shame and protected from such things as knowledge of adult sexuality, was a product of the end of the Middle Ages and the rise of the printed word. A boy of seven years old was, for all practical purposes besides “making love and war”(p.15), capable of every meaningful act in Mediaeval society. He could speak, and do labour, and in a predominantly oral culture, these are all that are needed for maturity and inclusion in the social structure. There is no need or possibility in a Mediaeval culture for the keeping of secrets; privacy was hardly a concept at all, and close quarters and the lack of any need of reading skill made knowledge a general commodity.

But when the written word became the new means to record, keep, and guard the culture’s knowledge base, institutions like educational systems were needed to induct the child into the world of adults. This effectively stretched the time of childhood from seven years to the end of schooling. As Postman points out, before the 16th century, there were “no books on child-rearing, and exceedingly few about women in their role as mothers […] There was no such thing as children’s literature […] , no books on pediatrics. […] Paintings consistently portrayed children as miniature adults […] The language of adults and children was also the same. There are […] no references anywhere to children’s jargon prior to the 17th century, after which they are numerous.” (18). Children did not go to school, because there was nothing to teach them. But now the definition of childhood changed, from one based on linguistic incompetence to one based on reading incompetence. Instead of just becoming an adult by ageing, children had to earn adulthood through education – and the European states invented schooling to accomplish this new process. Childhood, as Postman notes, became “a necessity” (36).

Later, with industrialisation, threats to this newfound idea of childhood emerged. The new urban demand for factory and mine workers supported the “penal” aspects of schooling to break the will of the child and accustom him to the routine labour of factory work. In response, child labour laws were introduced, enshrining the concept of the sacrosanct nature of childhood. Though Postman sees the growth of elementary education after 1840 as evidence of the triumph of the notion of childhood over industrial capitalist concerns, J.T. Gatto sees it somewhat differently.

Gatto, an award-winning teacher who speaks now against institutionalised education, argues that the modern American education system never outgrew its penal origins, and in fact goes further, saying that the system is set up more or less deliberately to bring about the class of uncritical, bored, dissatisfied consumers that is important for the corporate model of capitalism to flourish. Children were being actively groomed by industrial influencers of education systems to become not citizens or human beings, but “human resources”, to be moulded to fit something called a “workplace”, “though for most of American history American children were reared to expect to create their own workplaces.”

The subdivision of childhood into adolescence, and now, pre-adolescence (the “tween” phenomenon) is something that Robert Epstein has written on. Epstein, in his book The Case Against Adolescence , argues from the point that Postman and Gatto leave off, during the industrialisation of America. He sees the creation of the adolescent as a kind of benevolent but destructive side effect of the social reforms that were reacting against the admitted evils of the Industrial Revolution with regards to children’s rights. The creation of institutions such as child labour laws, compulsory education, the juvenile justice system, and the age-specific restrictions of “adult” activities such as driving, drinking alcohol, and smoking, according to Epstein, had the effect of isolating the child’s world from the adults’ almost totally. They are confined to a mostly (by definition) developmentally incomplete peer group, and their dependency is extended by more than a decade before they are required to enter the adult world after school – this despite the fact that their sexual maturity and mental readiness for such a transition are evident from a much earlier age. In a study, Epstein found that teenagers have ten times as many restrictions placed upon their behaviour as normal adults, and twice the number as felons and soldiers! The rise of incidences of such restrictions exactly parallel industrialisation, and jump significantly after World War II.

Epstein picks up an argument from Postman, and suggests that the studies that purport to “show” the biological cause of the supposedly innate surliness and incapacity of teenagers are flawed, in that they show only correlation, not causation. In fact, given the plastic nature of the brain, I myself would expect to find that such correlations are in fact backwards, meaning that the social restrictions on teen behaviour are in fact to blame for the state of their brains. The argument that brain scans “prove” the innate uselessness of teenagers in such areas as risk assessment or impulse control sound to me about as useful as “scanning” the musculature of a teenager who has never lifted weights, and declaring them “unfit” and biologically incapable of ever being an athlete.

In fact, the list of famous “characteristics” of teens proves to be mostly made up. Margaret Mead points out in her studies of adolescents in Samoa that the traditional “storm and stress” of North American teenage development is nowhere to be found in that culture, or any other preliterate culture. There is no term for adolescence in the majority of such societies. Even the list of undesirable teen behaviour in our own society, summarised by Philip Graham as “identity confusion, extreme moodiness and high rates of suicide, violent discord with parents, aggressive behaviour, intellectual and emotional immaturity, risk taking, and sexual promiscuity arising from the raised secretion of sex hormones” has been shown to be common to less than 20% of the age group in question. Hardly a useful list of descriptors, then.

And as to other biased assumptions about teenage behaviour, such as the idea that they are addicted to, and are misusing, technology?  Turns out we ought to have a good look in the mirror here too:  Adults were found in a study to abuse technology at a higher rate than their kids.  

Why are these ideas so pervasive and so tenacious? The original study of adolescence was done in 1904, by G. Stanley Hall. He observed the turmoil on American streets due to industrialisation and massive waves of immigration, unsupported by proper social structures. Concluding from this that all adolescents necessarily exhibited those nasty characteristics mentioned above, he drew on the now-long-debunked theory of biological “recapitulation”, in which the development of the individual mirrored the development of the species. In that model, adolescence “recapitulated” a savage, pre-civilised phase of the development of Homo Sapiens, and it would be expected that such a period would bring with it turmoil. He borrowed from the German Romantic idea of Sturm und Drang and applied it universally to all teens, claiming biology as the cause. Though the field of biology has long since abandoned such theories, the general public has not kept pace.

Of course, I have also found through my years of teaching that what is expected of a person is generally what one will get. As Eliza Doolittle says in Shaw’s Pygmalion,

“You see, really and truly, apart from the things anyone can pick up (the dressing and the proper way of speaking, and so on), the difference between a lady and a flower girl is not how she behaves, but how she’s treated. I shall always be a flower girl to Professor Higgins, because he always treats me as a flower girl, and always will; but I know I can be a lady to you, because you always treat me as a lady, and always will.”

Sadly, we live in a culture where the treatment of young adults is infantilising (do the test here!), demeaning, controlling, and stultifying. Perhaps it’s not entirely adults’ fault, though; as Postman points out, adults are the result of the same process of education that we subject our children to. Whereas once literacy was the dividing line between childhood and adulthood (ideas that were enshrined into the notion of the creation of a free state during the American Revolution), industrialisation also brought with it technologies that made actual familiarity with the written word obsolete. The telegraph, radio, television, and the Internet have taken over from where literacy left off, producing generations of adults who have had unfettered access to information, but no sequential, age-appropriate introduction to discerning its meaning. The very definition of childhood as an idea, not just a biological stage of individual evolution as it is now conceived of, depended on slowly being indoctrinated into greater knowledge through increasingly complex mastery of literacy. Now, who actually reads anything by people like Barack Obama, Stephen Harper, George Bush, or Ronald Reagan? Would they be rewarded if they did? Though our Canadian society has succeeded in producing generations of functionally literate people, we are increasingly reverting to a Mediaeval-style oral culture, in which even people who can read, generally do not, and most of those who do, cannot do so very well. The line between childhood and adulthood is blurred, and brain scans show an adolescent development well into the mid-twenties of North American subjects — “coincidentally”, this is about the same time as many post-secondary students are leaving school.   I would dearly love to see brain scans of people other than the Westernised college students who are the typical subjects of such studies. My intuition is that they would be vastly different at comparable ages.

The assumption that the tastes and interests of a teenager are equally fixed, never to grow, was made clear to me in a textbook on English grammar much in use several years ago, in which every sentence, in order to be palatable to what grammar-textbook publishers assumed teenagers’ interests were, had to have something to do with skateboarding. To me, this attitude is no better ethically speaking, and has just about as much science behind it, as the old idea of the genetic inferiority of slaves. The problem is, with bandwagoning, it’s difficult to get off the wagon, or out of its way. I once had a principal (a fellow with a science background, who ought to have known better) who hawked these unpleasant wares at every staff meeting and P.D. day, much to my annoyance. Years later, after his retirement, he admitted to me that he knew full well all along that it was bunk, but claimed that he found it a useful tool for management. He told me that “we have to work with something” – a foolish imperative that always makes me think of the show Yes Minister, where it was put in syllogistic form:

1. Something must be done.
2. This is something.
3. Therefore, this must be done. 

Teaching is often a surreal experience.

We were presented in Teachers’ College with the interesting model of Howard Gardner’s Multiple Intelligences, and told that we must adjust our teaching techniques to all of them, regardless of their relevance or applicability, because “students can only learn in certain ways”. Every lesson had to touch on as many of the Intelligences as possible, and administrators’ evaluations of teachers would be based on a handy checklist and cursory observation. Imagine trying to incorporate kinaesthetic learning into a lesson on punctuation or grammar! This led to all kinds of silliness , like hopping up and down to simulate semicolons, from which the better teachers miraculously managed to salvage some memorable learning experiences. Since then, Gardner’s theory has come under closer scrutiny, and has been largely debunked, at least in the absolutist terms under which it was adopted in schools. Here’s a quick video outlining the basic flaws in the theory: 

Far from being deterministic learning “styles”, they appear to be mere preferences, and there is no good evidence that pounding a round lesson into one of its square holes does anything to help learning at all. Instead, a good teacher will understand which kinds of tools are applicable and effective, given the nature of the ideas or skills being taught. In other words, according to Professor Daniel Willingham, “While there’s little evidence that matching one’s teaching style to one’s students’ learning styles helps them learn, there’s much stronger evidence that matching one’s teaching style to one’s content is wise.”

Why this obviously silly meme has stuck around for so long, and had such an impact on systems of education is a bit of a mystery, but I have the following observations, which might shed some light: The first half of the equation comes from good intentions, I think: most teachers or educators feel a calling and a social responsibility to their profession. We’re often caring to a fault, and this is an example of the ‘fault’: our predisposition to believe that our job involves finding the “hidden learner” in every student blinds us to the lack of evidence for this particular incarnation of that impulse. A kind of Confirmation Bias, if you will. The idea of Multiple Intelligences (which is a description of ability, not of style), bent slightly to suit our notion of being teachers who care deeply about individual students’ learning, is powerfully appealing. We want to believe in it, because it reinforces pre-existing beliefs that we have brought to our profession, but regardless of how admirable those beliefs might be from an ethical standpoint, if they do not fit the actual facts, they ought to be altered or abandoned. Recently, a study by Daniel B. Klein of George Mason University uncovered what he thought was a type of intellectual bias in Liberal-minded respondents to a survey. When it was pointed out to him that the survey he had provided might be biased, he re-wrote it, and found bias in those of Conservative bent. Then he wrote with some humility and intellectual frankness about his own Confirmation Bias – two attributes that my profession could certainly benefit from.

The second part of the reason this meme is so prevalent in schools, in my opinion, is not because it is correct, nor because it is touted by teacher ed. texts, (Daniel Willingham has looked at the course syllabi of Teacher Ed. Courses and found no evidence of it being ‘officially’ sanctioned) but because of the management models of evaluating teaching ability. When a principal is charged with evaluating the prowess of the teachers in his or her school, and has to report those findings upward to his or her own “managers”, the same silliness happens as when we are evaluating our students: we want to fall back on measurables. It’s a lot easier to carry a clipboard into a teacher evaluation and tick off “yes” or “no” to a question like, “Does the teacher address the students’ learning styles individually?” than to actually make complex judgements about a very fluid and complicated problem like evaluating “good teaching”. So it’s partly a question of efficiency, just as being forced by the requirements of reporting student learning (a vastly complex and mostly abstract concept) in terms of percentile grades results in us asking stupid questions on tests that focus only on measurable, concrete facts, rather than on the rather more important aspects of higher-level thinking. I once asked a question on a test that required students to place in order several events from a novel we were studying in class: something that assessed both their memory of the details of their reading, as well as their understanding of the cause-and-effect relationship between the events. I was forced to abandon the perfectly valid question because it is essentially ungradeable – as soon as one event is out of order, a domino effect takes place and makes it impossible to give a numerical evaluation of how close to being ‘right’ the student was. If anecdotal comments, or even a conversation, were the method of relaying to a student the quality of their understanding, I wouldn’t have lost a potentially valuable assessment tool. A managerial model of reporting quantifiables upward on a chain of command, ultimately to a political bureaucracy, just does not work when dealing with something as complex as human learning.

Partly, though, it’s more insidious than just the self-perpetuating efficiency of a system. Sadly, the two halves of the equation often come together in unsavory ways: when the principal asks “Is the teacher hitting enough of the learning styles in his lessons?” the implied subtext is often, “Is the teacher caring enough toward his students?” This puts a lot of pressure for the meme to become accepted, or at least unquestioned, in teacher circles, at least when administration is present. It’s an unspoken type of ad hominem : between the lines is the question, “Do you really care about children?” I think this is the method of preservation of a lot of silly educational buzzwords, actually: they’re tied to teacher performance reviews. A lot of it is just lip service, as is suggested by the number of teachers who in private conversations will question the meme, but it still has an effect.

I am calling here for a greater intellectual and moral courage on the part of teachers to stand up against policy that is not evidence-based.  Here in Canada, under a government that is apparently actively anti-evidence, this is a tall order.  But we’ve got to start.

Leave a comment

Filed under Uncategorized

Things Our Grandparents Got Right #4: They Didn’t Try to Educate Us for the “Future”

Part Two

 In the last post, I outlined the basic futility of trying to educate our children (“train” them, I suppose would be a better word) for a specific set of skills that would be useful under specific economic circumstances in the future.  I entered the job market, in my mid-twenties, at the very tail end of the 20th century.  My elementary school education, during the 1970s and 80s, could not possibly have prepared me for a job market within the context of a recession that nobody had predicted, and in which the major emphasis was on jobs in fields that had not yet been invented when I was going to school.  On top of that, several years later, the I.T. bubble burst, and all the jobs that were supposedly available to those with a very specific skill set suddenly disappeared.  Nobody really predicted that one, either.  In fact, there is good reason to believe that nobody will ever predict economic futures.

Employers, for their part, have been making it plain for years that it’s less important what specific software skills prospective employees come to them with than what skills in areas like problem solving, creativity, social adaptation, and communication they bring.  Training can always be done (and in my opinion, should be done, at the expense of employers, not the public) in situ for whatever tasks employees will be asked to perform.  The ability to learn quickly and efficiently from that training, by being punctual, polite, open-minded, critical, creative, and proactive is what makes prospective employers drool.  I’m not somebody who believes that the purpose of education is to provide employers with workers, but if you are, then it should matter to you that by all accounts, employers aren’t happy with the quality of worker they’re being given.    It seems that most of them would trade ten technically skilled applicants for a single well-spoken, well-socialised, clear-thinking applicant who can adapt and learn quickly.

 The problem with the future, as I’ve said, is that nobody knows what it will look like.  Its inevitability, though, makes us fill the yawning blankness in front of us with all kinds of hopes and fears – all of which come from our own past experiences, projected upon the future in a kind of collective psychological paroxysm of denial.  The future becomes a canvas upon which all of our present anxieties work themselves out in public.  There are some problems that attend the belief that we actually can educate kids for the future, though, and some of them aren’t as obvious as they should be.

First, there’s the danger of disregarding good ideas based on their novelty in favour of something that is comfortable, but has no good evidence to support its use.  The unconscionable resistance of schools to listen to the increasingly large body of evidence to suggest that grading not only does not assist in the process of learning, but is actively detrimental to it, has been going on far too long.  This is an enormous subject that really deserves a whole post to itself, which I will be glad to provide sometime later.  It is certainly possible to view the past with rose-coloured glasses, and ignore real harms done by practices which have the force of habit, but not of reason.  Often, the desirability of the practice in question is questioned even by its proponents, but urged anyway on the assumption that if it was bad enough for one generation, it ought to be bad enough for the next.  Sometimes this is accompanied by what Alfie Kohn has called the “BGUTI” clause, or “Better Get Used To It”, wherein the future is assumed to be filled with horrible arbitrary uses of power, for which we must train our children to submit.  This does not seem to me to be a noble ambition for our children.

Second, there is the danger of using this “Golden Age” of education disingenuously, as a way to discourage real progress.  Educational reformers, especially those who are advocating changes based on conserving parts of systems of education that have been proven to work well, are accused of “living in the past” and stifling innovation through their delusion.  Again, Alfie Kohn provides us with examples of the kind of “educational reform” sweeping through his nation, the United States, detailing how they are often merely disguised conservative movements, based in ideology rather than facts, and too often designed to line the pockets of those who put them forward.

Third, there is the danger of defining the ‘future’ in terms that are too narrow by far.  Too many educators see the “big picture” of the future of high school students to be the end of their four-year stint with us, and the awarding of the diploma.  After all, “studies have shown” that kids without a high school diploma are more likely to be economically and socially disadvantaged later on, right?  This is often seen to be the legitimate outcome of being deprived of the benefits of the type of education we offer, and not the result of rampant credentialism.   I always try to educate with the long-term goal of producing a thoughtful and mature human being who will continue to think and learn as long as their brains hold out.  And there seems to be good evidence that Alzheimer’s Disease can be mitigated by strong habits of thought, so I’m happy to consider the long term to be roughly “the rest of their natural lives”.  And maybe longer, if they teach their kids healthy habits of mind.

 Fourth, there’s the danger of throwing the baby out with the bathwater.  All of the posts about our grandparents’ “outdated” methods and ideas address this issue.  Certainly, they did a lot of backward, even harmful things in the name of education (many of which I abhor, and will address in later posts), but that does not mean that they had not found certain practices that actually worked.  Their nearly obsessive interest in penmanship, for example, though perhaps emphasised to the point of detriment to other aspects of learning, did have benefits that we miss, now that it’s gone from the curriculum.  Everybody has been through some sort of schooling, and everyone has had bad experiences, bad memories, and bad teaching at one point or another, all of which people insist on telling me about in detail the instant they learn that I am a teacher.  Learning has always been hard work, and ever since Shakespeare wrote about the “whining school-boy, with his satchel /  And shining morning face, creeping like snail /  Unwillingly to school” (As You Like It, II.vii.145-47), we’ve had to bear the brunt of everyone’s residual educational and social angst from high school.  The past, no matter how awkward, stressful, or frustrating, was not all bad, and it is worth preserving the better parts of what our ancestors came up with over many centuries of research and development.  This definition of conservatism in education I am all for.  But how, one asks, can we determine which parts to preserve and which parts to discard?  I would answer that anything that has been demonstrated to be harmful or detrimental in any way to the process of learning ought to be done away with as quickly as possible.  Anything that can be shown to reduce or kill hope outright, or poison students’ innate curiosity and desire to learn, ought to go.  Anything that develops humane perspective, curiosity, and habits of mind that allow learning to be indulged in as a pleasurable (though not effortless) activity for the rest of one’s life ought to be encouraged at all costs.  Encourage flexibility, and discourage rigidity of thought and ideology; otherwise, that great unknown future will wallop our kids when it finally shows up in a form that nobody anticipated.

 Fifth, there’s the concomitant danger of bandwagoning; of jumping onto every new idea or educational movement uncritically and for the sake of novelty itself.  Talk to any teacher who’s been teaching more than a few years, and they’ll tell you some stories about this one.  Our profession is awash in buzz-words, and though the words themselves sometimes show up in different forms, the range of ideas they represent is surprisingly limited.  Often, they’ll come back in roughly ten-year cycles, re-branded and as fresh as a bad penny (to mix a metaphor).  For a period of time in the late 1990s and up until a few years ago, one of the buzz-words you’d hear everywhere, presented as a strange hybrid of Policy, Gospel, and “Best Practice” (the latest euphemism for “toe the line”) by administrators everywhere, was the astonishingly silly phrase, “Brain-Based Learning” (is there an alternative organ that could be substituted?  It’s only a matter of time before “spleen-based learning” is all the rage).  Here’s a quick video detailing the level of skepticism we need to approach this concept with:

All of which brings me to the last point:

Finally, sixth, there’s the danger of treating the future (or your limited understanding of it) as inevitable, based on physiology.  This is an important enough topic that it deserves its own entry.  To be continued . . .

Leave a comment

Filed under Uncategorized

Things Our Grandparents Got Right #4: They didn’t try to educate us for the ‘future’.

  Part One

 This is kind of anti-intuitive.  The very process of educating children seems to rest on the idea of preparing them to meet their future.  The whole concept presupposes that the end of the process will create an educated member of society, many years down the road.  That part is fine:  of course we want to have a purpose in education, and it seems reasonable that it has something to do with kids becoming adults over time, which kind of implies the involvement of the future.  The problem comes when we start to think we know what that future will look like.

 Ever wonder why so many Science-Fiction movies set in the future are either Utopic (rare) or Dystopic (way more common)?  And have you noticed that all the fashions and hairstyles of these movies are just reflections (usually shinier, or slightly more ridiculous) of styles in vogue at the time the movie was produced?  And when the movie is set in a year that we’ve already lived through, how utterly unlike the reality of that time it is?  Further, have you noticed that these films are usually good indicators of the varieties of social angst that were current when they were made?  How many “Alien Invasion” movies from the 1950s mirror Cold-War fears of foreign infiltration and invasion? 

Who knew that those dresses would still be in style 400 years later?

It shouldn’t really be a shock to us that we can’t read the future.  What’s a lot more shocking to me is how often we act as if we can, and how infrequently we learn from being proved wrong.  Dan Gardner, in his book Future Babble, exposes the degree to which relying on experts, against all intuition to the contrary, actually renders us less able to predict and adapt to the future.

Gardner makes reference to studies that have been done over the years to try to verify the accuracy of expert predictions about the future.  This is, of course, a separate question from the amount of knowledge about a certain subject (gained from studying the past) any given expert possesses. The question is, “Does having a lot of knowledge about a particular subject increase your chance of being right when making predictions about the future of the area of study?”  Some of these studies have been conducted by the media (admittedly not very scientifically).  Here’s Gardner: 

“In 1984, The Economist asked sixteen people to make ten-year forecasts of economic growth rates, inflation rates, exchange rates, oil prices, and other staples of economic prognostication.  Four the test subjects were former finance ministers, four were chairmen of multinational companies, four were economics students at Oxford University, and four were, to use the English vernacular, London dustmen.  A decade later, The Economist reviewed the forecasts and discovered they were, on average, awful.  But some were more awful than others:  The dustmen tied the corporate chairmen for first place, while the finance ministers came last.” (p.21)

Other more recent examples have also come from the press:  If anyone remembers the famous accuracy of  Paul the Octopus, a cephalopod who was able to predict the correct outcome of all seven matches AND the final of the German team’s 2010 FIFA World Cup of soccer, they might be amused to hear of other animal ‘predictions’ that put our purported abilities to shame:  Chippy the chimpanzee embarrassed famous American pundits by choosing flashcards indicating political outcomes at a higher rate of accuracy than the experts, two months running.  In the field of meteorology, Wiarton Willie, the groundhog who predicts the onset of springtime every February second in Ontario, claims to be accurate 90% of the time on his personal website (though a larger study puts groundhog predictions in general over the last 40 years at about 39% accurate).  National weather bureaus claim about a 60% accuracy on long-range forecasts, though many think this is too high.  Certain ancient traditions of haruspicy are still being practiced; a pig farmer in North Dakota who examined the spleens of his pigs to predict the weather boasted of an 85% success rate.

None of these, of course, point to any magical powers possessed by animals.  (A better candidate for a claim of that sort is perhaps to be found in the case of the Tsunami of December 2004, in which more than 150, 000 people were killed, but relatively few animals, who anecdotally seemed to know that something was about to happen and fled).  At best, they indicate that when a series of choices is made more or less randomly, the accuracy rate is higher than when experts make them.  This is embarrassing enough, but to find out that one’s chances of being right actually decrease when one’s confidence and expertise increase is downright humbling.

Philip Tetlock, a psychologist at the University of California, conducted the largest experiment on the subject over a number of years after the spectacular failure of anybody to predict the downfall of the Berlin wall in 1989 and the subsequent collapse of the Soviet empire.  He studied 284 experts in politics, economics, and journalism, and compiled 27, 450 predictions about the future.  Conclusion:  the experts would have been beaten by a “dart-throwing chimpanzee”.  Some, however, were a lot worse than others:  these experts would have vastly improved their accuracy if they guessed randomly.  Tetlock discovered that these experts’ backgrounds or education didn’t explain their inaccuracy; instead, it was their mode of thought.  They were particularly uncomfortable with complexity and uncertainty.  They worked from an ideology and were extremely confident that it was correct.  Tetlock called these experts “hedgehogs”, after the fragment of the poem by Archilochus:  “The fox knows many things, but the hedgehog knows one big thing”.  The foxes, on the other hand (the experts who had no preconceived ideology, but worked from data, synthesising multiple sources and self-critically correcting for error as they went), did much better, and did manage to do better than just flipping a coin.   Much has been made recently about studies that appear to show differences in the tendencies of conservatives’ and liberals’ ways of thinking that mirror these broad categories:  Conservatives tend toward hedgehoginess, and liberals to vulpine leanings.

Interestingly, hedgehogs who are more ideologically extreme are even more likely to be wrong, and their accuracy actually declines when they know a lot about their subject, as well as when they predict something over a long period of time.  As Gardner puts it, the lesson is that “if you hear a hedgehog make a long-term prediction, it is almost certainly wrong.” (27)   And, of course, the problem is that we get most of our predictions from hedgehogs.  They are on TV and in the news all the time: they are confident, educated, knowledgeable experts who are willing to say bold, loud, easy-to-understand things about the future.  No media source wants to have foxes on TV; they will tend to want to say things like, “It depends,” or discuss things at length, giving a nuanced opinion.  And, in the end, though they do much better than hedgehogs, foxes are no prophets:  the world is fundamentally complex and unpredictable.  You can beat even a fox at predicting the future by predicting that “nothing will change”.  The things that are predicted are almost always wrong, (remember Y2K?  The paperless office?  The list is huge) and the things that end up happening, such as the collapse of Eastern-Block Communism, the Arab Spring, the housing crisis of 2008, and 9/11, leave pundits scrambling to rationalise all the reasons they hadn’t seen anything coming.

So the hubris of predicting things like what the “economy of the future” will be is really just an arrogance born of fear:  we want to educate our children to face what is now, has always been, and will always be, an uncertain future.  All kinds of educational imperatives have been attempted in the name of just that.  The fact remains that we simply don’t know, and are not able to know, what will drive the economic engine of our children’s future.  If we belong to that section of society that believes that the purpose of education is largely economic, then we are pretty much out of luck.  It simply can’t play that role.

In Ontario, where there is little formal attention in the curriculum given to job-specific skill sets, this is less of a problem than elsewhere.  But we can still get sucked into the “education for the future” meme in other ways.  We often talk about education like it is “for” something, in a kind of pragmatic way.  I can’t disagree; I think so too.  I just think that I don’t know what it’s for.  I’ve had ex-students come visit me ten or more years after I taught them.  They always share their memories of the classes they had with me, and it’s a rare moment when their memories match mine.  They’ll sometimes tell me that something I said in class changed their lives – my response is often unspoken, but goes something like this:   I said that?  Huh.  I don’t remember that.  Sounds profound, though.  I’m glad it helped.  Many times the things they remember weren’t part of any official curriculum.  Just some off-the-cuff remark that stuck with them and meant something eventually.  Sometimes it isn’t even anything you say:  sometimes just the long-term effect of your character on a kid will turn things around for him.  I’m always surprised by what they say meant something to them.  It’s rarely something content-related.  That’s where a little humility goes a long way:  I don’t know what is meaningful to them, or what will become so in the future.  I don’t know what part of my experience and worldview will resonate with them.  At the time, it sometimes seems like none of it is making any impact, but they tell me different, years later.  So I teach what I think is interesting, and hope for the best.

Sometimes we answer the question of “what is education for” in a too-limited manner.  Aristotle thinks of the question like this:  Why do we do anything?  Can we follow the trail of motivation to a source?  Something we do for its own sake, and not as a step to something else?  We’re goal-oriented in the West; it seems like we’re often lost without them.  We go to school, we think, because we want to get into university or college.  Why?  So that we can earn a certificate or degree.  Why?  So that we can use it to get a job.  Why?  To earn money.  Why?  To buy things with.  Why?  (And here’s where the trail usually ends in a capitalist society)  Because we think they will make us happy.  But why do we want to be happy?  For no reason.  Happiness is its own end.  We think, though, in the goal-oriented rat race of the West, that happiness is an ‘end’ in a kind of a final sense:  we think that retirement is the time in your life when all this will eventually pay off.  And so many of us end up waiting until we’re 65 to be happy.  In fact, by that point, many of us are so used to setting goals and postponing happiness that we don’t know what to do with ourselves after we leave our professions.  That’s obviously no way to live your life either.

So the future doesn’t seem to be the way to go when we think about education.  In the next post, I’ll go into why the alternatives, i.e., living in the “golden age” past of education, or else turning education into nothing more than a reinforcement of existing biases, aren’t viable options either.

Leave a comment

Filed under Uncategorized



In honour of Hallowe’en, I’d like to spend a moment talking about zombies. 

 What I’m referring to here are horrible, shambling, disjointed things that, though they were put in the ground years ago and really ought to be dead, keep popping back up to eat people’s brains. 

 Yes, the Zombie Idea is hard to get rid of.  We in the field of education see more than our share of them, probably because our realm of expertise is so heavily controlled by people who have little or no background in it.  A Zombie Idea is usually one which is ideologically based; these are particularly tough to eradicate.  Though studies are done to find out the reality behind certain ideas people seem to want to have about education, and the results are often as decisive as a shotgun blast to a decaying head, you can bet that within months or even days, the Zombie Idea will lurch back to life and pester you, forcing you to have to deal with it all over again.

In the field of Law, there’s such thing as precedent.  Once something is accepted as being true, it takes something really extraordinary to rehash the debate from zero.  The Nuremberg Trials, for example, once and for all put into the ground the defence of “Following Orders” and committing atrocities.  Nobody can get away with that crap anymore.  Any lawyer trying to shough off his clients’ guilt by claiming that they were following orders today won’t get his case heard seriously in court.  We know it’s not a reasonable defence.  We’ve been through that, already. 

 I started this blog because I didn’t feel like there was enough of a conversation going on about education.  I definitely am strongly against the curtailing of free speech.  But there comes a time when ideology trumps evidence, and old ideas are brought out, dusted off, and set to work gnawing at our brains for the umpteenth time since whenever.  Merit pay for teachers will separate the wheat from the chaff!  More homework for students will increase their academic success!    Teaching kids about sex will just increase their chances of STDs and pregnancy!  We need to grade students in order to motivate them to do work at school!  And on and on. 

 All of these ideas sound plausible.  “Someone should check to see if that’s true!” is a good response to something that has an air of plausibility.  You would think that we’d hear that response more often.  But there’s news:  we checked.  They don’t work.  They never did work and they never will work, for reasons that are complex, interesting, and fundamental to the process of learning.  They are just, plainly put, wrong.  Smart people, using good testing equipment and procedures, have examined the evidence and found that although they sound good, they are just not true. 

 And that should be that.  From there, we ought to be able to move on and find out why things that sound like they make sense turn out not to be the case.  We might learn something about reality, for instance.  Instead, we have to slog through the same mud over and over again, every time some ideologically-driven wingnut decides to use our profession as a hobby horse.  Education may be a political football, but even footballs eventually make their way down the field. I’m stunned by how the debate went during the last Provincial and Federal elections.  Facts seem to matter little.  Ideology forces many people to ignore them even if they are reported.

I’d like to call here for a much stronger system of parameters for the education debates.  I’d like for the ideas discussed to be based on evidence.  I want policy to be evidence-based, above all.  And I would like a process of precedent to be set up in the public conversation, where we don’t have to explain absolutely everything from scratch, every time.

Sadly, as long as the field of education remains in political hands, and as long as the antiquated hierarchical system within schools is not replaced with a more democratic system wherein educational leaders are elected from among teachers and researchers, this won’t happen.  It’ll just be my ideology versus the zombies’ ideology, swinging back and forth every election. 

Braaaiiiiinnnnnnsss!       Whhyyyy don’t we uuuuuuuusssse theeemmmmm???

Leave a comment

Filed under Uncategorized