Packing It In

Life has changed a lot in the last 50 years. Technologically, we’ve gone from hard-wired telephones and physical communication modes to an unlimited universe of advanced electronic, mobile, and Internet devices. Despite all those technological changes in our lives, perhaps our societal attitudes have evolved even more.

Think of how dramatically our thoughts about gay rights, abortion, drinking and driving, privacy, climate change, public safety and other big ticket issues have been altered over that time.

If you’ve ever watched the television show Mad Men before, you’ll know what I’m talking about. Set in the early 1960s, almost every scene in the program involves someone either a) drinking alcohol b) cheating on their spouses or c) smoking in a venue where it would be prohibited today.

What brought the last point to mind this week was a photo I noticed of a diplomat sitting in the White House with then-President John F. Kennedy back in 1962. Tucked in the very corner of the photo, sitting inconspicuously on a coffee table, is a fancy glass case loaded with cigarettes.

To anyone born in the last 20-30 years, the thought of being able to smoke in the White House, let alone an airplane, movie theatre, doctors’ office, hospital or restaurant, is something totally foreign.

Today, it is not just illegal, it’s also socially unacceptable in many circles. Ostensibly, JFK was a cigar smoker in private and his wife Jackie was a heavy cigarette smoker but, even in 1962, this was not something generally acknowledged in public. However, that certainly didn’t stop the rest of North Americans puffing away wherever they pleased.

According to a 2013 University of Waterloo report on smoking, in 1965, over 62% of Canadian men were smokers and about 50% of all adults in this country smoked, the all-time peak in tobacco usage. Today, just 16% of Canadians are regular smokers and the number continues to fall every single year.

That’s a phenomenal change in less than half a century. Pressure by The Canadian Cancer Society, the Non-Smokers’ Rights Association and a variety of other public and private organizations has led to more and more restrictions on where people can smoke and what age you can buy cigarettes, along with packaging changes and warning notices, plus a whole bunch of other deterrents.

Health concerns have become better known. Workplaces have banned smoking. Governments have systematically bumped up “sin” taxes. The list of hindrances has grown to the point where smokers are not just a tiny minority, they’re ostracized for taking part in an activity that, in addition to being perfectly legal, continues to be a massive source of revenue for government, accounting for over $7 billion in tax revenue annually.

Most politicians wouldn’t be caught dead smoking a cigarette in public, partly because they know their political careers would likely be dead, too. In July 1984, I was working in the Press Gallery on Parliament Hill and found myself at a picnic one Sunday afternoon, chowing down next to Brian Mulroney, who would become Canada’s Prime Minister just two months later. Seeing the writing on the wall, he told me how he’d quit smoking a short time before that, as he realized how difficult being a smoker would be while holding the highest office in the country.

Barack Obama made a similar decision in February 2011 after 30 years of being addicted to the weed. And I’m sure thousands of other politicians made the same commitment, partly for their health, but mostly because it’s become a habit the majority of people not only don’t participate in, but actually frown upon, especially when it comes to the people they elect.

There’s an interesting article in the November 2014 issue of The Walrus by longtime magazine writer Lynn Cunningham about her lifelong attempt to quit smoking, part of which details her spending time in the Mayo Clinic’s Nicotine Dependence Centre.

After 50 years and numerous attempts to rid herself of the habit, nicotine had become a vice she knew she couldn’t overcome without serious help. Serious enough to travel to Rochester, Minnesota and pay $5,500 U.S. for the Mayo’s eight-day cessation program.

Cunningham talks about the lack of residential treatment options for those who simply cannot quit on their own – and the similar lack of public sympathy for cigarette addicts. Unlike other addictions for which there are numerous support groups available, she says reformed smokers rarely have such avenues.

She comments on the fact that many recovering addicts, especially alcoholics, are often chain smokers who don’t even consider smoking an addiction.

And she even talks about how many popular movies have been made about the struggles of quitting alcohol or drugs – when nobody would even think of making a blockbuster about someone who quit smoking.

“Popular culture basically doesn’t acknowledge smoking as a dangerous addiction, nor does it lend it the patina of romantic dissolution that might garner users more sympathy – or better treatment options,” writes Cunningham.

Last week, the Canadian Cancer Society said it is taking the “next logical step” by urging Health Canada to introduce plain packaging for cigarettes, according to a Canadian Press article.

It’s already the law in Australia, where cigarettes have been packaged in plain olive brown wrapping since late December 2012 and cigarette use has fallen sharply since.

The CP story says similar plans are in the works in Ireland, New Zealand, the United Kingdom and France. “Plain packaging is an important and logical next step for Canada to curb tobacco marketing, reduce smoking and save lives,” says Rob Cunningham, a senior policy analyst at the Cancer Society.

As more and more pressure is inflicted on Canada’s remaining smokers to quit the killer weed, it’s amazing to look back at the changes that have taken place since the 1960s. When the Non-Smokers’ Rights Association (NSRA) was formed in 1974, its founders had very modest goals. They hoped to convince a few people that smoking was bad for their health and, in doing so, make them consider the idea of quitting.

As Canada gets closer and closer to being a non-smoking society, the NSRA must look back and marvel at how boldly life has changed in their 40 years of existence. It’s just one example of the ways our lives in this country have evolved, but it’s a profound one.

Hand Me The Remote

Are you excited about the kick-off of the new fall television season? Or could you care less? When I was a tiny TV watcher many, many years ago, I used to love this time of year. All the new shows starting and barely enough hours in the day to view them all.

Like many people, however, now the new season barely seems like a big deal at all. I emailed a few friends last month with some random magazine list about the Top 25 new network television programs this fall and they both told me they don’t even watch anything on regular television anymore, having switched their allegiances several years ago to specialty channels.

In the ever-evolving world of TV programming, it’s become a huge challenge for the formerly powerful American networks like ABC, NBC, CBS and FOX to bolster their sagging ratings against the formidable competition from outlets like HBO, AMC and Showtime.

Over the past several years, those latter stations have given viewers enticing choices they haven’t had in the past – commercial-free programming, full-length shows that fill most of an hour instead of being chopped up into little bits, and award-winning quality (Breaking Bad, Mad Men, Game of Thrones, etc.) without nearly as much censorship as conventional television.

Plus, with viewers’ ever-shrinking attention spans, most of these stations only produce 12-13 episodes per year and run them every week. Network television airs 22-23 episodes of prime time programs and they’re constantly pre-empted for “special events” or during holidays. They also go on hiatus for months at a time, annoying regular viewers and making it difficult to follow storylines.

It’s no wonder people like my friends have pretty much given up on the old-school television many of us grew up on – and switched to shows that are fresh, new and creative.

Now the television landscape is changing again. With the introduction of Netflix a few years ago, people have a whole new way of watching their favourite programs. If you’re unfamiliar with this service, think of it as a giant video store (remember those?) where you have instant access to thousands of movies and television series.

Rather than having to drag out a DVD or Blu-ray disc (remember those?), you flick on a box and pick out whatever the heck you want to watch. All without even having to move your lazy butt from your La-Z-Boy.

Theoretically, you could watch entire seasons of television programs all in one sitting. And that’s exactly what people are doing. Forget commercials. Forget taking a hiatus whenever the network feels like pulling your favourite show from the air. Forget even having to insert a disc in a machine and waiting for ten seconds while it loads.

Of course, not every program or movie is available on Netflix, so most viewers will continue to fill their plates with a combination of other programming from conventional networks, specialty stations, live sports and other options.

Still, there’s no doubt that Netflix has revolutionized many people’s television viewing habits – just as cable and satellite TV did decades ago, as VCRs (remember those?) did in the 1980s, and as the original specialty stations (including channels like Food Network and HGTV) have done over the last 20 years.

But Netflix hasn’t stopped there. No longer just a re-broadcaster of other networks’ movies and television shows, these feisty folks are now producing their own original programming, including the most recent season of Arrested Development, along with top-notch series like House of Cards and Orange is the New Black.

Not only are they creating some amazing, groundbreaking programs, they’re also doing something unprecedented in the history of television. They’re making entire seasons of new programs available instantly, releasing what used to take months to accumulate all in one moment.

That’s right. You can watch a whole season of these programs continuously. Depending on how much abuse your retinas or your bladders can take, you could be done with an entire season of your favourite new program in less than half-a-day. They call it “binge watching” and it’s not hard to see why.

If you think traditional networks are freaking out about this, you’re right. And they’re not the only ones. The same specialty networks that seemed so hip a few years ago are now wondering how they’re ever going to top this. Why pay for a specialty station to watch a program over several months when Netflix can deliver great shows you can be finished with in just a few hours?

It’s the perfect fit for a society where we don’t like waiting for anything and impatience rules.

Life moves pretty fast. And it won’t be slowing down anytime soon. There’s not much call anymore for waiting or anticipating or dreaming about what’s around the next corner. That next corner is already in the rear-view mirror. Sorry you missed it.

Is that a good thing – or a bad one? Heck if I know. When it comes to television, I’m a graduate from the Luddite School of Idiot Box Viewing. For several years, I lived in a northern town with one measly television station, so my viewing choices were limited to “On” and “Off.” At the time, no one could have even dreamed what the future might hold. Or probably would have cared. But, like everything else in life, things evolve and we adapt. Or we don’t.

Move forward, stand still, live in the past. It’s your choice. Hand me that remote.

 

Have A Great Weekend!

You don’t have to go back too many years to remember when the majority of workers had weekends off. Search your memory banks and you can probably recall sitting on the back deck with your feet up, enjoying a refreshing beverage, putting the work week behind you and letting your mind drift off to Never Never Land. Zzzzzz.

Where was I? Oh, right – weekends. Well, unless you’re one of the few fortunate souls who still works Monday to Friday from 9-5, those days are long past. In addition to the normal busy parts of life (family functions, charitable endeavours, kids’ sports activities, etc.), most of the world is now burdened with a variety of tethers that tie them to their jobs 24/7, even if they’re far away from their physical workplaces.

In the “old days,” they used to say certain types of work required people to be “on-call.” What an antiquated term that seems today. Now we’re all on-call, around the clock, wherever we are, even if we’re, technically, on vacation.

Some of that may be a requirement of our employment, but much of it is self-inflicted. It’s our choice to carry our smartphones or other technological umbilical cords with us at all times, glancing at them like Pavlovian dogs every time we’re summoned, whoever is beckoning us. We can’t seem to turn them off – and most of us wouldn’t be inclined to do so whether or not we had the choice.

It’s bad enough that most of the non-stop interruptions that keep us from doing something useful with our lives involve Facebook updates (“I just bought a hat!”), tweets (#cleaningthesink), selfies (me and a lint ball), YouTube videos (Cat licks paw!!!), or whatever.

It really starts to get sad, though, when all roads inevitably lead back to our jobs: checking our emails, making notes to ourselves, calling the office, dealing with customer concerns and, of course, actually working from home for 10, 20, 30 or more additional hours a week.

At some point, we all need to shut it down and give our weary brains a much-needed snooze. In a Scientific American article from last fall entitled Why Your Brain Needs More Downtime, author Ferris Jabr says, “Throughout history people have intuited that such puritanical devotion to perpetual busyness does not in fact translate to greater productivity and is not particularly healthy.”

Instead, we should be searching for ways to disengage ourselves from work, rather than trying to perpetually add more to our overflowing plates. Quoting an essay from The New York Times by essayist Tim Kreider, Jabr says: “Idleness is not just a vacation, an indulgence or a vice; it is as indispensable to the brain as vitamin D is to the body, and deprived of it we suffer a mental affliction as disfiguring as rickets.

“The space and quiet that idleness provides is a necessary condition for standing back from life and seeing it whole, for making unexpected connections and waiting for the wild summer lightning strikes of inspiration—it is, paradoxically, necessary to getting any work done.”

How very true. Jabr adds, “Downtime replenishes the brain’s stores of attention and motivation, encourages productivity and creativity, and is essential to both achieve our highest levels of performance and simply form stable memories in everyday life. A wandering mind unsticks us in time so that we can learn from the past and plan for the future. Moments of respite may even be necessary to keep one’s moral compass in working order and maintain a sense of self.”

As well, just what are we doing with all that additional “work time?” Not much of any true value, to be honest. Quoting a 2010 study of 1,700 white collar workers from the U.S., China, South Africa, the U.K. and Australia, Jabr says, “On average employees spend more than half their workdays receiving and managing information rather than using it to do their jobs.”

It’s easy to offer advice on how to consolidate or eliminate much of our “busywork” but, in the long-term, how effective will those efforts be if we don’t make our own commitment to downsizing our lives, resisting the temptation to peek at our electronic devices whenever they call out to us, choosing not to spend that extra hour or ten at our workplaces, deciding not to sacrifice our nights and weekends to “catch up” on our mountain of neglected employment spewage and, generally, making a choice to put leisure time ahead of our jobs.

Or maybe all of that is some unrealistic, out-of-date fantasy. Have we come so far in our evolution that we’re ready to give up all the things we’ve always cherished and that have provided us with an antidote to our jobs?

If so, perhaps it’s time we rewrote that 1981 Loverboy classic, Working for the Weekend. In today’s frenzied world, maybe it’s time to come up with some lyrics that truly reflect where we’re at today. In that case, we’ll just retitle the song, Working on the Weekend and be done with it.

School’s Out

“No more pencils, no more books, no more teachers’ dirty looks,” the great philosopher Alice Cooper once wrote. A recent experience where I watched a large family in a restaurant spend the majority of their meal cruising their smartphones led me to wonder how students in the digital age are managing to learn in a traditional school environment – when so much of their lives revolve around their electronic devices.

With the question of new teaching methods picking at my brain, I started searching for answers on the Internet. And, as so often happens, I ended up finding out much more than I ever wanted to know in the first place.

In the case of my search for teaching trends in the digital age, I stumbled upon a professor from the UK named Steve Wheeler. The educator recently wrote a three-piece series that commented on yet another article about three new emerging teaching trends. That article was written by Daniel S. Christian, an information technology instructor in Grand Rapids, Michigan. Now you see what I mean about already discovering more than I really wanted to know.

According to his original thesis, Christian said he believes that new teaching methods are being driven directly by the upsurge in online activity and identifies three key changes to support his argument.

The first trend is “a move to opening up learning, making it more accessible and flexible.” Christian says, “The classroom is no longer the unique centre of learning, based on information delivery through a lecture.”

Wheeler comments on Christian’s first trend by saying that this has been happening for at least the last decade. Classroom centred learning may be cost effective in terms of having a teacher deliver lessons in front of the class, allowing the students to reflect on what they might have learned and then testing them afterwards. But, is it still effective? With new technology, Wheeler says, learning can now take place anywhere and, more importantly, at the pace of each individual learner.

Of course, by taking teaching out of the classroom, we’re also introducing a whole series of other issues. As Wheeler asks, “Will there be a divide between learning that continues to rely on traditional learning spaces, compared to learning that takes place largely outside the walls of the traditional classroom? Moreover, if there is such a divide, will it be delineated by its cost effectiveness, its conceptual differences, or its pedagogical impact?”

Christian’s second trend involves “an increased sharing of power between the professor and the learner.” He continues: “This is manifest as a changing professorial role, towards more support and negotiation over content and methods, and a focus on developing and supporting learner autonomy.

“On the student side, this can mean an emphasis on learners supporting each other through new social media, peer assessment, discussion groups, even online study groups but with guidance, support and feedback from content experts.”

Wheeler wonders if teachers will be willing to voluntarily relinquish their position as the sole instructors in the classroom and become “co-learners.” He believes, “Some would feel justified in jealously protecting their positions as acknowledged experts and resist any calls to take a sideways step and let their students lead. Knowledge is power, and holding that position of power can be seductive.”

On the flip side, will students be willing to let their teachers into their personal digital world? “They are intimately familiar with the functionality of their devices, knowing how to use them to connect to, create and organize content. They are adept at connecting to their friends and peers too, but will they be willing to power share with their professors, take on greater autonomy and assume more responsibility to direct their own learning in the future?” wonders Wheeler.

Christian’s final changing trend in teaching centres around “an increased use of technology not only to deliver teaching, but also to support and assist students and to provide new forms of student assessment.”

Wheeler says this issue may be the stickiest of all, because learning and assessment are inseparable in education. Therefore, how do you reconcile digital learning methods with traditional grading systems? Wheelers reasons, “If students are relying increasingly on digital technology to connect them with content, peers and tutors, and to facilitate new, distributed forms of learning, then we should endeavour to assess the learning they achieve in a relevant manner.”

In Wheeler’s own classrooms, he often frees students from the confines of paper-based essays and allows them to submit videos, blogs and other forms of assignments. But how do you grade one form against another? In Wheeler’s case, he tries to determine equivalencies in effort, the sequencing of content and how well his students use the different capabilities of each technology. That’s a long, long way from having students take multiple-choice exams – and one that seems ripe for disagreements.

Whether we agree with Wheeler’s interpretation of Christian’s new trends or not, there can’t be any doubt that the way we teach our children is undergoing an extraordinary change because of the light-speed advancements happening in digital technology.

To return to Alice Cooper’s philosophical treatise on education: “Well, we got no choice/All the girls and boys/Makin’ all that noise/’Cause they found new toys.” New toys, for sure. And we’d better start thinking of more productive ways to make use of those toys if we intend to keep pace with the way we educate our children. If not, Mr. Cooper’s prediction that “School’s out forever” will almost certainly become increasingly true.

 

Back To The Future

Ronald Reagan. It’s been years since I thought about the 40th President of the United States. However, in one of those odd coincidences that happen so frequently in life, I was reminded of Reagan recently after watching an Oscar-nominated movie and reading a popular 2013 novel.

The movie is Dallas Buyers Club, which tells the horrifying story about the outbreak of the AIDS virus in early 1981, coincidentally, the first year of Reagan’s administration. The film details the struggle to identify and treat the first victims of AIDS. It’s a sad, sad story of fear and prejudice and ignorance, some of which was propagated by Reagan himself.

Ostensibly, the President refused to utter the word “AIDS” in any of his speeches until 1985, during his second term in office, despite the fact that it had become an out-of-control epidemic by that time. In 1981, there were just 159 reported cases of the disease. By the time Reagan left office in 1989, nearly 90,000 Americans had already died of AIDS.

As the movie relates, during those first few years, the U.S. government dithered and delayed, eventually setting up blind clinical trials that dying AIDS sufferers would have to wait for a year to start. By then, if they were still living, they would have only a 50/50 chance of being prescribed the untested drug AZT. If they weren’t in that fortunate group who received the drug, they’d get a worthless placebo, instead.

Dallas Buyers Club relates the story of two very different victims, one an emaciated redneck played by Matthew McConaughey (who knew this guy could actually act?) and the other a flamboyant transgender male/female, played superbly by Jared Leto. The unlikely pair of victims join forces to purchase illegal, experimental drugs from various parts of the world, creating their own “cocktails” to help prolong their lives.

The other 80’s touchstone is the novel The Interestings by Meg Wolitzer. The book centres around a group of young people who come of age during the Reagan administration. One part of their lives deals with the sudden appearance of the AIDS virus and its effects on the members of the group, one of whom becomes involved with a victim of the disease.

Twenty-five years after he left office, Ronald Reagan routinely scores near the top in surveys about “Most Admired Presidents” and many still consider him to have had a greater impact on American life than almost any U.S. leader in the 20th century. His supporters point to the restoration of American morale following the Vietnam War, the great wealth accumulated by many, the collapse of the Soviet Union and numerous other touchstones that occurred during his administration.

On the other hand, Reagan’s tenure also saw the national debt soar, relations with Iran and other Muslim countries ruined, a massive build-up of defence spending, the attempted destruction of unions and, of course, the aforementioned devastating effects of Reagan’s inattention to the AIDS virus.

Added to that, in my opinion, there was a transformation of America into a less caring, more fearful, more isolated nation, one that’s only been made worse by subsequent Republican Presidents, including Reagan’s Vice President and successor, George Herbert Walker Bush and Bush’s son, George W.

For those who never supported Reagan, he’s considered a B-list actor (one who co-starred with a chimp in the “classic” Bedtime for Bonzo), an eccentric geezer, and a dunderheaded buffoon who championed absurd projects such as the cartoon-like Star Wars defence program, which would have seen billions or trillions of dollars spent trying to shoot enemy missiles out of the air. It also led to the President’s popular nickname, Ronnie Raygun.

Rather than looking at him like a friendly, doddering old uncle, they see him as a mean-spirited tool of the rich and powerful who gave generously to the wealthy through his failed Reaganomics program, a simplistic economic system that anticipated a trickle down of wealth to the poor and middle class, something that never happened.

Instead, Reagan’s policies sowed the seeds for an America where the rich got richer, the gap between the haves and have-nots widened, mistrust of foreign countries grew and fear became the norm in American life. It also paved the way for creepy characters like the Bushes and Dick Cheney to build on their own wealth and power at the expense of average citizens for much of the last 30 years.

In the movie and book’s descriptions of living with the AIDS virus, Ronald Reagan’s true colours shine brightly. During his tenure, the primary goal in life was to accumulate great wealth, at the same time ostracizing those who were different, promoting fear, buckling under to the religious right and ignoring anyone who didn’t fit into the President’s narrow definition of what it meant to be an “American.”

In Reagan’s United States, the AIDS virus was considered to be God’s punishment for those whose lives didn’t conform to what was considered “normal.” It was a tragic, despicable view that ended up killing tens of thousands, many of whose lives might have been spared if Reagan had kept his eye on the physical health of his country, rather than just its wallets.

A friend reminded me last week of a quote from an unknown source that says, “People were created to be loved. Things were created to be used. The reason why the world is in chaos is because things are being loved and people are being used.” Too true.

Put on as many pairs of rose-coloured glasses as you want. No matter how hard you squint, you can’t hide the fact that this popular president did so little to help average citizens, as well as the weak, the poor, the sick or the challenged. Instead, he promoted the stockpiling of wealth for those who were already well off – at the expense of the people who truly needed his help and compassion. In my mind, that’s nothing to be admired.

 

 

Read All About It

I’ll be the first to admit, I’m not a huge fan of the CBC. To be honest, I can’t even remember the last time I turned on either CBC Television or Radio. How about you? I imagine if you enjoy hockey or The National or some of CBC’s radio programs, you can count yourself as a supporter. Certainly, I’m not an advocate of disbanding either service, as it’s always nice to know they’re there if you ever need or want them.

On the other hand, I receive several daily news summaries from CBC in my email, which help give me their perspective on what’s going on in the news, arts, etc. So, it’s not like I’ve shut the Corporation out of my life entirely.

There’s one initiative they’re involved with that does excite me, however. It’s called Canada Reads and it’s been operating on CBC Radio since 2001. Each year, the program covers a different theme and involves narrowing down a list of Canadian books that listeners and a panel choose as best representing that theme.

For Canada Reads 2014, they’re looking for the one novel that could change the nation or, perhaps, even the world. A long list of 40 books chosen by Canadians was revealed last October 24th. People voted to narrow that number down to a Top 10.

That list was given to the 2014 panelists, who have the task of defending their choice during a series of debates that air on CBC Radio and CBC-TV from March 3rd to the 6th. They’ll also be streamed online. One at a time, the panelists will narrow the list down until only the winner remains.

In the past, I’ve only glanced briefly at the nominees. However, this year I seem to have a little more invested. That’s probably because I had already read two of the five novels and was actually reading a third at the exact moment when the list was released. Since then, I’ve completed a fourth.

So, who are these mysterious nominees?

In alphabetical order by author, the first is The Year Of The Flood by Margaret Atwood, who is by far the most famous and recognizable name on the list and generally regarded as Canada’s finest novelist. This is the only one of the five books I haven’t read, as it’s in a genre, science fiction, that I have a lot of trouble getting my head around. Atwood’s book about a future world that emerges following a manmade pandemic will be defended by Stephen Lewis, a longtime leader of Ontario’s NDP, but now known as one of Canada’s most prominent philanthropists.

The second nominee, The Orenda by Joseph Boyden, is also the most recent, having been released last September. Boyden is probably my favourite current Canadian novelist and this is an interesting and controversial book set during the early history of Canada and involving the crossed paths of three characters: a Jesuit missionary, a Huron elder and a young Iroquois girl. It’s been attacked by segments of the religious community, the native community and just about everyone else, so you know what you’re getting into. And there’s a lot of violence, so be forewarned. It will be defended by Wab Kinew, a journalist, aboriginal activist and hip-hop artist.

Next on the list is Esi Edugyan’s Half-Blood Blues. It’s not often I remember exactly where I was when I read a book but, in this case, I recall being on vacation in the sunny south and absolutely loving this novel, which tells the story of young, black German jazz musician’s disappearance during World War II. It’s written in a “jazz language,” if that makes sense and is a wonderful piece of literature that deservedly won the 2011 Scotiabank Giller Prize. Two-time Olympic gold medal winning runner Donovan Bailey will defend the book.

It’s been a long time since I read Cockroach by Rawi Hage, but it’s managed to stick with me pretty well because of its dark, unsettling nature. It captures the life of a recent immigrant during one bitterly cold month in Montreal. The lead character, who imagines himself a cockroach, lives on the edge of society as a petty criminal eking out a marginalized existence. While searching out some summaries of the book, I noticed it was described as a black comedy for teens and young adults, but I’m not sure it’s a book that youngsters would necessarily be drawn to. In any case, it will be defended by comic, actor and writer Samantha Bee, who’s been a correspondent on The Daily Show with Jon Stewart for more than a decade.

Finally, there’s Annabel by Kathleen Winter, which I just finished reading a couple of weeks ago, so I can offer a very fresh perspective on it. This is a heart-wrenching story of a child who is born hermaphroditic (both boy/girl), but raised as a boy, with disturbing and sad results. It’s a book that has moments of both extreme tenderness and ugly brutality, but one I also think will remain with me forever. Sarah Gadon, a young Canadian actress who’s starting to make a big name for herself in Hollywood (with five movies set to come out in 2014), will champion Annabel.

It will be interesting to see how each of the celebrities defends the book they’ve chosen. It’s one thing to enjoy a novel, but to debate how that book might change Canada or the world is something entirely different. Excluding Atwood’s book, which I haven’t read, I’d lean towards either Boyden’s or Winter’s, mainly because the issues of native rights and sexual equality will continue to play huge roles in our country’s future.

In any case, the choice in this battle of the books will be an interesting one, as each of the novels speaks in an entirely different voice and, without a doubt, definitely has the potential to change Canada. Read on!

 

Warm Thoughts In The Dead Of Winter

It’s hard to find much good to say about last week’s extreme frigidity. Offhand, the only thing that lightens my mood when it’s -25 C out is the reappearance of the anti-climate change Luddites. I’m talking about the diehard few who cling to the completely debunked idea that global temperatures aren’t continuing to climb at an alarming rate.

Nothing brings these folks out of hiding like a record-setting cold snap. Refrains of “Whatever happened to your global warming?” were all the rage last week, rising meteorically in equal proportion to the plummeting temperatures outside.

I appreciate the fact that many of these people are just joking. It’s their winter equivalent of “Hot enough for you?” in the dog days of summer – and, just like that popular slogan, it gets tired mighty fast.

Like ostriches with their heads buried in the sand, these “denialists” claim glaciers aren’t melting, snow cover isn’t disappearing, spring isn’t coming earlier, humidity isn’t rising, temperatures over lands and oceans aren’t increasing, sea levels aren’t rising, sea ice and ice sheets aren’t disappearing, oceans aren’t warming, species aren’t migrating and tree-lines aren’t moving poleward and upward.

Thankfully, the number of misinformed individuals continues to decline – and worldwide acceptance of manmade climate change continues to grow. Apparently, all it took was a series of epic extreme weather incidents to make a large chunk of the few remaining naysayers change their opinions.

That’s especially true with Americans, a nation that often seems to thrive on dismissing everything that’s happening around them. After a series of cataclysmic events, including Hurricane Katrina (over 1,800 dead, $81 billion in damage, according to Wikipedia), Hurricane Sandy (nearly 300 dead, $68 billion in damage and massive flooding), drought (the current one is called the largest natural disaster in American history) and record-setting heat waves, the number of climate change deniers in that country may soon drop to less than 10%. Hallelujah.

It’s always nice to see Americans catching up to the rest of the world, considering the untold destruction and loss of human life that’s already occurred in other parts of the planet, directly or indirectly caused by climate change.

Personally, I knew the tide was turning when one of the last bastions of global warming denial crumbled last year. In my case, I’d be talking about my 86-year old father who I always assumed would drown underneath a melting polar ice cap while holding a placard that said, “Climate Change Is A Hoax.” I nearly fell off my soapbox when he informed me that, “There might be something to this global warming after all.” Miracles never cease.

Like many popular hoaxes, the anti-climate change folks still cling to the junk science that exists on the Internet, although the remaining websites that promote this crud are starting to look like projects some college pranksters might have designed when they were both extremely wasted – and terminally bored.

It was priceless to see right-wing broadcaster Rush Limbaugh hit the crackpot jackpot last week after he claimed scientists had made up the idea of a “polar vortex” to explain the frigid temperatures. In Limbaugh’s words, “They’re relying on their total dominance of the media to lie to you each and every day about climate change and global warming.”

When the anti-climate change contingent is forced to rely on someone like Limbaugh to make its case, you know they’re in trouble. This is the same clown who regularly rants against women, African Americans, Latin Americans, Native Americans, any religion except Christianity, homosexuals, immigrants and anyone who’s not a member of the Republican Party. This is someone you want on your side? Why not hire Krusty the Clown? At least that clown’s got a sense of humour.

Thankfully for Limbaugh, he probably won’t be around in 20 years or so when the world starts to get really nasty. As if it isn’t already insufferable enough in many tropical countries where temperatures are making life nearly unlivable for much of the year, it’s going to get a whole lot worse. According to a study in the respected journal Nature, tropical countries like Indonesia will start experiencing regular, unprecedented heat waves just five years from now.

An article in USA Today from last October 10th, says these heat waves will start to affect much of the U.S. just 20 years later and will create a tipping point after which the temperatures will rise every year. The figures will break every record set in the last 150 years if climate conditions continue to change at their current pace. The study’s lead author says, “Whatever climate we were used to will be a thing of the past.” Scary stuff.

It’s fun to make jokes about global warming – but man-made climate change will soon be anything except a laughing matter. Living in Canada, we may end up being insulated from some of the most radical changes for a few extra years. Right now, it’s rather enjoyable to have spring arrive earlier, winters pass faster, less snowfall and rainfall, and some of the other benefits we’re becoming accustomed to.

But, ask people in tropical countries what their lives are like today compared to what they grew up with – or talk to Americans in the drought-stricken regions – and you’ll gain a different appreciation for why climate change is something none of us should be looking forward to.

“Hot enough for you?” will no longer be a tired, summer catchphrase. Instead, it will be an inescapable reality. As the number of those opposed to the idea of manmade climate change dwindles and the temperatures skyrocket, it will be more than hot enough for everyone. And clowns like Rush Limbaugh will, no doubt, find someone else to blame for it.