May 6, 2009

Calm before the swine

Posted in Global discourse tagged , , , , at 9:59 am by Maggie Clark

There is reason to think positively about the strength of citizens en masse. There is reason, too, to think positively about the benefits of our new networking technologies. And one need look no farther for proof of this than the confrontation between panic and perspective in relation to the swine flu epidemic.

Swine flu had, and still has, all the earmarks for a perfect shock story: The strain, H1N1, afflicts the healthy, the strong, by over-stimulating the immune system’s response. It’s an inter-species mutant, so you can imagine the inference that it must surely be three times as strong as its avian, human, and swine strain predecessors. And the outbreak has been tied to Mexico — just one more illegal immigrant to worry about, right? (It’s even being called the “killer Mexican flu” in some circles.)

As I write this, according to the Canadian Public Health Agency, there are 165 reported cases of this H1NI strain in humans in Canada. The U.S. claims 403 cases, and between the two of us we have exactly two confirmed deaths. According to WHO statistics (current to May 5) Mexico has 822 cases, with 29 deaths; in the whole world, 21 countries share a collective case count of 1,490, with no other confirmed deaths.

If scientists declare that the strain has established itself outside of North America, the flu will reach pandemic status. In theory, that sounds terrifying, but really, the meaning extends no further than the fact that the illness can be found across the globe. The term pandemic says nothing, for instance, about how lethal or non-lethal said condition is; and though some sources are fond of speculating worst case scenarios, this means that the death rate is still very low. How low? Let’s take the U.S. numbers to illustrate: Annually, there are some 200,000 cases of hospitalization due to typical flu types in the U.S. — and 36,000 deaths. By this measure, swine flu has a long way to go before being anywhere near as serious a threat as its local, home-grown competitors.

And yet all this, for me, isn’t where it gets interesting. Not even close. Rather, what continues to surprise and impress me is our capacity for self-regulated response to the initial panic invoked around this illness. Yes, the media was talking up a storm about Influenza A H1N1. Yes, doomsday speculation was abounding. And yes, many industries — sanitation and pharmaceutical groups especially — have profited greatly in terms of market shares and business from all this panic.

But also abounding was — and still is — a countering force of calm. And it takes some truly extraordinary forms: For instance, mainstream news articles taking other articles to task for the lack of coverage about all the good news we have about Influenza A H1N1, and ethical deliberations about whether or not laughing at this illness (its name, its origins) is acceptable. And then there’s the really fun stuff: Stephan Zielinski applying the amino acid sequence for Influenza A H1N1 to ambient music. Gizmodo posting a hauntingly beautiful video demonstration of how the virus gets released. xkcd.com aptly encompassing the typical range of responses to Swine Flu on Twitter.

In other words, for all the panic we’ve had thrown at us about this illness, many have responded with a measure of fearlessness at least a hundred times as infectious. Does this mean everyone is rid of that panic? No, of course not: these reactive trends are often regional and compartmentalized due to varying interests and complex investments. The mass killing of all pig herds in Egypt, for instance — a perfectly rational response to a disease that, at that time, had no cases of pig-to-human infection manifested in the world, and absolutely no cases of human infection in the country itself — leaves huge consequences for the pig farmers, who with 300,000 animals killed have lashed back at the government in the form of protests: doubtless this panic attack on the part of officials will leave a long list of social consequences in its wake.

But think back, for comparison’s sake, to our global reaction to SARS — the extreme panic, the devaluation of tourism in heavily affected cities and regions, the dramatic quarantining procedures. Globally, the disease racked up 8273 cases, with 775 direct deaths (a death rate of 9.6 percent, weighted heavily toward seniors). Though clearly a more serious disease than Influenza A H1N1, the overall death rate of Americans due to seasonal influenza was still much higher; and yet our panic was long-standing and far-reaching, in large part because we were given no room for questions of doubt: only more panic.

Similarly, I’m not convinced the relative calm in this case emerged from the ground up: rather, I suspect news articles first had to present seeds of doubt about this issue, as forwarded by scientists reacting to the extent of media spin. I think room for doubt had to emerge from these sources first; and then the average reader, artist, and blogger could follow after — in turn serving to create more room to maneouver, rhetoric-wise, in future works by the mainstream media. But regardless of speculation about just how, and in what order, these groups fed off each other — the scientists, the media, and the participatory citizenry as a whole — what’s more striking is that they fed off each other at all to produce this ultimately calming effect.

We have, in the last 8 years, kicked ourselves over and over again for allowing flimsy excuses for war-mongering to stand; for allowing freedoms to be stripped from us in the name of security; for permitting, in general, the hard polemics of with-us-or-against-us to divide the population. And rightly so: When we go along with fear-mongering, we can be, en masse, pathetic excuses for an advanced and critically thinking civilization.

But cases like our reaction to swine flu should likewise give us cause for hope — and should be treated as such, with praise for measured response wherever it emerges. For as much as we can act like sheep if treated like sheep, it nonetheless takes precious little in the way of tempered social rhetoric for us to realize our own, independent engagements — fearless, inquisitive, and inspired alike — with the world instead.

Advertisements

May 1, 2009

Death by any other name

Posted in Military matters, Public discourse tagged , , , , , at 9:57 am by Maggie Clark

Major Michelle Mendez, a Canadian soldier stationed in Afghanistan, was on her second tour in the region when found dead in her sleeping quarters at Kandahar Airfield. Hers marks the third death of a Canadian woman, and the 118th fallen Canadian, in Afghanistan since our involvement in the conflict began. The media has done an exemplary job of presenting Mendes in the respectful light afforded all Canadian soldiers lost in this conflict — and perhaps with extra care, too, because hers marks the second female fatality in as many weeks — but one word is pointedly absent from all talk of her “non-combat death”:

Suicide.

According to the Canadian military, an investigation into the circumstances of her death is still ongoing: evidently the possibility of her firearm accidentally discharging has not been entirely ruled out, though The Globe and Mail reports that “a Canadian government source said ‘all evidence points toward a self-inflicted gunshot wound.'”

The prominence of this story, and the blatancy of the aforementioned omission, have piqued my interest. The debate about whether or not to talk about suicide in newspapers, and in what ways, with which emphases, has been waged for decades. The argument ultimately centers on two points: the quest for greater public understanding, and the fear of inducing a copycat effect among readers. To this end, there are fierce defenders of different approaches — each backed by their own body of research and professional opinion. Last year The Newspaper Tree wrote an editorial responding to reader concerns over the term’s use in relation to one case: therein they noted that certain organizations of mental health professionals agreed it was better to tell readers the cause of death, but that the stories needed to be presented with the “valuable input of well-informed suicide-prevention specialists” in order to be effective. In that same year, Media Standards Trust published a firm condemnation of suicide stories, citing the high statistical correlation between published stories and copycat suicides.

My problem with the omission approach, however, is its selectivity: Suicides are deemed taboo, but the publishing of violent domestic deaths? murder-suicides? school shootings? isn’t — and all of these stories arguably pertain to people in even more disturbed mindsets (one, because I do not hold that everyone who commits suicide is “disturbed” in the sense of having lost their ability to reason; and two, because their acts take the lives of others, too). A recent Times article asked if the copycat effect was being felt here, too, pointing to the lone study that has been completed to date on the theme. The article also developed a short history of the copycat effect in media, which reads as follows:

The copycat theory was first conceived by a criminologist in 1912, after the London newspapers’ wall-to-wall coverage of the brutal crimes of Jack the Ripper in the late 1800s led to a wave of copycat rapes and murders throughout England. Since then, there has been much research into copycat events — mostly copycat suicides, which appear to be most common — but, taken together, the findings are inconclusive.

In a 2005 review of 105 previously published studies, Stack found that about 40% of the studies suggested an association between media coverage of suicide, particularly celebrity suicide, and suicide rates in the general public. He also found a dose-response effect: The more coverage of a suicide, the greater the number of copycat deaths. (See pictures of an exhibit of Columbine evidence.)

But 60% of past research found no such link, according to Stack’s study. He explains that the studies that were able to find associations were those that tended to involve celebrity death or heavy media coverage — factors that, unsurprisingly, tend to co-occur. “The stories that are most likely to have an impact are ones that concern entertainment and political celebrities. Coverage of these suicides is 5.2 times more likely to produce a copycat effect than coverage of ordinary people’s suicides,” Stack says. In the month after Marilyn Monroe’s death, for example, the suicide rate in the U.S. rose by 12%.

Journalists have a responsibility to the living. We have a responsibility to give readers the best means necessary to make informed decisions about the world around them. This also means doing the least amount of harm. In the case of suicide, this measure of harm is difficult to assess at the outset, as even the very language of the event is against us. To “commit suicide” bears with it the gravitas of an age when suicide was deemed a crime, not a tragedy — and not, in some cases, a release from untreatable pain. To “take one’s own life” is a step up — dramatic, but delicately put — though it is unclear if one term is preferable to the other in keeping the copycat effect to a minimum.

That effect itself also plagues me, because I have to wonder if it occurs in part because there isn’t enough reporting: if all suicides were listed as such (3,613 in Canada in 2004; 32,439 in the U.S. — roughly 10/100,000 for each population), and those suicides were contextualized by similar tallying of all deaths (drownings, the flu, and other causes of death with much higher population tolls) would that copycat effect drastically diminish over time?

I can only speculate. Meanwhile, another telling question has a more interesting answer: Can the news provide the requisite depth and breadth of coverage on mental health issues without the direct mention of suicide? In answer, I refer you to this piece from The Globe And Mail, which delicately tackles mental health in the Canadian military as a hot topic arising from Mendes’ “non-combat death,” while the Canadian Press approaches the issue from the vantage point of the female chaplain who presided over Mendes’ ramp ceremony.

There are, then, ways to nod to the issues surrounding suicide without using that word directly. But are they enough? Or does the omission of the word, in conjunction with so much open commentary about related issues, create a different reality — one in which suicide, lacking its public face, becomes at best a vague and theoretical matter?

These are difficult questions, and they grow more difficult when addressing systemic suicides — as exist among many Aboriginal communities in Canada, as well as among military personnel — and when suicide strikes the very young. To whom does the journalist owe her ultimate allegiance: the grief-stricken families, the immediately affected communities, or the public at large? How can we use the fact of suicide to better our understanding of this world we live in? Are we forever doomed to make things worse by the mere mention of suicide’s existence?

Two days ago I watched Rachel Getting Married, a film about a woman who comes home from rehab to take part in her sister’s wedding. A great many difficulties unfold as this woman struggles with guilt and self-hatred, coupled with depression and suicidal tendencies. Watching this film, I registered numerous “triggers” in myself, and cycled for a day and a half back into certain, terribly familiar mental routines. It was then, as I reminded myself that most people likely wouldn’t have had the same reaction to this stimulus, that it struck me: I will never be completely rid of these thoughts, these propensities to cycle between contentment and depression. Anything — a movie, a newspaper article, an off word from a close friend — might trigger them, and then it will be my responsibility to take control of these impulses: acknowledge them, experience them, and move past them.

I know, too, that eight percent of Canadians live with depression, and that at least 16 percent will experience a period of depression at some point in their life. I know I’m on the lucky side of this spectrum: I’ve learned how to counter the anxiety that often pushes depression to the brink, and after years of very extreme engagements with my mental health issues, they are manageable for me. I know this isn’t the case for everyone. I think to myself, what if someone in a much more agitated or suggestible state of mind watched this film instead — or others, with far more tragic endings? What if that was all it took, and the film pushed them to the brink?

Yes, a film or song or book could move someone to suicide. Most likely, it has already happened a lot. In short, anything could be a trigger; anything might be the last straw. But art, like the media, has as its higher purpose the construction of conversations about the world we live in, and how we live within it. So if there is a way to address suicide directly in the news — with the aid of suicide prevention experts; with a fully conveyed understanding of the context in which suicide operates; and with absolute respect for the families and friends each instance affects — I think we need to take it. To do otherwise, for me, is to leave each victim as alone in death as they surely felt in the lives they chose to end.

And honestly, that’s just not something I can live with.

April 18, 2009

The Heart of the Matter: A Shifting Social Discourse

Posted in Global discourse, Public discourse tagged , , , , , at 2:57 pm by Maggie Clark

A very important transition is occurring in North America, and I suspect it will still be another year or so until we grasp its full implications. Just a few weeks back, Chinese financial leaders suggested changing the world’s standard currency from the dollar to a global currency reserve, and UN economists have since backed this proposition. This move would mark a shift away from the U.S. as the source of global financial stability, and towards a preexisting global discourse that will at last be given its own voice, even if North American still plays a large role in the debate.

I suspect the same is very much true for socio-religious discourse: While George W. Bush was in office, the rise of right-wing Christianity in conjunction with the U.S.’s wars in Afghanistan and Iraq launched a polemic debate between Christians and Muslims — a West meets Islam, “U.S.” vs. them affair. Moreover, the rise of a particular brand of Christianity — politically-motivated Evangelical Christians — created in its own right a series of related conflicts on the home front, such that Evangelical resistance to the theory of evolution in classrooms, global warming in government policy-making, expansive rights for women and the LGBT/IQQ community, and various issues pertaining to “morally acceptable” content on national airwaves garnered excesses of media attention and political sway.

Now, though the politically-motivated Evangelical Christian community still amounts to a sizable social force, the media portrays a very different, more long-standing socio-religious battle: the conflict between Israel and the Arab world.

In this ideological warfare, North America undoubtedly still plays a crucial role, but in the last few years this role has shifted from one of proactive engagement to one of passive response. The U.S. has always been deemed pro-Israel, regarding the country as a beacon of hope for stability and the eventual spread of democracy in the Middle East. However, the U.S. simultaneously relies upon strong business relations with nations in the Arab world, and to this end has equally supplied many such countries with arms, money, and the maintenance of dictatorships that suited U.S. interests. This has always made its involvement in the region self-motivated.

Post 9/11, that involvement necessitated a stronger alliance with those who would fight against U.S. enemies in Afghanistan; later, it also meant stronger alliances with those who would support Americans in Iraq. But times have changed. Immigration from the Arab world into Europe created stresses from which controversial national leaders and extreme anti-foreigner stances have emerged. The two-state solution between Israelis and Palestinians, once a viable discourse with its very own “road map” to peace, is no longer a welcome solution for many in the region. And here in North America, every political decision is becoming increasingly mired in questions of perceived Islamophobic, Zionist, anti-Semitic, pro-Israeli, pro-Palestinian, anti-Israeli, anti-Palestinian, pro-terrorist, and anti-terrorist allegiances.

This is not by any stretch of the imagination to argue these terms weren’t bandied about before — of course they were. But what has been lost in recent months, from a socio-religious context, is a sense of North American values having any measure of relevance in the debate. Even terrorism is not being engaged as something feared again on home soil; rather, those terms, like their aforementioned brethren, time and again reroute discussion to the matter of the Middle East.

An excellent example of this arose quite recently, in the matter of George Galloway. Galloway is a five-time British MP expelled from the Labour party for extremely controversial comments made in response to Britain’s invasion of Iraq. He has toured Britain and the U.S., working with many causes: some clearly humanitarian, many others complicated by statements that have brought UN condemnation upon him, and actions that have blurred the lines between humanitarian aid and front organizations for personal gain. (I won’t make a habit of this, but there are so many controversies pertaining to his views, actions, and travels that I’m going to recommend reading his Wikipedia entry — no one mainstream article on the man comes anywhere near as close.) On March 20, 2009, he was denied entry into Canada, on the basis of his ties to Hamas: though he has gone on record stating that he does not agree with Hamas, Galloway gave the government $45,000. As Hamas is on Canada’s list of terrorist organizations, this was enough to deny him entry, though Canadian immigration ministry spokesman Alykhan Velshi’s comment on the issue is a little more dramatic than that:

The Telegraph — Immigration ministry spokesman Alykhan Velshi said the act was designed to protect Canadians from people who fund, support or engage in terrorism.

Mr Velshi said: “We’re going to uphold the law, not give special treatment to this infandous street-corner Cromwell who actually brags about giving ‘financial support’ to Hamas, a terrorist organisation banned in Canada.

“I’m sure Galloway has a large Rolodex of friends in regimes elsewhere in the world willing to roll out the red carpet for him. Canada, however, won’t be one of them.”

Galloway contested the ban, lost, but got around the ruling by being broadcast via video-link from New York to Canadian locations. And so life went on, with the news turning to “Tea Parties” in the U.S. and Canadian outrage towards the Afghani rape law. Yes, we have plenty of political matters to attend to at home; there is no shortage of issues. But the question posed by the high profile case of Galloway — to say nothing of audience reactions to North American portrayals of recent Israeli-Palestinian disputes and Somali pirates– remains: Which is the greatest? Not in the world at large, per se, as so many cultural wars are played out on that stage every day — but here, at home, in North America? Does our ultimate socio-political investment lie with home turfs, and all the multicultural challenges upon them, or quite literally with foreign lands, and the conflicts waged there instead? If the latter, does this tie our future directly to their outcome? What are the implications (not necessarily negative!) of a national discourse set primarily by the happenstance on foreign soil?

March 14, 2009

Why Aren’t We Standing Up To Ad Hominem Attacks?

Posted in Public discourse tagged , , , , at 11:42 am by Maggie Clark

In the wake of the Jon Stewart / Jim Cramer controversy, which I feel was not so much overly hyped as, in its polemic framework, erroneously hyped, a striking point remains unmade: Where was the condemnation of mainstream ad hominem attacks?

Specifically, Joe Scarborough of Morning Joe, by launching a heated attack on Stewart on his show, provides a good example of the kind of argument held by Stewart’s critics all throughout the week of this controversy: Many called him on being a comedian, and condemn him for having critical opinions in this capacity about the statements of others.

People have responded to this condemnation, yes. They have done so by arguing Stewart isn’t just a comedian, noting his strong history of media criticism and notable appeals to journalistic ethics. Not one person in the mainstream media has said, however, “Even if he were just a comedian, would that make his criticism any less valid?”

And that’s a problem. It’s a problem because while many fallacies are very difficult to police (being of the subtler variety), ad hominem attacks are pretty straightforward. Moreover, the ad hominem fallacy in this case is an attack on freedom of speech (in the U.S.) and freedom of expression (in Canada), because by its very nature it implies some people’s arguments are less valid simply because of who is making them, and yet it’s made by people in positions of power — people who, as members of the media, should be empowering everyone to hold them accountable for failure.

So while Scarborough was attacking Stewart for daring to make a critique of CNBC while simultaneously being an entertainer, he (and others like him, in print as well as on TV) was also encouraging the unquestioned use of this fallacy. And that’s dangerous, because Scarborough has privileged access to both the airwaves (which gives him access to millions) and, from his association with notable news organization, a measure of legitimacy (which gives him an edge over pundit bloggers). He is part of a system which sets a standard for casual, daily discourse in North America — and he, like many of the people in these roles — is failing to promote fair, reasoned, empowering conversation in this realm.

The ways in which print, TV, and online articles have used this fallacy are often indirect: Headlines reading “the clown won” after the Stewart/Cramer conversation on The Daily Show are as damaging to the cause of coherent, empowering media discourse as any direct, unchecked statement of “What right does a comedian have to criticize?” could ever be.

And that’s where things get confusing: Why on earth are these statements going unchecked? Where is the dominant culture of critical analysis that curtails, both institutionally and on a case-by-case basis, statements that feed into this “dis-empowerment” of individual viewers?

There was a time when we had few on-air personalities: now we have an excess of them, and the depressing catch-22 is that if the bulk of these personalities don’t regularly remind their viewers about formal argumentative structures, fair comment, and journalistic ethics (which they don’t), said viewers will come to view the kind of argument that exists instead as the right one — fallacies and all. And why should these on-air personalities do otherwise? They were hired because their companies know that entertainment sells, but don’t grasp why Jon Stewart is so successful providing both entertainment and analysis; so these companies treat their forms of entertainment with all the gravitas of serious journalism, even when they’re not. And if a comedian — someone who readily acknowledges that he’s doing entertainment, but maintains a core sense of journalistic right and wrong distinct from his role as entertainer — calls them on it? Well, they’ve got ample public access where they can condemn him for speaking in the first place, instead of addressing his comments, to their hearts’ content. And no one will call them on it, because they’re the ones setting the discourse in the first place, and the discourse they’ve set is of refusing the legitimacy of a comment on the basis of the person who makes it.

… Except that there are people who do ostensibly toil for the protection of U.S. citizens in relation to media abuses. The FCC is vigilant about calling out “public indecency” as it (or rather, the loudest of the interest groups that pressures the FCC) perceives these instances to be. And so we see justice meted out swiftly when a woman’s nipple is shown on national prime-time television, or a children’s cartoon has a character with two moms, or an expletive is used in the wrong time-slot. In all these ways, the general public is kept safe from the excesses of media.

But the unchecked use of fallacies that, by implication, strip an awareness of power from viewers by pushing the essence of American discourse away from what was said, to who said it, and encouraging others to do the same? These are let stand.

I’m not saying the FCC should fine people for unsatiric use of ad hominem fallacy. I’m just saying, Christ, wouldn’t it be great if someone in a position of media authority at least condemned it?