Topics

A veteran journalist's take on such diverse subjects as religion and religious violence, democracy, freedom of expression, sociology, journalism, criticism, travel, philosophy, Southeast Asia, politics,economics, and even parenthood, the supernatural, film criticism, and cooking. Please don't hesitate to participate by starting a comment thread if you have an interest in any of these subjects...or anything else, for that matter... p.write@gmail.com

Language in the era of uncertainty

Declining Standards or a Sea Change?

Pagun

(VANCOUVER ISLAND) I use the phrase “Sea Change” ironically to point out the declining standards also referred to in the slug above this essay.

Although Shakespeare used the phrase in Ariel’s Song in The Tempest, the expression “sea change” is bardactually a reference to Shakespeare’s Hamlet. In the bard’s greatest tragedy, Prince Hamlet of Denmark undergoes a nearly inexplicable modification of his personality and character during an off-stage adventure at sea, which is not performed, but only described to the audience by means of the play’s dialogue. Unobserved by the audience, Hamlet embarks on the adventure as the melancholy Dane, an indecisive, dithering mass of uncertainty. When he returns, he is strong, focused, stalwart, and determined. He has undergone a sea change.

That phrase is commonly employed by writers today who mean little more than a modification of the status quo; as a metaphor, it has lost its punch through misuse and overuse.

I only mention that single and not terribly important example as an indicator of an increasing degradation of the English language. Unexamined metaphors are one thing, but the erosion of the fundamentals of the language is something else again. The inclination to disregard rules and conventions of usage is a clear trend, and it is snowballing as it gathers strength and increases in acceptability. I doubt if anyone who is proficient in grammar, spelling, and word usage has failed to notice, and even decry, the way the English language has lost much of its beauty and elegance in an unequal trade for a naturalistic sound in published prose.

toystory-badwritingEven a cursory look through the Internet editions of highly respected publications like The New York Times, The Atlantic, or Time Magazine will reveal typo after typo, syntactical errors, and misused words. Just today, I saw an article by a professional journalist who used the word “clique” where she clearly meant “cliché”. An error on the part of an auto-correct feature? Perhaps. A slip as a result of writing under the pressure of deadline? Could be. But the point is that errors of all types are increasingly common. What is significant, though, is not so much the frequency of errors, but the indifference displayed even by professionals to their appearance in print or online.

It is exceedingly rare to read anything online today that is entirely free of errors. What is becoming increasingly common, though, is the deliberate employment of chatty colloquialisms, Web-jargon, and acronyms. Serious articles contain expressions like LOL, or WTF, even presidential candidates rely on social media familiarity by posting phrases like “delete your account”, which would have been meaningless even a decade ago.

Now this essay isn’t intended to be a crotchety, pedantic, lament from a professional writer for “the good good-newsold days”. On the contrary. I’m writing this piece to suggest that we may just be at a watershed point in the history of the English language. Historically, it wouldn’t be the first time the language has undergone a process of rapidly overhauling itself.

The Great Vowel Shift of the 15th to 17th Centuries was a process of radical modification in the way English was pronounced; essentially, English was spoken like Chaucerian English before the shift, and like Shakespearean English afterwards. At roughly the same time the pronunciation was changing, the language itself was changing from Middle English to Modern English. A quick way to get a sense of the scope of the change is to compare the readability of something by Sir Thomas Malory from the early-mid 15th Century, to the ease of access enjoyed by the King James version of The Bible, written between 1604 and 1611.

Two great dictionariests also had marked impact on the English language. In 1746 in Great Britain, Samuel Johnson was contracted to produce a more definitive English dictionary than the haphazardly researched ones then available. Over nine years, he single handedly researched, compiled, and wrote A Dictionary of the English Language, sometimes printed as Johnson’s Dictionary. Widely recognised as being among the greatest scholarly achievements in history, Dr. Johnson’s magnum opus remained the standard English language reference until The Oxford English Dictionary was completed over 170 years later. Thanks to Johnson, spelling became standardised. Prior to his dictionary, spelling was idiosyncratic and capricious, with words being written however the writer heard it in his head at that moment. William Shakespeare famously even spelled his own name differently on different occasions. Now there was a correct and an incorrect way of writing the spoken language.webster-johnson

On the other side of the pond, in 1806 Noah Webster’s first dictionary, A Compendious Dictionary of the English Language, was published. Webster can be credited with standardising the American spelling of English language words, and formalising the differences between British and American spelling. In America, centre became center; labour, neighbour, and favour all lost their Us and became labor, neighbor, and favor; specialise swapped its second S for a Z (now called zee rather than zed) and was spelled more like it is pronounced: specialize; and so with civilize, vitalize, recognize, and harmonize.

And all of these mowilliamdifications, shifts, and changes are in the relatively modern history of the language. Before the Norman Conquest in 1066, a modern English speaker would not have understood a Briton’s language at all; the introduction of Norman French to the language of the Angles and Saxons altered the language of the British Isles and sent it off on a trajectory that culminated in today’s version.

That we are in a new period of flux and uncertainty regarding English usage can be attributed to the Internet, of course, but there is also a sociological and even political component at work.

The Internet has provided an audience undreamed of by writers even 20 years ago. And that audience is available to literally anyone who has something to say and access to a computer. For those of us who started our careers in print media, we were second guessed on newsroom-old-schoolmatters of adherence to the publication’s style manual, on spelling, on usage, and our editors always had a spike waiting if we wrote crap. Nothing was set in type without having been seen by at least five sets of eyes. Mistakes in print, therefore, were rare. In contrast, for the vast majority of people who post on social media, or even personal blogs and websites, there is no third party filter; there is no editor to exercise control over content, there are no copy editors to impose style, spelling, and usage standards, and there are no proofreaders to provide a final check for errors. What they post can be stream of consciousness, straight from their keyboards to a potential audience of hundreds of millions of internet-warriorpeople. And no matter how unhinged the copy is, it can find an audience within that vast amorphous crowd who wants to read more of it. That’s how The Drudge Report and Breitbart manage to stay in business.

People are becoming accustomed to sloppy syntax, to the casual employment of neologisms, unconventional grammar, usage, and spelling. The unprofessional writing standards of even very successful Internet outlets has become the new normal. And that’s where the sociological and political component comes in.

For about 20 years now and increasingly every day, a movement within conservative circles has deliberately disparaged and undermined any hint of intellectualism, or expertise. Part of the new conservatism, especially as represented by the so called “alt right”, is an insistence that an expert opinion is just an opinion and anybody, however ignorant of the subject, can have an equally valid opinion. Scientific claims can be refuted, in the conservative zeitgeist, by anyone who makes a contrary claim; accepting an expert’s claim is seen as elitism unless even the least educated counterclaim is given equal standing.

That, of course, crosses over into the field of writing. A professional who anti-intellectual-1automatically avoids splitting infinitives, who ensures that verb and subject agree, chooses words with care, and is consistently accurate in his spelling, is condescended to as an “elitist”. The anti-elitist thinking goes so far as to suggest that it is safe to reject the arguments of someone who frames them logically, presents them with precision and care, and supports his points with factual evidence. To criticise the quality of writing in the blog of someone who is barely literate, is to invite a rebuttal that would argue that it must be good writing because of the number of hits it racks up. Popularity justifies bad writing. Bad writing is becoming standardised. But this may be the dawn of another shift; this time from Modern English to a 21st Century English language.

For a new and updated version of the English language to become the agreed upon standard, perhaps a period of fluidity, of flux, is a necessary precondition. Before a Samuel Johnson can emerge and rewrite the standards of acceptability in the new New English, it may be necessary for people to become dissatisfied with the capricious way people speak and write. If so, that probably won’t take very long. The problem with people using language willy-nilly, without reference to conventions or rules, is that communication suffers.

People simply won’t be able to understand one another clearly without conventions of usage. We have paul-ryan-trump-ap-reuters-640x480already seen the bastardisation of the English language leading to confusion in the current US presidential election. The mutability of words allows, for example, Paul Ryan to refuse to defend his party’s nominee; to acknowledge that Donald Trump is the very definition of a racist, to report being “sickened” by his misogyny, but somehow continue to endorse him. When words do not have a clearly understood shared meaning, every statement made can be weaseled out of as having been misunderstood. That’s where we are now.

But soon, one can hope, people will reject the muddy, indefinite, and vacillating use of language that is so common today. A new set of conventions will, one can hope, emerge, and lead to a comfortable period of clarity, understanding, and perhaps even elegance of expression in the English language. Perhaps then public figures won’t be able to get away so easily with claiming they didn’t say something they were recorded saying. Words have power; it’s incumbent upon us all to insist that the power be used with care.

ENDITEM….

 

Splitting the Baby

A Tale of Two Opinions

Pagun

(VANCOUVER ISLAND) A favourite technique of the right wing seems to be to attempt to influence public opinion by pretending that there is a serious debate on a subject of importance when in fact there isn’t.

An obvious example of that tactic is the right wing’s insistence that the question of anthropogenic climate change is a controversial issue; that there is genuine disagreement as to whether human activity is contributing to climate change. Not that they’d ever admit it, but even that position represents a retreat from their original argument that climate change (the phenomenon formerly known as “global warming”) simply didn’t exist outside of the fevered imaginations of leftist socialist tree hugging alarmists. When the elephant in the room started to fart and trumpet loud enough that its existence could no longer be ignored or denied, the argument became: Sure the climate is warming up, but it’s part of a natural cycle; dumping millions of tons of greenhouse gases into the atmosphere has no effect on the planet. And, of course, that “argument” came from politicians who, entirely coincidentally I’m sure, accepted huge contributions from Big Oil and, bush denialalso coincidentally, voted to give those very companies billions of dollars annually in corporate welfare. Where that spurious argument did not come from was any actual climate scientist.

The level of public discussion actually included everyday conservatives pointing to every record snowfall and unseasonably cold day and shouting out that here was evidence that global warming was a liberal hoax. Rather than becoming involved in a hopeless attempt to explain the distinction between climate and weather, or to explain how planetary warming could lead to anomalous weather events in some areas, climate scientists started to use the phrase “climate change” to make the truth a little easier to grasp. Still, in an attempt to demonstrate to the public at large that there was a serious debate on the issue, at one point the shills for Big Oil managed to put together a list of “scientists” who held that there was no such thing as human generated climate change. Itclimate-change-denier-1 took about twenty-four hours for that ploy to be exposed as a fraud. Among the deniers were high school science teachers who had never published in peer-reviewed journals and a wide selection of experts in fields like anthropology and dentistry. What was absent was any representation of climate scientists. Despite the rhetoric, there has rarely been, at any time in history, so solid a consensus among scientists; only crackpots and non-experts dispute anthropogenic climate change in 2016. And conservative politicians who have sold their constituents a bill of goods.

As far back as 2001, actual climate scientists published their consensus and their scientific opinion on climate change in the Third Assessment Report of the Intergovernmental Panel on Climate Change (IPCC). Its conclusions were summarised as follows:

  1. The global average surface temperature has risen 0.6 ± 0.2 °C since the late 19th century, and 0.17 °C per decade in the last 30 years.
  2. “There is new and stronger evidence that most of the warming observed over the last 50 years is attributable to human activities, in particular emissions of the greenhouse gases carbon dioxide and methane.
  3. If greenhouse gas emissions continue the warming will also continue, with temperatures projected to increase by 1.4 °C to 5.8 °C between 1990 and 2100. Accompanying this temperature increase will be increases in some types of extreme weather and a projected sea level rise. The balance of impacts of global warming become significantly negative at larger values of warming.

 

These findings were and continue to be recognised by the national science academies of all industrialised nations. Since then, climate change has become more pronounced and the scientific consensus has become a virtually unanimous voice. There is, in other words, no meaningful debate.

Nevertheless, people with a vested interest in maintaining the current level of hydrocarbon consumption insist that there is a genuine debate to be had and insist that no serious action be taken until the “controversy” is resolved. That technique is known by logicians and rhetoricians as “the fallacy of the middle ground”. That logical fallacy is the mistake of believing or asserting that if there are two sides to a dispute or two competing opinions, the truth is to be found somewhere in the middle between the two opinions. While that may be true on some occasions, and while it may seem intuitively democratic and fair, it is an affront to critical thinking. The tactic employed here is to stake out a position absolutely contrary to reality and try to force people to move away from the truth and toward an artificial middle ground.

However, simply asserting something does not give the assertion legitimacy or any intellectual standing. Despite science-conspiracythe right wing’s anti-intellectualism and dismissal of expertise as elitism, an expert opinion caries more weight than an uneducated, unsupported claim. That is most particularly true when we are speaking of scientific propositions being contradicted by insisting that anybody’s unsupported opinion is as valid as a scientific conclusion.

There is no serious debate on climate change. There are scientific conclusions, and there are uninformed opinions and wishful thinking based on, of all things, political views. That is not a debate. The only debate is how to deal with the reality that the world is facing a clear and present danger that we continue to exacerbate while we keep our heads in the sand. And we do that so the most profitable corporations in the history of the world can continue to increase their revenues and power, and collect more billions of welfare dollars, courtesy of the politicians they own.

Another example of the technique of insisting there is a controversy where none exists is the increasingly nonsensical insistence on the part of conservative, and especially evangelical, Christians that creationism (or its uptown cousin, intelligent design or ID) should be taught along side evolution as a competing scientific theory. Their argument is simply this: Evolution is a theory; so is ID. They should have equal prominence in schools, and refusing to teach ID on an equal footing is one more example of the modern persecution of Christians and Christianity.

This, of course, is another non-debate. The fact that it is even discussed is evidence of the lack of education of the creationism-evolutionID proponents; they don’t understand what, in science, a theory actually is. Having read little but publications that offer theories like: Elvis is alive and living on life support in a cryogenic chamber in Area 51, or President Obama is a Kenyan Muslim plotting the destruction of America, or the moon landings were faked by Stephen Spielberg as a final project to graduate from film school, they don’t understand what a real theory is. They don’t understand that a scientific theory is an explanation of phenomena; an explanation that has been examined, scrutinised, and subjected a series of repeatable experiments and has survived all attempts to falsify it. A scientific explanation is only considered to be a theory if it is testable by experiment or other empirical method. And those tests must, to be valid, be attempts to disprove or falsify the proposed explanation (or, in scientific jargon, the hypothesis); it is easy to find evidence to support a hypothesis but for the hypothesis to become accepted as a theory, it must survive every attempt not to prove it, but to falsify it.

Evolution is a theory. Anthropogenic climate change is a theory. Gravitation is a theory. Even the existence of atoms and subatomic particles is a theory. In fact, most of those things are looked at as facts by any educated person. They are called theories simply because explanations of phenomena can always be refined and tweaked; to call them facts would be sloppy science. Intelligent design, in contrast, doesn’t even qualify as a hypothesis. It is merely an assertion based on an interpretation of a compilation of folk tales told by illiterate late Neolithic middle eastern nomadic goat herders and written down some time over two thousand years ago. To call it a theory is to misunderstand and misuse the word.

At first glance, it is bewildering that scientific questions are political debates with sides lining up along the CorpSpend91214liberal/conservative division. But when one considers that the denial of human caused climate change is supported by the same people and for the same reason that they denied the connection between tobacco smoke and lung disease, it becomes a little clearer. Their phony arguments are sponsored by industry and insisted upon because doing so makes right wing politicians a lot of money. The same is true of the refusal to make any real attempt to stem the flood of deaths by gunshot; politicians have been paid to insist that guns don’t kill people.

The outlier here is the insistence on the propagation of intelligent design as science. Nevertheless, there is a political element in that phony debate; conservative and evangelicals Christians, the proponents of ID, tend to be on the right of the political spectrum, so conservative politicians find themselves pandering to their crackpot notions in an effort to ingratiate themselves. It is cynical in reasonably rational politicians, and in the devout, it is simply one more example of the religious right’s constant pressure to undermine democracy and create a Christian theocracy.howinsteadofwhat

Critical thinking, self education, broad reading, and constant vigilance are all needed to push back against the forces that would twist the facts to fit a political agenda. I believe very strongly that, rather than expend enormous amounts of money and energy to teach particular doctrines in schools, there should be a combined effort, from both sides of the political spectrum, to include critical thinking as an integral part of the curriculum. When rational scepticism, an understanding of rhetoric, recognising sophistry and logical fallacies are all part of the arsenal of students it would be interesting to see how many of these idiotic non-debates simply fizzle and disappear.

Enditem…

Religion, Logic, and Correspondence

Helping with critical thinking….

Pagun

VANCOUVER ISLAND, CANADA – I just thought I’d run this little bit of correspondence as an example of the kind of thing that occasionally arrives in my email. This one, despite its tone of superiority and hostility was less offensive than most that come from people who hold similar points of view, and since he or she claims to be a high school student, I didn’t simply dismiss the writer as arrogant and obnoxious. Nevertheless, the writer demonstrates a very worrying tendency that is all too common among older versions of him or herself: an inclination to extend philosophical disagreement to personal animosity. Despite my decision to address my interlocutor with civility (okay, and a little condescention), the hostility in his/her original email is palpable.

I should note that I received no response to my reply.

Pagun
house on religion

Hello,

Religion: a personal set or institutionalized system of religious attitudes, beliefs, and practices. (according to http://www.merriam-webster.com/dictionary/religion).

According to this definition atheism in itself is a religion. In your articles (that lack logical arguments and correct spelling) you state many “reasons” why religion is “obscene” and should not be allowed and should be turned against. According to this definition your entire beliefs should not be encouraged and your arguments are hurting your own opinion. By informing people of how bad religion is you are saying that your beliefs are also wrong.

If you would like to gain proper knowledge of the subject you pretend to know so much about i am sure the internet would help in that research. Until then enjoy posting the defamation of your own beliefs.

Sincerely,
A high school student.

bible-logic

 

I emailed my interlocutor the following response: (Pagun)

Dear “A high school student”:

First of all, I would like to offer my sincere thanks for reading the articles on my website, and for giving them such intense consideration. I am very grateful, as I want to try to make people think about subjects that are of interest to me; it appears that I have succeeded, in your case at least!

Let me start by addressing your main point.

The definition of “atheism” with which you start your email serves to demonstrate precisely that atheism is NOT a religion.

Let’s start with religious “attitudes”. Atheism doesn’t meet the definition provided since it espouses no religious attitudes; rather, it denies the validity of any god and therefore most religion. Oh, to be sure most atheists have an attitude toward religion, but that is something quite different from having a religious attitude. But since that question of religious attitudes is the most subtle of the three benchmarks you’re using, let’s move on.quote-a-nation-without-a-religion-that-is-like-a-man-without-breath-joseph-goebbels-232253

As to your (Merriam Webster’s) second test: beliefs; atheism by definition is an absence of belief (in gods, specifically), so it doesn’t meet that standard either. Lack of belief is not a belief.

Man made god

And finally: practices. Atheists have no defining set of practices; atheists are only discernible by their absence of belief, and then, only if they tell you about it. (Unless you consider critical thinking a religious practice, in which case I would argue that religions actually discourage genuine critical thinking, so I don’t see that as a way to make the pieces fit.)

In any case, the definition you provided requires that atheism meet all three tests; it quite palpably doesn’t meet even one. I’m sorry to say, therefore, that your argument from definition fails at its first premise.

As has been said before, “Atheism is a religion like not collecting stamps is a hobby.”

Therefore, as an atheist (a person without a belief in god and with no religion), I repeat my assertion that religion is a plague and that humankind would be far better off without it.Intelligence and religion

Now, as to the rest of your letter: I will simply dismiss your proposition that “by informing people of how bad religion is you are saying that your beliefs are also wrong” as it rests on a faulty premise. Beyond that there doesn’t seem to be much content other than your expressed desire that I stop posting until I have researched. You’re not entirely clear about just what you would have me research before I post again, but my suspicion is that you mean I should know more about religion if I am going to speak about it in such negative terms. If that suspicion is correct, I have two responses: In the first place, if I am going to proclaim a disbelief in fairies, there is no prevailing requirement for me to read every “expert’s” opinion on the length, span, colour, and translucency of their wings (thank you, Richard Dawkins). In the second place, my suspicion (again) is that I have done considerably more research on the subject of religion and am better versed in its intricacies than most people, including the most vocal true believers, and that quite likely includes you.

I will address briefly your parenthetical criticism of my spelling and logical arguments. As to spelling: I have no doubt that a spelling error – or many – have escaped my editorial eye; for that I apologise. I might caution you, however, about provincialism in your use of the English language. I tend to adhere to British conventions in my spelling, hence neighbour with a “U” and specialise with an “S” rather than a “Z”. The default spellchecker on Microsoft is U.S. spelling; I don’t use that.

This brings us to logical arguments. I see you have tried to formulate one and for that I congratulate you. It doesn’t survive even casual scrutiny, but at least it represents an logical-fallacyattempt to think rationally, something which I encourage heartily. Moreover, logical arguments are always welcome on my website…that is the place for discussion of the subjects broached in my articles.

When I get a personal and anonymous email to me criticising something on Pagunview, it is my normal practice just to delete and ignore it. On Pagunview, I have a comment section at the end of each of my articles and I always respond to rational argumentation; I’ve never censored anything except spam and raw abuse. In your case I chose to answer in this way because I admire your passion, and I hope that you fan that spark of critical thinking into a small flame and eventually a genuine fire.

I’d be happy to recommend some sources for learning about logic, argument, and critical thinking if you wish.

Once again, thank you for taking the time to read and think about what I have written.

Sincerely,
Pagun

ChristianLogic

 

…enditem…

I have seen the future, and it’s murder…

Loonies, and teabags, and prayers…oh my!

Pagun

VANCOUVER ISLAND, CANADA – Since it is now true that most people in North America use the Internet as their primary source of news, I’ve been trying to take the pulse of the Internet surfing public. To that end, I’ve been following news commentary on Internet news providers like Yahoo; I’ve even posted a couple of comments on a sampling of news stories to get a sense of the level of news discussion in which the general public engages. I’m here to tell you that if I have seen the face of the future, we’re in for a rough ride.

A few weeks of using the most popular Internet portals for daily news and commentary is a sobering experience. In the first place, the editors at Yahoo, to use the most popular source as an example, don’t seem to draw a distinction between news and commentary; click on a headline and your chances of opening an opinion piece by one of Yahoo’s bloggers is about the same as getting a Canadian Press, or Reuters, or other newswire piece. I have no idea how Yahoo chooses who is to be one of their contract bloggers, but they certainly have strong opinions, with, it seems to me, a right leaning predisposition. This is fair enough, of course; unless of course it is run without clearly acknowledging that the opinions expressed are personal views and not news reporting. Imagine if my opinions in the posts on this site were run without being distinguished from news! Even I would object to an unbalanced, partisan op-ed – like most of my pieces – being run under a news headline on the news section of a news site.

I won’t even bother going on about the preponderance of celebrity gossip, gotcha photos of “celebrities” I have never heard of since I don’t watch reality shows, and intensive analysis of the wardrobe choices of virtually anyone who has ever had a picture taken. There is no need to click on those headlines, and to maintain one’s self respect, one simply doesn’t.

I’m not even going to spend time bemoaning the wretched quality of the reporting and writing of the actual news they run between their lists of “10 things Men Hate About Women” and “12 Foods That Will Reduce Stress”. Let us just say that the content of the news logs is supermarket tabloid level and the form is barely literate.

But for a glimpse into the heart of darkness that seems to be at the centre of the Internet surfing experience, you need to follow one of the interactive threads provided for readers’ commentary after each piece. Now that can be truly frightening. A casual or even a serious look into these threads reveals a subculture dominated by vicious, hate-spewing, intolerant, uneducated, right wing bigots. If you want to challenge this observation, just pick a Yahoo News story on any high profile issue. Make a mild comment that suggests tolerance, or compassion, or human decency, then sit back and watch the replies come flooding in.

Is it just me, or does this guy look like Reagan?

I read a piece on Hilary Clinton’s release from the hospital after she was treated for the blood clot she incurred when she recently fell; I commented that I was happy she had recovered and hoped that she was in renewed good health. The very first comment that was posted was a carefully thought out discussion opener. I quote it verbatim: <<Pagun, your a scrotum sucking Liberal %$#@*& who needs to be frickin shot. You and every other *&^%@#$ dont understand freedom or democracy!!!>> (No, my interlocutor wasn’t sparing my sensibilities with that collection of symbols…Yahoo apparently runs an algorithm that censors unacceptable words. Perhaps to avoid racist comments it won’t let you post the word “white”. This led me to read one of my own posts after it was cleansed and I found that I had referred to the President’s dwelling as the @#$% House).

Apart from the clear stupidity in the response to my somewhat innocuous comment, there is a worrisome undercurrent that runs through the Internet news forums. The right wing violent rage is palpable and it manifests itself in outbursts of venom at the slightest hint that someone may hold a differing point of view on even the least contentious issue. For the right wing, it seems, it’s not enough to disagree with Hilary’s politics; it’s not enough to resent her bitterly; it’s not even enough to despise her; they have to wish violent death upon anyone who even treats her with a modicum of courtesy.

Imagine the fun if you comment favourably about the @#$% House’s proposals for gun controls. Since I post comments using my “Pagun” handle, it’s fairly easy for even the none-too-bright trailer trash to find this website; one mild comment supportive of the need to reign in the gun violence in the US and I was inundated with death threats apparently intended to persuade me that they were from responsible gun owners. I know they were responsible gun owners because they told me so, and then promised to use their assault weapons to <<*&^%#$  shoot (my) mother&^%$*  Liberal  &^%$  off and teach (me) about being a real man>> since I am <<a frickin fudgepacking %$#& hole>>.

Hand in hand with this extreme intolerance is an inclination to politicise virtually everything. A woman stabbed her husband and two children to death; the first response in the midst of the that tragedy and the overlapping mourning of the children who died in the Sandy Hook school massacre? <<Now Obammy’s gonna want to take away knives from law obiding citizen’s>>

Something I am learning is that there are two categories of people, according to the audience who chooses to engage in public news analysis on the Internet. If you say anything vaguely positive about environmental efforts you cannot escape your categorisation as: a gay, hippy, Marxist, unemployed, welfare sucking, intellectual, abortion pushing, gun-hating, deluded, atheistic, anarchist. And on the other side, if you are a fiscal conservative, you find it necessary to espouse unfettered civilian access to weapons of war, killing the poor, rejecting all science, Christian fundamentalism, life beginning at conception, eating the whales, drill and frack in Banff, pave the forests, torture prisoners, invade every annoying country, and arm teachers. No middle ground; compromise is failure; shout the others down and deny their right, not just to an opinion, but to live.

My journey through the muck of the lowest common denominator on the web was profoundly depressing. I know there is more out there; I also look at genuine sites with actual news and therefore actual discussion, and I am sometimes refreshed by the thoughtful comments and I’m occasionally inspired by the insights found there. What is depressing is that such reasonable discussion is hard to find whereas the easily accessed surface stuff would embarrass Jerry Springer. This seems to me to be a perfect example of what I used to call The Pagun principle when I taught critical thinking to first year university classes: Ninety percent of everything is crap. 

And, judging by the level of stupidity of the content of the Internet news and those who weigh in on it, that principle needs to include people. Yes, as Leonard Cohen put it, I have seen the future and it is murder.

…enditem…

 

 

Death merchants

The NRA comes out, guns blazing

Pagun

VANCOUVER ISLAND, CANADA – “The only way to stop a bad guy with a gun is a good guy with a gun.” Never mind the fractured syntax; it’s the tortured logic of Wayne Lapierre, Executive Director of the NRA, which is terrifying. In the NRA’s first public statement since 27 people including 20 1st grade children and six staff members of an elementary school were slaughtered by a gunman using a legally acquired assault rifle and several legal semi-automatic handguns, the tone was combative and defiant. The general message was that it was the media’s fault for demonising guns and that if only the teachers – and presumably the children – had been armed with their own weapons, lives would have been saved. Not ensuring that teachers carry weapons, it seems, is irresponsible, and gun control advocates can shoulder the blame for the deaths of the children. And of course, gun control advocates are further to blame for “politicising” the question of gun control legislation, according to the anti-gun control legislation lobbyist.

It wasn’t really a press conference – Lapierre read a prepared statement and took no questions – it was more of a commercial advertisement for the gun lobbyist’s clients, the weapons manufacturers. After suggesting that teachers ought to carry weapons to class, Lapierre dwelt on his frankly bizarre argument that since we see fit to protect the president with the Secret Service and our money in banks with armed security, we ought to ensure that there is armed security in every school. He went on to call on the government (this small government arch-conservative gun lobbyist) to place at least one heavily armed security guard in every school in the country; the clear implication was that the government was to blame for the death of those children. It wasn’t mentioned that there was an armed guard at Columbine whose effectiveness was nil. Nevertheless it wasn’t lost on many that to implement his plan would result in a spike in weapons sales in the United States. Presumably creating a business opportunity for his clients out of the unthinkable tragedy is less cynical than seeking legislation to prevent it from happening again.

Continuing the cynical tone of the advertisement, Lapierre blamed gun violence on everything he could think of except, hardly surprisingly, guns. He blamed video games, he blamed movies, he blamed rap music, and he blamed mental illness. It never came up (as I mentioned, there were no questions permitted) that games, movies, music, and nutjobs are available everywhere in the world, but this level of gun violence is still a uniquely American phenomenon. Somehow the fact that, largely because of his organisation, the United States, with five percent of the world’s population, has fifty percent of the world’s guns just didn’t get a mention. Moreover, the fact that eighty four percent of gun homocides in the entire world are American didn’t get any play. And the NRA sees the solution as arming more Americans.

While the world shakes its head in dismay and the people of the United States

Your Child Here

bury more bullet riddled children, and wonder why they are being told that more guns will solve the problem, the NRA has leapt upon this latest marketing opportunity. The NRA sells death and pimps its product by peddling fear.

The NRA has blood on its hands, but it isn’t satisfied yet; it wants more. It’s time to shut that organisation down.

…enditem…

 

The neglected survival skill

Critical thinking

Patrick Guntensperger

VANCOUVER ISLAND, CANADA

Recently I was asked to give a seminar for senior high school students; the idea was that I was to introduce them to what they would confront when they first started attending university. This wasn’t a big stretch, as I have frequently given a “Welcome to University, You’re Not in High School Any More” address to freshmen at universities where I lectured. It was largely just a matter of changing many of my observations from the present progressive tense to the future progressive.

The content by now is relatively familiar to me: choosing an institution; course selection criteria; techniques for note-taking and studying; time management strategies; pointing out the need for self-reliance and the fact that there won’t be anybody to nag them about completing assignments or attending classes; balancing social and academic activities; the value of identifying the best campus pub prior to the first day of class. (At my alma mater, York University, that is clearly the Absinthe Pub in Winters College.) So, since I had heard all of this before, for me the most interesting part was the interactive part of the seminar which started with a Q & A. I always like that bit because I get to know the students personally and my confidence in the survival of the human race is constantly renewed by the intelligence and enthusiasm of a few of the students. (Of course my natural cynicism is also frequently affirmed by others, but on a good day I choose to wallow in denial on that score.)

What was really driven home to me at the last seminar was the desperate need in high schools, and preferably from 1st grade onward, for some education in critical thinking.

It always surprises and dismays me when I consider that school curricula are written under the assumption that thinking skills are esoteric and suitable only for higher education and then only for those specialising in such rarified fields as philosophy. It’s hard to understand how our educators can believe that students can be presented with a virtual infinity of information, sort through it, master the subject to which it pertains, pass an exam on it, and compete on a world level with other students…all without having been taught first of all how to think clearly and to discriminate between genuine information and crap. Given the Pagun Principle (90% of everything is crap), that skill, the ability to discriminate, is becoming more and more crucial.

I think it’s important that the fundamentals of critical thinking be taught even in the earliest years of formal education so that all students, not just those who are university bound, are exposed to its principles. Critical thinking is a life skill. Having just watched the United States presidential election and campaigns, I was struck by how lacking in critical thinking the general public seems to be; certainly at least one of the parties running was convinced that the general public lacked that skill entirely. A knowledge of some basic logical fallacies, one component of critical thinking, would have encouraged more people to query some of the claims made, reject many of the arguments put forward, and spot the manipulation attempted in much of the advertising.

Logical fallacies are identified and catalogued errors in reasoning; they can be used inadvertently when we think instinctively rather than in a critical fashion, and they can also be used deliberately to manipulate the views and opinions of others. It is perhaps worth looking at one or two logical fallacies at this point to illustrate how important a familiarity with critical thinking can be.

One of the most common fallacies employed in everyday life is known as “The Straw Man”. That is a rhetorical device in which one participant in a debate attacks a position that is not actually held by the opposition. Debater A articulates an exaggerated, distorted, or non-existent claim or argument of debater B and then proceeds to demolish it. If done well, this can leave debater B trying and failing to defend a position he doesn’t hold. Every high school student is familiar with that fallacious argument; who hasn’t been involved in something like this?

Mom: Are you really going to wear that (shirt, skirt, sweater)?

Teen: Yes. It’s really comfortable.

Mom: Why do you always want to dress like a (bum, slut, slob)?

Mom has achieved her aim; the teen is now defending a position that involves whether he/she always wants to look like a bum, slut, or slob; in fact, the position put forward by the teen was that the item in question was comfortable. Trivial? Certainly. Mom, however, probably committed the Straw Man fallacy unknowingly and the teen didn’t spot it, so the scene is likely to escalate. A little critical thinking could have avoided the confrontational aspects of a simple wardrobe discussion.

The Straw Man is important in the larger scheme of things, though, because it is a favourite of politicians. In the recent United States presidential election, for example, Republican candidate Mitt Romney committed this fallacy; in his case he did it deliberately and to great effect.

A lie supporting a Straw`Man

Specifically, the Romney campaign put out a series of television ads which condemned Obama for having eliminated the work requirement for welfare recipients, then went on to build a case for cutting benefits generally with the justification that the United States was becoming a country of takers and freeloaders at the encouragement of the president. The problem was that Obama had not eliminated the work requirement; he had – at the request of several Republican state governors – allowed the welfare regulations to be administered by the states themselves so that the work requirements could be increased if the states so desired. A quintessential Straw Man, supported by the oldest fallacy of all: barefaced lying. Someone with critical thinking skills would have identified the manipulation and questioned the entire edifice that was based upon the original Straw Man.

Another very common logical fallacy carries the somewhat forbidding sobriquet “post hoc ergo propter hoc” (Latin for “after this, therefore because of this”).  Simply put, post hoc is the error of assuming that because one event follows another it was caused by the first. We do it all the time instinctively; it is part of the way we learned to understand the world. We turn the stove on, the element heats up; we treat our toys harshly, they get broken; we eat too much candy, we get a stomach ache. The problem is that we often extend the principle to situations in which there is no causal connection between two events. Examples abound.

I wore these socks and I played a great game; I’m going to wear them every time I play.”

“A virgin fell into the volcano and it didn’t erupt. We ought to toss one in every year.”

“I prayed that my son would come home safe and he did. Now I’m going to pray that I win the lottery”

While the foregoing are obvious examples of a misplaced causal connection, one need only take another look at the US presidential election to find a glaringly evident use of post hoc ergo propter hoc to deceive and mislead.

Misidentifying the ACTUAL cause of the economic crisis

The Romney campaign was largely based on the Republican argument that the American economy is in terrible shape; much worse shape than it was on the day Obama was elected four years ago. In a nutshell: 1) Obama is elected, 2) The economy tanks. From that sequence, the inference is supposed to be drawn that Obama’s presidency is responsible for the economic crisis. Of course, that is quite simply wrong; the economic crisis was the result of the irresponsible fiscal policies of the previous administration as well as commitments – including two wars – made by the previous administration that required the Obama administration to spend excessively in order to honour them.

There are some forty logical fallacies that most logicians agree on. There are about ten that are as common as the dirt thrown in an election campaign; it is a tragedy that students are not taught to identify them. A systematic understanding of the rhetorical devices, logical manipulations, and duplicitous methods employed by the marketers of products and political candidates, one would think, is a necessary survival skill. It becomes ever more important as information becomes easier to access.

One of the problems with the abundance of information available and the ease with which we can access it is that utter bullshit comes up on a search engine’s results just as easily as does accurate, reliable data. The fundamental thing that that critical thinking does for us is to provide a framework for deciding what to believe and what to reject; given the Pagun Principle, there is a great deal to reject and only a relatively small amount to believe. And that large percentage that ought to be rejected increases exponentially during election years.

An education system that places an emphasis on critical thinking will be doing its job far better than one that emphasizes rote learning and unquestioning acceptance of that which is placed in front of the students. Surely with lying and deceit as the new paradigms in politics and marketing, it is time we realized that what our students really need is not to be taught what to think, but how to think.

…enditem…

 

Problem solving 101…leave judgment out of the equation

Somebody brought this to my attention recently. It’s a piece I published in the Jakarta Post several years ago at a time when Indonesia was struggling with these issues. There was an enormous backlash against a suggestion in parliament that a harm-reduction model be applied in Jakarta to combat the growing problem of HIV/AIDS. Predictably, it was the religious right who bleated about this approach encouraging drug use, sexual promiscuity, and homosexual practices. The only difference between their reaction and the ones we have all seen here in the West is that theirs came primarily from Islamic fundamentalists rather than Christian fundamentalists.

 

Harm reduction and the fight against HIV

Patrick Guntensperger

Jakarta

 It’s a well worn cliché that education about HIV is needed in order to combat the lethal virus. However, it’s not just a lack of understanding of the scientific details of HIV and AIDS, but also a failure to understand the disease’s moral, social and legal implications that impedes our attempts to control the spread of the virus.

AIDS and HIV, the virus that causes it, come with considerable emotional baggage that muddies the waters when we are trying to deal with the disease. Unlike say, diabetes or cancer, the victims of HIV are demonised and all too often treated as pariahs; the suffering caused by the virus itself can be less acute than the suffering caused by the stigma attached to the disease. When AIDS was first detected and identified inNorth America, it was thought of as a “gay disease”. It appeared to have vectored out from a very promiscuous gay flight attendant so the first identifiable group of sufferers were his sexual contacts; other gay men throughout the world.

Little was known about AIDS at that time, not even that it was caused by the virus that would soon be known as HIV. How it was spread was not even known, although sexual contact was quickly pegged as the probable mechanism. With that pedigree, AIDS was not considered by most governments to be deserving of much investigation or research funding; after all, it was a seen as disease restricted to gay men and spread by sexual activity.

The general public’s perception was that it was of little concern to them and that if the homosexual population wanted to avoid the disease, they ought simply to stop having homosexual sex. A more radical view, expressed by a number of influential American religious leaders, was that the disease was God’s punishment for the “sin” of homosexuality. The implication was that to combat the disease, therefore would be to attempt to thwart God’s holy retribution.

But the disease started to be found among yet another marginalised group: intravenous drug users. And the rate of the disease’s spread was beginning to alarm epidemiologists. Unfortunately, in many people’s view, junkies were no more deserving of support than queers. The result has been that the search for a cure and the battle to control the spread of HIV and AIDS has carried an enormous social and political burden that simply doesn’t exist with equally lethal but non-communicable diseases.

Now that HIV is known to be spread by blood and, to a lesser degree, other body fluids and is not restricted to homosexuals or drug addicts, there is a stronger social will to find a cure. Great steps have been made in developing drugs that extend the life expectancy of those infected, but a cure has not yet been found. It is the spread of the virus that must be addressed in order to give us time for the researchers to find that cure. Unfortunately mitigating the social and moral stigma attached to HIV and AIDS has not kept pace with the work done on the purely medical aspects of the disease.

Harm reduction techniques can unquestionably slow the spread of HIV. But there is great social resistance to the implementation of those techniques as a result of completely non-medical attitudes toward the virus and the disease itself. Since the virus can be spread by behaviour that is morally unacceptable to many, efforts to diminish the impact of that behaviour are met with disfavour.

It is widely known that the routine use of condoms would have an enormous impact on the spread of HIV. And yet there are those who fight strenuously against sexually active teenagers being encouraged to use them, the argument being that to encourage their use would be to condone or even encourage sexual behaviour among young people.

And now that the spread of HIV is most evident among intravenous drug users, harm reduction methods provided to drug addicts would be enormously beneficial. If we could erase the stigma of the disease and leave any moral objections to drug use out of the equation, we would see that efforts have to be made to ensure that injection drug addicts have access to sterile needles and are educated in antiseptic injection techniques.The old model just doesn't work

Moral objections to hard drug use have to be put aside and it must be recognised that denying addicts the use of sterile needles encourages the spread of a deadly disease. And this disease has no social or sexual boundaries. It will not stay among those of whom society disapproves; it crosses every border and every social stratum and reaches out to kill our friends, neighbours and family members.

To ensure that drug addicts have sterile injection devices is neither to condone nor to encourage drug addiction. People do not decide against becoming a junky because they determine that it would be too hard to find a clean syringe, nor do they choose to become an addict because they think they know of a source of needles. But if a source of sterile needles is available to those who are addicted, the spread of this disease can be slowed down; maybe even slowed down enough to allow us to find a cure before someone in our family dies.

 …enditem…

 

Specialisation or ignorance?

The limits and purposes of education

Pagun 

(VANCOUVER ISLAND) It isn’t exactly original to point out that we live in an age of specialisation. Early in the game, young people have to make decisions that will affect their entire lives; even before one graduates from high school, students are confronted with life-defining choices. Academic or vocational? Maths and sciences, or liberal arts? The selections a student makes while still undergoing the rites of puberty will determine the course of his or her life; and some of these choices are close to being irrevocable.

Being a generalist isn’t something people aspire to and it isn’t actually possible in any real sense any more. At one time high school students and even university undergraduates were expected to learn something about everything; specialisation came later and was the result of having enough knowledge to make informed choices. Now, however, students have started to specialise so early that it isn’t uncommon for a high school graduate in math to be completely unaware of who the Duke of Wellington was, or for a first year university Arts student to be unable to tell you what a square root is, much less calculate one. This isn’t laziness or poor teaching; this is a side effect of the nearly infinite availability of information.

It has been said that Thomas Young (1733-1829) was the last man to have read everything published. That’s also been said of Samuel Taylor Coleridge (1772-1834) and Voltaire (1694-1778). None of these candidates for the crown of ‘last omni-auto-didact” strikes me as being very likely; I suspect that there was too much already published even in Voltaire’s time for anybody to have absorbed it all. Possibly another candidate, Aristotle (384-322 BCE), some two thousand years earlier, makes a more realistic candidate. In any case, the point is that we no longer live in a world where it is possible to have a general knowledge of even a representative sampling of every field of study. Perhaps in recognition of this, people, while still children, have started neglecting areas that don’t immediately fascinate them in favour of focusing on a field they think will both interest and occupy them for the rest of their lives.

This has a number of negative consequences.

 One of the most obvious problems with making such self-defining decisions so early is that children change as they grow; most men my age would have started training as astronauts or cowboys if they had been required to decide their future occupations at too early an age. To require such decisions of adolescents is only marginally better, if it is better at all. Anyone who remembers adolescence will remember the emotional volatility they endured and the self-destructive choices young teens are prone to make.

It is perhaps better to stop thinking of education as job-training. Education right up until the end of the teen years would be better serving its purpose if it was broad, even all-encompassing. A young person who understands the significance of the Napoleonic wars, the basics of Kantian ethical reasoning, recognizes the rationality of the periodic table of the elements, and is capable of understanding the basics of evolution is a more interesting and in most ways a smarter person than one to whom these are all opaque areas. That person is also in a far, far better position to make any decision at all, particularly one that involves a personal, life-determining choice.

Essentially, we are raising contestants on The Price is Right. The quality and focus of their education is on turning young people into proficient practitioners in their narrow fields, but beyond that, nothing but consumers – ignorant, unthinking, uncritical pursuers of gadgets and status symbols. They are rewarded for screaming in ecstasy and collapsing in bliss at the thought of a side-by-side refrigerator and to shiver with delight at the idea of a new washer/dryer. From their school years they are taught that those things are the Holy Grail of adulthood, but they are not taught, or even given an opportunity to learn, about the legends of the Holy Grail in literature and history.

We need to adjust society’s expectations so that children have their childhood for growing up, not just for their training. Education, until it is time to specialise in specific money-earning skills should focus on literacy and numeracy skills first of all, and then, with those as the foundation, upon critical thinking. Information pervades the world; there is no shortage of sources of knowledge. The most important thing we can teach our children is to distinguish between genuine knowledge and bullshit; between fact and fantasy; between what a politician says to get elected and what is true. If logic and critical thinking were taught in schools as being fundamental tools, all other forms of skill acquisition would be both easier and more effective.

As critical thinking becomes a habit, broad reading and a spirit of inquiry also become habits; learning about our past, about great thought, other cultures, scientific achievements, all these become passions. A broad understanding of the world is an almost inevitable result of learning to think clearly. A narrow mind is one of the saddest and most common human disabilities.

A great familiarity with debentures and stock options is good practical knowledge and can perhaps help a young adult make a living. Marketing expertise, ditto. Being able to solve the problem when a computer crashes may be indispensable, and ought to be learned. But never having heard of Voltaire is a tragedy, and not knowing who Socrates was is an intellectual crime.

The world suffers from a growing suspicion of real science, of anyone with knowledge of history, culture, and philosophy. When, as was recently the case in the United States, a poll can show that 46% of adults do not believe in evolution, and their leaders can run for public office with the conviction that global warming is a liberal myth as part of their platform, it is clear that ignorance has progressed far enough. It is time to turn back the tide.

It is time that adults started to acquire knowledge, and time for children to learn to think…not because it will make them wealthy, but because ignorance in a world of information is the greatest sin there is.

…enditem…

 

 

 

Is Indonesia waking up?

Doing a guest spot

Patrick Guntensperger

Jakarta Indonesia

 We have now applied for a visa for JJ to go to Canada where we intend to have his certificate of Canadian citizenship issued. To do so requires getting him a tourist visa to enter Canada as he can’t get a Canadian passport until he has that certificate; that entailed spending hours at a visa application centre, outsourced to a local entrepreneur, answering pointless questions. Without exaggeration, what follows is some verbatim conversation I had with a visa application officer:

Q: “What is your three-year-old’s current occupation?”

A: “Child.”

Q: “Previous occupation?”

A: “Foetus.”

Q: “Why is there no letter of permission from the child’s birth parents?”

A: “Because they’re dead. That’s why you have their death certificates and a Court Order of Adoption in your hand. WE are his parents.”

Q: “I still need their written and notarised permission.”

A: “Please let me speak to someone with an IQ.”

Ah well. Some things never change.

Or maybe they do.

 A good friend asked me if I’d fill in for him at Bina Nusantara University for an afternoon, and, being bored senseless, I was happy to do it. It was a four hour class of Academic English, a course and school with which I am very familiar; preparation was minimal, and my friend Charles is very good, so I knew it would be a piece of cake. I put on my professorial face and attitude, showed up early, sober, and unhungover. Now here’s the weird part: I was awestruck at the general improvement in the quality of the students at an Indonesian university.

 I shit you not. It was a relatively small class, but they were almost all there – not just on time, but early. The one missing girl showed up about a minute late, apologised profusely, took a seat and was ready to learn. The class went well; we all had a lot of fun, the kids followed my reasoning during some of the more abstruse sections on informal logic and its application to essay writing, had no apparent problems taking notes and asking reasonable questions, and with one minor exception, abstaining from laptop, tablet, and cellphone use during the lecture part of the class.

After they had been told to go ahead and use their laptops

A new generation solving real problems

I took proposals for the topics of their next assignment, which was to be an essay which describes a problem, offers a solution, anticipates objections, addresses the objections, and concludes by advocating the proposed solution. That’s an assignment that is deployed in that elementary academic writing for first year students every semester, and I’ve gone through the drill more times than I care to remember.

My past experience in that same school with students of similar ages, backgrounds, and levels of intelligence had routinely included young women proposing to address such issues as dry skin or hair that was too curly, being the subject of malicious gossip, or parents who were reluctant to spring for their own car and driver. Meanwhile the young men traditionally offered to address problems including parents who were reluctant to spring for their own car and driver, the poor performance of one or another soccer football team, or the size of the portions served in the university’s cafeteria.

 I was gobsmacked when the small groups they were working within came up with the problems they wished to address. They included the deforestation of Papua, the endemic poverty in Jakarta, the routine mistreatment of Indonesian migrant workers, and the human rights abuses against local populations practiced by the Indonesian military when deployed as mercenary security forces for US mining companies. We spent the rest of the four-hour class engaged in lively discussions of these problems, brainstorming solutions, anticipating objections; when the somewhat gruelling day was over, some even lingered to continue the discussions or to ask genuine questions – not one of which was whether it would be okay to hand something in late because they had a wedding to attend.

 I’m not sure whether this admittedly anecdotal experience represents anything larger; I couldn’t say whether the apparent sea change in the maturity of a small group of young Indonesians is even significant. But it is sure as hell refreshing.

I attribute a great deal of this encouraging development of social consciousness, and general social maturity to their regular teacher, Charles Schuster, for whom I was substituting. Charlie is a good friend and drinking buddy; he is a long-time US expat and Indonesian resident and he is first and foremost an artist of considerable, perhaps great talent; certainly he has a very solid reputation. But like most artists, he actually has to work to support his art. Indonesia can be thankful that his chosen employment is that of university lecturer.

 Deep in my heart, I am sincerely optimistic that I may have seen the beginnings of the sea-change that will move Indonesia into the ranks of civilised countries; maybe it won’t be such a no-brainer for my son to choose whether to maintain his Canadian or his Indonesian citizenship when he reaches the age of eighteen. Or better yet, maybe by then Indonesia will have developed sufficient self-respect and self-confidence that she will recognise dual citizenships like the rest of the civilised world and not force her own people to cut themselves off from their homeland in their quest for a better – or different – standard of living.

But one way or another, here I sit at 6.30 am in a 24 hour cafe, drinking warm beer and eating Dim Sum for breakfast, trying to work before the morning heat becomes intolerable, with more hope for this country than I have had for years.

 

…enditem…

 

 

Education should be a right

Student Debt

Patrick Guntensperger

Jakarta, Indonesia

 

As getting JJ back to Canada within the next few months becomes more and more of a likelihood, the usual parental worries start to replace the more immediate ones of ensuring that his adoption is ironclad, and his Canadian citizenship is unassailable. Those worries are being incrementally replaced by worries about his long-term future; his education, his choice of métier as he reaches and passes through the horrors of adolescence, and ultimately his security and happiness as an adult.

 I recently got an email from a very good friend in Canada. To put things in perspective, she is, shall we say, of an age to have two young adult sons from a first marriage, neither of whom had a teenaged mother. She is now married to a terrific and very successful guy, runs her own thriving business and does some consulting on the side. She and her second husband know how to have fun (I met up with them on their honeymoon in Bali a few years ago and can testify to this), enjoy life, and seek out opportunities to live it to the hilt.

 Nevertheless, there is a fly in the ointment. You see, almost two decades ago, my friend D hit a rough patch on the path her life was taking. She was pregnant with a toddler to care for; she had just been licensed and was starting a career in a highly competitive business and going through an acrimonious and very messy divorce. She, like many Canadians, and more every year, was faithfully paying off a student loan. As a result of some dirty pool that was being played in her divorce proceedings, she had her personal assets frozen, her income attached, and she ended up on welfare for a brief period. During that time she defaulted on her student loan.

 The student loan system in Canada, despite being well-intentioned, or at least politically correct, is an appalling boondoggle. Its very conception and fundamental structure are guaranteed to make the system inefficient, an immense burden on those who can least carry it, and an absolutely risk-free vast fortune for those who need it the least.

 Neither Canada (nor the provinces, which run a supplementary, parallel system) actually lends students money. Not for tuition, books, learning materials, research or living expenses, or anything else. What the government does is send the bright-eyed promising young scholar off to its favourite Canadian – not a citizen, but a corporation – and tell the student to ask for money. Meanwhile, the government has told that corporation, one of Canada’s chartered banks, to go ahead and lend the kid a pile of money. Meanwhile the bank is told not to take payments on the loan until six months after the studente has ceased to be enrolled full-time in an approved institution of tertiary education. After that….they’re all yours.

It doesn’t take a rocket scientist to see that, even in the ideal scenario, this system is a profoundly flawed one. The ideal scenario, of course, is that the six months referred to above are the first six months of work in the student’s chosen profession after having successfully completed a course of study that has qualified that young person for full-time employment in his chosen field. Six months of steady paycheques, a chance to settle into life as an adult, and then monthly payments to the bank to compensate for all those years of education at their expense.

 Yeah, right.

 What should be apparent to anyone with the intelligence of a doorstop is that the banks have a vested interest in seeing the student default on their payments. These loans, let’s not forget, are guaranteed by the government; principal and interest. These loans are absolute government gifts in the event that a student doesn’t graduate and therefore can’t get a high paying job, or can’t get a job in their chosen profession within six months of graduation, or runs into any one of an infinite number of financial obstacles. They’re way better than mortgages or any loan secured by assets of any sort.

 If a mortgage or other secured loan goes south, we all know that the bank can and will foreclose or seize assets; they’ll take ownership of the property and sell it for as little as the court will let them get away with and pocket the money plus costs plus interest. But all this, in financial terminology, is a pain in the ass. Banks don’t want an inventory of 3 bedroom bungalows, nor do they want warehouses full of RVs and motorboats, all of which need to be sold before they can make their profit; they’re neither realtors nor used car salesmen….they have neither the ethics nor the consciences of either, much less the skill sets required.

 Student loans, on the other hand, are an effort-free goldmine. If a recent graduate misses a single payment, the bank holding the loan does two things at the same time: it cracks open the champagne and it sends in a demand note to the government for the entire principal plus interest on the loan and payment of the default penalties. This can amount to a hundred thousand or more dollars. At the moment student debt greatly exceeds credit card debt. We’re talking real money here. And in the event of a default, they don’t have to do a damn thing to suck it in.

 But it gets better. Now the debt is transferred to the government; it becomes a Crown Debt. And most people who aren’t going through the system aren’t aware that Crown Debts have certain very nice features – for The Crown. Unlike other consumer debt, there is generally no statute of limitations on a Crown Debt. If you are a bank that is owed a thousand dollars past due on a credit card, you have six years (minus one day, for some reason) to attempt to collect it or sue. If any payments are made or the debt acknowledged in any way during that period, the debt recovery game goes on, and includes the possibility of a lawsuit. If no acknowledgement or payment is made and no suit is filed by the bank, the statute of limitations kicks in; the debt must be written off as uncollectable and it is barred from litigation. But not a Crown Debt. Like a diamond, a Crown debt is forever.

 But it gets still better. The bankruptcy procedure is one that all civilised countries have as part of the system of laws that govern financial matters. In a nutshell, it provides that if a person or corporation gets so deep in debt that, after all reasonable attempts to negotiate and restructure the debts, there is no serious likelihood that the debts can ever be repaid in full, bankruptcy is a remedy that can be utilised. It allows for a court to supervise the disposal of all that corporation’s or, in a personal bankruptcy, that person’s assets, leaving only the necessities to live with a degree of dignity. The court will also attach a portion of the bankrupt’s income until the bankruptcy is discharged…usually six months to a year or two. After that, the spoils are divided proportionally among the creditors and the debts are wiped clean. The bankrupt person starts over with a lousy credit rating but can turn all that round in a few years of careful financial management.

 But guess what? That student loan doesn’t usually get included in the bankruptcy. That’s right, folks; even after the pain, loss, and humiliation of a personal bankruptcy, you probably still have to pay back that debt….the one that was risk free to the bank. You don’t start clean, a financial pariah, but a debt-free pariah; you still owe the money…you suffer all the pain, and you still emerge with the debt that probably drove you to bankruptcy in the first place.

 And here’s the cool part. The government outsources the collection of these student debts. They are given to a collection agency if their first phone call doesn’t end with an acceptable repayment schedule. And believe me, collection agencies are homes for sociopaths. You would rather be stalked by Jason Voorhees.

 Bill collectors, by and large, are the bottom feeders of uneducated, otherwise unemployable social misfits: junkies, drunks, aged whores who have lost any looks they might have had, retards, bullies, and other dregs of the human race. Their training consists of being told that every debtor assigned to them can make payments on the debt, but is simply being stubborn; that their function as collectors is to make life more uncomfortable to owe the money than it would be to pay it. Then they are told that the collector with the lowest numbers on the board at the end of the month is fired. Now go get’em.

Pay up, bitch!

Your friendly neghbourhood debt collector

They’ll try to get the welfare mother, the terminal cancer victim with two infant children, or the sole caregiver to dying parents but no income, to commit to a payment schedule that will either be impossible to meet or make their already crushing existences even more miserable. If they can’t do that, if they can’t collect from someone so impoverished that they worry constantly about feeding themselves and their children, they send it to an even lower form of humanity…they send it to a lawyer to sue.

 There are law firms whose entire income is based on the prosecution of these lawsuits. They work for the government and they work in bulk. These firms are the scum of the earth, the most appallingly distasteful bottom feeding slime that ever stepped into a civil courtroom; and they thrive. They’ll sue and they’ll get their judgements. And now they can ensure that not only is the once bright-eyed and hopeful student’s life unbearable now, but they can guarantee that it will never get better.

 You’ve got a student loan judgement out against you and you need a car to get to interviews? Maybe a family member gives you an old beater to help you out. The scum of the bar association will send thugs to seize it, sell it at auction, and run up costs that are more than they sell the car for, increasing your debt by the amount of the shortfall. You have a special needs child who can’t leave his bedroom for long periods? Your friends and neighbours put together their few cents from bottle returns and piggy banks and buy you a television for him at a thrift shop. If you have another TV in the apartment, those same parasitical slime bags will seize it and sell it. Two TVs is a luxury, you see.

 If you start a new job, the first thing that happens is your new employer gets served with a court order requiring him to pay a percentage of your pay (40% is fairly standard) to the law firm; a wage garnishment is not an auspicious start to a career anywhere. And no employer likes to have the extra accounting to do along with the knowledge that if they err in any way they are responsible for any shortfall. In most cases, so long, new career.

 I have personally seen each of these scenarios play out. In Canada. I have seen people sued for a few thousand dollars after having paid back over their entire career several hundred thousand…many times the amount borrowed. This costs the economy and this costs the taxpayer. This destroys lives. But the filth in suits who barely passed the bar make out like bandits, the banks always come out on top without breaking a sweat, and the government gets kudos for its dedication to educating the next generation. It’s a racket and it has to stop.

 Years ago parents used to point out that they started out with nothing and look what they have accomplished. Yes, look! Their children start out with a debt burden that their parents couldn’t even have imagined. These young people (some of us not so young anymore…the racket has been successful for a long time now) would have loved the opportunity to start out with nothing.

 Canada needs to demonstrate a genuine commitment to education. That means that every student who makes passing grades in high school should have a first year of post-secondary education and expenses paid for by our government. Not their pet corporations, the chartered banks. Then every student who maintains a sufficiently high grade point average should continue to have government subsidised education for as long as they stay in school.

 The funny part about this is that everywhere this has been tried, it has not cost the taxpayer a single penny over the first ten years. The social benefits and the returns paid by having educated, taxpaying students in high-paying jobs has grossly outweighed the initial cost of educating them. But explaining that to parliamentarians who need to get elected is a waste of breath. Nobody in our current government will buck the tide of coattail riding neocon American wannabes. So unless you’re wealthy, you start your career with a debt burden that is unconscionable. Of course if you’re not wealthy, those that are would prefer you to know your place in society, forego higher education entirely; those who are wealthy will be needing people to drive them about, clean their homes, mow their lawns.

 Canada needs to implement a system of tertiary education subsidy. But more urgently it needs to forgive existing student debt.

 My friend D? Her email was to tell me that she thought she had retired this debt fifteen years ago; through a mistake by the scumbag Law Society member who had sued her at the depth of her once desperate life, it turns out that about one thousand dollars hadn’t been collected. Now that she’s successful, the tapeworms have woken up, tracked her down and are now placing demands upon her for payment, backed up by threats of seizure of assets, garnishments of contract payments due, and any other slimy threat they can come up with. This is great for them, because, through the miracle of compound interest, the amount demanded is now more than the initial debt. And they have the law on their side.

 It’s long since time for reason to step in.

…enditem…

css.php