Monday, January 30, 2012

Rather Horse-Riding than Horse....

Last night I spent three hours watching British-produced mysteries on PBS. I enjoy trying to solve their intricate plotting. How can England can turnout quality drama like this, while America creates amateurish, soft-porn, loaded with adolescent humor?

Now there’s a mystery I can’t solve.

If we have succumbed to the “dumbing-down” of America, I don’t think TV is the origin; I just think it’s a symptom.

For the root cause, I look to our educational system. Back in 2008, I wrote a series of columns on the lack of writing ability in many of our high school graduates. And I don’t mean their lack of verbal brilliance—although that is a problem.

No, I mean they can’t write! They print—everything! I learned to print in the first and second grade. I learned to write—what today’s educators call “cursive”—in subsequent grades.

But many, if not most of today’s public high school graduates can neither read cursive writing, nor can they write in longhand themselves. Some cannot sign their name. Really—they print their signature!

In writing those columns in 2008, I interviewed several educators at both the college and grammar school level. There were several reasons why people in their teens and even in their twenties were unskilled at reading cursive documents—hand-held printing devices prime among them. (And isn’t it ironic we call a lot of those, “smart phones.”)

But again, those devices, like TV, are more symptomatic of the problem, rather than a reason why little Johnny and Janey can’t read a hand-written letter, let alone write one.

Look, I’ve used a computer for 25 years. I have texted, e-mailed, surfed the Web… But Lord knows, I can also read a hand-written note. So what’s changed?

Those educators that I mentioned I interviewed? Well, one question got two different answers, depending on who was being asked. Cursive handwriting was not taught in any of the public schools. But all Catholic grammar schools continue to teach the skill of handwriting.

And now, Catholic schools are closing everywhere.

I don’t have all the answers, but it seems to me that public schools have shown no inclination to improve their teaching methods—at least if the results they’ve presented over the last generation are any indication.

One middle-aged administrative executive I talked to about this didn’t see the problem as critical. “It’s (cursive writing and reading) an unnecessary skill today—like horseback riding,” he said, shrugging his shoulders. “Today’s educators don’t waste their time teaching our grammar school students how to ride a horse, so why should they teach cursive writing?”

They may not be teaching horseback riding, but they seem to be wasting their students’ time with a lot of other solid material associated with the horse.

Monday, January 23, 2012

Change Your Verb; Change Your Attitude

One of the most noteworthy concerns of the last 25 years has been the Health Care problem, especially with the aging population. I call it a problem, simply because I choose to use that word to describe this concern, and, being a writer, words are important to me.

And when it comes to health, there’s one little, four-letter word that really causes me to react.

I’ve had my own health care since I was 17 years old. That’s when I entered the Air Force. After my military service, I again acquired my own health care when I joined the Philadelphia Police Department.

People often told me how fortunate I was that the city of Philadelphia “gave” me health benefits while I was employed by the city.  Just like they “gave” me a salary, I suppose.

I didn’t think that my health benefits were something given; I was naive enough to believe I worked for them. That word—give—was the one that for a long time has drawn a reaction from me.

I first started to have trouble with the word “give” back in the late 1960s. People tend to forget (or ignore) what was happening back then, but I was a cop at the time, and many major US cites were literally set aflame during the upheaval of the 60s. By the time the decade was ending, Congress was holding televised hearings to ascertain just why so many of our major cities were the target of urban riots.

I especially remember two distinct citizens who testified in front of that panel.

One, a middle-aged naturalized American, had explained that when he first came to this country, there was no work to be had, so he took bricks and learned the brick-laying trade, starting a career that enabled him to support himself.

The other, a young urban resident who was born here, said that, “Nobody gave me any bricks to build with, so that’s why I’ve been throwing them.”

I was still in my 20s at the time, but I immediately saw the flaw in the latter’s reasoning, even though the Congressional panel sided with his viewpoint.  (No surprise there!)

There were two things wrong with his statement, as I saw it. First, if he had the bricks to throw, where did he get them? And why throw them? Why not build with them as the first man did?

Second, and most erroneous, I felt, was his dependency on the verb, “give”—that four-letter word that could “push my button,” as today’s cliché implies. And that young man used the word quite a bit, as I recall.

“Nobody gives me a chance.”
“Nobody gives me a job.”
“Nobody gives me respect.”

I personally would have advised him (and anyone with those oral inclinations) to jettison the word “give” from his everyday vocabulary as often as possible, because it was—and still is—seriously over-worked.

You see, I believe he was using the wrong verb. As young as I then was, I knew that nobody gives you a chance.
You take a chance.

Nobody gives you a job.
You win a job.

And most assuredly, nobody—no one—gives you respect.
That, you must earn.

Tuesday, January 17, 2012

Two Words to the Wise

I don’t make resolutions every January, but I hope for others to resolve their lives, specifically, to clean up their English. But not in a, “thou shall not take the name of the Lord in vain” type of way. I just wish for certain over-used or ill-used language habits to cease.

Number one on my 2012 list should surprise no one. In fact, all people with an I.Q above double-digit levels should have this on their “clean-up-the-language” wish list: How about if we just use the word “awesome” correctly?

A woman called my house last month to setup an appointment. “Will you be home tomorrow at 10 a.m.?” she asked. I assured her I would be. “Awesome!” she replied.

Really? It inspired awe in this woman to know that I will be in my home? Now here are some things that are awesome: God is awesome, the birth of a child, sunset over the Grand Canyon…I can see all of these things inspiring awe.

But not a trip to the mall, as I once heard a teenage girl explain it. “You going to the mall?” she asked her friend. Her friend said she was, and so the first girl simply responded, “Awesome!”

This is just one more example of our penchant for language inflation. Like monetary inflation, over-used and ill-used words like awesome are just not worth what they once were.

And if homophobia is irrational fear of, aversion to, or discrimination against homosexuals, then logically, heterophobia is irrational fear of, aversion to, or discrimination against heterosexuals.

Just as all heterosexuals are not homophobic, all homosexuals are not heterophobic.

So, would heterophobics please stop trying to usurp the word, “marriage?” For thousands and thousands of years—millennia— marriage has meant the spiritual, social, physical, legal union of female with male, for the ultimate purpose of the perpetuation of the species.

If two people of the same sex want to join in a civil union, that’s between them, the state, and their principles. It’s therefore not my business; not my concern. Go in peace.

But please, get your own word!

Call it a civil union, call it a partnership, call it a krempfelder for all I care, but don’t steal a word that has meant something entirely different since before Moses walked the earth.

Abraham Lincoln once asked a man, “How many legs does your dog have?”
“Four,” answered the puzzled man.
“And if you were to call your dog’s tail a leg, how many legs would your dog then have,” Lincoln continued.
“Why then he’d have five,” the man answered, quite sure of himself.
“No,” Lincoln corrected him. “He would still only have four legs, because just calling a tail a leg doesn’t make it one.”

Calling a union between two people of the same sex a marriage, doesn’t make it one.

Get your own word.

Wednesday, January 11, 2012

Hello, Newman! Welcome To the New Decade…Almost

And why seest thou the speck that is in thy brother’s eye, and seest not the beam that is in thy own eye?
—Matthew 7:3

Admittedly, I’m one of those guys that tends to see the speck in the eye of my brother while missing the beam in my own eye. It’s one of my imperfections, and, as familiar as I am with the Sermon on the Mount, I too often yield to this temptation toward superiority.

I have to work on that.

But I’m just a weak, stupid man, and consequently spend a good deal of time fighting off temptation. The best way to fight temptation is to avoid it, but these are the times that try weak, stupid men’s souls.

I first ran head-on into one of these times at the close of 1999 when I was editing a children’s newspaper geared for grades kindergarten through sixth.

We had an editorial board consisting of educators and parents, and at one meeting I asked if the schools were teaching students that the year 2000 was not in fact the start of a new millennium, as was popularly being trumpeted, but rather the last year of both the 20th century and the second millennium.

I got a collective, silent, blank stare from the board. Not one educator knew that mathematically, the year 2001 was the start of a new century and a new millennium.

I was flabbergasted! (Do people still say flabbergasted? They should; it’s delightfully onomatopoeic.)

Our newspaper’s Production Department chief leaned over and whispered in my ear, “See, I told you that you were the only one who cared about this.”

Why wouldn’t teachers (and parents) want proper mathematics taught?

Maybe it was my years of arithmetic instruction at the hands of the IHM (Immaculate Heart of Mary) nuns that gave me an appreciation for sums, quotients, divisors, multipliers, ratios, proportions, and the elegance of this perfect science; but to find that today’s “educators” didn’t even know what the devil I was talking about…well, the Mighty Macs (as we lovingly referred to our IHM teachers) would have never stood for such ineptitude.

The 10 ensuing years (1999-2009) had temporarily dulled my memory of those days of unawareness, but that troublesome “9” on the end of the waning year back in 2009 stirred up those feelings once more, and I found myself again concentrating on that speck in my brother’s eye.

In December 2009, a popular TV sports-talk show commentator spent the morning asking fans to submit their choices for greatest athletic accomplishments of the decade, when a responder noted that we had one more year to go in this decade (he must have had the Mighty Macs).

“Oh, don’t be that guy!” the commentator demanded in a tantrum. “Don’t be that guy that’s always correcting people about the year starting with one, and not zero. You’re like Newman in Seinfeld.”

As I felt the old rage returning, I kept telling myself that sports was his forte, not math. I shouldn’t expect him to know that a decade is 10 complete years, not nine.

But, having followed sports all my life, I am acutely aware of how critical statistics are to sport fans. Did he think that 11 home runs equaled a dozen? Does first-and-10 mean you need nine yards for another first down? When you list the 10 best of anything, is the first one number zero and the last one number nine?

I suppose I can forgive him his computation trespass, for like me, he is just a weak, stupid man.
But…to mis-allude to a Seinfeld reference! That borders on profanity.

Newman, you see, was not the guy who corrected the error of believing that the new millennium began in 2000 (see episode 20, season eight, entitled, “The Millennium”). He was the guy that made the mistake by mis-scheduling his ‘Newmanium’ party.

It was Jerry Seinfeld himself who executed the counting coup d'état by explaining to the irrational postman that there was no year zero, hence the new millennium would begin in 2001.

Newman’s reaction was predictable—he seethed as he realized his mistake, noting that Seinfeld had bested him once again. Yes, even Newman, the representative TV sitcom loser of the 90s, immediately recognized his error. It needed only to be explained to him—mathematically.

What a shame a sportscaster can’t see the forest for the trees, and what a downright disgrace that educators can’t see the sum for the numbers.

So when newspapers and TV gossip shows parade out their ‘best of’ and ‘worst of’ the past decade at the end of 2019, I’ve decided to overlook those specks in their eyes and let them revel in their inaccuracies. I’ll just augment my reading with a little more Matthew 7.

And wonder if Newman had the Mighty Macs.

Thursday, January 5, 2012

Still Not Guilty After All These Years

Writers naturally want to have an impact on their readers. That’s why we write. Sometimes that impact is lasting, even when it might not be immediately clear.

Tom Wicker recently passed away. He was in his late eighties. He rode in a press vehicle as part of President Kennedy's motorcade when the president was assassinated on November 22, 1963. Wicker was a journalist for the New York Times.

When he penned his memoirs years later, Wicker labeled that day as a turning point for the country: "The shots ringing out in Dealey Plaza marked the beginning of the end of innocence," he wrote.

Really? I know that phrase has become increasingly overused in the last forty years. I saw it written in a news article shortly after the Twin Towers were bombed in 2001. And every time I see it used as a declaration, I wander (in my mind) back in time.

How could a country that lived through World War Two, have any innocence to lose a mere 19 years later when Kennedy was shot? Did we still have innocence after losing 300,000 men in that war?

Was there any innocence left after dropping not one, but two nuclear warheads to end the war? Did we still have innocence after surviving the Great Depression? After living 14 years with prohibition? After seeing the carnage of World War One?

And seriously, how could a country that cut itself in half during the War Between the States—with one segment enslaving its fellow men, while the other fought to free them—call itself innocent? Especially after 600,000 young men died fighting for both of those “causes?”

This country was founded via a war with its British overseers from 1775-83, then before another century had passed, it fought itself in a bloody Civil War. And shortly thereafter, it had its first presidential assassination when Lincoln was killed in 1865. Did we lose innocence then? Or had we lost it during the Civil War slaughter?

Maybe we lost it again in 1881 when President Garfield was assassinated. And again in 1901 when President McKinley was murdered. How much innocence did we have left to lose when we suffered our fourth presidential assassination in 1963?

I guess I just believe that countries cannot have a collective innocence. And they certainly cannot keep losing it every few decades. So Tom Wicker may have penned that flowery homage to our lost purity, but I tend to think it was merely hyperbole on his part. Writers tend to do that also.

If your care to log on to http://www.jimvanore.com/, you’ll find further access to my writing, which, I confess, contains very little hyperbole.

When it comes to that—I’m innocent.