Electing Time Travelers

Some of the people we elected yesterday will decide how we travel through time.

This weekend we fell back from daylight savings time to standard time. Officially the change comes at 2 a.m. yesterday. There always are some folks who don’t get the message or forget the message and find themselves arriving at the end of church services instead of at the beginning, or an hour late for tee time if they worship the putter instead.

There are a lot of folks who think we should have daylight savings time year-around.  Going back to standard time will give us more daylight in the mornings but we’ll be in the dark an hour earlier in the evening. The Hill reported last week about the efforts in Congress to keep daylight time year around. It cites a poll that says, “Most Americans want to abandon the time change we endure twice a year, with polls showing as much as 63 to 75 percent of Americans supporting an end to the practice. But, even if the country does do away with the time change, the question still remains whether the U.S. should permanently adapt to Daylight Saving Time (DST) or Standard Time (ST).”

Most of the country is on daylight time eight months of the year and switches to standard time for four months. There are always some contrarians, of course. Hawaii and Arizona stay on standard time all year.  Hawaii decided the Uniform Time Act of 1967 meant nothing to a state that is so close to the equator that sunrise and sunset are about the same time all year.

Arizona has a different reason.  It doesn’t want to lose an hour of morning time when it’s cool enough for people to go outdoors in the summer.

Residents of or visitors to Puerto Rico, Guam, the Northern Mariana Islands, the Virgin Islands and American Somoa don’t tinker with their clocks twice a year either.

And there’s the rub, as Hamlet says in his soliloquy.  Some folks like permanent standard time because it’s more in line with our circadian rhythms and hels stave off disease. But in March, the U.S. Senate passed a bill that would make DST permanent—the Sunshine Protection Act (who thinks up these insipid names for bills?)—because of its economic benefits because more Americans would go shopping if it remains lighter in the early evening hours.

The movement to protect the sunshine has been led by Senator Marco Rubio of the Sunshine State of Florida. He says the change would reduce the risk of seasonal depression.  That strikes us as a little silly and reminds us of the time when Missouri decided to adopt DST in 1970 when some of the ladies who were regular listeners of “Missouri Party Line” on the local radio station where I worked were vitally concerned that their flowers would not get enough sunlight if we tried to “save” daylight.

The Senate has passed the bill, as we have noted. Final approval is iffy because the Lame Duck Congress has only seventeen working days left before it becomes history.  But if the House approves it, permanent DST would go into effect a year from now.

—Except in states that now operate on Standard Time. They won’t have to switch.  We recall the days before DST became more common when we had to change our watches when we crossed certain state lines.  Our annual trips from Central DST Missouri to Eastern ST Indiana in May always left us uncertain about whether to change our watches until we stopped some place with a clock and learned that CDST was the same as EST.

At least, I think that’s how it went.

Polling has found no consensus on which time should be the permanent time.

If we eliminate switching back and forth, we could be endangering our safety.  Various safety officials tell us that we should replace the batteries in our smoke and carbon monoxide detectors when we change our clocks.  To keep some battery life from being wasted, it is suggested that they be changed either when clocks are adjusted for DST or when they’re adjusted for plain ST.  That assumes the battery-changer remembers which time is the time to switch. We know of no one who marks their calendars for such events.

The article in The Hill’s series “Changing America” delves into the pros and the cons:

Sleep experts say the health benefits that could come from a permanent ST are crucial for a chronically sleep-deprived nation. In response to darkness, the body naturally produces melatonin, a hormone that helps promote sleep but is suppressed by light. Thus, having too much sunlight in the evening can actually work against a good night’s sleep. 

The status quo leads to circadian misalignment, or “social jetlag,” says Beth Malow, a professor of neurology and pediatrics and director of the Vanderbilt sleep division. Malow also authored the Sleep Research Society’s position statement advocating for a permanent ST. 

Under DST, our work and school schedules dictate our actions; while in an ideal scenario, environmental changes like lighter mornings and darker evenings would regulate sleep patterns, Malow explained in an interview with Changing America. 

“There’s a disconnect when we have to wake up early for work or school and it’s still dark outside and we want to sleep,” she said.

Light in the morning wakes humans up, provides us with energy, and sets our mood for the day. “It actually aligns us so that our body clocks are in sync with what’s going on in our environment,” Malow said.

Having more energy in the morning can also make it easier to fall asleep at night when it’s darker outside. 

Overall, ST “maximizes our morning light and minimizes light too late at night,” Malow said. 

When the body doesn’t get enough sleep, risks of developing heart disease, diabetes, and weight gain all increase.  Insufficient sleep is also linked to some forms of cancer.

Polls show younger individuals are less likely to support abolishing the clock change, largely because they’re more flexible than their older counterparts who support nixing the practice. 

But teenagers and young adults are at a higher risk of negative impacts from permanent DST, partially because they’re already primed for sleep deprivation.

“What happens when you go through puberty and you become a teenager is…your natural melatonin levels shift by about two hours, so it takes you longer to fall asleep,” said Malow. “[Teenagers] end up going to bed or being tired at 11 o’clock at night, even midnight sometimes, but they have to wake up early for school.” 

Students who wake up in darker mornings and drive to school could be at a greater risk of car accidents. The same is true for workers with early commutes and individuals in the north or on western edges of time zones who tend to experience more darkness overall.

“Sleep is really, really important to our health. And right now, what we’re doing is imposing mandatory social jetlag for eight months out of the year,” Malow said. “And we’d like to—rather than going to mandatory social jetlag for 12 months out of the year—to stop the clock and go back to Standard Time which is much more natural.” 

Despite the myriad of health benefits that come from adopting ST year-round, having more sunlight in the evenings if DST were permanently adopted is a tempting prospect for many Americans, especially those who work or attend school indoors all day.

Who got us into this mess?  The Washington Post says we can blame two guys. George Hudson, from New Zealand, wanted more daylight time in the late afternoon to collect bugs.  Britisher William Willett wanted more time to play golf late in the day.

Their idea didn’t catch on until World War I when Germany, bogged down in trench warfare with the French and the British, adopted it to save coal. England soon followed suit. It didn’t catch on in this country until 1917 when stockbrokers and industries lobbied for it. The Post says they overcame opposition from railroads that feared the time change would confuse people and led to some bad crashes.  And farmers opposed it because their day already was regulated by the sun and they saw no reasons to fiddle with the clocks.  David Prerau, who wrote Seize the Daylight: The Curious and Contentious Story of Daylight Savings Time, told the Post dairy farmers didn’t want it because they’d have to start their milking in the dark if they wanted to ship their product out on the trains. “Plus, the sun, besides giving light, gives heat, and it drives off the dew on a lot of things that have to be harvested. And you can’t harvest things when they’re wet.”  Getting up an hour early didn’t solve that problem.

This country adopted DST in 1918 with the Standard Time Act. DST was repealed the next year and wasn’t seen again until FDR reinstated it during WWII for the same reason it was instituted in The Great War—to save fuel.

In 1966, Congress passed the Uniform Time Law. In the 1970s we got permanent DST for a while, also an energy-saving issue because we were in the midst of an energy crisis caused by the Middle East Oil Embargo. That situation caused major inflation issues including in energy prices—at the gasoline pumps and in home heating and electric bills—to skyrocket. The great minds in Congress decided we needed permanent DST to reduce excess utility costs.  But the public didn’t like it and the experiment ended after ten months.

Then George W. Bush got the Uniform Time Act amended to change the sates when clocks were to spring ahead from April to March and we’ve had our present system since then.

Does it really work or is it just something to politicians to fiddle around with from time to time?

A 2008 Department of Energy report said the Bush change cut the national use of electricity by one-half of one percent a day.  Ten years or so later, someone analyzed more than forty papers assessing the impact of the change found that electricity use declined by about one-third of a percent because of the 2007 change.

More contemporary studies show similar small changes in behavior when DST kicks in.

One study supporting the economic advantage of permanent DST was done by JP Moran Chase six years ago.  The study looked at credit card purchases in the month after the start of DST in Los Angeles and found it increased by 9/10th of a percent.  It dropped 3.5% when DST ended.  That was good enough to recommend fulltime DST.

Another report showed robberies dropped by 7% during DST daytimes. And in the hour that gained additional sunlight, there was a 27% drop in that extra evening hour. That’s in Los Angeles.

Rubio maintains that having more daylight in the evening could mean kids would be more inclined to get their noses out of their cell phones, tablets, and computers and go outside and run around playing sports.

Maybe they could take up golf.  Or looking for bugs that proliferate in the twilight. Imagine a parent suggesting those ideas for their nimble-thumbed children.

So what’s better—having kids standing in the dark waiting for the morning school bus or riding the school bus into the darkening evening and arriving at home where the lights are all on?

The people we elected yesterday are likely to make this decision sooner or later. Let us hope they’re up to it.

 

Woke

I’m Woke.  At least I think I am.  If it means being aware of the world around me and not being afraid to learn the world around me is something other than what I have thought it to be, I’m Woke.

Woke is a carelessly-used pejorative that has been used to blindly attack progressive views of almost any level. Not just progressive views, either.  It’s been thrown around in public and private arguments about what we should know about our history and what history our children should be taught.

It is a one-word example of today’s bumper sticker politics in which it is easier to call someone a name or disparage their ideas rather than have the courtesy or curiosity to discuss differences.  It is perceived as coming from someone with a “my way or the highway” attitude that replaces thoughtful dialogue with a one-word dismissal.

It’s childish.  Name-calling is a refuge of fools with nothing substantial to say.

A challenge to those who label others as Woke has come from a report by the United Kingdom version of the Huffington Post (HuffpostUK).

Rakie Ayola is an award-winning Welsh actress and producer born of a mother from Sierra Leone and a father from Nigeria. She is now starring in a six-part BBC series called The Pact about some friends who are tied together by a secret. On the BBC Breakfast show the other day, one of the co-hosts suggested some viewers would consider the program “a ‘woke’ version of the Welsh family.”

“If anybody wants to say that to me,” Ayola said, “what I would say first is, ‘explain what you mean by woke – and then we can have the conversation.’”

“If you can’t explain it, don’t hand me that word.

“Don’t use a word you cannot describe.

“Or maybe you know exactly what you mean, and you’re afraid to say what you mean, then let’s have that conversation.

“Not even afraid – you daren’t. Do you know what I mean.

″Sit there and tell me what you mean by ‘woke,’ and then we can talk about whether this show is woke or not.

“Then I can introduce you to a family just like this one – so are you saying they don’t exist, when they clearly do? Are you saying that they’re not allowed to exist? What do you mean by that?

“Let’s have a proper conversation. Don’t throw words around willy-nilly when you don’t know what they mean.

“If you don’t know, then please be quiet because you are incredibly boring.”

Seems to be pretty good advice.

You can watch that part of her interview at:

Rakie Ayola Has The Perfect Response To Anyone Who Uses The Word ‘Woke’ (msn.com)

She makes a good point. Those who throw the word around should be able to define it. And there is some doubt that most can.

Part of the problem with Woke is that most of us are not aware of the word’s history and the reasons for it. So let’s discuss that a little bit.

A significant part of the history of Woke is related to the Ferguson killing of Michael Brown in 2014, in fact.

New York magazine published an excellent article about the history of Woke two years ago. For most of its history, it has been a word of caution within the Black community, not a weapon of division of society in general.

https://www.vox.com/culture/21437879/stay-woke-wokeness-history-origin-evolution-controversy

White folks understanding of the history of Woke is part of the understanding of Black culture and, perhaps, in understanding it, respecting it.

Seeing other cultures and understanding how they see the dominant white cultural history of our country is a matter of respect. Unfortunately some in our political world find it more profitable to denigrate those efforts. History will prove their short term infliction of politically-advantageous pain will have been an unsuccessful bump in the road for the people our grandchildren’s grandchildren will be.

Facing our history, celebrating the noble parts and acknowledging and correcting the bad parts, can be difficult.  But we need not be afraid to do both.

A few days ago I picked up a copy of a 2015 National Book Award-winner, An Indigenous People’s History of the United States by Roxanne Dunbar-Ortiz, whose mother—born in Joplin—probably was part Cherokee. Early in her book, she talks of an exercise she has given her students in Native American Studies at California State University-Hayward. She asked students to draw a rough outline of the United States when it gained independence from Britain. “Invariably most draw the approximate present shape of the United States from the Atlantic to the Pacific,’ she writes. When she reminded students the only things that became independent in 1783 were the thirteen colonies, the students often were embarrassed. “This test reflects the seeming inevitability of US extent and power, its destiny, with an implication that the continent had previously been seen as terra nullius, a land without people…The extension of the United States from sea to shining sea was the intention and design of the country’s founders. ‘Free’ land was the magnet that attracted European settlers.”

“…In the United States, the founding and development of the Anglo-American settler-state involves a narrative about Puritan settlers who had a covenant with God to take the land.”

“Indigenous peoples were…credited with corn, beans, buckskin, log cabins, parkas, maple syrup, canoes, hundreds of place names, Thanksgiving, and even the concepts of democracy and federalism. But this idea of the gift-giving Indian helping to establish and enrich the development of the United States is an insidious smoke screen meant to obscure the facts that the very existence of the country is a result of the looting of an entire continent and its resources.”

—And the destruction of dozens of Indian nations, a truth that’s hard to accept in a country in which the cowboys always defeat the savages and the cavalry always arrives to drive them away.

The fact is that this was not “a land without people” at all.  They just weren’t the right kind of people. (Go back to our July 25th comments if you would like more background.)

The insistence by some that we are better if we see our history through the eyes of those who were enslaved or driven from their lands is too often dismissed as “Woke.”

If we are afraid to see ourselves as we really are, and as we really have been, we short-change our opportunities for what we can be.

The Colonies and the Mother Country

The coverage of the change in the British monarchy has rekindled some interest in the comparisons of the United Kingdom with the United States.

Oscar Wilde, the 19th Century wit and playwright had a British character in The Canterville Ghost comment, “We have really everything I common with America nowadays, except, of course, language.”

Through the years, George Bernard Shaw has been credited with turning that comment into, “England and America are two countries separated by the same language!”

The other day, we came across a newspaper column written by former First Lady Eleanor Roosevelt, whose column, My Day, was syndicated in newspapers by United Features Syndicate nationwide.  She wrote on August 17, 1946 that the relationship between this country and the United Kingdom is “a little like a family relationship where the younger generation breaks completely away from the older generation with the result that relations for a time are very strained.

In most families, however, when either the younger or the older generation is threatened by real disaster, they come together and present a solid front. That doesn’t mean that they will see things in the same light in the future, and it does not necessarily mean approval on either side of the actions of the other—nor even that they might not quarrel again. But it makes future quarreling less probable. It is a kind of “blood is thicker than water” attitude which makes them stand together when a crisis occurs and, year by year, brings better mutual understanding.

She contrasted the characters of our peoples—Americans being people of light exaggeration and the British being people of understatement. Americans are more “dashing and perhaps more volatile” while the British are “more stolid and tenacious”

Remember this was just after World War Two. She recalled a British soldier who said the Americans did not enter the war until they developed an interest in winning, at which point they capitalized on “the hard work and the losses which we have sustained.”

And while Americans might not approve of many things important to the British, she write, there is a belief that we can find ways to live and work together.

In fact, she thought, that attitude is basic to our foreign policy—that “we can find ways to live and work together.”

The Colonies, us, are the kids who leave home.  But when there’s a family crisis, we get together.

Even in today’s world, three-quarters of a century later, she seems to have identified us.

 

Why Hasn’t Ukraine Lost?

Ukraine’s counterattack against Russian invaders appears to have stunned a lot of Russian soldiers and their commanders—and a growing number of influential people in Moscow who are starting to openly criticize Vladimir Putin for his unprovoked invasion of Ukraine.

Putin expected a quick conquest.  Why didn’t he get it?  And why is he, as of this writing, getting his butt kicked by a supposedly smaller, inferior, force?

You might find it interesting to explore a book that explains why.  It’s the same reason Hitler didn’t conquer England, why the United States fled from Vietnam, and probably why the Taliban controls Afghanistan.

The book is Malcom Gladwell’s David and Goliath, a study of why bigger is not always best, why stronger does not always prevail, and why—believe it or not—the underdog wins so often.

While most analyses of military actions focus on military capabilities and/or failures, Gladwell focuses on people and what happens when their country is attacked by a seemingly overwhelming force.

He writes that the British government was worried as Europe sank into World War II that there was no way to stop a German air offensive against the country. The country’s leading military theorists feared devastating attacks on London would 600,000 dead, 1.2-million people wounded and mass panic among the survivors, leaving the Army unable to fight invaders because it would be trying to keep order among the civilians.

The eight-month blitzkrieg began in the latter part of 1940 and included fifty-seven consecutive nights of bombing.

But the people did not panic.  Military leaders were surprised to see courage and almost indifference.  The reaction puzzled them as well as psychiatric workers expecting the worst.

And they discovered the same things were happening in other countries under attack.

What was going on?

Gladwell writes that a Canadian psychiatrist, J. T.MacCurdy, determined that the bombings divided the populace into three categories: the people killed, the people who were considered near misses—the people who survived the bombs, and the remote misses—people not in the bombed areas.  MacCurdy said the people in the third category developed “a feeling of excitement with a flavour of invulnerability.”

While the toll in the London bombings was, indeed, great (40,000 dead and 46,000 injured), those casualties were small in a community of eight-million people, leaving hundreds of thousands of “emboldened” near misses, people that MacCurdy said became “afraid of being afraid,” a feeling that produced exhilaration and led them to conquering fear and developing self-confidence “that is the very father and mother of courage.”

Hitler, like the British military command, had assumed that a populace that had never been bombed before would be terrified. It wasn’t. Instead, it was emboldened.

“Courage is not something that you already have that makes you brave when the touch times start,” writes Gladwell. “Courage is what you earn when you’ve been through the tough times and you discover they aren’t so tough after all.”   He maintains that the German expectations that the bombings would terrorize the people and destroyed their courage was a “catastrophic error” because it produced the opposite result. He concludes the Germans “would have been better off not bombing London at all.”

Gladwell explores the “catastrophic error” this country made in Viet Nam when its political and military leaders believed they could bomb the Viet Cong into submission.  Thousands of pages of interviews of Viet Cong prisoners indicated the result instead was that the bombings made people “hate you so much that they never stop fighting.”

Many of the prisoners maintained no thoughts of winning but they didn’t think the Americans would win either.  Nor did they think they would lose. “An enemy indifferent to the outcome of a battle is the most dangerous enemy of all,” Gladwell writes, and leads to a shift in advantage and power to the underdog.

His thoughts might help us understand why, after 30 years, the Gulf War has failed to install democracy in that area and instead has left Iran, Iraq, and Afghanistan far from what we dreamed they would become.

We hope the ideas are not tested on Taiwan.

Those who go to war expecting to win through might and power alone are Goliaths. And, as Gladwell sees it, all they’re doing is creating a lot of Davids.  And—although Russia’s invasion is not mentioned—in Ukraine, the shepherds with slings are swarming.

(The book is David and Goliath: Underdogs, Misfits, and the Art of Battling Giants, New York, Little Brown and Company, 2013 (with a revised paperback edition by Back Bay Books, 2015. His thought-challenging musings also cover such topics as class size, prestigious colleges, art, dyslexia, and crime.  If you want a sample of his perceptive interpretation of how underdogs so often prevail, go to: https://www.youtube.com/watch?v=ziGD7vQOwl8 and if you want more on other topics: https://www.youtube.com/watch?v=7RGB78oREhM)

(Photo credit: youtube Ted Talk)

 

RACE

In various forms we are tied up, politically, socially, economically and about every other kind of “ically” with the subject of race.

It provokes anger, fear, and uncertainty.

Am I racist?  Is someone else a racist, too, although they don’t look the same way I do?

Am I a victim? Am I a perpetrator?

What should I do?  Admit it?  Feel guilty about it?  Demand something from somebody? Be afraid of somebody?  Organize and try to stamp it out or stamp out discussions of it?

And where did it come from?

There are those who prefer not to discuss this issue. They have turned the word “woke” into a pejorative describing disparagingly those who are, as the Oxford Old English Dictionary tells us “originally (were) well-informed, up-to-date” but now “chiefly” means someone “alert to racial or social discrimination and injustice.”

A few weeks ago (July 25), we wrote about “Two Popes and Christian Nationalism.”  Recently we listened to a talk by John Biewen, the director of Duke University’s Center for Documentary Studies, a podcaster, and an author. He called his remarks, “The lie that invented racism,” and offered suggestions for solving racial injustice that began about 170 years before 1619, the date cited by a much-attacked New York Times article that (erroneously, we think) sets the date for racism in America.

Biewen’s talk supplements that July 25th exploration. We do not fear being called “woke” by recommending you watch Biewen’s presentation. Frankly, we are more likely to take it as a compliment, which might only make an accuser more angry. Too bad.

https://www.youtube.com/watch?v=oIZDtqWX6Fk

I jotted down three quotes while listening to his talk—-not a speech, mind you, a talk.  Racism “is a tool to divide us and to prop up systems.”

“It’s about pocketbooks and power.”

“White guilt doesn’t get a lot of anything done”

His presentation is one of many TED Talks posted on the web.  Talks such as these began with a 1984 conference on Technology, Entertainment, and Design (thus, TED).  The program focuses on “ideas worth spreading.”  The talks, limited to no more than eighteen minutes, cover a huge range of topics within the broad fields of science and culture.

Some famous people have made them. There are many whose names are meaningless to most of us—but whose words are worth hearing.  This is a forum for people unafraid to think outside their personal box, not for those who prefer to box out thoughts different from theirs.

There is afoot in our land an effort to ban discussion of race. Some say discussing race is an effort to make white people feel guilty about being white. The greater danger is from those who find no guilt in continuing to consider people of another color as lesser people.

We cannot escape history and we do not serve our country if we try to hide from it, obscure it, or ignore it.

The mere fact that we are discussing this issue as much as we are is proof enough that race remains one of the greatest overarching problems in our society and in our country.  It remains a problem and a problem is never fixed by denying that it exists or denying there never was a problem.

I hope Mr. Biewen’s remarks make you think.

(Photo credit: TED talks/youtube.com)

 

Irreverence

I was talking with one of my friends at the Y last Friday morning and the conversation drifted, as it always does, all over the place.

We eventually started talking about family heirlooms and how the current generation—Nancy and I have two members of one, she doesn’t—has no interest in them.  The silver service grandma used to dig out of the bottom dresser drawer when people were coming over for a special occasion, the doilies great aunt Marge made, the quilt (oh, lord, the quilts!) from who knows?

The nick-nacks from the places we and our forebears visited—the ash tray from the Great Smokeys (a clever pun of a souvenir), the paperweight with a picture of an enrupting Old Faithful embedded in it, matchbooks galore from hotels and motels long closed and either rotted or demolished, dried up pens from the same places, an old felt pennant that says “Rock City.”

All of that STUFF.

The coal oil lamp from the days before farms had electricity, the radio with a built-in 78 rpm record player, the salters that used to be placed on the dinner table for special occasions so people could dip their radishes in some salt before eating them, the stiff old baseball glove that great uncle Herb used in the 1920s.

My mother-in-law, Yuba Hanson, referred to STUFF as things having a “sedimental value,” being as meaningful to someone else as the dust that gathers in the corners of seldom-used and thus seldom-cleaned rooms, like sediment.

And then we slid into discussing disposing of this or that relative’s clothes after their deaths—parceling things out to surviving relatives who find something close to still being in style and giving the rest to Goodwill or the Salvation Army, and taking dishes and cooking utensils to this or that re-sell-it shop.

And I asked—–“What do you think will happen to Queen Elizabeth’s clothes.”

Yes, we really should be more reverential about the late Her Majesty (by the way, how long to do you have to be dead before you are no longer “late?).  There are millions of people, probably, in the United Kingdom who would take umbrage at such a comment.  But this is the United States and we cut to the chase.

We do not expect to see a sign on Buckingham Palace Road with an arrow pointing the way to London SW1A 1AA reading “Garage Sale.” It’s not uncommon to see a few racks of no-longer-fitting clothes in garage sales.  But we’re not going to see anything of the sort at Buckingham Palace.

Queen Elizabeth was known for her hats—which matched the rest of her attire when she was out in public.  What is to become of them?

This grossly irreverent thought has occurred that should offend so many people:

We understand that it is customary within the Catholic Church for the galero, the red ceremonial wide-brimmed tasseled silk hat of Cardinals, to be suspended from the rafters of the cathedral in which they served a month after their deaths.

The first Queen Elizabeth was the daughter of King Henry VIII, the king who broke with the Catholic Church and created the Anglican, or Episcopal, Church as the Church of England. Perhaps her large collection of hats could be distributed to the oldest Anglican churches in England, one to each, and be lifted to the rafters as a tribute to the person who headed the Church of England longer than anyone in its 488-year history.

We are aware that some will find this discussion unsavory.  But to common folks such as most of us who deal with the disposal of the worldly goods of family members who have left us, the question might lurk somewhere in the recesses of our minds but we are afraid to ask.

And she had an irreverent side to her, too.  Ten years ago, some might remember, she opened the London Olympics by “parachuting” into the stadium.  She did a video with James Bond (Daniel Craig) who went to Buckingham Palace to provide her security as she went to the royal helicopter and headed to the stadium where a stunt double jumped out of the chopper and moments later the real Elizabeth was introduced in the stadium.

Or there is the video she shot of tea with Paddington Bear in which he offered her a marmalade sandwich only to see her reach into her ever-present purse and pull out one she claimed she always kept for emergencies.

Both are on Youtube along with other moments when the Queen was just Elizabeth.  I have a feeling she would have enjoyed doing a turn on Downton Abbey if the story line were to continue another eighty years beyond where the latest movie left off.

We probably would not have written this irreverent entry if we had not seen three news stories the day after Her Majesty’s death.  One asked what would become of her beloved dogs?  She had four or five dogs, “two Corgis named Muick and Sandy, a Dorgi called Candy, and two Cocker Spaniels,” as Newsweek reported them.  There was much speculation already.

The second news story reported that the producers of the Netflix television series “The Crown,” a biopic inspired by the life of Queen Elizabeth II, had decided to pause the filming of the sixth and apparent final year of the series “as a mark of respect” on the day she died. We have seen no date for resumption of the filming although it appears it won’t happen until after her funeral. The series’ website says it is about “the political rivalries and romance of Queen Elizabeth II’s reign and the events that shaped the second half of the twentieth century.”  The writer of the series, Peter Morgan, says it is “a love letter to her.”

And ABC was quick to assure subjects of the United Kingdom that their money with Her Majesty’s face on it would still be the currency of the realm.  She was the first British Royal to have a photo on paper bills, in 1960. The Bank of England has indicated more details about changes in currency will be announced after the 10-day mourning period.

A spokesman for the Bank of Canada says there are no plans to change the face on that country’s currency. The same is true in Australia although a new $5 note with the image of King Charles will be issued at some undetermined date.  New Zealand has the same plans although its new bill will be a $20 bill.

That’s paper money.  Coinage?

The custom of the reigning monarch being on coins began with the last King Charles, the 17th Century Charles II.  The custom is to issue new coins with the new monarch facing the opposite direction the immediate past-monarch faced.

It is said she had a “wicked” sense of humor—or humour as her people would spell it.

I wonder if she ever counted the number of hats she had and laughed.

(photo credit: elle.com)

The debt

In these times when word “self-aggrandizement” appears to be an admired quality in some who are or who want to be our leaders, we want to highlight someone we find much more admirable.

Giles H. Stilwell was the president of the Chamber of Commerce in Syracuse, New York for 1929-30.  When he stepped down, he had an observation for those who thought their city owed them something.  No so, Stilwell said. It’s just the opposite.

My city owes me nothing.  If accounts were balanced at this date, I would be the debtor. Haven’t I, all these years, lived within the limits of the city and shared all its benefits?  Haven’t I had the benefit of its schools, churches and hospitals?  Haven’t I had the use of its library, parks and public places?  Haven’t I had the protection of its fire, police and health department?  Haven’t its people, during all this time, been gathering for me, from the four corners of the earth, food for my table, clothing for my body, and material for my home?  Hasn’t this city furnished the patronage by which I have succeeded in my business?  Hasn’t it furnished the best friends of my life, whose ideals have been my inspiration, whose kind words have been my cheer and whose helpfulness has carried me over my greatest difficulties? What shall I give in return?  Not simply taxes which cover so small a part of what I have received.  I want to give more, I want, of my own free will, to say, “This is my city,” so that I  can take pride in its prosperity, in the honors which come to its citizens, and in all that makes it greater and better.  I can do this only by becoming a part of the city—by giving to it generously of myself. In this way only can I, even in small part, pay the great debt I owe.

A similar, shorter sentiment was expressed by the headmaster George St. John at Choate Academy, a prep school in Connecticut, who quoted a Harvard dean’s statement to his students, “As has often been said, the youth who loves his Alma Mater will always ask not ‘what can she do for me?’ but ‘what can I do for her?”‘  One of St. John’s students was a kid named John F. Kennedy, who made a modified version of the phrase famous in 1961.

Some might find Stilwell’s speech pretty sappy.  Some might think substituting “state”  or “nation” for “city” would work as well.

Something to think about in our present climate, we suppose.

Tread Carefully

The Missouri General Assembly convenes in a special session in a few days to consider a significant cut in the state’s income tax and other issues.

The past and the present and two seemingly unrelated situations suggest this is a time to tread carefully—-although, this being an election year, politics could take a higher priority than should be taken in considering the tax cut.

Let’s set aside politics for a few minutes and raise some concerns based on years of watching state tax policy be shaped.

Days after Governor Parson announced he was calling the legislature back to cut the income tax, President Biden announced his program to eliminate a lot of college student loan debt.  The two issues, seemingly wide apart, actually are related in this context. It will take some time to explain.

We begin with the Hancock Amendment. In 1980, Springfield burglary alarm salesman—later Congressman—Mel Hancock seized on a tax limitation movement sweeping the country and got voters to approve a change to the state constitution that tied state government income to economic growth.  If the state’s tax collections exceeded the calculated amount, the state had to send refund checks to income taxpayers.

Some of the Hancock Amendment was modeled on Michigan’s Headlee Amendment adopted two years earlier. But the timing of Hancock could not have been worse.  While Michigan’s amendment was passed during good economic times, Missouri’s Hancock Amendment went into effect during a severe economic recession considered to be the worst since World War II.

Missouri therefore established a limit that had a low bar. There are those who think the state has suffered significantly because of that.

Except for one year the Hancock Amendment has worked well.  Too well, some think, because it has encouraged state policy makers to underfund some vital state programs already hampered by Hancock’s low fiscal bar.

In 1998, the state revenues exceeded the Hancock limit, forcing the Revenue Department to issue about one-billion dollars in refund checks (averaging about forty dollars per household).

The legislature decided it did not want to repeat that. So it decided to cut taxes to keep from hitting the Hancock limit again.  Not a bad idea, except that when the national and state economies took a dive, financing of state institutions and services was severely lowered.  Had the refund program remained in effect, the economic downturn would have meant no refunds but institutions and services would have been hurt far less because the tax base would have stabilized funding.

The MOST (Missouri Science and Technology) Policy Initiative, a fiscal think tank, has recorded twenty tax cuts from 1993-2013.  The result is that Missouri is almost four-billion dollars under the revenue limit set by Hancock, according to the latest annual study done by the state auditor.

Missouri is unable to do a lot of things it could be doing because the legislature eroded the state tax base instead of issuing checks.

Now the legislature will consider an even deeper tax cut.

Nobody likes to pay taxes. But there has been cultivated in our state and nation a culture that seems to think the benefits of government—education, public safety, infrastructure, care for the sick and elderly and indigent, and other parts of our lives we take for granted—should be free.  Or, to the way of thinking of some people who don’t need those things, eliminated.

How does the Biden program to forgive billions of dollars in student loans provide a cautionary element to consideration of the Parson tax cut?

When I was in college in the previous century, I knew many people who worked their way through school. Some could do it with part-time jobs on campus or in the community. I had one friend who worked for a semester and then took classes for a semester.  I have one friend who  financed his college education by selling thousands of dollars worth of Bibles and other religious books during the summer.

But the expense of a college education today makes that kind of self-financing impossible, or almost impossible.  And here is a major reason why.

Back when my generation and the generation after us, probably, could work our way through school, the state provided for a substantial cost of higher education.  Today, the percentage is much lower.

Last year, one of Missouri’s most distinguished attorneys—who also was appointed by Governor Parson to the Coordinating Board for Higher Education—W. Dudley McCarter, noted in The Columbia Missourian, “After striving to attain this goal over the past 10 years, the state of Missouri has now succeeded in becoming the state that is at the very bottom in funding for higher education. No, it is not Mississippi, Arkansas or Alabama — it is Missouri. Over the last 10 years, state funding for higher education has increased nationwide at an average of 12.40% with some states increasing funding by over 40%. In Missouri, however, funding has decreased during that same period by 13.70% — the only state that had reduced funding. When adjusted for inflation, the decrease is actually over 26%. The national average for funding is $304 per student, with some states providing over $700 per student. In Missouri, the funding is less than $200 per student.”

The downward trend has been going on far longer than that. The internet site Ballotpedia has noted that state appropriations per full-time student dropped by 26.1% in the first decade of this century, about twice the national average.

A study done a few years ago for Missouri State University showed that, nationally, student higher education tuitions made up 30.8% of higher education revenues in 1993. By 2018, tuition was financing 46.6% of higher education costs—and the costs were higher. It was during that time that student borrowing ballooned to offset declining percentages of state and federal higher education support.

The Biden student loan forgiveness program deals with those who have debts already. It does nothing to prevent current or future students from incurring crippling student debts, because government has reduced its support for higher education.  And now, the legislature is being asked to reduce state revenues even more.

We lack the expertise to get too far into the weeds of economic nuance.  But reducing the state’s ability to meet its fiscal responsibilities in the future, whether it’s in higher education or numerous other fields is a long-term issue that must be approached with great caution.

Things are flush right now, thanks partly to inflation and the massive injection of federal Covid relief funds in the last few years.  But Missouri still is far short of its own limit on the state tax burden and still far short in funding numerous human-service needs.

It is politically popular in an election year to cut taxes. The public seldom recognizes the long-term penalties that might result.  Tomorrow’s college graduates might be among those paying a high price for today’s popular tax cut and incurring new student debt burdens.  And if a recession hits next year, as some economists keep predicting, some unfortunate results of this year’s tax cut could become painfully clear.

Governor Parson has taken a wise step in meeting with members of both parties to explain why he thinks a tax cut is appropriate today. We suspect he had an easier sell with member of his own party than he did with the other side. We expect some passionate discussion of this issue during the special session.

We also expect a cut will be enacted.  We hope, however, that we do not have a repeat of the unfortunate post-refund tax cuts of decades ago. We must be careful as we consider what we might do to ourselves, our children, and our friends.  Tread carefully.

Unprecedented

“Unprecedented” is a word frequently heard these days in our national political discussions.  We thought it might be interesting to see what other times “unprecedented” has been applied to our Presidents.   “Unpresidented,” if you will, although it isn’t a real word.

It was unprecedented when the nation selected its first President who was not a member of an organized political party.  He also was the first President unanimously elected, a truly unprecedented feat: George Washington.

The idea that a President would never veto a bill while in office was unprecedented when John Adams did, or didn’t, do it. Adams had a lot of “not” precedents: the first President who did not own slaves; the first President who was a lawyer; the first President to lose a re-election bid and the first President who did not attend the inauguration of his successor.

Thomas Jefferson’s defeat of an incumbent President (Adams) was unprecedented. (So was the method of his election.  In those days the President and Vice-President each accumulated electoral votes.  Jefferson and his running mate, Aaron Burr, each got 73 electoral votes. Incumbent John Adams had 65 but his running mate, Charles Pinkney, only had 64.  The House of Representatives cast 36 ballots before Jefferson won 10 of the 16 state ballots. Burr had four and Maryland and Vermont delegations tied within the delegation.  All of this was unprecedented, too, of course.)

James Madison took the unprecedented step of asking Congress for a declaration of war.

The election of Senator James Monroe to the presidency was unprecedented.

John Quincy Adams’ election was unprecedented because he was the first President who lost the popular vote.  (None of the candidates got a majority of the electoral vote, throwing the election into the House of Representatives under the 12th Amendment. Thirteen state delegations favored Adams, seven favored Andrew Jackson and four favored William H. Crawford.)

Andrew Jackson’s administration was the first administration to pay off the entire national debt.

Martin Van Buren’s presidency was unprecedented because he was the first President who was born an American citizen (all of his predecessors had been born as British subjects).

The death of William Henry Harrisons while in office was unprecedented.

The House of Representatives took an unprecedented vote to impeach President John Tyler.  It failed.

James K. Polk took the unprecedented step of refusing to seek a second term.

Zachary Taylor had never held a public office before becoming President, an unprecedented event.

Millard Fillmore took the unprecedented step of installing a kitchen stove in the White House.

His successor, Franklin Pierce, took the unprecedented step of installing central heating in the White House.

James Buchanan was our first bachelor president. Historians debate whether he was gay.

No president had been murdered until John Wilkes Booth took the unprecedented step with Abraham Lincoln, who is the only president to hold a United States patent.

The House of Representatives held a successful unprecedented impeachment vote against Andrew Johnson.  The Senate held an unprecedented trial and failed to convict him.

U. S.  Grant vetoed more than fifty bills, an unprecedented number.

It was unprecedented in modern election history when Rutherford B. Hayes won the electoral vote but not the popular vote.

James Garfield was an unprecedented President because he was left-handed or ambidextrous.

Chester Arthur took the unprecedented step of having an elevator installed in the White House.

Grover Cleveland set several precedents—the first President married in the White House; the first to have a child while President, and the first President to veto more than 100 bills.

Benjamin Harrison set a precedent by being the first President to have his voice recorded.

William McKinley was the first president to ride in an automobile.

Teddy Roosevelt set a precedent by becoming the first president to ride an airplane. (He got aboard a Wright Brothers airplane piloted by Arch Hoxsey and flew for about four minutes at Kinloch Field in St. Louis. https://www.youtube.com/watch?v=NaFulqGGkwk). He also took an unprecedented trip on a submarine.

The first president to throw out the first ceremonial pitch of the baseball season: William Howard Taft.

The first president to hold regular news briefings was Woodrow Wilson. He also took the unprecedented stop of appointing a Jew to the U.S. Supreme Court, Louis Brandeis.

Warren G. Harding learned of his election in an unprecedented way—he heard about it on the radio.

In 1927 the Lakota Sioux tribe took the unprecedented step of adopting a U.S. President as a member of the Lakota nation. Calvin Coolidge.

Herbert Hoover took the unprecedented step of having a telephone installed on his desk.

Franklin D. Roosevelt set a precedent by serving more than two terms. Among his other precedents—the first to fly across the Atlantic and the first to establish 100 days as the first benchmark for accomplishments in office.

The Secret Service set a precedent when it made Harry Truman the first President to have a code name (General). Television set a precedent by televising his 1949 inauguration.

Television set a precedent when it gave one of its Emmy Awards to President Eisenhower who was the first President to appear on color television.

First President who was a Catholic: John F. Kennedy. He also set a precedent by being the first former Boy Scout elected to the office.

The first President to be inaugurated on an airplane was Lyndon Johnson. He also set precedents by appointing the first African-American to the U.S. Supreme Court and appointing the first African-American to serve in a cabinet position

Richard Nixon set a precedent when he attended a National Football League game. Also: First President o resign.

First President never elected to the office or to the office of Vice-President: Gerald Ford.

Jimmy Carter broke precedent when he went by a nickname instead of the formal James E. Carter Jr.  As we write this, he moves into unprecedented territory by living longer than 97 years and being married for more than 75 of them.

Ronald Reagan set a precedent when he was re-elected, the first President re-elected older than 70 (73 at the time). He also set a precedent by nominating a woman to the U.S. Supreme Court.

George H. W. Bush set a precedent when he became the first President to pardon a Thanksgiving turkey.

First President who was a Rhodes Scholar, to have an official White House website, and to perform at a jazz festival (saxophone): Bill Clinton

First President to achieve a 90% approval rating in modern polling: George W. Bush.

America set a precedent by electing African-American Barack Obama, who was the first president born outside the 48 continental United States (Hawaii) and who was the first to endorse same-sex marriage.

First President with no prior public service experience, first to be impeached twice, first president to never see an approval rating above 50%, first president to refuse to publicly acknowledge re-election defeat: Donald Trump.

Joe Biden has set a precedent by being in office past his 77th birthday. He’s the first President to get more than 80-million votes.

First President to be indicted by a grand jury?  The first President to be brought to trial on criminal charges?  The first President to wear a prison uniform?  These are unprecedented possibilities that many hope never come to pass while many others hope come true.

That’s because we are living in unprecedented times.

 

Franklin W. Dixon and Carolyn Keene

The names might ring a bell for some of our readers.  “They” wrote books that have sold millions of copies and are still being published after more than a century.

For a short time, Franklin and Carolyn were the same person.   His name was Leslie McFarlane and I came across his second autobiography during a recent visit to a bookstore in Michigan.

Did you ever read or hear about The Bobbsey Twins?

Your grandfather or great-grandfather might have read  the Tom Swift novels or The Rover Boys, or perhaps novels featuring the heroics of Dave Fearless or the sleuthing of The Dana Girls. I have some copies of The Radio Boys. There also was a companion series, The Radio Girls. All were among the 109 juvenile fiction book series published by the Stratemeyer Syndicate which hired writers and gave them story outlines and paid them small amounts to churn out books, the best known of which are The Hardy Boys and Nancy Drew. 

Their contracts required that they never admit they were ghost writers of any of these books, using names assigned them by the syndicate.

McFarlane wrote 22 of the adventures of Frank and Joe Hardy and the first four spinoff volumes of Nancy Drew called The Dana Girls.

The book I picked up in Michigan is Ghost of the Hardy Boys. If you grew up reading any of the syndicate’s series, you’ll enjoy reading McFarlane’s story—which is far more than the story of the Hardy Boys stories.   His writing about the small Canadian town where he grew up and his stories of his early jobs with small-town newspapers are wonderfully written.

Not even his son knew he had written that shelf of books in the family bookcase. McFarlane, who considered his authorship just a job, never paid attention to what happened to his books after he wrote them and did not realize until the closing years of his life the significance of his efforts.

(I read several of the Hardy/Nancy novels but the real juvenile fiction author of my youth was Fran Striker, who created The Lone Ranger novels.  I have all of them about ten feet from where is have written most of the literary gems such as the one you are now reading.)

McFarlane struck a chord with your book reviewer a couple of times when he wrote about writing.  Here are a couple of excerpts:

When my young wife told her friends that she had married a writer, their good wishes sounded more like condolences…One good woman said, “God help you, my dear!” with compassion. We thought it amusing at the time. Later we realized what she meant.

Writers are not good husband material. (I am not qualified to speak for the husbands of female writers.) Not because they are worse characters than men of other occupations. They aren’t. Not because they are impractical and untidy. They are. Not because their income is chancy. It is. But they are always underfoot…Who can blame her if she envies her sisters whose husbands clear out every morning and stay the hell out until dinner time, returning with fascinating accounts of their adventures in the great world, of the installation of a new water cooler and how he told off the assistant manager? My life has been blessed by two remarkably happy marriages, each happy because of a woman who had the cheerful courage and devotion to put up with an existence calculated to drive most wives to a psychiatric hospital or divorce court…

The other day someone asked my friend, MacKinlay Kantor, when he planned to retire. Our paths in life have differed vastly but we both are of the same age, began on small-town newspapers, made a living from the pulps, and are still writing. “Writers,” replied Kantor, in a voice that came mighty close to a snarl, “never retire.  Real writers, that is.” And we wouldn’t have it any other way. It is a survival course that never ends for any of us. I will be freelancing until someone draws the cover over my typewriter for me for the last time.

I wish more people were writers.  Of their own stories.  Many people are intimidated by the thought, never sure “where to start,” thinking a story has to begin at the beginning.

Hogwash.

A story just has to begin. Earlier or later accounts will fill in the before-and-after holes. All life stories are worth telling. It is unfortunate that the main accounts of the lives people have lived are woefully inadequately summarized in the last newspaper article that will ever mention them.

Some people who retire worry about what they will do without a job and the social contacts that are part of employment.  The answer is simple.

Become a writer.  Write about the things you know best.  And the one thing you know best is yourself.  Abandon any pretense of modesty. Enroll in McFarlane’s “survival course that never ends for any of us.”

Descendants you will never meet will meet you.  And they will be enriched by what they read.

I was enriched by reading about Franklin W. Dixon and Carolyn Keene and discovering how much more they were than a couple of names.