Feeds:
Posts
Comments

Rarely has anyone asked me a question that I felt less certain about while answering.

Several times in the first week of the month, someone asked me the same question, and each time I felt the confidence swell up like a warm balloon inside of me. How dare they even ask? The answer was so obvious that I all but openly scoffed at the questioner.

“Do you think we’ll get any snow?” a co-worker asked.

Pfft.

My eyes narrowed and the corners of my mouth rose into a slight, cynical grin. My back stiffened. I felt like a sage asked to impart wisdom upon the uneducated masses. I waited a moment, letting the pause settle to the ground between us, before answering in a tone as calm and placid as the surface of a lake on a windless day.

“No,” I said. “Or we might get snow, but there is no way – absolutely no way – we are getting anything like a foot of it.”

I cited the lower end of the forecast, which at the time was around 6 inches, and said I’d be happily surprised if we got that much.

That was all the wiggle room I left myself.

I could easily remember all the times forecasters predicted the possibility of calamity – whether hurricanes, floods or blizzards – that never materialized, and times when predictions of tiny weather events fell disastrously short of what happened, as with last month’s ice storm.

More than that, I remembered all the times I hoped for big snowstorms, only to be disappointed.

Those memories fueled my sense of certainty. Those forecasters. They weren’t going to get my hopes up this time.

Early in the week, the forecast shifted from day to day, and it further fueled my certainty.

The shifting more or less stopped by Thursday, but I was not deterred.

“Do you think we’ll get any snow?” a co-worker asked me on Friday.

My eyes rolled so far back in my head I could see my brain pan.

“No,” I said, trying not to sneer, “certainly not 10 to 16 inches.”

And I added that since the highs were going to be in the 40s in the days before any snow fell, the ground would be warm and it would melt pretty quickly. There was no sense wringing hands about it.

I intentionally avoided the grocery store. I would not be held hostage in long lines of hysterics loading up for Snowmageddon.

My lone concession to the forecast was to agree it would be prudent to send last Sunday’s paper to press earlier than usual Saturday evening, just in case.

I woke after midnight that night and looked outside to see a dusting of snow on the grass, and a steady amount of new snow falling. I retrieved my News-Topic from the front sidewalk, shaking the snow from it, and went back to bed.

Several hours later I woke and looked outside to see that something close to 6 inches had fallen and piled up in the trees, and it was still snowing steadily. I checked my phone’s weather app, and it said there was a 100 percent chance of snow until early afternoon.

It appeared that I might have been wrong.

As the morning went on and the snow grew deeper, I began to worry about the amount of food in the refrigerator.

Around noon, when there clearly was much more than a foot of snow on the back patio, I worried about the power going out.

When the snow finally stopped, I went outside with an 18-inch ruler and pushed it down into the snow on my car. It sank to the tip.

I was wrong. Man, oh man, was I wrong.

You may ask, did I learn a lesson about acting so haughty?

Based on experience, I can answer with nearly absolute certainty, and I will be succinct: No, I learned nothing. No way.

Advertisements

You are changed by how you read

Reading is vital to the development of the human brain, but how we read – whether we read words printed on paper or words lit electronically on a digital device – may be more important still. The question is whether you should find that chilling.

Maryanne Wolfe, a professor in UCLA’s Graduate School of Education and Information Studies, recently wrote in an article for The Guardian – “Skim reading is the new normal. The effect on society is profound” – about research by her and others that has disturbing implications for the ability of people to comprehend what they are reading, to think critically and to act rationally.

“My research depicts how the present reading brain enables the development of some of our most important intellectual and affective processes: internalized knowledge, analogical reasoning, and inference; perspective-taking and empathy; critical analysis and the generation of insight,” Wolf wrote. “Research surfacing in many parts of the world now cautions that each of these essential ‘deep reading’ processes may be under threat as we move into digital-based modes of reading.”

Why should it make such a difference whether you are holding a paper book and turning physical pages rather than holding a Kindle and swiping left?

In part, Wolfe wrote, some research suggests that the physical sense of holding a book or newspaper and turning a physical page adds a spatial sense that helps the brain file the information away.

Other research suggests that it may be related to what paper does NOT do: enable you to stop reading and check Facebook, or text messages, or Twitter, or anything else you can do on an internet-connected device. Such multi-tasking trains the brain’s “reading circuit” how to behave, Wolf wrote.

“If the dominant medium advantages processes that are fast, multi-task oriented and well-suited for large volumes of information, like the current digital medium, so will the reading circuit. As UCLA psychologist Patricia Greenfield writes, the result is that less attention and time will be allocated to slower, time-demanding deep reading processes, like inference, critical analysis and empathy, all of which are indispensable to learning at any age,” she wrote.

This has enormous implications for what people will and won’t be able to do in all spheres of life – at school, at work, in personal interactions, in daily life. As Wolf wrote, people become impatient for quick bites of information and can’t devote the time it takes to understand something complex – including not just literature but such things as wills and contracts.

More disturbing, think of what this means for our ability to maintain a unified and relatively civil society. Consider all we know now about disinformation campaigns on social media. How much worse could things be as the ability to critically analyze information becomes increasingly rare?

“The subtle atrophy of critical analysis and empathy affects us all. It affects our ability to navigate a constant bombardment of information. It incentivizes a retreat to the most familiar silos of unchecked information, which require and receive no analysis, leaving us susceptible to false information and demagoguery,” Wolfe wrote.

Despite all that, Wolfe sounded a hopeful note: “There’s an old rule in neuroscience that does not alter with age: use it or lose it. It is a very hopeful principle when applied to critical thought in the reading brain because it implies choice.”

Cynical journalist that I am, though, I can’t help but see Wolf’s article through the lens of how the innovations of the digital revolution have disrupted my own industry and left it perhaps permanently diminished. My reading brain lingers on this passage:

“As MIT scholar Sherry Turkle has written, we do not err as a society when we innovate, but when we ignore what we disrupt or diminish while innovating.”

Statistically speaking, I probably reached my peak desirability three years ago.

I would have guessed it was longer ago than that, but science says otherwise. If only I had known so I could savor the year of my peak sexiness. Maybe I could have gotten someone else to buy most of my beer. Maybe I could have gotten some big discounts by fluttering my eyelashes. Sometimes people let me cut in line at Food Lion. I’d hate to think that’s all I got to show for it. Alas, the opportunity for more has passed.

The Washington Post reported this week on a study in the journal Science Advances that analyzed data from thousands of users of an unidentified “popular, free online dating service” in four major U.S. cities: Boston, Chicago, New York and Seattle.

The user data did not include names, personal details or message content. The scientists involved analyzed how many messages users sent and received, how long those messages were and whether they got a response, and cross-referenced that information with users’ age, ethnicity and education.

The study established “a hierarchy of desirability” defined by the number of messages someone received, and it compared that to the desirability of the people sending those messages. In other words, a person who received a lot of messages from people wanting to get a date rated as highly desirable.

The study found that men’s desirability increased with age – up to a point. The peak desirability was at 50.

Maybe that’s why you don’t see George Clooney in movies anymore. For a while he was everywhere, box office gold, but now he is 57 and the luster has been fading for seven years. That’s four years past my age, so it would make him even less desirable than I am. (Wouldn’t it? Don’t answer that.)

The study also found that men are shallow and insecure. At least that’s how I interpret the information that women were most desirable at age 18 and less so from then on — and that more highly educated women were particularly less desirable.

Elizabeth Bruch, lead author of the study and a sociologist at the University of Michigan, told the Post that this data means scientists can now answer the question, “What would it mean scientifically for someone to be ‘out of your league?’ ”

The answer is that if you are the one initiating contact, you’re already pushing the upper limits of your league.

Both men and women sent first messages to potential partners who were on average 25 percent more desirable than they were, and men wrote more first messages than women did.

The length of the messages also corresponded to how much more desirable the message’s recipient was than the sender. So, if you are trying to ask someone out on a first date and find yourself going on and on, babbling, unable to stop yourself, recognize that on some level you know you are seriously out of your league.

I take my analysis of this study’s information a step further than the Post’s story does: Even though men seemed most interested in very young women, at all age levels they tended to initiate contact, which means at all age levels the women they contacted were still on average 25 percent more desirable than they were, often much more than that. The likelihood, then, is that on any resulting date, the man should have felt lucky even to be at the table because he was probably out of his league.

In other words, science now confirms what all smart men openly acknowledge: Almost all of us marry up.


There may well be a sucker born every minute, but don’t place the credit or blame for that observation on P.T. Barnum.

Phineas Taylor Barnum, the showman perhaps best known for founding the Barnum & Bailey Circus, was the source of a number of pithy saying about human nature and business, but perhaps the most widely circulated saying attributed to him is the cynical, “There’s a sucker born every minute.” It is cited often in laments about the gullibility of the public.

A friend who posted another of Barnum’s quote on Facebook also posted that the “sucker” quote was not actually Barnum’s. That set me searching.

According to the Quote Investigator website, there is “no persuasive evidence that Phineas Taylor Barnum who died in 1891 spoke or wrote this saying.”

“Researcher Ralph Keyes presented a skeptical stance with his assertion in ‘The Quote Verifier’ that ‘No modern historian takes seriously the routine attribution of this slogan to P. T. Barnum,’” Quote Investigator said.

The website posted a list of related sayings that had been documented, the oldest appearing in 1806. Barnum wasn’t even born until 1810.

In an 1806 an article titled “Essay on False Genius” in “The European Magazine and London Review” had this fictional account involving the reply of salesman “to whom some person had expressed his astonishment at his being able to sell his damaged and worthless commodities, ‘That there vash von fool born every minute.’ And perhaps the calculation might be brought to the proof, that not more than fifty men of genius are born in half a century.”

Without the phonetic spelling: There was one fool born every minute.

Another website, Brook Browse, says that the “sucker” quote was attributed to Barnum in 1868 by a business rival, David Hannum. Hannum had been drawing large crowds to see a “fossilized giant” he had bought, and Barnum created his own giant out of plaster and drew crowds away, infuriating Hannum. Turns out that Hannum’s also was fake, created by an Iowa man — so in the end Hannum was the sucker because he had believed it was real and bought it.

The website Brainy Quote gives a long list of Barnum’s quotes, which cover a variety of topics, only a few of them about business or making money, but even those are much more eloquent than “there’s a sucker born every minute.”

The one that comes closest to making money by drawing people in is, “Every crowd has a silver lining.”

One that I like about money is, “Money is in some respects life’s fire: it is a very excellent servant, but a terrible master.”

But the one that began my research, pulling me in on Facebook in my friend’s post, has perhaps more resonance for me than all the others:

“He who is without a newspaper is cut off from his species.”

It was true before Barnum said it, but I will happily credit him for that observation.

We have begun election season, and candidates should heed the advice of experienced political consultants that putting out a ton of yard signs doesn’t work.

The only thing it accomplishes is creating lots of visual clutter and post-election litter.

If you want voters to remember your name, there is solid evidence of where candidates get the most bang for their buck: in elevators.

The evidence comes in the form of a recent poll by Elon University testing how well registered voters know who their elected officials are.

Overall, voters’ knowledge is pretty bad.

People generally know the name of the president, vice president, and probably the governor and at least one U.S. senator, but after that, the poll shows, their knowledge goes off a cliff.

Only 22 percent can identify who represents them in the N.C. House of Representatives. Around here, that could be understandable. Destin Hall was first elected only a year and a half ago, and he’s young enough (31) that he hadn’t had much time to make a public impression before he ran for office.

Only 17 percent can identify who represents them in the state Senate. Again, around here that could be understandable, but for different reasons. Caldwell County keeps getting shifted to different Senate districts as the legislature and the courts tussle over redistricting maps. Until a few weeks ago, our senator was Deanna Ballard, who is from Watauga County and like Hall was first elected in 2016. Ballard replaced another Watauga County resident who resigned. (Nothing against Watauga County residents, but people are less likely to recognize the name of out-of-towners who show up mainly for ribbon-cuttings and ceremonies.) For the past few weeks our senator has been Warren Daniel of Burke County – who had been our senator before a previous round of redistricting.

Only 11 percent know the name of the president of the state Senate, who many observers convincingly argue is the most powerful politician in North Carolina at the moment. His name is Phil Berger, he is from Rockingham County, and if you were on the email list to receive his press releases you surely wouldn’t forget him because almost everything issued by his office is like digital napalm employed in a constant political war.

A big exception to this lack of knowledge about the state’s elected leaders, Elon’s poll said, is that 49 percent can identify the state’s commissioner of labor. That’s a slightly higher percentage than can identify their local sheriff.

But the reason people stand about a 50-50 chance of identifying her is the unofficial title people give her: “Elevator Lady.”

Cherie Berry’s name and photograph appear in the little window every elevator in the state has for displaying its inspection certificate.

Berry was the first N.C. labor commissioner to put her photo with her signature on the certificates. Critics complained, but clearly the tactic worked. She has now been in office for 25 years.

The conclusion we can draw, then, is that constant exposure to a candidate’s name on signs displayed in residential yards and in the medians of heavily traveled roads does little to sway voters. But putting a person’s name and face in the line of sight where people will spend a few quiet moments riding in awkward silence, scanning the walls for anything to divert their attention from the strangers around them, creates a lasting impression.

Unfortunately, Caldwell County does not have many elevators. This leaves local candidates with just one real option: Spend most of the campaign riding up and down inside Caldwell Memorial Hospital.

I promise you, candidates, it will have an effect: The hospital has the county’s highest elevator, therefore the longest rides, and the added awkwardness of the hospital setting will make you and your steady smile truly unforgettable to each voter you encounter.

And those of us who don’t visit the hospital will appreciate the respite from campaigning.

The unthinkable happens every day, somewhere.

Some ordinary thing, some routine activity, goes horribly awry.

Something such as going down the stairs at home.

Ashley Moss must have gone up and down the stairs in her apartment in Granite Falls hundreds of times, and gone up and down countless other stairs thousands of times since she was a child.

Yet something went badly wrong Tuesday, and she was found dead at those stairs. Police said it appeared to be just an accident. She fractured her neck in a fall.

My house has stairs, and every now and then I’m reading while walking, or thinking about something going on at work, and I miss a step, or I hit the edge of the step and slip down, flailing for the rail. I curse myself for not paying attention, but I don’t think much more about it.

Most of the time when we talk about “the unthinkable” we mean someone close to us dying suddenly in something like a car wreck, a boating accident, a random shooting. But those are things that we know are dangerous and potentially deadly events. We think about the possibility. That’s why we wear seat belts and why boats have to have life jackets on board, and why dark alleys in the city scare us. That’s why mothers ask us to call when we reach our destination so they know we are safe — and they can stop worrying.

We mean they are unthinkable because we hate to think about them.

But no one thinks about death when approaching a flight of stairs.

Falling down the stairs is depicted two ways in movies and TV shows: as comedy and as deadly – more often as comedy. In “Get Shorty,” John Travolta’s gangster character throws a stuntman down the stairs in a restaurant. The stuntman is humiliated but not injured. In “Die Hard,” Bruce Willis’ character and a terrorist he is fighting both tumble down a single flight of stairs. Willis is unharmed; the terrorist is killed — deadly for the bad guy, no big deal for the good guy.

Perhaps the number of times we have seen falling down the stairs portrayed without serious injury contributes to us not thinking about what could happen.

But it’s not the only thing we never think about that can go fatally wrong.

How many times have any of us had the flu? We don’t think about that as something that could be fatal except for young children or adults who are frail, but the flu can turn from what we think of as an incapacitating but temporary illness to pneumonia and then to sepsis, when chemicals released into the bloodstream to fight an infection can trigger a cascade of changes that can damage multiple organ systems. Less than two months ago a 21-year-old bodybuilder in Pennsylvania developed sepsis from the flu and died.

We don’t think about how fragile life can be, the myriad tiny hazards we cross each day, or how close we are to being in the situation of Ashley Moss’ parents, Mark and Cherry Moss. That’s something Mrs. Moss said in a story we ran Friday:

“The one thing that Mark and I are getting across to people is, hug your kids tight, … because you don’t know.”

Journalists think journalism is the main point of “The Post.” It seemed kind of secondary to me when I saw the movie earlier this week.

The movie is about events surrounding the publication of what came to be known as the Pentagon Papers, a secret, Pentagon-compiled history of the conflict in Vietnam – which predated the U.S. troop presence there, with U.S. involvement stemming from decisions made during the Truman administration. That history catalogued decades of lies that the U.S. government told the public about what it was doing in Vietnam and how successful those efforts were.

The New York Times obtained that history and found itself in the legal crosshairs of the Nixon administration after it began publishing stories detailing the lies.

The movie, though, is about what key figures at the Washington Post did after that.

Some, including journalists at the Times, complain that the movie should have focused on the Times and how it got the papers. But the movie doesn’t focus on the journalism surrounding the Pentagon Papers. There would be good drama to be found there, but that’s not what this story is.

The primary story “The Post” tells is that of Katharine Graham, her struggle to grow into the role of publisher and the decisions she had to make that could have destroyed her and, perhaps most significantly in her mind, her children’s inheritance.

The movie takes some license with the reality of Graham, portraying her as a much more fragile person than she was, and also with the reality of how extensive the pushback was against publishing the story, so the situation portrayed is largely accurate, but the drama is ramped up markedly — as often happens with movies.

If this is a movie primarily about journalism, then why is there so little actual journalism portrayed? The initial act that led to the Pentagon Papers coming into the Times’ possession is portrayed as that of a whistleblower, with no involvement of any journalist — the movie viewer doesn’t even know, at that point, where the documents land. Later, the Post reporter Ben Bagdikian is shown tracking down the whistleblower, but there is less of him in the movie than Graham as either the lone woman in a room full of male bankers and lawyers or the lone businesswoman in a room full of housewives or secretaries.

Time after time, there are scenes contrasting Graham’s roles – the socialite, publisher’s wife role into which she was raised, and the business owner/publisher role into which she was thrust. The scenes illustrate the man’s world of 1971, where a woman making important decisions was treated by men like, to borrow a phrase of Graham’s from one scene, the sight of a dog walking upright.

And other than the scene of the socialite housewives, in all the other scenes the woman all are young. They are the next generation. They are the ones following, looking to the example of Graham’s generation of women.

To be sure, this is a movie partly about journalism as a necessary means of holding government accountable – thus director Steven Spielberg’s emphasis on a key part of the Supreme Court decision: “In the First Amendment, the Founding Fathers gave the free press the protection it must have to fulfill its essential role in our democracy. The press was to serve the governed, not the governors.”

But if this is not a movie primarily about Graham, why so much emphasis on scenes where Graham is an object of obvious veneration by young women? This happens in crowds of women twice, and both times the young women part for her like the Red Sea for Moses, all of the young women gazing upon her in open admiration, and it happens once with a young woman working for the U.S. attorney general. The movie hammers the point, as Spielberg movies tend to do with their points, that Graham was a trailblazer for women.

And as if to further drive home the point for all of the men who still think it’s a story about journalism, Spielberg has editor Ben Bradlee’s wife explain to Bradlee why Graham had the most to lose and showed the greatest bravery. Bradlee does not argue with her.

So, for the question, why does the movie focus on the Post rather than the Times? Precisely because Graham found herself in a crucible like no one else did (or at least, since in reality it all happened so quickly that she didn’t have time to agonize over the decision, Graham’s situation posed for the movie makers an irresistible potential for a crucible). That her crucible was a key moment in journalistic history would be beside the point if there were not powerful people who today believe the government should control what the media can publish.

This is a movie about women, and more than anything else this is a movie made for this new “Year of the Woman.”

———-

NOTE: One thing I’ve not read about “The Post,” anywhere: If the Post had only 4,000 pages of the Pentagon Papers, as stated in the movie, there’s no way it would fill two giant boxes, not unless paper in 1971 was many times thicker than it is now. Pick up a ream (500 pages) of paper. You are talking 8 of those. That’s less than one box.