Updated

The sixtieth anniversary of the premier of one of the most beloved television shows in American history--“I Love Lucy,” which debuted on October 15, 1951-- gives us occasion to think about the social impact of TV over those same six decades.

The stubborn reality is that for all the criticism hurled at TV--as the “boob tube,” as a destructive force undermining culture, even rotting our brains--the reality is that TV is enduring and popular precisely because it is so friendly. How bad can a friend be?

Lucille Ball was a friend to millions. The original “I Love Lucy” series ran on CBS for six seasons, although Ball continued portraying her “Lucy” character on various sitcoms into the 1980s.

In August, on the centennial of her birth, Fox’s own Cal Thomas wrote a fond piece, recalling her  “everywoman” appeal. “People identified with Lucy,” Cal reminded us.

Were they wrong to identify with Lucy? Probably not. As they say, the camera doesn’t lie.

It might be possible for a movie actor, wrapped in secrecy, to fool an audience, but television, rolling on week after week--well, that’s a different story. The real Ball might have had her moods, but who could doubt that she would be the life of any party? In other words, television didn’t fool people; instead, the new medium shared the truth about the stars.

And in that sharing spirit, a common form of television has been the simplest and most human form of communication imaginable--the talk show.

Talk has changed little over the eons, and it has changed little over the decades on TV. And so, for example, Larry King’s low-key style on CNN over the last quarter century was little different from that of Dave Garroway, host of NBC’s “Today Show” in the 50s and early 60s.

Yet if TV is a simple extension of human nature into a new electronic technology, it has never pleased the critics. In the minds of the elite, we might conclude, anything so popular with the American bourgeoisie was automatically suspect.

So in 1961, a decade after “Lucy” was launched, Newton Minow, then the chairman of the Federal Communications Commission, declared in a speech to the National Association of Broadcasters that TV had a few good moments--but mostly bad moments. “When television is good,” Minow declared, “nothing is better.” But then came the punchline--and the punch:

But when television is bad, nothing is worse. I invite each of you to sit down in front of your television set when your station goes on the air and stay there, for a day, without a book, without a magazine, without a newspaper, without a profit and loss sheet or a rating book to distract you. Keep your eyes glued to that set until the station signs off. I can assure you that what you will observe is a vast wasteland.

With those two words, “vast wasteland,” Minow immediately entered the Snob Hall of Fame.

A half-century later, every self-declared highbrow cultural critic still knows Minow’s phrase; “vast wasteland” had been the standard point of departure for a billion attacks on TV. And we all know the further pejorative phrases applied to TV, such as “chewing gum for the eyes,” as well as to its viewers: “couch potatoes,” “zombies.”

But is TV really that bad? Is it really such an insidious force? The answer is “no.”

TV, after all, is mostly people talking--along with lots of jokes, pratfalls, mystery-solving, and shoot-‘em-ups. But mostly, TV is talk, whether it’s Oprah Winfrey, Barbara Walters, or Bill O’Reilly. Is that so terrible?

People have been talking forever--they must like it.

Indeed, for almost all of human history, few people could read, and so virtually all communication occurred through human interaction, showing and telling. In other words, we are “wired” to receive information and entertainment through these primal behavior patterns.

The break in the visual-and-oral tradition came in the 15th century, when Gutenberg invented the printing press, making books more widely available.

Yet even so, it was not until the 19th century that any appreciable number of people had access to printed material, or the ability to read it; most people still received their communication through speeches, sermons, and shows.

And then, in the 20th century, came movies, radio, and TV, at which point the visual-and-oral culture made a roaring comeback.

In other words, when ordinary folks are given a choice, they prefer to receive news and entertainment the way that humans have always received it--from another person.

Today, of course, people have access to all manner of communication technologies--most obviously, the Internet--and yet people are still watching TV, big time. People still read, but most often, most people prefer to watch.

It’s impossible to imagine, for example, not watching sports live--that is, in real time. So whether sports-consumption is on a familiar TV or on some newer kind of “smart device,” it will still be essentially a TV experience, as sports fans live through “the thrill of victory, and the agony of defeat” as it happens.

And as for the news, the same imperative toward “live” applies. Whether it’s a car chase, an election, or a terrorist attack, people still want to know what happened as soon as it happens. And that means TV and TV networks. Moreover, there’s nothing more natural than someone you trust explaining what is happening, even as you are watching the images.

Everyone understands that point, including, most recently, the “Occupy Wall Street” protestors. When they say, “The whole world is watching,” what are they referring to? Why TV, of course. Only television brings the immediacy of the situation into living rooms across the world.

And that’s the way it is, for comedy, for drama, for sports, and for news.

It was that way in the mid-20th century, and it will be that way in the 21st century, and beyond.

James P. Pinkerton is a writer, Fox News contributor and the editor/founder of SeriousMedicineStrategy.