Let Our People Tweet!

In a recent interview, Barack Obama made the following observation regarding the promise—and pitfalls—associated with the rapid growth of the use of social media in our hyper-politicized age: “The question has to do with how do we harness this technology in a way that allows a multiplicity of voices, allows a diversity of views, but doesn’t lead to a Balkanization of society and allows ways of finding common ground.” This is a good question, but it may miss the mark just slightly—as many perfectly reasonable questions sometimes do.

The ever-expanding range of social media—everything from Facebook to Twitter to Snapchat and beyond—has fundamentally changed our political, personal, and social discourse in ways we are still struggling to understand. Who, for example, had heard of “hashtag activism” a scant few years ago or would have foreseen the manner in which a political neophyte could leverage his love of “tweeting” into the highest elected office in our nation?

Politicians, reporters, businesspeople, celebrities, athletes, and others now race to provide their instantaneous reactions—we cannot possibly call it analysis—regarding every twitch in the fabric of our world. No event or statement—no matter how momentous or mundane—seems beyond comment, and YouTube personalities now rake in six and seven figure incomes for sharing (or perhaps oversharing) every aspect of their daily lives. Our planet’s population has become a global network of symbiotic exhibitionists and voyeurs, each dependent upon the other for the peculiar gratifications of either posing or peering. It is sometimes a wonder that anyone finds the time to brush their teeth between checking online, posting, and anxiously waiting for the “likes” to appear.

As a result, privacy is now nearly synonymous with invisibility, which has both individual and cultural consequences we can only begin to today fathom. We should, however, by now recognize the drawbacks inherent in engaging with social media in a manner that slices and dices individuals into ever-smaller subgroups based upon identities, interests, and political leanings. Although shared community can certainly result from, for example, finding Facebook “friends” who are just like you—and actively “unfriending” those who are not—this can easily slip into the Balkanization that concerns Mr. Obama. The myopic view of the world that results from communing exclusively with those who agree with everything you say produces the mental flabbiness and smug certitude that has helped to poison so many of our national conversations. Speaking only to those like ourselves surely separates us from one another—and impedes honest discussion.

However, this being acknowledged, I believe that Mr. Obama neglected to emphasize perhaps the greatest benefit of social media: the removal of mediators and filters that decide how information is transmitted—or whether it is transmitted at all. I am old enough to remember when a mere handful of major networks and newspapers were able to impose a virtual information hegemony upon our nation, which turned them into arbiters, gatekeepers, and kingmakers—and drastically narrowed the range of information and opinions available. Perhaps the most startling—or, for some, terrifying—aspect of last year’s Presidential election was that Donald Trump won without a single endorsement from a major news outlet and slogged on to victory while thumbing his nose at their repeated disparagements. This was, no matter how it might otherwise be spun, a stunning populist victory that would most certainly have been stopped in its tracks by the mainstream media in years past. It will be up to historians to determine the merits of Donald Trump’s presidency, but his success at the ballot box would have been impossible before the advent of social media.

Of course, right now a Trump opponent is rolling his or her eyes at his use—some would say manipulation—of his Twitter account, but it should be remembered that there would be no #MeToo moment or #BlackLivesMatter tidal wave revealing decades of pain and abuse were it not for the enormous power and reach of social media. In both of these instances, the entrenched establishment lost control of the narrative because millions of voices were suddenly able to speak and be heard. This is what most terrifies those in positions of previously unassailable power and influence: The average person can now wield a mighty sword to cut them down to size with just the tip of their finger tapping on a screen.

The nascent effort to combat “fake news” by empowering corporations and government agencies to ferret out information they deem unreliable—or perhaps embarrassing—seems to me to be nothing but a thinly veiled attempt by the establishment to reassert their control over what information is available in order to maintain their crumbling authority. Rumors, gossip, and pettiness have been baked into humanity since the dawn of civilization, but the official lies that have driven disastrous misadventures (we never did find those “weapons of mass destruction” in Iraq, did we?) are too numerous to enumerate and have caused vastly more damage to our nation and its people.

We are likely much better off with a wild and uncontrollable social media environment that asks uncomfortable questions and attacks complacent assumptions. If people are sometimes insulted and misinformation is occasionally spread, this is a small price to pay for the incredibly free and open discussion that is now possible, and we would be fools indeed to have this wrested away from us because some are more comfortable with the hollow silence that would soon follow.

The common ground we find after free-wheeling debate is a firmer foundation than the shaky consensus forced upon us by stilling voices of dissent. We must, of course, learn how to avoid ad hominem attacks and cruel invective as we discuss difficult and divisive issues, but the Balkanization that so concerns Mr. Obama also might be characterized as the messy and maddening freedom to speak truth to power and challenge a status quo that many find unacceptable. It is normal and healthy for citizens in a democracy to disagree, and those who yearn for the good old days when those who owned the television broadcast licenses or printing presses decided what we would be allowed to hear or say are simply hoping that taking away the voices of the many will protect the power of the few.

No matter how many times experts and insiders assure us that strict social media censorship will produce peace, harmony, or security, don’t believe it for a second. We are much better off with the sloppy cacophony of voices and viewpoints that we have right now, and those who are pushing for more curated conformity and crass control deserve nothing other than a good kick in the pants—on social media.

Advertisements

The Problems Posed By To Kill A Mockingbird

Recent media reports regarding efforts by a school district in Biloxi, Mississippi to drop To Kill A Mockingbird from their curriculum have generated understandable concern. As schools continue to grapple with both disorienting societal changes and increasing political polarization, we are inevitably going to see more challenges to specific classroom content and practices, which should concern any professional educator. Anger rarely results in good policy decisions.

Our societal discord certainly connects to broader questions regarding what we expect of our K-12 schools. That fine line between education and indoctrination will be ever more difficult to discern as educators struggle to find ways to challenge students to think without falling into the trap of preaching to them. However, given the well-documented deficiencies in critical thinking skills that colleges and employers must grapple with today, it is more important than ever to encourage our K-12 schools to shake students from their easy assumptions and comfortable mental inertia. The question is, of course, how best to do this.

I’ve taught To Kill A Mockingbird to high school students in the past, and they were often shocked to read about the routine degradations inherent in the entrenched racial discrimination of our nation’s history. If nothing else, the novel served as a lesson that allowed us to ladder into discussions about what has—and still has not—changed in America today. It has been many years since I’ve had the opportunity to teach this particular novel, but I suspect that my classroom lessons and activities regarding To Kill A Mockingbird would need to be very different now because I would be compelled to address uncomfortable changes in our perceptions of the characters and their motivations.

The cartoonish delineation between the heroes and villains in To Kill A Mockingbird has always posed pedagogical problems, although it eases reading comprehension for an audience often composed of 8th or 9th graders. On the one side we have the Ewell family, who are a caricature of what we expect—and perhaps prefer—our racists to be, an ignorant and violent clan devoid of even an iota of decency or honesty. Facing off against them, we have Atticus Finch, a caring and compassionate lawyer and tragic widower raising two intelligent and inquisitive children who are miraculously free of the least taint of racism. Caught in the middle we have Tom Robinson, falsely accused of rape by the evil Ewells, and the very personification of stoic dignity in the face of injustice. There are no shades of gray among these main characters; there are only, if I may be forgiven this analogy, broad strokes of black and white.

To Kill A Mockingbird, were it to be published today, would likely face a somewhat more mixed critical reception. Aunt Alexandra’s desperate efforts to put a gloss of girlishness on the tomboyish Scout would likely be more harshly judged by contemporary feminist critics. Mr. Dolphus Raymond’s sexual relationships with African-American women would raise questions regarding power differentials and consent. Boo Radley’s peculiar interest in his prepubescent neighbors, which obviously includes covertly observing them and following them outside the house at night, might not be so wondrously free of any question of pedophilia—or at least “stranger danger”—in today’s less innocent world. It may well be that the year of the novel’s publication back in the mists of 1960 was the very last moment in our cultural and social history when the questions and answers seemed quite obvious and easy, so complexity and nuance could be blithely set aside in the pursuit of an uplifting fable.

I’ve always been a bit leery of joining in the chorus of hosannas regarding To Kill A Mockingbird, and perhaps this is because I have always found Atticus Finch a bit less than admirable—which I realize is near to sacrilege to some. Although he has the best possible intentions in the worst possible situation, Atticus Finch and his legal machinations, in a final and flinty-eyed analysis of outcomes, actually come to nothing. Tom Robinson is dead, no minds are changed, and the Jim Crow system that informs the actions of the town and its people is wholly unaffected.

Atticus Finch’s attitudes and actions are in many respects a foreshadowing of the well-meaning (but ultimately ineffectual) white liberals in the 1960’s whose best intentions would be overrun by the flame and fury that finally destroyed Jim Crow segregation and its many local permutations. Although the novel suggests that readers should derive some cosmic satisfaction from the death of the thoroughly despicable Bob Ewell, which also allowed Boo Radley to finally reveal his essential human decency (although it might be reasonably observed that manslaughter is a mighty odd plot device to get there), it would be impossible to argue the trial of Tom Robinson produced any significant changes in the town or its people.

Of course, all of this speaks to the many moral compromises that inform the book. The worst of the town of Maycomb and its racist attitudes is on display, but the best of the many small but significant accommodations the decent need to make each day to survive in an indecent world also bear our examination. It could be argued, if one really was looking for hope for a better future, that the most moral course of action Atticus Finch could have pursued would have been to refuse to represent Tom Robinson, thereby removing the thin veneer of respectability that placates those whose mute compliance is needed. Imagine how different the novel would have been if Judge Taylor had not been able to use Atticus’ stirring but pointless speech to soothe the consciences of those who knew just how profound an injustice was being done. Moral but meaningless victories serve the needs of tyrannies that need to smooth over the rawness of oppression, and we should not fail to recognize that Atticus’ carefully restrained outrage sounded lovely but changed nothing at all.

All of this is, of course, beside the point of why the novel is now often banned. The norms that now rule in many communities judge the politically incorrect—but historically accurate—usage of the “N-Word” as both insult and casual descriptor to be too much to bear in our sensitive school and social climates. This is understandable, but it also opens up opportunities for classroom discussion of the novel and its context. If we are going to crusade to excise every questionable bit of U.S. history from our schools instead of engaging in the conversation, research, and exploration of our past that is a core mission of education, we condemn our children to facile sloganeering instead of intelligent and well-rounded inquiry that will prepare them for a future where the answers will be neither obvious nor easy.

Perhaps the key to continuing to use To Kill A Mockingbird in our nation’s classroom is to gently remove it from its pedestal and recognize its limitations—just as acknowledging our own human limitations is the precursor to a better understanding of our world and ourselves. To Kill A Mockingbird is not a perfect novel, and the tiresome insistence on canonizing it impedes an honest engagement with what can be learned from a thoughtful and critical reading. Just as a person can be wonderful but flawed, so can a book fall into that same category. If we can accept this, perhaps we can finally move forward instead of squabbling without end, which ultimately does nothing to improve the education of our children.

 

Can We Survive If “The Center” Is Gone?

We are all defined by our life experiences.

Wherever we grow up, whatever individual circumstances shape our lives, and whomever we interact with all combine to form our perceptions of ourselves and the world in which we live. Moreover, understanding and sharing our life stories can instruct—and sometimes inspire—others. To forget the influences that made us who we are is to, in a sense, forget ourselves, and our personal narratives also help to enhance our understanding of history by giving it a human face. This all makes it important to collect, preserve, and celebrate our life stories and the life stories of those around us.

For example, when I was growing up, one of my favorite books was Reach For The Sky by Paul Brickhill. This account of the life of Douglas Bader, a Royal Air Force pilot who lost both of his legs in an airplane crash, was forced to leave the service on disability—and yet persevered to return the R.A.F. and become one of Britain’s greatest military leaders and fighter aces during World War II—is an amazing tribute to both personal bravery and resilience under the most difficult of life circumstances. It certainly put whatever adolescent concerns I might have had about a stray pimple in its proper perspective and taught me a valuable lesson about never giving up no matter what obstacles life or fate might throw in your path.

Globally speaking, personal narratives—or at least the illusion of them—have been both entertainment and moral instruction since the dawn of civilization. The Iliad and The Odyssey, Mahabharata, The Holy Bible, Beowulf, The Song of Roland, Le Morte D’Arthur, and so manyothers have taught countless generations right from wrong, honor from disgrace, and good from evil. These narratives and others like them, whether sacred text or epic tale, have served as the essential glue binding together societies, nations, continents, and our entire planet by both transmitting shared values and creating institutions that have served as the foundations of governance and justice up until the present day. To put it plainly, without the many life lessons gleaned from these texts and our reactions to them, who we are today would simply not exist.

Today we live in the golden age of the personal narrative, and the advent of powerful and omnipresent technology now allows us to share our stories with a worldwide audience. For perhaps the first time in human history all voices can be heard, all stories shared, and all lives celebrated via an iPhone or an Internet connection. What an amazing world it is.

However, the downside of this multiplicity of voices and viewpoints is that our common cultures and shared values are being rapidly obliterated by the combined opinions of an entire planet of individuals who are all asserting the primacy and correctness of their particular needs and wants. Now more people than ever—especially those who live in our large urban media centers—essentially curate their own idiosyncratic set of personal values from all that is available. Given the infinite possibilities inherent in the cafeteria-style morality now available via Google, that which separates or unites many people is less dependent that ever on national boundaries, traditional cultural beliefs, or religious institutions. There is instead a new globalized system in their places bypassing and supplanting that which bound us to our immediate neighbors for many, many previous centuries.

Given that traditions and institutions that once acted as arbiters and guidelines regarding taste and social norms have now been discarded in favor of what could—with perhaps a trace of irony—be called “crowdsourced individuality”, we find that those most comfortable with the norms of this fluid and ever-changing milieu—actors, entertainers, and media personalities—are now most often called upon to pronounce judgment on the issues facing us. What is truly remarkable about the world we live in today is that celebrities are routinely asked to offer opinions on matters of war and peace, the stewardship of resources, international diplomacy, immigration policy, and a host of other issues—and their opinions are dutifully reported as actual news on front pages around the globe. Think carefully for a moment: Do you recall anyone checking with Humphrey Bogart or Katherine Hepburn before we declared war on Japan after Pearl Harbor? Did President Kennedy worry whether Elvis was on his side during the Cuban Missile Crisis? That which, if you stop a moment to think, is utterly bizarre is now quite commonplace.

Unsurprisingly, some find shrugging off the societal shackles of the many millennia incredibly liberating—and insist that we all celebrate their personal paths toward whatever lifestyles or experiences will maximize their happiness. However, others obviously find the erasure of long-held cultural and moral norms to be either stressful or troubling. Nonetheless, ditching all that created a common humanity so a relative few can pursue their personal journeys does not seem a concern for the media elites that now drive our national conversations. Considering the matter broadly, we could question whether we are living a wonderful moment in human history or acting as the avatars of the end of national, cultural, and societal cohesion—but few seem to care to inquire further regarding this.

So this is where we are today. Our personal narratives and individual judgments have now become the unassailable—and sole—guides to how to live life for an ever growing portion of our global population. Therefore, conversations about what is right or wrong, honorable or disgraceful, and good or evil have become impossible. In fact, merely to assert that some behavior is right, wrong, honorable, disgraceful, good, or evil is to make a judgement about someone else’s idiosyncratic curation of their values that is often considered insulting or intolerant, which makes reasoned discussions about any issue or concern very, very difficult indeed.

I am not against embracing our personal narratives or pursuing personal self-fulfillment; I am, however, concerned that our zeal for elevating the needs of the individual over that of the group is a prescription for the unending paralysis of direction and purpose—at a time when definitive and perhaps painful actions are needed to meet a host of challenges. There will, given the enormity and complexity of the problems facing our nation and world, be a time in the very near future when cooperative sacrifices will be necessary for the common good, and I am not at all certain we are going to be able to muster up anything beyond endless bickering about the solutions—if we can even manage to agree on the problems. With apologies to William Butler Yeats, no civilization can continue to exist unless a boring, stable—and perhaps to some slightly judgmental—center is allowed to hold.

Is it really a problem that we are fixated with individual stories and personal dramas that grab our attention rather than national and global matters that will assuredly impact our country? Perhaps some comparisons will prove instructive. Think just a moment, for example, about the time wasted on news articles about the age disparity between the new President of France and his wife versus the coverage of the pension crisis right here in the United States. Have you heard more about recent—and ominous—test firings of ballistic missiles by North Korea or the marital or financial woes of any one of a dozen Hollywood stars? Would it, sad to say, be easier for most Americans to name the nine starters on their favorite baseball team or the nine Justices of the U.S. Supreme Court?

We might be able to muddle along wrapped in our oblivious self-absorption a bit longer, but I fear a day of reckoning is at hand that we are wholly unprepared to meet because many of us can see no further than the tips of our own lovely noses. This will be too bad for us—and for the generations to follow who will likely be stuck with cleaning up the many problems we happily ignored while updating our Facebook pages.

What Can Be Done To Improve Teaching In Our Public Schools?

If you Google the terms “Teacher Shortage” and “Teacher Turnover”, your hits will light up rather forebodingly. Obviously local conditions affect individual districts in a variety of ways, so not all schools or regions are suffering to the same degree. However, there does seem to be a fairly broad-based national problem of recruitment and retention of K-12 teachers that is becoming yet one more problem affecting our public schools.

Talented individuals leave the teaching profession—or avoid it altogether—for a variety of reasons. Poor pay, stress, lack of professional support, workplace dysfunction, administrative micro-management, long hours of bureaucratic busywork, disrespect and abuse from students and their parents, and many other factors have—and will continue to—make it difficult to recruit and retain top quality elementary and secondary educators. Moreover, our current model of training and credentialing teachers is costly and time-consuming, yet it still leaves many graduates lacking the basic pedagogical and student management skills necessary for creating a respectful and successful classroom environment.

I remember my own trip through “teacher education” after I left the advertising business—what an incredible waste of time and money. Although alternative teacher training programs obviously exist, the profession is still is dominated by the traditional model of teacher training and licensing, which—despite its documented shortcomings—persists because it is a cash cow for colleges and universities and creates lots and lots of jobs for local and state education bureaucrats.

The meandering and pointless journey through Ed School coursework that often seems only tangentially related to actual classroom practice also serves as a gigantic disincentive to mid-career entry for those who can bring real world experience to their teaching. This seems an obvious drawback—and it is generally acknowledged to be so—but we still blindly press forward with a model that pushes twenty-two year old young adults who have done nothing much other than sit in a classroom for their entire lives into yet another classroom—where they will now try to educate our children. Doesn’t make much sense, does it?

Even worse, the pot of gold at the end of the rainbow no longer seems to be attractive enough to hang onto many who go through the process—as evidenced by the many problems with recruitment and retention. What can be done?

I have a modest proposal…. Four, really.

Close all Colleges of Education.

Why continue to support a system that produces graduates who often don’t turn out to be very good teachers—or who quickly quit the profession altogether because they just can’t cut it? Our schools of “mis-education” are clearly not up to the job, and noodling around the edges with some cosmetic changes after year upon year of study and discussion is just a further waste of time, money, and human capital.

A simpler and more direct system of teacher training and licensure can certainly be devised, but it will run into a brick wall of bureaucratic resistance unless the public is willing to push for change. Obviously, I believe our students and society will benefit if we improve and streamline teacher training, but I fear that the political will—and this is all about politics—to do what is necessary is lacking. As long as one of our two major political parties is a wholly-owned subsidiary of the National Education Association, bold and inventive thinking will certainly be discouraged by many.

Require that all new teachers must have worked in jobs outside of education for five years before they step into a classroom.

Maybe I’m just looking at this all wrong, but I want people in my public schools who have lived some kind of life outside of a classroom before teaching my children. Work in a restaurant. Sell insurance. Join the Army. Presenting our public schools with truckloads of fresh college graduates who more than likely still had their moms doing their laundry the week before they begin teaching for the first time is kind of insane. We need individuals with a little grit under their nails and life experience under their belts so they are better prepared to deal with the challenges that now daily face our K-12 teachers.

License teachers based on proven performance—not college or continuing education credits alone.

Yes, teachers should go to college—and preferably graduate school—to earn degrees in the subject area that they plan to teach (no more “education” majors, please!). In addition, teachers should continue to sharpen their skills with classes and workshops throughout their careers.

However, how cool would it be to unlock the schoolhouse doors and get some real world experience into our nation’s classrooms? What parent wouldn’t be thrilled to have a chemist teaching their child Chemistry, a Physician Assistant teaching anatomy in their local high school, an editor helping their child learn how to write, or a local farmer showing that lucky student the practical aspects of how the business actually works? The possibilities would be endless, but it might obvious endanger an ossified status quo that likes everything just the way it is because it privileges paper credentials over job-proven competence.

Community colleges, for example, make outstanding use of practitioners in their classrooms—which is a great benefit to the students who are there to learn the skills they need to succeed. Recruiting teachers from the workplace is a proven winner at the 2 year college level, so why not extend this strategy to K-12?

If these working professionals can teach at their local public school only part of the day or the year, offer them a prorated salary and don’t waste their valuable time making them hop through a million hoops in order to share their valuable experience—and please don’t assign them to bus duty or lunch supervision. We desperately need more practitioners and fewer pretenders in our classrooms—particularly in our middle and high schools where content knowledge is so important. Assign the brain-dead busywork to minimum wage hires or parent volunteers.

End teacher tenure and pay based on seniority.

I know changing to a free market for teacher hires runs counter to the civil service model that has dominated public education for many, many decades, but it would be a game changer. It would encourage excellence, create desperately needed fluidity in the job market, and incentivize mid-career entrants who could bring job skills and life experience into the classroom. If we start to pay teachers based on their value rather than how many years they’ve sat in a school building—a measure that is typically divorced from actual performance—we can start to address the many problems caused by our highly uncompetitive system.

Can’t find a Math teacher for your district? Hire a local engineer who can show students how mathematics is used in the real world—and has the actual work experience to teach it. Need a Business teacher? Hire a manager at a local manufacturer—and also build a bridge with a local employer that might hire your graduates. Have a truly wonderful Music teacher you want to hold onto? Find a way to adjust their duties so they’ll want to stick around—instead of treating that talented person like just another replaceable cog.

In other words, instead of running public schools like hermetically-sealed vaults, open the doors and innovate. It will be scary as hell for some and drive the “edu-crats” crazy because they will lose control—and perhaps their jobs as well—but the alternative is to continue to sacrifice our children on an altar built out of rulebooks and dusty theories about education that do nothing but ensure that little learning actually happens. Think this isn’t true? Google the “college and career-ready” or “college preparedness” statistics for your local school or entire state and decide for yourself. The actual data can be a real eye-opener.

I think the time for a real education revolution is long overdue—and maybe we are now ready for it!

The Free Speech Conundrum

One would need to have been living under an extraordinarily large rock over the past couple of decades to be unaware of the ongoing war between the advocates of free speech and those of “political correctness”. On the one hand we have those who insist that it is sometimes both useful and necessary to express thoughts and opinions that differ—and, as a result, may offend some—because listening to all sides of an issue is a necessary precursor to the intellectual rigor that leads to good judgment and decisions. On the other hand we have those who are equally insistent that we must banish all words and thoughts that make anyone uncomfortable in order to create a society where all feel welcome and valued because to cause anyone to feel criticized or excluded is both wrong and wrongheaded.

These battle lines have formed virtually everywhere we look, and the controversies these opposing ideas cause inevitably crop up in our neighborhoods, schools, houses of worship, and workplaces. When these conflicts arise, government is often called upon to act as an arbiter and write rules that govern our daily interactions with one another—an increasingly expansive mandate guaranteed to offend those who dislike restrictions on their freedom of speech. Although “speech codes” are increasingly a facet of our lives, I generally frown upon these because, while recognizing there is a clear difference between discussing and insulting, I agree with those who believe that we need to sometimes suffer the idiocy of the few to protect the rights of the many.

Many claim that the surprising—some might say shocking—victory of Donald Trump was fueled by the rejection of politically correct cultural norms on the part of a large portion of the electorate. This has, in turn, led many to conclude that racists and misogynists determined the outcome of the election, which seems to have hardened opinions on both sides. Rather than lead to a reasoned discussion of the appropriate balance between open expression and respectful behavior, the 2016 Presidential election has caused advocates on both sides to simply snarl at one another while lobbing charges of “intolerance” across the great divide.

It would be foolish to claim that, for example, racism and misogyny do not exist in our nation; there is no doubt that some voted as they did because of Hillary Clinton’s gender or Barack Obama’s race. However, it is equally nonsensical to assert that Donald Trump’s victory was due to nothing other than sheer bigotry. To do so would be to ignore widespread discontent with the outsourcing of jobs, the cost and management of government programs, terrorism at home and abroad, immigration policy, and the cataclysmic failures of the Affordable Care Act. The positions of most voters, if you take the time to speak to them, are quite nuanced and thoughtful, and to paint those who voted one way or another with an overly broad brush is, I believe, a demonstration of one’s inability to recognize the validity of opposing viewpoints.

I will readily admit that I am a fierce proponent of free speech, but I also recognize that people sometimes deliberately use words to wound rather than enlighten. Simply as a matter of common courtesy and human consideration for the feelings of others, we should always frame our disagreements and discussions in a manner that avoids unnecessary hurt and pointless invective. Insult is the shortcut of the intellectually weak, and it should not be a surprise—although it is scarcely a comfort—when we are subjected to a barrage of f-bombs from someone who cannot otherwise figure out how to express their feelings or beliefs.

However, insult is not solely the purview of the uneducated; our college and university campuses are far too well known today for the flame throwing rhetoric aimed at those who have the audacity to challenge the herd. If, despite your education, you are unable to convince someone of your point of view on the merits of your argument, characterizing those who believe something different than you as idiots or bigots is often a sign of intellectual laziness that is not much different from that of the buffoon who showers curses upon those who disagree.

Explaining what you think—and why—to others is time-consuming and occasionally maddening. Even worse, sometimes our cherished values and beliefs collide with an alternate reality that shocks and angers us because we have not been exposed to viewpoints that are different from our own. Our entirely understandable egocentricity leads us to believe that we are right and others are wrong, but this natural bias seems to me to have been exacerbated in recent years by cultural patterns that encourage insularity. We more and more listen only to people and information that reinforce our existing viewpoints, and we are increasingly confident that wrapping ourselves in a smug bubble is appropriate because those who think differently are not merely people with contrary ideas—they are ignorant and downright nasty.

The various bubble worlds that we inhabit are intensely self-comforting, but they are also dangerous and damaging. Any time we close our ears and our minds to ideas other than our own, we put ourselves and others at risk, eliminate the possibility of functional compromises, ratchet up the level of societal discord—and wall ourselves off from the possibility of personal growth.

Most of us probably don’t have to work very hard to think of examples of people and situations where we felt that our viewpoints were dismissed out of hand because they contradicted someone’s accepted narrative. I’ve experienced this a number of times recently, and I can tell you that it’s pretty darned upsetting to be denigrated because you believe something different. Still worse, to have the other person, the more you try to reasonably explain your viewpoint, grow more inflexible can be terribly disheartening.

I believe most would agree that to refuse to listen to someone and insist that you are absolutely correct is not only incredibly disrespectful—it is also quite arrogant and annoying—and we should all avoid behaving in this manner. Speaking as an educator, I find it frustrating if this happens with students, but it is doubly disturbing when I encounter this problem with colleagues whose job is, by definition, supposed to be about keeping an open mind and seeking to broaden the understanding of others. When the teachers decide that believing or teaching only one side of an issue is A-OK, we have crossed a line that calls into question the very purpose of our profession.