Conspiracy Theories Or Reasonable Questions?

As long as humanity has had a toehold on terra firma, we have looked for someone to blame for our woes. Our many problems, which for much of our history were blamed on either the disfavor or caprices of the gods, now are typically blamed on human agents—who are usually part of some cabal out to fool and manipulate us.

Whether we are seeking those behind the JFK assassination, the true story behind 9/11, those UFOs parked in Area 51, or the UN office behind Agenda 21 (perhaps our better conspiracies end with the number 1!), many are convinced that dark forces with malevolent motivations are controlling our world in pursuit of one dastardly agenda or another.

It is, of course, simple human nature to demand a simple explanation for catastrophe. Placating powerful gods at one time consumed those portions of our short and often brutal lives when we were not already engrossed with scratching our meager livings from the earth. Those who claimed to be able to divine and communicate with forces beyond our understanding always were able to win favor, and if some degree of protection from pain or horror might be secured through either ritual or avoiding proscribed behaviors, there would always be a ready audience for such notions. Our compelling interests in avoiding famine, flood, fire, and disease baked a certain degree of easy credulity into humanity’s DNA over the course of many thousands of years, and we must recognize this inheritance is within us all.

Today the thunder of the gods has receded somewhat, and our shamans are typically scientists. Based upon their sage advice, we gulp supplements, avoid bacon and cigarettes, run on treadmills like hamsters, slather on sunscreen, and assiduously attempt to forestall the inevitable deaths of both ourselves and those whom we love by seeking out the secrets to our ever elusive immortality. These behaviors are, by and large, fairly benign and typically work to our benefit. When, for example, is the last time you met someone sporting a large and unsightly goiter—and do most of us even know what this is anymore?

However, the flip side of our credulous belief in the wonders of science as an agent for individual improvement is our bizarre belief in the perfectibility of humanity itself. Hence our willingness to embrace ideas based on the crudest eugenic theories and our obsession with elevating ourselves—while degrading others—based upon what are ultimately the most minute variations in our genes. The hatreds and warfare that have soaked our species in blood now more typically manifest themselves in cartoonish characterizations that are more laughable than dangerous—although ethnic and racial slaughters still pop up around the world with depressing regularity. We obviously still have quite a way to go before we entirely stamp out stupidity.

Recognizing our twin desires to both avoid disaster—and to know who to blame when it befalls us—is necessary if we are to fully understand many of the political and social problems besetting our nation and our world. As our global affairs have become more complex and interdependent, the opportunities for exploitation have multiplied exponentially, and government and multinational corporations—often working hand in glove—have become the golden idols at the center of our lives. Far more powerful, intrusive, and frightening than the supposedly omnipotent gods of old, the power of government and industry to grant stupendous wealth, poison our bodies and minds, destroy our planet, provide uncounted comforts and distractions, take away our property and livelihoods, either greatly extend or savagely shorten our lives, and ultimately control every facet of our existences is unprecedented in human history. Zeus and Apollo were mere amateurs compared to Google and Goldman Sachs.

It should not be much of a surprise that our fear and wonder drives us to anxiously search for patterns and clues to help avoid the wrath of these new and implacable gods—and seek the reasons why they insist on punishing so many of us. Some call them conspiracy theories. More times than we realize, they may be remarkably reasonable questions about our remarkably unreasonable world.

Our grim awareness of the naked and shameless lust for wealth and power that drives so many who now control our lives makes the construction of the conspiracy theories/querulous narratives that animate our discussions all the easier. Understanding the extremist ideologies that undergirded so much of the Cold War, it is easier to imagine whispered instructions from a secretive group ordering the murder of a President. Knowing of the desire of multi-national corporations and their government cronies to secure control of Mideast oil supplies, one need not work too hard to see a stupendous plot to fake a terrorist attack against America in order to justify endless war. Having been kept in the dark about so many secret government military projects, little green men in flying saucers becomes a plausible explanation for so many of those bright lights in the night sky. Observing the never ending violence and drug traffic in our inner cities involving African-Americans, it is little wonder that so many are certain this is being facilitated by the government as part of a genocidal war of extermination.

Given the craven and corrupt behavior that is now so common among government officials and business executives, are we paranoid to believe that our needs come far behind the interests of those in power who are chasing riches and influence? One could, of course, argue that dishonesty and avarice have defined the leaders of every age—why else, after all, would one chase high office in government or business? However, we perhaps have a confluence of circumstances today that heightens the stench that often emanates from the halls of power.

The Information Age has been a boon to the average American citizen—and a bane to those in power. Although political scientists often trace our loss of faith in our leadership back to the twin national traumas of the Vietnam War and Watergate, I suspect the crux of the problem is the more adversarial style of journalism these scandals helped to create and the rise of alternative news sources—first in print and later on through the worldwide web. Just as putting a brighter lightbulb in a room causes one to suddenly notice the stained carpet and peeling paint, so has the variety and visibility of news and opinion targeted to—and now increasingly produced by—the masses led to an enormous range of information that speaks to the fears and concerns of virtually everyone.

Although muckrakers and iconoclasts like Ida Tarbell and I. F. Stone played influential roles in shaping opinion earlier in the 20th century, the 24/7 news and information cycle—and the many to whom cheap and powerful technology has now given an unsanctioned and unrestrained voice—has made it virtually impossible for the crooked and corrupt to fly beneath the radar undetected. This visibility produces a higher degree of accountability, but it also calls the motives and methods of business and government—today’s omnipotent yet mysterious gods—into question on a daily basis.

If this scrutiny produces more “conspiracy theories”, so be it. The rich and the powerful are perfectly able to defend themselves if the suspicions of impropriety are unwarranted. If we are compelled to listen to outlandish notions on occasion—only to have them later debunked—I do not find this too high a price to pay for the ongoing oversight that is now possible. If those in charge want our trust, perhaps they had best conduct themselves in a manner that is above suspicion. If not, we should be free to arrive at our own judgments concerning their veracity and good intentions.

Advertisements

Let Our People Tweet!

In a recent interview, Barack Obama made the following observation regarding the promise—and pitfalls—associated with the rapid growth of the use of social media in our hyper-politicized age: “The question has to do with how do we harness this technology in a way that allows a multiplicity of voices, allows a diversity of views, but doesn’t lead to a Balkanization of society and allows ways of finding common ground.” This is a good question, but it may miss the mark just slightly—as many perfectly reasonable questions sometimes do.

The ever-expanding range of social media—everything from Facebook to Twitter to Snapchat and beyond—has fundamentally changed our political, personal, and social discourse in ways we are still struggling to understand. Who, for example, had heard of “hashtag activism” a scant few years ago or would have foreseen the manner in which a political neophyte could leverage his love of “tweeting” into the highest elected office in our nation?

Politicians, reporters, businesspeople, celebrities, athletes, and others now race to provide their instantaneous reactions—we cannot possibly call it analysis—regarding every twitch in the fabric of our world. No event or statement—no matter how momentous or mundane—seems beyond comment, and YouTube personalities now rake in six and seven figure incomes for sharing (or perhaps oversharing) every aspect of their daily lives. Our planet’s population has become a global network of symbiotic exhibitionists and voyeurs, each dependent upon the other for the peculiar gratifications of either posing or peering. It is sometimes a wonder that anyone finds the time to brush their teeth between checking online, posting, and anxiously waiting for the “likes” to appear.

As a result, privacy is now nearly synonymous with invisibility, which has both individual and cultural consequences we can only begin to today fathom. We should, however, by now recognize the drawbacks inherent in engaging with social media in a manner that slices and dices individuals into ever-smaller subgroups based upon identities, interests, and political leanings. Although shared community can certainly result from, for example, finding Facebook “friends” who are just like you—and actively “unfriending” those who are not—this can easily slip into the Balkanization that concerns Mr. Obama. The myopic view of the world that results from communing exclusively with those who agree with everything you say produces the mental flabbiness and smug certitude that has helped to poison so many of our national conversations. Speaking only to those like ourselves surely separates us from one another—and impedes honest discussion.

However, this being acknowledged, I believe that Mr. Obama neglected to emphasize perhaps the greatest benefit of social media: the removal of mediators and filters that decide how information is transmitted—or whether it is transmitted at all. I am old enough to remember when a mere handful of major networks and newspapers were able to impose a virtual information hegemony upon our nation, which turned them into arbiters, gatekeepers, and kingmakers—and drastically narrowed the range of information and opinions available. Perhaps the most startling—or, for some, terrifying—aspect of last year’s Presidential election was that Donald Trump won without a single endorsement from a major news outlet and slogged on to victory while thumbing his nose at their repeated disparagements. This was, no matter how it might otherwise be spun, a stunning populist victory that would most certainly have been stopped in its tracks by the mainstream media in years past. It will be up to historians to determine the merits of Donald Trump’s presidency, but his success at the ballot box would have been impossible before the advent of social media.

Of course, right now a Trump opponent is rolling his or her eyes at his use—some would say manipulation—of his Twitter account, but it should be remembered that there would be no #MeToo moment or #BlackLivesMatter tidal wave revealing decades of pain and abuse were it not for the enormous power and reach of social media. In both of these instances, the entrenched establishment lost control of the narrative because millions of voices were suddenly able to speak and be heard. This is what most terrifies those in positions of previously unassailable power and influence: The average person can now wield a mighty sword to cut them down to size with just the tip of their finger tapping on a screen.

The nascent effort to combat “fake news” by empowering corporations and government agencies to ferret out information they deem unreliable—or perhaps embarrassing—seems to me to be nothing but a thinly veiled attempt by the establishment to reassert their control over what information is available in order to maintain their crumbling authority. Rumors, gossip, and pettiness have been baked into humanity since the dawn of civilization, but the official lies that have driven disastrous misadventures (we never did find those “weapons of mass destruction” in Iraq, did we?) are too numerous to enumerate and have caused vastly more damage to our nation and its people.

We are likely much better off with a wild and uncontrollable social media environment that asks uncomfortable questions and attacks complacent assumptions. If people are sometimes insulted and misinformation is occasionally spread, this is a small price to pay for the incredibly free and open discussion that is now possible, and we would be fools indeed to have this wrested away from us because some are more comfortable with the hollow silence that would soon follow.

The common ground we find after free-wheeling debate is a firmer foundation than the shaky consensus forced upon us by stilling voices of dissent. We must, of course, learn how to avoid ad hominem attacks and cruel invective as we discuss difficult and divisive issues, but the Balkanization that so concerns Mr. Obama also might be characterized as the messy and maddening freedom to speak truth to power and challenge a status quo that many find unacceptable. It is normal and healthy for citizens in a democracy to disagree, and those who yearn for the good old days when those who owned the television broadcast licenses or printing presses decided what we would be allowed to hear or say are simply hoping that taking away the voices of the many will protect the power of the few.

No matter how many times experts and insiders assure us that strict social media censorship will produce peace, harmony, or security, don’t believe it for a second. We are much better off with the sloppy cacophony of voices and viewpoints that we have right now, and those who are pushing for more curated conformity and crass control deserve nothing other than a good kick in the pants—on social media.

The Problems Posed By To Kill A Mockingbird

Recent media reports regarding efforts by a school district in Biloxi, Mississippi to drop To Kill A Mockingbird from their curriculum have generated understandable concern. As schools continue to grapple with both disorienting societal changes and increasing political polarization, we are inevitably going to see more challenges to specific classroom content and practices, which should concern any professional educator. Anger rarely results in good policy decisions.

Our societal discord certainly connects to broader questions regarding what we expect of our K-12 schools. That fine line between education and indoctrination will be ever more difficult to discern as educators struggle to find ways to challenge students to think without falling into the trap of preaching to them. However, given the well-documented deficiencies in critical thinking skills that colleges and employers must grapple with today, it is more important than ever to encourage our K-12 schools to shake students from their easy assumptions and comfortable mental inertia. The question is, of course, how best to do this.

I’ve taught To Kill A Mockingbird to high school students in the past, and they were often shocked to read about the routine degradations inherent in the entrenched racial discrimination of our nation’s history. If nothing else, the novel served as a lesson that allowed us to ladder into discussions about what has—and still has not—changed in America today. It has been many years since I’ve had the opportunity to teach this particular novel, but I suspect that my classroom lessons and activities regarding To Kill A Mockingbird would need to be very different now because I would be compelled to address uncomfortable changes in our perceptions of the characters and their motivations.

The cartoonish delineation between the heroes and villains in To Kill A Mockingbird has always posed pedagogical problems, although it eases reading comprehension for an audience often composed of 8th or 9th graders. On the one side we have the Ewell family, who are a caricature of what we expect—and perhaps prefer—our racists to be, an ignorant and violent clan devoid of even an iota of decency or honesty. Facing off against them, we have Atticus Finch, a caring and compassionate lawyer and tragic widower raising two intelligent and inquisitive children who are miraculously free of the least taint of racism. Caught in the middle we have Tom Robinson, falsely accused of rape by the evil Ewells, and the very personification of stoic dignity in the face of injustice. There are no shades of gray among these main characters; there are only, if I may be forgiven this analogy, broad strokes of black and white.

To Kill A Mockingbird, were it to be published today, would likely face a somewhat more mixed critical reception. Aunt Alexandra’s desperate efforts to put a gloss of girlishness on the tomboyish Scout would likely be more harshly judged by contemporary feminist critics. Mr. Dolphus Raymond’s sexual relationships with African-American women would raise questions regarding power differentials and consent. Boo Radley’s peculiar interest in his prepubescent neighbors, which obviously includes covertly observing them and following them outside the house at night, might not be so wondrously free of any question of pedophilia—or at least “stranger danger”—in today’s less innocent world. It may well be that the year of the novel’s publication back in the mists of 1960 was the very last moment in our cultural and social history when the questions and answers seemed quite obvious and easy, so complexity and nuance could be blithely set aside in the pursuit of an uplifting fable.

I’ve always been a bit leery of joining in the chorus of hosannas regarding To Kill A Mockingbird, and perhaps this is because I have always found Atticus Finch a bit less than admirable—which I realize is near to sacrilege to some. Although he has the best possible intentions in the worst possible situation, Atticus Finch and his legal machinations, in a final and flinty-eyed analysis of outcomes, actually come to nothing. Tom Robinson is dead, no minds are changed, and the Jim Crow system that informs the actions of the town and its people is wholly unaffected.

Atticus Finch’s attitudes and actions are in many respects a foreshadowing of the well-meaning (but ultimately ineffectual) white liberals in the 1960’s whose best intentions would be overrun by the flame and fury that finally destroyed Jim Crow segregation and its many local permutations. Although the novel suggests that readers should derive some cosmic satisfaction from the death of the thoroughly despicable Bob Ewell, which also allowed Boo Radley to finally reveal his essential human decency (although it might be reasonably observed that manslaughter is a mighty odd plot device to get there), it would be impossible to argue the trial of Tom Robinson produced any significant changes in the town or its people.

Of course, all of this speaks to the many moral compromises that inform the book. The worst of the town of Maycomb and its racist attitudes is on display, but the best of the many small but significant accommodations the decent need to make each day to survive in an indecent world also bear our examination. It could be argued, if one really was looking for hope for a better future, that the most moral course of action Atticus Finch could have pursued would have been to refuse to represent Tom Robinson, thereby removing the thin veneer of respectability that placates those whose mute compliance is needed. Imagine how different the novel would have been if Judge Taylor had not been able to use Atticus’ stirring but pointless speech to soothe the consciences of those who knew just how profound an injustice was being done. Moral but meaningless victories serve the needs of tyrannies that need to smooth over the rawness of oppression, and we should not fail to recognize that Atticus’ carefully restrained outrage sounded lovely but changed nothing at all.

All of this is, of course, beside the point of why the novel is now often banned. The norms that now rule in many communities judge the politically incorrect—but historically accurate—usage of the “N-Word” as both insult and casual descriptor to be too much to bear in our sensitive school and social climates. This is understandable, but it also opens up opportunities for classroom discussion of the novel and its context. If we are going to crusade to excise every questionable bit of U.S. history from our schools instead of engaging in the conversation, research, and exploration of our past that is a core mission of education, we condemn our children to facile sloganeering instead of intelligent and well-rounded inquiry that will prepare them for a future where the answers will be neither obvious nor easy.

Perhaps the key to continuing to use To Kill A Mockingbird in our nation’s classroom is to gently remove it from its pedestal and recognize its limitations—just as acknowledging our own human limitations is the precursor to a better understanding of our world and ourselves. To Kill A Mockingbird is not a perfect novel, and the tiresome insistence on canonizing it impedes an honest engagement with what can be learned from a thoughtful and critical reading. Just as a person can be wonderful but flawed, so can a book fall into that same category. If we can accept this, perhaps we can finally move forward instead of squabbling without end, which ultimately does nothing to improve the education of our children.

 

Can We Survive If “The Center” Is Gone?

We are all defined by our life experiences.

Wherever we grow up, whatever individual circumstances shape our lives, and whomever we interact with all combine to form our perceptions of ourselves and the world in which we live. Moreover, understanding and sharing our life stories can instruct—and sometimes inspire—others. To forget the influences that made us who we are is to, in a sense, forget ourselves, and our personal narratives also help to enhance our understanding of history by giving it a human face. This all makes it important to collect, preserve, and celebrate our life stories and the life stories of those around us.

For example, when I was growing up, one of my favorite books was Reach For The Sky by Paul Brickhill. This account of the life of Douglas Bader, a Royal Air Force pilot who lost both of his legs in an airplane crash, was forced to leave the service on disability—and yet persevered to return the R.A.F. and become one of Britain’s greatest military leaders and fighter aces during World War II—is an amazing tribute to both personal bravery and resilience under the most difficult of life circumstances. It certainly put whatever adolescent concerns I might have had about a stray pimple in its proper perspective and taught me a valuable lesson about never giving up no matter what obstacles life or fate might throw in your path.

Globally speaking, personal narratives—or at least the illusion of them—have been both entertainment and moral instruction since the dawn of civilization. The Iliad and The Odyssey, Mahabharata, The Holy Bible, Beowulf, The Song of Roland, Le Morte D’Arthur, and so manyothers have taught countless generations right from wrong, honor from disgrace, and good from evil. These narratives and others like them, whether sacred text or epic tale, have served as the essential glue binding together societies, nations, continents, and our entire planet by both transmitting shared values and creating institutions that have served as the foundations of governance and justice up until the present day. To put it plainly, without the many life lessons gleaned from these texts and our reactions to them, who we are today would simply not exist.

Today we live in the golden age of the personal narrative, and the advent of powerful and omnipresent technology now allows us to share our stories with a worldwide audience. For perhaps the first time in human history all voices can be heard, all stories shared, and all lives celebrated via an iPhone or an Internet connection. What an amazing world it is.

However, the downside of this multiplicity of voices and viewpoints is that our common cultures and shared values are being rapidly obliterated by the combined opinions of an entire planet of individuals who are all asserting the primacy and correctness of their particular needs and wants. Now more people than ever—especially those who live in our large urban media centers—essentially curate their own idiosyncratic set of personal values from all that is available. Given the infinite possibilities inherent in the cafeteria-style morality now available via Google, that which separates or unites many people is less dependent that ever on national boundaries, traditional cultural beliefs, or religious institutions. There is instead a new globalized system in their places bypassing and supplanting that which bound us to our immediate neighbors for many, many previous centuries.

Given that traditions and institutions that once acted as arbiters and guidelines regarding taste and social norms have now been discarded in favor of what could—with perhaps a trace of irony—be called “crowdsourced individuality”, we find that those most comfortable with the norms of this fluid and ever-changing milieu—actors, entertainers, and media personalities—are now most often called upon to pronounce judgment on the issues facing us. What is truly remarkable about the world we live in today is that celebrities are routinely asked to offer opinions on matters of war and peace, the stewardship of resources, international diplomacy, immigration policy, and a host of other issues—and their opinions are dutifully reported as actual news on front pages around the globe. Think carefully for a moment: Do you recall anyone checking with Humphrey Bogart or Katherine Hepburn before we declared war on Japan after Pearl Harbor? Did President Kennedy worry whether Elvis was on his side during the Cuban Missile Crisis? That which, if you stop a moment to think, is utterly bizarre is now quite commonplace.

Unsurprisingly, some find shrugging off the societal shackles of the many millennia incredibly liberating—and insist that we all celebrate their personal paths toward whatever lifestyles or experiences will maximize their happiness. However, others obviously find the erasure of long-held cultural and moral norms to be either stressful or troubling. Nonetheless, ditching all that created a common humanity so a relative few can pursue their personal journeys does not seem a concern for the media elites that now drive our national conversations. Considering the matter broadly, we could question whether we are living a wonderful moment in human history or acting as the avatars of the end of national, cultural, and societal cohesion—but few seem to care to inquire further regarding this.

So this is where we are today. Our personal narratives and individual judgments have now become the unassailable—and sole—guides to how to live life for an ever growing portion of our global population. Therefore, conversations about what is right or wrong, honorable or disgraceful, and good or evil have become impossible. In fact, merely to assert that some behavior is right, wrong, honorable, disgraceful, good, or evil is to make a judgement about someone else’s idiosyncratic curation of their values that is often considered insulting or intolerant, which makes reasoned discussions about any issue or concern very, very difficult indeed.

I am not against embracing our personal narratives or pursuing personal self-fulfillment; I am, however, concerned that our zeal for elevating the needs of the individual over that of the group is a prescription for the unending paralysis of direction and purpose—at a time when definitive and perhaps painful actions are needed to meet a host of challenges. There will, given the enormity and complexity of the problems facing our nation and world, be a time in the very near future when cooperative sacrifices will be necessary for the common good, and I am not at all certain we are going to be able to muster up anything beyond endless bickering about the solutions—if we can even manage to agree on the problems. With apologies to William Butler Yeats, no civilization can continue to exist unless a boring, stable—and perhaps to some slightly judgmental—center is allowed to hold.

Is it really a problem that we are fixated with individual stories and personal dramas that grab our attention rather than national and global matters that will assuredly impact our country? Perhaps some comparisons will prove instructive. Think just a moment, for example, about the time wasted on news articles about the age disparity between the new President of France and his wife versus the coverage of the pension crisis right here in the United States. Have you heard more about recent—and ominous—test firings of ballistic missiles by North Korea or the marital or financial woes of any one of a dozen Hollywood stars? Would it, sad to say, be easier for most Americans to name the nine starters on their favorite baseball team or the nine Justices of the U.S. Supreme Court?

We might be able to muddle along wrapped in our oblivious self-absorption a bit longer, but I fear a day of reckoning is at hand that we are wholly unprepared to meet because many of us can see no further than the tips of our own lovely noses. This will be too bad for us—and for the generations to follow who will likely be stuck with cleaning up the many problems we happily ignored while updating our Facebook pages.

What Can Be Done To Improve Teaching In Our Public Schools?

If you Google the terms “Teacher Shortage” and “Teacher Turnover”, your hits will light up rather forebodingly. Obviously local conditions affect individual districts in a variety of ways, so not all schools or regions are suffering to the same degree. However, there does seem to be a fairly broad-based national problem of recruitment and retention of K-12 teachers that is becoming yet one more problem affecting our public schools.

Talented individuals leave the teaching profession—or avoid it altogether—for a variety of reasons. Poor pay, stress, lack of professional support, workplace dysfunction, administrative micro-management, long hours of bureaucratic busywork, disrespect and abuse from students and their parents, and many other factors have—and will continue to—make it difficult to recruit and retain top quality elementary and secondary educators. Moreover, our current model of training and credentialing teachers is costly and time-consuming, yet it still leaves many graduates lacking the basic pedagogical and student management skills necessary for creating a respectful and successful classroom environment.

I remember my own trip through “teacher education” after I left the advertising business—what an incredible waste of time and money. Although alternative teacher training programs obviously exist, the profession is still is dominated by the traditional model of teacher training and licensing, which—despite its documented shortcomings—persists because it is a cash cow for colleges and universities and creates lots and lots of jobs for local and state education bureaucrats.

The meandering and pointless journey through Ed School coursework that often seems only tangentially related to actual classroom practice also serves as a gigantic disincentive to mid-career entry for those who can bring real world experience to their teaching. This seems an obvious drawback—and it is generally acknowledged to be so—but we still blindly press forward with a model that pushes twenty-two year old young adults who have done nothing much other than sit in a classroom for their entire lives into yet another classroom—where they will now try to educate our children. Doesn’t make much sense, does it?

Even worse, the pot of gold at the end of the rainbow no longer seems to be attractive enough to hang onto many who go through the process—as evidenced by the many problems with recruitment and retention. What can be done?

I have a modest proposal…. Four, really.

Close all Colleges of Education.

Why continue to support a system that produces graduates who often don’t turn out to be very good teachers—or who quickly quit the profession altogether because they just can’t cut it? Our schools of “mis-education” are clearly not up to the job, and noodling around the edges with some cosmetic changes after year upon year of study and discussion is just a further waste of time, money, and human capital.

A simpler and more direct system of teacher training and licensure can certainly be devised, but it will run into a brick wall of bureaucratic resistance unless the public is willing to push for change. Obviously, I believe our students and society will benefit if we improve and streamline teacher training, but I fear that the political will—and this is all about politics—to do what is necessary is lacking. As long as one of our two major political parties is a wholly-owned subsidiary of the National Education Association, bold and inventive thinking will certainly be discouraged by many.

Require that all new teachers must have worked in jobs outside of education for five years before they step into a classroom.

Maybe I’m just looking at this all wrong, but I want people in my public schools who have lived some kind of life outside of a classroom before teaching my children. Work in a restaurant. Sell insurance. Join the Army. Presenting our public schools with truckloads of fresh college graduates who more than likely still had their moms doing their laundry the week before they begin teaching for the first time is kind of insane. We need individuals with a little grit under their nails and life experience under their belts so they are better prepared to deal with the challenges that now daily face our K-12 teachers.

License teachers based on proven performance—not college or continuing education credits alone.

Yes, teachers should go to college—and preferably graduate school—to earn degrees in the subject area that they plan to teach (no more “education” majors, please!). In addition, teachers should continue to sharpen their skills with classes and workshops throughout their careers.

However, how cool would it be to unlock the schoolhouse doors and get some real world experience into our nation’s classrooms? What parent wouldn’t be thrilled to have a chemist teaching their child Chemistry, a Physician Assistant teaching anatomy in their local high school, an editor helping their child learn how to write, or a local farmer showing that lucky student the practical aspects of how the business actually works? The possibilities would be endless, but it might obvious endanger an ossified status quo that likes everything just the way it is because it privileges paper credentials over job-proven competence.

Community colleges, for example, make outstanding use of practitioners in their classrooms—which is a great benefit to the students who are there to learn the skills they need to succeed. Recruiting teachers from the workplace is a proven winner at the 2 year college level, so why not extend this strategy to K-12?

If these working professionals can teach at their local public school only part of the day or the year, offer them a prorated salary and don’t waste their valuable time making them hop through a million hoops in order to share their valuable experience—and please don’t assign them to bus duty or lunch supervision. We desperately need more practitioners and fewer pretenders in our classrooms—particularly in our middle and high schools where content knowledge is so important. Assign the brain-dead busywork to minimum wage hires or parent volunteers.

End teacher tenure and pay based on seniority.

I know changing to a free market for teacher hires runs counter to the civil service model that has dominated public education for many, many decades, but it would be a game changer. It would encourage excellence, create desperately needed fluidity in the job market, and incentivize mid-career entrants who could bring job skills and life experience into the classroom. If we start to pay teachers based on their value rather than how many years they’ve sat in a school building—a measure that is typically divorced from actual performance—we can start to address the many problems caused by our highly uncompetitive system.

Can’t find a Math teacher for your district? Hire a local engineer who can show students how mathematics is used in the real world—and has the actual work experience to teach it. Need a Business teacher? Hire a manager at a local manufacturer—and also build a bridge with a local employer that might hire your graduates. Have a truly wonderful Music teacher you want to hold onto? Find a way to adjust their duties so they’ll want to stick around—instead of treating that talented person like just another replaceable cog.

In other words, instead of running public schools like hermetically-sealed vaults, open the doors and innovate. It will be scary as hell for some and drive the “edu-crats” crazy because they will lose control—and perhaps their jobs as well—but the alternative is to continue to sacrifice our children on an altar built out of rulebooks and dusty theories about education that do nothing but ensure that little learning actually happens. Think this isn’t true? Google the “college and career-ready” or “college preparedness” statistics for your local school or entire state and decide for yourself. The actual data can be a real eye-opener.

I think the time for a real education revolution is long overdue—and maybe we are now ready for it!