Who Will Teach Our Children?

The so-called “school to prison pipeline” has been a significant aspect of many discussions among education policymakers over the past several years. The idea that overly harsh or capriciously applied school discipline policies are priming students to fail later in life has led to a variety of local, state, and federal initiatives and laws designed to reduce the number of suspensions and expulsions meted out for even the most flagrant and repeated infractions of school rules. Those who support this new direction—which is a stark contrast to the “zero tolerance” policies of only a few years ago—are certain that a less consequence-laden environment will benefit a broad spectrum of our public school students.

​I always questioned the underlying logic of this new approach. Back in 2016 when legislative passage of SB 100 here in Illinois mandated a reduction in school punishments, I was not the only educator who wondered about the outcome, and I shared my concerns in a commentary published on my own blog and elsewhere entitled “Illinois Is Trying Out A New School Discipline Law, But Will It Make Schools Safer?”. Although I am certain there are many who still advocate for these new policies, the ongoing and serious teacher shortages experienced here in Illinois, which now impact 80% of the districts in our state, have been exacerbated by teachers leaving the profession in droves. This speaks to a crisis that many studiously choose to ignore.

However, teacher shortages are not only an Illinois problem. National statistics show that far fewer college students are majoring in education—and efforts to increase the pool of teachers through alternative certification programs have had only a marginal impact. Many districts struggle to even keep enough substitute teachers on board to cover normal daily teacher absences.

​Proposals to increase teacher salaries will hopefully encourage some to consider careers in education, but I do not believe a few more dollars in pay is going to be the magical incentive that many believe it will be. Except for a relative handful of egregiously overpaid administrators, K-12 education has never been a road to riches. Looking back over time, very few people became teachers because they were expecting stock options. Most entered the field—and stuck with it—because they enjoyed their students and derived great personal satisfaction from helping young people to learn in a safe and respective school environment.

​How much has this changed in today’s classrooms? National statistics from 2015-16, which I am certain grossly underreport the problem, indicate that 5.8% of teachers were physically assaulted by their students, and close to 10% were threatened with physical injury. These statistics fail to capture the ongoing and pernicious psychic toll of the rude, insulting, and slanderous treatment that so many teachers must endure from students—who know the consequences for their misbehavior will be slight. Too many teachers can tell depressing stories of students being sent the principal’s office after unloading a tidal wave of curse words—only to be sent right back to do it again. If, by chance, the student is actually punished, teachers often are then subjected to harsh criticism from a parent—one who will think nothing of continuing to harass that teacher online or troll them on social media.

In addition, the inevitable outcomes of decades of broken homes and societal dysfunction also land right on the school doorstep each day. Students who are depressed, traumatized, or abused are now a daily facet of the work lives of many teachers, who are given neither the tools nor the training to deal with problems that in many cases legitimately warrant hospital care. Throw in a smattering of pregnant students or teen parents, add a smidgen of suicidal ideation in essay assignments, a dash of cognitively damaged children, a splash of prescription and illegal drug use, and a soupçon of sexually aggressive and inappropriate classroom behavior, and a reasonable individual might wonder about the sanity of their career choice. Oh, we should not forget about all those “non-working” hours at home and over the summers that are consumed with grading and lesson planning. Why would you not stick around in the classroom—for thirty or more years?

​Let’s have a reality check: Is the promise of, say, a 5% raise really going to persuade our nation’s overworked and overstressed teachers to stay in the classroom? The price increases for Chardonnay and Xanax alone run far ahead of what cash-strapped districts can possibly offer to attract and retain effective teachers, who now can add the remote—but still frightening—potential for school shootings to their already expansive list of worries.

​Sadly, what would likely convince more teachers to stay in the classroom is what most school districts are least likely to provide: tougher discipline policies that include long suspensions or expulsions for repeat or flagrant offenders. Most teachers would like a raise (Who wouldn’t?), but most would likely much prefer a safer and more respectful classroom and school environment where they can focus on doing their jobs without fear of a student throwing a chair at their heads, cursing them out, or miming oral sex with a knowing smirk on their faces. Continuing to condone misbehavior out of some misguided desire to end the fabled “school to prison pipeline” robs the students who want to actually learn of their educations, reinforces the worst behaviors by a handful of students—and drives all but the most desperate or masochistic from the teaching profession. It is not the job of our nation’s teachers to be punching bags, and fatter paychecks will not solve our rapidly worsening teacher shortages.

We need to rethink the both the daily practices and long-term goals of our nation’s public schools if we expect the system to survive. If we do not, the problems will only worsen.

Advertisements

A Supreme Problem

The three co-equal branches of the United States government—executive, legislative, and judiciary—each have their roles to play in the management and mission of our nation. However, the federal judiciary and its judges, whose role current Chief Justice John Roberts famously (and perhaps disingenuously) characterized as one of simply “calling balls and strikes” regarding the matters before them, has until recently clung to an air of impartiality—but those days are now gone.

People who study the Supreme Court assert that 5-4 split decisions are no more common than they once were, but now every close or controversial decision has become another component of the partisan battles that are the background music of our hyper-politicized nation. Moreover, the celebrity, notoriety, and visibility of today’s Supreme Court justices invites speculation regarding their personal and legal agendas. Unfortunately, the near anonymity that the justices once cultivated has been replaced by a public advocacy for which those are both sides of the many issues dividing the Court and our country are equally culpable.

It would have been much better if the late Justice Antonin Scalia has been a little less fond of celebrating his own conservative viewpoints and linguistic cleverness in his speeches and writing. Justice Ruth Bader Ginsburg—the “Notorious RBG” to her fans among liberals—foolishly interjected the Supreme Court into electoral politics in 2016 by openly criticizing the candidacy of Donald Trump and joking about moving to New Zealand if he were elected.

The abandonment of the circumspect silence that was once the glory of those who served on our nation’s highest court has thrilled some advocates, but this has also served to reduce the status and credibility of this branch of our government. This disintegration of the dignity once associated with the Supreme Court is evident in the ever more contentious confirmation battles over the past couple of decades. Supreme Court nominations are now yet one more piece of raw meat for partisan attack dogs to fight and growl over—and the perceived integrity of all our judicial processes are harmed as a result.

All of this makes me wary of the upcoming fight over seating a replacement for Justice Anthony Kennedy, who announced his retirement from the Supreme Court this week. Due to his unique position as the swing vote on so many cases before the court during his thirty year tenure, his replacement will likely become the deciding factor for a great many 5-4 split decisions in the years—and perhaps decades—to come. Given what is a stake, partisan fervor regarding the confirmation of President Trump’s nominee is likely to rise to levels that will make all our other fractious arguments seem mild by comparison. The net effect of this pitched combat will be to cement the public perception of the Supreme Court as just another governmental outpost of politicized and polarizing discord, which will likely irreparably damage its already tattered status and cause it to lose more of its most precious asset—the nation’s trust.

Given the vast and often unbridgeable social, political, cultural, religious, economic, and regional divides in our nation at the present time, it is not surprising that our nation’s courts have been asked to arbitrate the fights around the table at Thanksgiving. Because so many disagreements do not easily lend themselves to compromise—a women cannot, for example, have half an abortion—and communal values have been largely replaced by assertions of unfettered individual rights heretofore unprecedented in history, judges are more and more trapped in the unenviable position of acting as the arbiters of our nation’s morals. Setting aside the basic reality that humans tend to disagree about everything, this task is made yet more thankless and impossible by the fact that significant segments of our population are openly and loudly adverse the very idea of morality, viewing it as either a vestigial annoyance or a pointless guilt trip.

Courts can—and should—mediate regarding the application of laws, but can—or should—the courts continue to mediate in ever more granular and quotidian aspects of our daily lives? The evidence would tend to suggest they should not, but our nation’s courts have, nonetheless, tried their best to solve the conundrum of differing moral and ethical values by simply granting more and more “rights” that are divorced from any notion of responsibility. The problem with this approach—which has become more and more obvious over time—is that trying to create a civil society by allowing everyone to do as they please is like trying to fix the economy by printing more money. A period of euphoric happiness follows, but an inevitable and catastrophic crash will ensue—and the problems that follow are certain to be beyond easy or painless remedy.

We now live in a rudderless nation where we are free to be as self-centered, spoiled and entitled as we want without fear of either consequence or rebuke from individuals, institutions, or government. To express even the mildest disagreements with the behavior of others is today a sure sign of hateful intolerance—which must, of course, be adjudicated through the courts. To a certain extent I suppose inventing more and more rights is wonderful new business development for lawyers and judges, but it is also guaranteed to facilitate every sort dysfunction, infuriate those who act responsibly, and destroy any sense of community and common purpose by privileging the few at the expense of the many.

Supreme Court nominations matter. The tone the Justices set for the entire judiciary matters. However, unless the rulings by all levels of the courts re-establish some balance between what individuals contribute to society and what society can reasonably provide to individuals, expect the worse.

Secrets and Lies

The recent arrest of a former Senate Intelligence Committee staff member—a veteran of almost 30 years in government service—on charges of lying to FBI agents investigating leaks of classified information surprised some.  However, what really churned the waters was the concurrent seizure of the phone and email records of The New York Times reporter to whom he had been allegedly leaking—but with whom he was most definitely having sex.  They don’t call Washington “The Swamp” for nothing.

This incident and so many like it speak to the inherent tension between government secrecy and a free press in a democracy.  That which government would prefer remains hidden has always been catnip for reporters, but it appears more and more the case that a symbiotic and worrisome relationship has developed between those in government and those working in the press—each seemingly tethered to fewer and fewer institutional norms or traditions.  Given that government cannot operate effectively in a glass house, both the leakers and those reporters who are anxious to disseminate secrets are playing a dangerous game that could have catastrophic consequences.

We generally find government information falls into three broad categories.  

First, we have information that can and should be made readily available to all: the cost of contracts, specific legislative and regulatory actions, court rulings, or initiatives of the Executive branch are obvious examples of information that is critical to the smooth functioning of democratic processes.  There are also categories of information that need to be carefully evaluated before they are made public; troop movements in wartime and active criminal investigations are obvious examples.  We don’t want to either compromise military operations and put lives at risk or allow crooks to escape before they can be apprehended and put on trial.

There remains, however, a third category of information that causes the most practical and ideological problems in an open and democratic society: that which cannot be revealed under any circumstances without causing perhaps irreversible harm to our nation and its people.  

The very existence of this final category of information is offensive to those who believe in absolute government transparency and deeply distrust the idea of government secrecy.  It must be acknowledged that the United States government—like every government in history—has sometimes tried to drop a veil of secrecy over information that would reveal neglect, malfeasance, or plain stupidity.  The question then arises whether revealing this information serves any public good or just causes further damage by either unnecessarily eroding public trust or politicizing what are, in the final analysis, nothing more than instances of human weakness or misjudgment.

Likely the two most famous examples of closely-held secrets revealed during the course of my own lifetime are the publication of the so-called “Pentagon Papers”, which allowed the general public to read the unvarnished political and military deliberations concerning the conduct of the Vietnam War, and the revelations surrounding President Nixon’s role in encouraging spying upon—and sabotage of—his political opponents, which led to his impeachment and resignation.  

In both of these cases the news media decided that our country and our citizens were best served by revealing the secrets and lies of our government officials.  We saw a long-term drop in our faith in government as a result—which is either healthy or harmful, depending on your point of view—but the issues at hand were clearly pertinent to both public policy and the operations of democratic government, so we needed to know the truth.  However, the facts associated with each case had far-reaching and long-term consequences for our country, so the editorial decisions to publicize this information were made only after long and careful internal deliberations concerning the complex balance between press freedom and our national interests.

That was then—and this is now.

Over the past 30-40 years journalistic standards have joined floppy disks on the scrap heap of history.  Our internet-driven 24/7 news cycle has produced a crazed bazaar of half-truths and one-sided opinions presented as facts.  As articles regarding personalities and perceptions—and snarky reactions to both—have continued to crowd out simple reporting in the quest for clickbait, any sense of proportion and decency has more and more been discarded.  Hence, “news” has devolved into just one more facet of our wacky entertainment culture rather than an enterprise where careful fact-checking and an unbiased presentation—combined with a deeply entrenched sense of reportorial responsibility—are considered normal and laudable.

Imagine, for example, if our current journalistic practices had been in place in the past.  Would the Manhattan Project, which developed the first atomic bombs during World War II, have stayed off the front pages of The Washington Post for long?  Would news websites be breathlessly reporting every twist and turn of the Cuban Missile Crisis based on leaks and the wildest unsubstantiated speculation—thereby driving our world even closer to the brink of nuclear war?  On a less elevated level, would some mistress of President Kennedy be providing a slurp by slurp account of their liaisons to 60 Minutes or The Tonight Show—perhaps while simultaneously hawking her new web store with its own line of “Presidential” lingerie for sale?

We need a responsible and inquiring press in a democracy—and many news outlets are still doing important investigative reporting that provides necessary accountability for government and government officials.  However, the disdain much of the American public feels toward journalism and journalists—which President Trump channels and amplifies for his own political purposes—is a direct outcome of the damage done by reporters who have turned themselves into partisans and provocateurs in order to advance their own careers.

There is an old saying in Washington: “Those who know don’t talk, and those who talk don’t know.”  We can add a codicil to this saying that is both a reflection of today’s reality and a warning: “and the public doesn’t know why so much talk leaves them knowing nothing at all….”

Jimmy Obama?

Leo Durocher, the baseball player and manager, once famously observed that “nice guys finish last”. His contention was that winning required one to get down in the dirt—and play dirty—when the situation required it.

I’ve been mulling over this comment as I consider the presidencies of Jimmy Carter and Barack Obama. Both had their successes and failures. Each was considered by their contemporaries to be very smart and able individuals—as well as very good and compassionate men. Each was, however, replaced by a successor who ran as their antithesis, promised an American economic renewal and shrinking of government power—and set about methodically erasing their imprint upon our nation.

Jimmy Carter’s first and only term found him clashing frequently with the entrenched powers in Washington and beyond, and his manner was frequently mocked by his political opponents, who characterized him as weak—and at times condescending. His presidency was sidetracked by the seizure of hostages at the American embassy in Tehran, and his chances for re-election were dealt a mortal blow by the catastrophic failure of the military rescue mission of those American hostages in 1980, which ended in an ignominious helicopter crash in the Iranian desert. Despite voter doubts about the bellicose temperament and character of his opponent in 1980, Ronald Reagan won a smashing victory, and over the course of his two terms in office put a deeply conservative stamp on domestic politics while pursuing a massive military buildup, high risk foreign policy adventures, and deregulatory actions that ushered in an unprecedented economic boom.

Unlike Jimmy Carter, Barack Obama was both embraced and celebrated by the entrenched establishment in Washington and beyond, and his smooth style and golden public speaking won him great favor with the nation’s media and entertainment elites. Seen as a new type of leader whose personal qualities transcended the muck of mere politics, he was able to inspire his allies—but he often foundered when the need for bare knuckle, backroom deal making was required to bully his opponents into submission. Gliding above the fray with a deeply cerebral (and at times condescending) manner, he often resorted to the use of executive orders rather than legislation to pursue his policies—certain that his legacy would be secured by the sheer infallibility of his ideas.

However, despite voter doubts about the bellicose temperament and character of his hand-picked successor’s opponent in 2016, Donald Trump won a smashing victory, and over the course of his (very possible) two terms in office will put a deeply conservative stamp on domestic politics while pursuing a massive military buildup, high risk foreign policy adventures, and deregulatory actions that have already ushered in an unprecedented economic boom.

Coincidence?

Direct comparisons between the presidencies of Jimmy Carter and Barack Obama are difficult because they were operating in completely different political environments with completely different economic and strategic challenges. I still remember wrapping myself in a blanket in order to study in my freezing college dorm room because oil was so ruinously costly and scarce that heat was considered a luxury; it is a far different energy environment today—America is now one of the world’s largest exporters of oil. Old school corporate and industrial muscle still ruled during the Carter presidency; Barack Obama was the “information economy” President who claimed American manufacturing jobs were gone for good, so everyone now needed to learn how to code. Jimmy Carter still had to contend with a confrontational and expansionist Soviet Union; the revamped Russian Federation is a still powerful but far less threatening presence—now our eyes are turned to the dangers posed by China and North Korea.

Jimmy Carter has built an influential and successful ex-Presidency that focuses on peacemaking, the eradication of tropical diseases, and building—often with his own hands—houses for the poor and homeless. However, our airport in Washington is Reagan National, and we still refer to the “Reagan Revolution” as part of our political discourse; for all his good intentions, Jimmy Carter often now feels like a placeholder rather than a President.

It is too early to assess the legacy of President Obama, but his penchant for executive orders over the hurly-burly of legislation left many of his signature accomplishments subject to reversal at the stroke of a pen. Just sketching some of the highlights of Donald Trump’s first 500 days in office offers a jaw-dropping litany of stark policy changes:

• The Paris Climate Accord—Gone
• The Iran Nuclear Agreement—Gone
• The Trans-Pacific Partnership Trade Deal—Gone
• The Affordable Care Act Individual Mandate—Gone

In addition, conservative federal judges (including one Supreme Court Justice) have been confirmed in record numbers, and President Obama’s patient and cooperative approach to economic and military affairs has been replaced by an extraordinarily combative style that is challenging international norms regarding long-established trade agreements with our allies—and is driving nuclear North Korea to the negotiating table with open threats of “annihilation”. Commentators have now begun referring to “Trump Time” to describe the hyper-accelerated pace of so much of his Presidency so far.

It is far too soon to evaluate the ultimate political impact of Barack Obama’s two terms in office, but it is perhaps not too early to wonder if his legacy will consist almost wholly of being our first African-America President. The anger that so many liberals feel over President Trump’s reversals of President Obama’s accomplishments is perhaps more and more tinged with fear. His brutal—and at times brutish—Presidential style has already reshaped the political landscape of our nation in ways that will be felt for generations to come, and the prospect of a second term—or even the mere completion of his first—fills his political opponents with terror.

Nice—Donald Trump is not. He may, however, be the living incarnation of Leo Durocher’s aphorism, and he could condemn Barack Obama to Jimmy Carter’s fate: a very nice guy whose Presidential legacy was gleefully stomped on by his successor—and then discarded.

The Consequence Of No Consequences

If there is any connective tissue between the many scandals and strife that fill our world today, it is this: People sure do hate being judged.

This is, of course, a very human reaction. Trying to bluster one’s way out of difficulty by proclaiming your actions were either innocent or misunderstood—which is, of course, sometimes true—has probably been a facet of human behavior from the dawn of civilization. However, what has now become a conspicuous characteristic of our troubled times is that both a belief in our own blamelessness and an embrace of utter shamelessness are now woven into the fabric of our modern culture.

A component of this is certainly based on our ongoing societal and political efforts to relegate shame to the dustbin of human history. Given that we now pretty much determine for ourselves what is right or wrong because the concept of social norms tends to annoy many, the only way you can really find yourself in hot water these days is to be critical of another person’s behavior. To attempt to cause anyone to feel shame is—ironically enough—considered shameful. This circular bit of ethical entrapment disables any possible discussion of right and wrong because, as is now the dominant doctrine in many quarters, right and wrong are nothing but social constructs meant to oppress us. Thankfully, we seem at least able to agree that child abuse is wrong, although even this issue collides on occasion with our desperation to celebrate non-Western or non-traditional child rearing practices.

Think about the news or commentary that we all read on a regular basis. It is incredible how often the stories today are less about actual events and more about criticisms of the reactions (or lack thereof) by others. As a result, we find ourselves trapped in an echo chamber of denunciations, which allows us to avoid any thoughtful discussion of blame, shame, or culpability. If those who disagree with us are themselves bad—because they either criticized us or failed to properly exalt us—we are able to deflect any shame our actions might bring and be held blameless. This is, unfortunately, a perpetual motion machine of insult and outrage that contributes very little to problem-solving but does much—far too much—to degrade and demean our public discourse.

The net outcome of these deflections of blame and shame is that all discussions dissolve into debates about whose interests are being helped or harmed—our lives reduced to nothing but a series of transactions devoid of values—and no effort is expended examining the basic morality of the actions or intentions of the parties involved.

An example of the confines of our cultural and political norms at the present time is the anger that erupted over the passage of a package of federal laws known as FOSTA-SESTA that now holds websites liable for advertising sexual services online. Opponents of these laws lament that sex workers will find themselves at greater personal risk and suffer professional inconvenience because they can no longer advertise their services easily and cheaply through the internet.

Lost in all the discussion of the law’s impact, which has been immediate and substantial, was perhaps a more fundamental issue few wanted to discuss because it would be considered judgmental or—to use a favorite expression of many—“slut shaming” of a subset of women who are, after all, simply trying to make a living: Does our nation have an obligation to facilitate—and therefore tacitly legitimize—the world’s oldest profession, prostitution?

Is it possible in today’s America to simply say that prostitution is immoral and damaging to all involved? Would we ever expect those in charge of our major news and media outlets in New York and California to criticize or condemn prostitutes and prostitution in an effort to improve public and private morals and behavior? Such questions are considered so old fashioned and retrograde to those who sit at the pinnacles of our elite sources of opinion and commentary as to even be unworthy of note. Imagine if the New York City Police Department and FBI were to launch a crackdown on prostitution—which seems extraordinarily unlikely. Would The New York Times, for example, endorse this effort or resort to running sympathetic profiles of all the valiant women who were being persecuted by the police and prosecutors for simply plying their trade?

Morality is, of course, a tricky business, and over the past several thousand years of civilization we have expended incredible time and energy attempting to distinguish right from wrong. Our ideas of what is moral and what is not have certainly undergone some revisions—but much of the essential framework has remained the same. Ignoring discussions of morality and immorality because they might make some feel uncomfortable or judged for their beliefs or behavior is a foundational problem that afflicts broad swathes of our nation and might explain the persistence and magnitude of at least some of the issues afflicting many communities, families, and individuals.

There are, to be sure, many difficulties we must today address, but most will likely remain unresolved if even the most basic issues of right and wrong are banned from the discussions because they might make some feel excluded—or bad about themselves. Perhaps this needs to change.