Who Will Teach Our Children?

The so-called “school to prison pipeline” has been a significant aspect of many discussions among education policymakers over the past several years. The idea that overly harsh or capriciously applied school discipline policies are priming students to fail later in life has led to a variety of local, state, and federal initiatives and laws designed to reduce the number of suspensions and expulsions meted out for even the most flagrant and repeated infractions of school rules. Those who support this new direction—which is a stark contrast to the “zero tolerance” policies of only a few years ago—are certain that a less consequence-laden environment will benefit a broad spectrum of our public school students.

​I always questioned the underlying logic of this new approach. Back in 2016 when legislative passage of SB 100 here in Illinois mandated a reduction in school punishments, I was not the only educator who wondered about the outcome, and I shared my concerns in a commentary published on my own blog and elsewhere entitled “Illinois Is Trying Out A New School Discipline Law, But Will It Make Schools Safer?”. Although I am certain there are many who still advocate for these new policies, the ongoing and serious teacher shortages experienced here in Illinois, which now impact 80% of the districts in our state, have been exacerbated by teachers leaving the profession in droves. This speaks to a crisis that many studiously choose to ignore.

However, teacher shortages are not only an Illinois problem. National statistics show that far fewer college students are majoring in education—and efforts to increase the pool of teachers through alternative certification programs have had only a marginal impact. Many districts struggle to even keep enough substitute teachers on board to cover normal daily teacher absences.

​Proposals to increase teacher salaries will hopefully encourage some to consider careers in education, but I do not believe a few more dollars in pay is going to be the magical incentive that many believe it will be. Except for a relative handful of egregiously overpaid administrators, K-12 education has never been a road to riches. Looking back over time, very few people became teachers because they were expecting stock options. Most entered the field—and stuck with it—because they enjoyed their students and derived great personal satisfaction from helping young people to learn in a safe and respective school environment.

​How much has this changed in today’s classrooms? National statistics from 2015-16, which I am certain grossly underreport the problem, indicate that 5.8% of teachers were physically assaulted by their students, and close to 10% were threatened with physical injury. These statistics fail to capture the ongoing and pernicious psychic toll of the rude, insulting, and slanderous treatment that so many teachers must endure from students—who know the consequences for their misbehavior will be slight. Too many teachers can tell depressing stories of students being sent the principal’s office after unloading a tidal wave of curse words—only to be sent right back to do it again. If, by chance, the student is actually punished, teachers often are then subjected to harsh criticism from a parent—one who will think nothing of continuing to harass that teacher online or troll them on social media.

In addition, the inevitable outcomes of decades of broken homes and societal dysfunction also land right on the school doorstep each day. Students who are depressed, traumatized, or abused are now a daily facet of the work lives of many teachers, who are given neither the tools nor the training to deal with problems that in many cases legitimately warrant hospital care. Throw in a smattering of pregnant students or teen parents, add a smidgen of suicidal ideation in essay assignments, a dash of cognitively damaged children, a splash of prescription and illegal drug use, and a soupçon of sexually aggressive and inappropriate classroom behavior, and a reasonable individual might wonder about the sanity of their career choice. Oh, we should not forget about all those “non-working” hours at home and over the summers that are consumed with grading and lesson planning. Why would you not stick around in the classroom—for thirty or more years?

​Let’s have a reality check: Is the promise of, say, a 5% raise really going to persuade our nation’s overworked and overstressed teachers to stay in the classroom? The price increases for Chardonnay and Xanax alone run far ahead of what cash-strapped districts can possibly offer to attract and retain effective teachers, who now can add the remote—but still frightening—potential for school shootings to their already expansive list of worries.

​Sadly, what would likely convince more teachers to stay in the classroom is what most school districts are least likely to provide: tougher discipline policies that include long suspensions or expulsions for repeat or flagrant offenders. Most teachers would like a raise (Who wouldn’t?), but most would likely much prefer a safer and more respectful classroom and school environment where they can focus on doing their jobs without fear of a student throwing a chair at their heads, cursing them out, or miming oral sex with a knowing smirk on their faces. Continuing to condone misbehavior out of some misguided desire to end the fabled “school to prison pipeline” robs the students who want to actually learn of their educations, reinforces the worst behaviors by a handful of students—and drives all but the most desperate or masochistic from the teaching profession. It is not the job of our nation’s teachers to be punching bags, and fatter paychecks will not solve our rapidly worsening teacher shortages.

We need to rethink the both the daily practices and long-term goals of our nation’s public schools if we expect the system to survive. If we do not, the problems will only worsen.

Advertisements

The Waste Land

Philip Roth recently died. During his long career as a novelist, he won every major award for his work except the Nobel Prize, and he is considered one of the preeminent writers of the late 20th century. However, with all due respect to Mr. Roth’ life and career, I don’t believe very many people outside of the rarified literary salons of the Boston-Washington corridor or a handful of PhD programs elsewhere actually read many of his novels—and he is an apt symbol for the wrong turn our cultural elites took in the post-WW II period.

In order to quickly illustrate my point and avoid a protracted explanation, please allow me to quote directly from Mr. Roth’s obituary in The New York Times: “His creations include Alexander Portnoy, a teenager so libidinous he has sex with both his baseball mitt and the family dinner, and David Kepesh, a professor who turns into an exquisitely sensitive 155-pound female breast.”

How could he have failed to win the Nobel Prize for Literature, you might well ask….

The literary novel—which was once, a long time ago now, built around characters wrestling with weighty matters of personal or social morality—has surrendered its purpose and lost its way. Our prevailing creative norm—in not only novels but movies and television as well—is now to sanctimoniously celebrate the triumphs of individuals over those family, foes, or institutions that fail to allow them to live just as they please. For an audience apparently content to be reassured that anyone who might pass moral judgment is simply hateful, this is somehow sufficient to make a story. Hence, there are generations of readers who, for reasons surpassing all understanding, find it entertaining that Holden Caulfield, the teenaged narrator of A Catcher in the Rye, calls every adult he meets a “phony”. When I had to inflict this novel on my own high school students, I sometimes wondered why this was considered a good use of instructional time, but keener minds than mine had long before determined this was a literary classic worthy of their attention.

The dramatic tension inherent in parsing issues of right and wrong (concepts utterly alien to much of our culture today) once gave the novel its power and cultural significance. Today these are reduced to a predictable polemic pitting the pure-hearted protagonists against an oppressive society that fails to properly recognize their uniqueness and sensitivity. It is little wonder that so much of our artistic output is now snark, pastiche, meta-fiction, satire—or comic book superheroes. To simply and seriously discuss the many complexities of morals or values today is to be hopelessly old-fashioned and overly judgmental.

Imagine our literary classics rewritten for our tolerant—and tech-savvy—modern world. Prince Hamlet today would be furiously and ineffectually tweeting about what a jerk his stepfather was, Ophelia would simply sext with Hamlet behind her father’s back, and Queen Gertrude would be busily working on her next palace podcast about her wonderful remarriage and her own journey of personal self-discovery. Given that all choices are now equally valid and correct, there would be no need for dramatic resolution. Everyone could simply do what they pleased, secure in the knowledge that their individual choices were unassailable, and we could sit back and enjoy the farce inherent in blowhards like Polonius futilely attempting to rein them all in. Ha-ha-ha.

Individual wants and needs are, of course, important; I am not advocating for a world run according to a hive mind mentality that neglects the critical importance of individuals within a larger community or society. However, there comes a point when a single-minded emphasis on individual wonderfulness becomes an empty intellectual exercise because it eventually will exclude any notions of shared duty or self-sacrifice for the common good—which, inconveniently enough, are necessary for a functioning and healthy society.

Adolescent self-satisfaction is, sad to say, now our predominant cultural characteristic, and just as any teenager typically does, we get awfully surly when someone points out that our selfish self-focus might be negatively affecting others. As much as we might want to sit in our rooms and just ignore all those other pesky people in our lives who somehow seem not to understand the importance of our needs, we do sometimes have to acknowledge the needs of others. It sucks, I know, but that’s what adulthood is all about. I might be ruining someone’s day by pointing this out, but a country composed of preening and self-involved individualists can cause as much damage to its citizens and their overall well-being as the most oppressive totalitarian state.

Please allow me to offer another related radical suggestion: That which is outré is not necessarily interesting or worthwhile. Circus “freak shows”, a blessedly discarded component of our entertainment culture, at one time offered viewers a chance to gawk at the physically afflicted. Sadly, we have not progressed much beyond this. Our late 20th and early 21st century cultural and artistic life has become overly enamored with the notion that examining characters and ideas occupying the fringes of our society will reveal heretofore untold truths about ourselves, an approach that, like the circus freak show, offers titillation but no illumination.

Which brings me back to modern literature, which has managed to write itself into irrelevance by mistaking the bizarre and obscure for the profound and life affirming. There is a reason that so many still love the plays of William Shakespeare, find life lessons in the Iliad and Odyssey, revel in the novels and short stories of F. Scott Fitzgerald, or continue to lose themselves in the adventures of Arthur Conan Doyle’s fictional creation, Sherlock Holmes. These works have survived the test of time because they engage with our minds and souls rather than attempting to shock and repel the average reader. Even those characters who are less than admirable are presented as fully formed—but deeply flawed—human beings rather than two dimensional caricatures of corruption and dysfunction.

If you want people to read your books and—perhaps more importantly—you want your work to be part of our daily cultural dialogue, it might be worth giving your readers a reason to continue to turn the page. Setting up straw men and knocking them down might be satisfying on some simplistic level, but it will only rarely sustain reader interest over the long term because there is no recognition of the difficulties that even the most seemingly insignificant life choices entail. Having your main character furiously masturbate into a piece of liver his family will later consume will shock us—but there is no knowledge or insight to be gained beyond this.

Spiritually and morally bankrupt cultures often privilege the sensational over the conversational. Good authors realize this. The “two minute hates” in George Orwell’s 1984 existed in a fictional culture devoid of humanity. The “feelies” in Aldous Huxley’s Brave New World were mass entertainment that stimulated rather than engaged their emotionally empty audiences. Our own two minute hates and feelies—now brought to us by our major literary publishers as well as cable television and the internet—are signs of how spiritually and morally bankrupt our culture has become, and we need to seriously discuss just how we can move literature and entertainment back in a direction that can again engage a mass audience in a broader discussion of the values that inform our lives.

A Modest Proposal For Our Public Schools

We live in the age of “big ideas” regarding how we can improve K-12 education in America.

We need personalized learning. Flipped classrooms would help. Teachers and students need to practice mindfulness. We could use more classroom technology—or perhaps less. No child will be left behind. Every student will succeed. I anxiously await the Lake Woebegone Education Act of 2035, which will mandate that every child be certifiably above average.

Let’s face the hard truth right here and now: All these many, many decades of reforms later, real and lasting improvements in K-12 academic outcomes are hard to find, and much of the available evidence points to further systemic declines.

Standardized tests continue to show that huge numbers of students are failing to learn, but apparently we should pay no attention to these test scores because they are nothing but a “snapshot” that fails to capture the “whole child”. As a result, hordes of high school graduates will continue to enroll in college each year—yet be wholly unprepared for college work—and flunk out after a semester or two. This is, however, not a reflection of the work being done (or not being done) at your local public schools. These danged kids must be partying too much.

Local news media—which pretty much operate as transcription services these days—will continue to report that their public schools are doing a fine job because these local television stations and newspapers really have no alternative but to do so. To report honestly about deficient academic measures and outcomes runs the risk of angering homeowners who are worried about their property values and contractors who are equally worried that the latest school construction bond might not pass and hence screw them out of lovely, fat paychecks. Any national or governmental data on broader problems with our country’s public schools do not, of course, apply to the schools in your own community, which the local news media have assured you are doing an excellent job preparing your children for successful futures. The circular logic of it all is a wonder to behold.

However, if a child is willing to sit in a classroom—or anywhere inside the building—so that your local district can collect their daily apportionment of state tax dollars, all will be well. If a student doesn’t like to write, that child can complete an “alternative” assignment—draw something, perhaps? If a child flunks a test, there is no need for worry—the school will likely allow unlimited test retakes. Hate to take notes or study? A student need have no concerns about that—count on a “study guide” the day before the quiz that contains all the answers. If nothing else works, your child can always enroll in a “credit recovery” course where, after watching a few movies and jotting down some random thoughts, full course credit will be expeditiously granted.

There are, of course, still public schools where some standards are maintained—and more and more charter schools are opening to provide alternatives for frustrated parents and students—but the daily reality for many children and adolescents throughout the length and breadth of our nation is maximum busywork and minimum learning. These problems later wash up on the doorsteps of our nation’s beleaguered community colleges, which are expected to somehow remediate 13 empty years of schooling within the span of a single semester.

I have suggestion so radical that to speak it out loud almost tempts a bolt of lightning to strike: Start flunking students who cannot perform to a minimal level of competence, which should translate into skills that would give that student a 50/50 chance of earning a C in a first year college course.

This does not seem an unreasonably high standard to set, and it would both bring some much-needed rigor back into our nation’s public schools and provide some reward for hard work. Our current system of striving to pass any student who can fog a mirror has turned much of our core coursework into a joke and has convinced everyone—students and teachers alike—that caring about learning is a waste of time.

Our unrealistically high graduation rates would obviously dip were we to adopt this standard throughout our nation’s schools, but those who thereafter received a diploma would at least have some assurance they possessed a good portion of the skills necessary to succeed in college or job training—and would not be condemned to a life of nothing other than the most minimally skilled jobs.

As odd as it might be to say this to those many Americans who are unaware of the diploma mills that so many of our public schools have become, implementing and sticking to this standard would entail a shock to the system akin to violent revolution. Rather than just pencil-whipping students through the grades, it would involve actual teaching, assessment, learning, and the many stresses of hard and sustained work—with no guarantee of success—that were once common in our nation’s public schools. Those teachers and administrators who cannot adjust to this new reality would need to be pushed aside, the happy nonsense that consumes so much of the average school day would need to be discarded, and both students and parents would need to face up to the fact that failure is sometimes a necessary stop on the path to actual learning.

Our other option is, of course, to continue to chase every educational fad that comes along, make excuses, and keep right on cheating many, many eighteen year olds of their futures while giving them nothing but an utterly false sense of their own competencies. A renewed commitment to teaching and learning seems an obvious choice to make, but one should never underestimate the corrosive powers of the inertia, laziness, petty politics, and bureaucratic timidity that are the hallmarks of American public education today.

Would Emergency Micro-Grants Help More Community College Students Succeed?

“Persistence” is a buzzword most educators at community colleges hear a great deal. We know too many students enter classes in the fall and—often before even the full academic year has passed—are gone from campus. Not surprisingly, a great deal of thought goes into what can be done to help students—especially the many who are older and re-entering the classroom or the first in their family to attend college—to complete their classes and secure a degree. There are a variety of ways to massage and tweak the data on degree attainment, but widely reported national percentages for completion tend to land in the low twenties. Obviously, everyone who works with community college students wants to do much, much better than this.

A great many good and helpful programs have already been implemented, and most boil down to providing more cocoon-like and intrusive advisory or educational interventions. Whether we are talking about mandated tutoring, individualized study tools, academic coaching, or even wake up calls to encourage students out of bed in the morning, most initiatives are some variation on the theme of hand holding. Truthfully, some students who lack confidence or independent life skills need exactly this because that which would seem obvious to many—attend classes regularly, complete the required readings, ask questions in class, and take careful notes—might not be so for students who attended academically deficient public schools or have no college-educated family members or close friends to act as mentors or role models.

It is also, of course, often the case that a simple lack of college-level skills in reading, writing, and math places many students into remedial coursework where they struggle to catch up. This is a continuing and largely avoidable tragedy that speaks to our national failure to provide every child with the opportunity for a quality education. The grievous dropout rate at our nation’s community colleges will continue to be inflated by inadequate public schools as long as we insist on handing high school diplomas to the equivalent of functional illiterates.

There is, however, another category of community college dropout whose problems I believe bear closer examination: those who are adequately prepared, motivated to succeed, but are dragged down by relatively minor financial circumstances beyond their control.

I am thinking of the single mother who has an unexpected expense and cannot pay her daycare provider at the moment. Unable to attend class for a week or more, she falls behind and grows frustrated. Upon her return, even though she has tried to work on her own and emailed her instructors for help, she needs to work much harder than her classmates to catch up—if she ever does.

I am thinking of the young man who has car problems and lives beyond the range where public transportation is possible. He misses classes while scrambling for a way to pay for the repair that will allow him to return to class. He emails his instructors, he knows what he is missing in class, and by the time he finally finds a relative or friend who can help pay for the necessary repair or provide transportation, the possibility of a successful semester is already slipping away.

I am thinking of the young woman who has a part-time hourly job in retail to help cover her living expenses while she is in school. Unfortunately, she falls seriously ill and misses over a week of school and work. There is no issue with her school absences beyond the assignments she needs to catch up on because she was medically excused from class, but the missed hours at work are a tremendous problem regarding her budget—so she takes on additional shifts when she is barely back on her feet to help cover her rent and food. As a result, she loses time to study and to fully recover from her illness, which has the inevitable negative impact on her wellbeing, classwork, and grades.

The three examples I have sketched from my experiences with my own community college students have a common theme: a minor financial setback becomes an academic catastrophe.

In all of these cases and many others like them, the amount of money necessary to keep these students in their classes and on track to graduate was shockingly modest—perhaps a couple of hundred dollars could have saved their semesters and helped them to succeed in school. The question I have when I see adequately prepared and motivated students fall by the wayside due to a financial glitch that is relatively minor and utterly beyond their control is this: Should community colleges “invest” a bit of money in these students today to help them to graduate tomorrow?

Compared to the staff, facility, travel, and advertising expenses associated with continuing to recruit new students to replace those who are lost—but might have been saved at the cost of a few hundred dollars at a critical juncture in their educational lives—there might be a very good dollars and cents argument to be made here. Moreover, the availability of this sort of emergency grant—some portion of which could be tailored to assist students who fit a particularly high-risk profile—could also draw more students into reaching out for other help offered by the college rather than just disappearing. If $200 for a new starter motor for a car today is going to help a student walk across the stage and collect an Associate degree a few years in the future, I cannot but believe this is a worthy—and worthwhile—expense.

I well understand the reasons community colleges will be wary of setting up programs to make emergency micro-grants. Community college trustees and state administrators would be understandably fearful of the negative publicity and investigations that would certainly result from this type of initiative were it to be poorly managed. No one wants to open the newspaper and read about sneaky students with sob stories scamming their local community college for weed money.

However, appropriate guidelines and management controls could certainly be developed by community colleges that would greatly minimize—although admittedly not eliminate entirely—the possibilities for abuse and misuse of the funds set aside for this purpose. Any such program should certainly start small and scale up as experience working with students provides the feedback necessary to fine-tune the process of disbursing funds, but it must not fall into bureaucratic deadlock if it is to be truly helpful to students facing a short-term financial crisis.

Although no reasonable person is going to suggest simply handing out cash from a shoebox in the Dean’s office, a program of this type will be effective if—and only if—funds can be provided within a business day. The more time that passes between the articulation of the financial emergency and its resolution, the fewer students who actually will be helped.

Is this idea worth a shot? That would be up to an individual community college to decide. However, it might be worth asking what is currently being spent on all programs at that college connected to student recruitment and retention, gather those figures, do some rough calculations, and ask whether a $250 grant that has, for the sake of argument, a 50/50 chance of keeping a student in school is a bargain when balanced against all the other expenses on the other side of the ledger.

I offer this idea for consideration because I believe we need to challenge ourselves to think outside the box to find solutions that will better serve our students—including those who are motivated but lack, for a variety of reasons, the economic safety net other students might possess. Given the well-documented crisis of non-completion at our nation’s community colleges, perhaps it is time for some innovative initiatives that are based upon the real world challenges that so many of our economically vulnerable students actually face. If we do not stretch beyond the tried and true (but perhaps not entirely effective) solutions of the past, we risk losing more and more students of modest means—but big dreams—who are trying to use community college as a stepping stone to a better life.

A version of this article was also published on Education Post (educationpost.org) entitled “Too Many Students Drop Out of Community Colleges. Here’s How We Fix It.” on January 19, 2018.

The Problems Posed By To Kill A Mockingbird

Recent media reports regarding efforts by a school district in Biloxi, Mississippi to drop To Kill A Mockingbird from their curriculum have generated understandable concern. As schools continue to grapple with both disorienting societal changes and increasing political polarization, we are inevitably going to see more challenges to specific classroom content and practices, which should concern any professional educator. Anger rarely results in good policy decisions.

Our societal discord certainly connects to broader questions regarding what we expect of our K-12 schools. That fine line between education and indoctrination will be ever more difficult to discern as educators struggle to find ways to challenge students to think without falling into the trap of preaching to them. However, given the well-documented deficiencies in critical thinking skills that colleges and employers must grapple with today, it is more important than ever to encourage our K-12 schools to shake students from their easy assumptions and comfortable mental inertia. The question is, of course, how best to do this.

I’ve taught To Kill A Mockingbird to high school students in the past, and they were often shocked to read about the routine degradations inherent in the entrenched racial discrimination of our nation’s history. If nothing else, the novel served as a lesson that allowed us to ladder into discussions about what has—and still has not—changed in America today. It has been many years since I’ve had the opportunity to teach this particular novel, but I suspect that my classroom lessons and activities regarding To Kill A Mockingbird would need to be very different now because I would be compelled to address uncomfortable changes in our perceptions of the characters and their motivations.

The cartoonish delineation between the heroes and villains in To Kill A Mockingbird has always posed pedagogical problems, although it eases reading comprehension for an audience often composed of 8th or 9th graders. On the one side we have the Ewell family, who are a caricature of what we expect—and perhaps prefer—our racists to be, an ignorant and violent clan devoid of even an iota of decency or honesty. Facing off against them, we have Atticus Finch, a caring and compassionate lawyer and tragic widower raising two intelligent and inquisitive children who are miraculously free of the least taint of racism. Caught in the middle we have Tom Robinson, falsely accused of rape by the evil Ewells, and the very personification of stoic dignity in the face of injustice. There are no shades of gray among these main characters; there are only, if I may be forgiven this analogy, broad strokes of black and white.

To Kill A Mockingbird, were it to be published today, would likely face a somewhat more mixed critical reception. Aunt Alexandra’s desperate efforts to put a gloss of girlishness on the tomboyish Scout would likely be more harshly judged by contemporary feminist critics. Mr. Dolphus Raymond’s sexual relationships with African-American women would raise questions regarding power differentials and consent. Boo Radley’s peculiar interest in his prepubescent neighbors, which obviously includes covertly observing them and following them outside the house at night, might not be so wondrously free of any question of pedophilia—or at least “stranger danger”—in today’s less innocent world. It may well be that the year of the novel’s publication back in the mists of 1960 was the very last moment in our cultural and social history when the questions and answers seemed quite obvious and easy, so complexity and nuance could be blithely set aside in the pursuit of an uplifting fable.

I’ve always been a bit leery of joining in the chorus of hosannas regarding To Kill A Mockingbird, and perhaps this is because I have always found Atticus Finch a bit less than admirable—which I realize is near to sacrilege to some. Although he has the best possible intentions in the worst possible situation, Atticus Finch and his legal machinations, in a final and flinty-eyed analysis of outcomes, actually come to nothing. Tom Robinson is dead, no minds are changed, and the Jim Crow system that informs the actions of the town and its people is wholly unaffected.

Atticus Finch’s attitudes and actions are in many respects a foreshadowing of the well-meaning (but ultimately ineffectual) white liberals in the 1960’s whose best intentions would be overrun by the flame and fury that finally destroyed Jim Crow segregation and its many local permutations. Although the novel suggests that readers should derive some cosmic satisfaction from the death of the thoroughly despicable Bob Ewell, which also allowed Boo Radley to finally reveal his essential human decency (although it might be reasonably observed that manslaughter is a mighty odd plot device to get there), it would be impossible to argue the trial of Tom Robinson produced any significant changes in the town or its people.

Of course, all of this speaks to the many moral compromises that inform the book. The worst of the town of Maycomb and its racist attitudes is on display, but the best of the many small but significant accommodations the decent need to make each day to survive in an indecent world also bear our examination. It could be argued, if one really was looking for hope for a better future, that the most moral course of action Atticus Finch could have pursued would have been to refuse to represent Tom Robinson, thereby removing the thin veneer of respectability that placates those whose mute compliance is needed. Imagine how different the novel would have been if Judge Taylor had not been able to use Atticus’ stirring but pointless speech to soothe the consciences of those who knew just how profound an injustice was being done. Moral but meaningless victories serve the needs of tyrannies that need to smooth over the rawness of oppression, and we should not fail to recognize that Atticus’ carefully restrained outrage sounded lovely but changed nothing at all.

All of this is, of course, beside the point of why the novel is now often banned. The norms that now rule in many communities judge the politically incorrect—but historically accurate—usage of the “N-Word” as both insult and casual descriptor to be too much to bear in our sensitive school and social climates. This is understandable, but it also opens up opportunities for classroom discussion of the novel and its context. If we are going to crusade to excise every questionable bit of U.S. history from our schools instead of engaging in the conversation, research, and exploration of our past that is a core mission of education, we condemn our children to facile sloganeering instead of intelligent and well-rounded inquiry that will prepare them for a future where the answers will be neither obvious nor easy.

Perhaps the key to continuing to use To Kill A Mockingbird in our nation’s classroom is to gently remove it from its pedestal and recognize its limitations—just as acknowledging our own human limitations is the precursor to a better understanding of our world and ourselves. To Kill A Mockingbird is not a perfect novel, and the tiresome insistence on canonizing it impedes an honest engagement with what can be learned from a thoughtful and critical reading. Just as a person can be wonderful but flawed, so can a book fall into that same category. If we can accept this, perhaps we can finally move forward instead of squabbling without end, which ultimately does nothing to improve the education of our children.