Showing posts with label education. Show all posts
Showing posts with label education. Show all posts

Wednesday, October 9, 2024

Post-America has no use for reading books

 There are three layers to what I'm presenting in this post. 

The first is the cover story in the current issue of The Atlantic titled "The Elite College Students Who Can't Read Books."

I'll let a fairly generous excerpt serve to make the pieces point:

nicholas dames has taught Literature Humanities, Columbia University’s required great-books course, since 1998. He loves the job, but it has changed. Over the past decade, students have become overwhelmed by the reading. College kids have never read everything they’re assigned, of course, but this feels different. Dames’s students now seem bewildered by the thought of finishing multiple books a semester. His colleagues have noticed the same problem. Many students no longer arrive at college—even at highly selective, elite colleges—prepared to read books.

This development puzzled Dames until one day during the fall 2022 semester, when a first-year student came to his office hours to share how challenging she had found the early assignments. Lit Hum often requires students to read a book, sometimes a very long and dense one, in just a week or two. But the student told Dames that, at her public high school, she had never been required to read an entire book. She had been assigned excerpts, poetry, and news articles, but not a single book cover to cover.

“My jaw dropped,” Dames told me. The anecdote helped explain the change he was seeing in his students: It’s not that they don’t want to do the reading. It’s that they don’t know how. Middle and high schools have stopped asking them to.

in 1979, Martha Maxwell, an influential literacy scholar, wrote, “Every generation, at some point, discovers that students cannot read as well as they would like or as well as professors expect.” Dames, who studies the history of the novel, acknowledged the longevity of the complaint. “Part of me is always tempted to be very skeptical about the idea that this is something new,” he said.

And yet, “I think there is a phenomenon that we’re noticing that I’m also hesitant to ignore.” Twenty years ago, Dames’s classes had no problem engaging in sophisticated discussions of Pride and Prejudice one week and Crime and Punishment the next. Now his students tell him up front that the reading load feels impossible. It’s not just the frenetic pace; they struggle to attend to small details while keeping track of the overall plot.
No comprehensive data exist on this trend, but the majority of the 33 professors I spoke with relayed similar experiences. Many had discussed the change at faculty meetings and in conversations with fellow instructors. Anthony Grafton, a Princeton historian, said his students arrive on campus with a narrower vocabulary and less understanding of language than they used to have. There are always students who “read insightfully and easily and write beautifully,” he said, “but they are now more exceptions.” Jack Chen, a Chinese-literature professor at the University of Virginia, finds his students “shutting down” when confronted with ideas they don’t understand . . . 

The second layer is a National Review piece by Ian Tuttle which expands on the larger cultural implications of what the Atlantic story presents.

Tuttle begins with a look at precipitating factors on the education level. What he comes up with is a damning indictment of what post-America considers education to be:

Horowitch notes, correctly, that the problem begins long before college. “In 1976, about 40 percent of high-school seniors said they had read at least six books for fun in the previous year, compared with 11.5 percent who hadn’t read any. By 2022, those percentages had flipped.” Reading for pleasure is even seen as a niche interest: “A couple of professors told me that their students see reading books as akin to listening to vinyl records — something that a small subculture may still enjoy, but that’s mostly a relic of an earlier time.”

No single cause is behind such a trend, but it is not hard to see that nearly every aspect of our educational culture discourages patient, attentive reading. High schools and middle schools have spent years phasing out books, often in response to the imposition of standardized testing. (As one teacher tells Horowitch: “There’s no testing skill that can be related to . . . Can you sit down and read Tolstoy?”) This trend is abetted by the widely adopted “college- and career-ready” educational program that has left many students prepared for neither.

And then there is post-American society's warped notion of "getting ahead":

Among students headed to elite colleges, there are additional pressures. Ferocious competition for acceptance to prestigious institutions, driven by a sense that long-term success is impossible without an Ivy League degree, promotes GPA obsession. For the same reason, students are subjected, often beginning in elementary school, to a punishing regime of extracurricular activities in the attempt to compose a résumé that can survive the gimlet eye of the nation’s last true gatekeepers: admissions counselors.

But, okay, why is reading dense books such a big deal?

Reading, a bit like faith, admits of many justifications — it increases empathy, enhances imagination, provides pleasure — but none of them is especially compelling to the nonreader. Yet we tend to take seriously what we see the people we love or respect taking seriously. Which is why Horowitch’s article is not primarily a story about kids but about adults. The observation that students, even at elite institutions, are struggling to read books implicates not just a few schools or school systems but an entire educational culture, along with families and parenting practices that, albeit well meaning, have trained students in a narrow, instrumentalist view of education.

That's right. Mom and Dad - and K-12 teachers -  are major factors:

The students Horowitch writes about are not failed learners. On the contrary: They have learned exactly what they were taught. Children are growing up, perhaps more than ever before, in environments where reading books is simply not a priority. At school, their teachers assign only excerpts from books and of necessity “teach to the test.” Children come home to parents who spend much of their leisure time responding to after-hours emails, scrolling their phones, or watching television. Their own leisure — what little they have after clubs, practices, rehearsals, volunteering, tutoring, and the rest — is easily co-opted by the distractions and addictions of TikTok and YouTube.

We prioritize what we see being prioritized. And for many, that is the grinding labor of getting ahead. Where thoughtful, attentive reading cannot be bent to this task, it goes by the wayside. But estrangement from that kind of reading makes it even more difficult to see that this all-consuming economy of achievement is ultimately intolerable to the soul, which exists in a different economy altogether.

Tuttle's mention of the soul is of paramount importance. He fleshed it out further:

Reading literature is one point of entry to a world not judged by test scores and résumé items. But teachers and parents and mentors must be the ones to make that invitation attractive. We can say to students, “Tolle, lege!” But we have to do it ourselves, first.

Okay, now for the third layer: my own observations.

The whole families-don't-sit-down-to-dinner-anymore conversation has been happening for decades, and for good reason. For reasons enumerated by Tuttle above, families with school-age kids are pressed for time.

I've written before about how my relationship with my father was fraught. He was a willful, demonstrative, and pretty much absolutist man. Because I was raised right on the cusp, right when the tectonic shift took place in our society, I bristled at what he was trying to impart.

But he also had an intellectual bent. Our family had quite an impressive book collection, which I've inherited. (Great record collection, too.) He was the first to expose me to the giants of Austrian economics - Mises, Hayek - and the letters of Lord Chesterfield to his son. He also impressed upon me why pivotal points in history were so. 

Our dinner table conversations were more often than not about the Big Ideas. Those repasts were an essential element in my formation, I now realize. They honed my reasoning powers and my commitment to taking all facets of a situation into consideration before drawing a conclusion. 

There are still undoubtedly some family dinner hours that are enriching in that manner. But it's pretty clear they are now a rarity.

I'm not an elite-institution professor. I'm an adjunct lecturer in jazz history and rock and roll history at the local campus of our biggest state university. But I'm experiencing what the sources in the Atlantic piece had to say. 

And even beyond my students' poor compositional skills or obvious lack of acquaintance with reading full-length books, what dismays me is the blank looks on their faces. It's clear they cannot just sit still and solely focus on my lecture or presentations. They look uneasy, as if they can't wait for the hour and fifteen minutes to be over. They don't exude the kind of social comfort on which a stable classroom environment is predicated. 

Reading - and other forms of communication and expression, such as music, visual art and drama - are how we humanize ourselves. 

Maybe there ought to be a mandatory high school course, taught in the junior year, when students are first looking at what comes after graduation, called "Why Would the Brass Ring Be Valuable?"

It seems to me to be rich with possibilities. It could be the door-opener to what the great minds of Western history have had to say about how we ought to go about appraising possible paths for our lives.

What we can say is that this is a problem that bodes very ill for our prospects. 

 

 

 

 


 

Saturday, December 9, 2023

The House hearing, the three university presidents, and the rot of post-American higher education on full display

 I've waited to weigh in on this, because there was assuredly going to be a first wave of reaction to the disgusting way presidents Claudine Gay of Harvard, Elizabeth Magill of the University of Pennsylvania, and Sally Kornbluth of MIT conducted themselves as they appeared before the House Education and Workforce Committee last Tuesday. And there was, from columnists, radio talk show hosts, cable-TV personalities and ordinary post-Americans conversing among themselves.

I probably don't have anything startlingly original to add to the discussion, but I know where on the landscape I want to position myself.

Let's start with the after-the-fact apology phenomenon. We see this a lot these days. The social climate in our country is such that there's not much regard for internal filters that might make someone think twice about taking a stance that allows one to indulge in self-congratulation, but that has a lot of opposition among people of influence. If President Gay's cocksure depends-on-the-context response to Representative Stefaniak's questioning about exhortations of Jewish genocide was one hundred percent sincere, then her walk-back has to be seen as full of ka-ka, does it not? 

Or is the reverse the case? There's at least a theoretical case to be made for that. After all, she was still dealing with the mid-November letter she received from 100 faculty members who did not at all care for the "Combating  Antisemitism" statement she issued in response to donors and alums speaking up about campus Jew-hatred. 

Either way, the only conclusion to be reached is that she's a phony.

And if these university presidents want to talk about context, we can gladly revisit the whole leftward drift of higher education over the past umpteen decades. We can trace the role of the Gramscian long march through the institutions by which 1960s radicals became tenured professors. We can point out the fact that William F. Buckley launched his career as an author with the 1952 tome God and Man at Yale, which examined his alma mater's complete secularization. Timothy Dwight, call your office.

I am also not the first to note that Gay, Magill and Kornbluth would have come down on similar calls for extermination of just about any demographic group other than Jews.

The "just about" qualifier was not thrown into the previous sentence idly. We all know which group would not incur their ire. 

And that's what this really comes down to, isn't it? A key component of the above-mentioned leftward drift is the assumption that there is something fundamentally problematic about being white. 

And there's a global dimension to this. Russia's Putin and China's Xi  are licking their chops at the prospect of a BRICS expansion that would bring its role as a voice for the "global south" into sharper focus. What an exquisitely effective way to nudge the West, and the United States in particular, out of their role as guarantor of the rules-based post-World War II international order.

So the ramifications of the way these three ladies conducted themselves last Tuesday are numerous.

It was one more confirmation that we've moved past the peak of human advancement and are descending back into the grim way human beings have treated each other for most of our species' history. 

Saturday, November 11, 2023

Saturday roundup

 As I've said before, a good book review stands alone as a thought-provoking essay from which one can glean insights whether one has read the book in question or not. Of course, it will also spur one to put that book on one's to-read list. 

Came across a couple of those recently.

At Front Porch Republic, Christian McNamara sets the table for his discussion of Seth D. Kaplan's Fragile Neighborhoods: Repairing American Society, One Zip Code at at Time by recounting a parade in the community where he'd lived for almost five years. He didn't recognize the mayor, something which took him aback:

How is it that someone comes to live in a place for almost half a decade without being able to recognize its mayor? Or to recall his name once he has been identified as such? Or even to be sure of his political party (which I could guess at given our town’s pronounced leanings in national elections, but that I wouldn’t have bet the farm on given the possibility of idiosyncratic results at the local level)? In particular, how do these things happen when the clueless someone in question is a person who had always considered himself politically and socially engaged—aware of important legislation being considered in Congress, familiar with major cases pending before the Supreme Court, well read on the significant public policy issues of our day?

Just like him, the author of the book being reviewed was ostensibly more preoccupied by larger-scale developments:

It is against such shortsightedness—the tendency of all too many of us to ignore what is going on in our own backyards—that Seth D. Kaplan delivers a desperately needed warning in his new book Fragile Neighborhoods: Repairing American Society, One Zip Code at a Time. At first blush, Kaplan would seem an unlikely evangelist for the importance of focusing on the local. An expert on fragile nation-states, Kaplan has spent his career working with organizations like the World Bank and the U.S. State Department in places like Nigeria, Colombia, Libya, and Yemen. Surely this is a man for whom what happens in individual neighborhoods is small beer as compared with the important work of running a country? Yet for Kaplan, when comparing two countries and asking why one has succeeded where the other has failed, what matters most is not national policies but “societal dynamics—the strength of the social glue, the nature of relationships across groups, and the role of social institutions.” These are things that manifest (or fail to manifest) at the local level. The social health of our neighborhoods “determines how safe we are, the quality of the schools our kids go to, what resources we have access to daily, the kinds of job opportunities we have, our psychological well-being, and even . . . how long we live.” It also shapes, in Kaplan’s view, the state of the nation. 

Which is bad news for the United States given Kaplan’s assessment that “the social decay we are experiencing in neighborhoods across America is unlike anything [he has] seen elsewhere.” This is a startling statement given the many troubled corners of the globe where Kaplan has hung his hat. As distressed as those places are, the people in them “are simply much warmer, their relationships much thicker than what [he has] experienced in countless neighborhoods here in the US.” Americans “don’t feel obligated to help our neighbors, give back to our community, or even (in many cases) care for members of our own family—and we resist joining any group or association that might create such obligations.” The result? We are “some of the most depressed, anxious, addicted, alienated, and untethered people in the world.” Not even material wealth is sufficient to protect against the effects of social poverty, with many of these problems plaguing middle- and upper-class neighborhoods as well.

The fundamental flaw besetting traditional approaches to social reform, according to Kaplan, is that they are typically top-down, one-size-supposedly-fits-all “solutions” that take no account of the unique dynamics of the specific places that are to be reformed—the particular challenges facing a given community and the assets already at its disposal for meeting those challenges. As Kaplan notes, even where initiatives succeed at relieving distress in the short run, they will ultimately have done more harm than good if they undermine the local social institutions necessary for a community to thrive over the long haul. It is not that there is no role for politics or national policy. According to Kaplan, both “[g]overnment assistance (a tool of the left) and more efficient markets (as favored by the right)” are necessary. But these interventions will be effective only insofar as they work through and are supportive of local social institutions.

At the Acton Institute's website, Brian A. Smith looks at the ongoing relevance of Walker Percy's Lost in the Cosmos: The Last Self-Help Book

Forty years ago, the philosopher and novelist Walker Percy published what is easily the strangest book of his writing career. Lost in the Cosmos distills the major themes of both his novels and his philosophical essays into a little over 250 pages of multiple-choice questions (and peculiar answers), hypotheticals, and brief stories. Billing it the “last self-help book,” Percy assailed virtually everything ordinary Americans take for granted about themselves—and issued stark challenges to the practitioners of the human sciences that very few scholars have bothered to take up.

The early reviewers for major newspapers loathed the book, finding Percy’s approach to be a confusing “mishmash of satire and seriousness” and “neither good philosophy nor a good read nor yet a book likely to help any Self I Know of, including its author.” Like all Percy’s works, Lost in the Cosmos has remained continuously in print and, far more than most of his novels, retains a strong following. It is probably the book that resonates most clearly with our present discontents and may well be the only one of his works that will continue to be read widely in decades to come. The critics suggest that the menu of answers Percy offers to his multiple-choice questions somehow imposes his views or pronounces his judgment upon the readers. But this is unfair: Percy isn’t exempting himself from condemnation or his own parody. Part of the discomforting joy of the book is in seeing many of our own thoughts laid bare in all their strangeness.

One cannot accuse Percy of failing to alert the reader to the kind of intellectual assault that awaits them. He opens with a “preliminary short quiz” to determine whether one is, in fact, lost to oneself. This mix of open-ended and multiple-choice questions offers some challenges to our everyday experience. For example, “Why is it that one can look at a lion or a plant or an owl or at someone’s finger as long as one pleases, but looking into the eyes of another person is, if prolonged past a second, a perilous affair?” Readers are encouraged to reflect on aspects of human experience that are familiar but also strange. This then prompts them to wonder whether they understand the human condition at all.

This isn’t to say that aspects of the book aren’t dated. Readers will be excused for looking up Phil DonahueLeo Buscaglia, and a handful of other references that would have been immediately familiar to Percy’s readers in the 1980s. But these callouts appear in the course of questions and scenarios we still face today—and the American craze for self-help guides, experts, and shortcuts “to a better you” certainly hasn’t abated. It isn’t the cultural references that pose problems so much as Percy’s own peculiar method of leading the reader to grapple with the depth of the human predicament.

Smith says that Percy's essential question to the reader is, "Who are you?"

In the book’s opening pages, Percy asks readers to evaluate which view of the “consciousness of self,” if any, they think explains one’s sense of the human condition. These run a gamut from pagan to theistic to modern-philosophic: Are you a cosmological self; a Brahmin-Buddhist, Jew, or Christian: a “role-taker”; a scientist or an artist; a fully autonomous being; or perhaps a totalitarian? Some of the options might have been plausible in the quite recent past, even. Consider the “standard American-Jeffersonian high-school-commencement Republican-and-Democratic-platform self”:

The self is an individual entity created by God and endowed with certain inalienable rights and the freedom to pursue happiness and fulfill its potential. It achieves itself through work, participation in society, family, the marketplace, the political process, cultural activities, sports, the sciences, and the arts. It follows that in a free and affluent society the self should succeed more often than not in fulfilling itself. Happiness can be pursued and to a degree caught.

This and all his descriptions nudge the reader to address a challenge: Is this really good enough to explain you, much less help you live well?

Percy’s questions help us see the real deficiency of virtually all self-help literature: these works presuppose that by simply learning the “habits of effective people” or practicing some slate of life management strategies, we will emerge as better versions of ourselves. What most people learn from embracing these fads is that even if we succeed in living out the advice, the self we help is still human and remains stuck in an inescapable predicament—a crisis driven by the inadequacy of our self-understanding.

Percy pushes the boundaries of what most people are usually willing to contemplate. Lost in the Cosmos relentlessly forces us to probe the limits of our conventional explanations for “extreme” or “dangerous” behavior. He suggests that even most religious believers lack an adequate grasp of how to grapple with the challenges of our times and are just as prone to seek escape from their everyday lives in what Percy calls immanence and transcendence.

We escape ourselves on the path of immanence through a variety of means. Among these are shopping, television, drugs, sex, and violence. But why? One possible answer:

The Self since the time of Descartes has been stranded, split off from everything else in the Cosmos, a mind which professes to understand bodies and galaxies but is by the very act of understanding marooned in the Cosmos, with which it has no connection. It therefore needs to exercise every option in order to reassure itself that it is not a ghost but is rather a self among other selves. One such option is a sexual encounter. Another is war. The pleasure of a sexual encounter derives not only from physical gratification but also from the demonstration to oneself that, despite one’s own ghostliness, one is, for the moment at least, a sexual being.

Just as stark are Percy’s explorations of how we seek to transcend our ordinary condition. Artists express what we hope and feel; scientists can grasp the causal relations between objects in the natural world. For both, he suggests, there is a kind of escape: “The pleasure of such transcendence derives not from the recovery of the self but from the loss of self.” We can lose ourselves in a variety of ways.

Human beings don’t follow a straight course. We oscillate between one extreme and the next. A mathematician might spend eight hours barely noticing the needs of the body then escape from work into a night of drug-fueled carousing, never considering for a moment anything about the peculiarity of being a complete person, both body and soul. Percy fears these kinds of individuals can become unmoored from everyone and everything:

None is as murderous as the autonomous self who, believing in nothing, can fall prey to ideology and kill millions of people—unwanted people, old people, sick people, useless people, unborn people, enemies of the state—and do so reasonably, without passion, even decently, certainly with the least obnoxiousness.

Lost in the Cosmos does a great service to its readers by helping outline the mental state that so often accompanies modern boredom or everydayness, that leads us into yearning for disasters or bad news or fleeing from our ordinary existence through consumerism, travel, sex, or other enthusiasms. Percy’s achievement is to suggest what we really need: a better sense of who and what we really are—and he approaches this in a manner aimed at persuading Americans who lean into either spiritualism or science to see where they need a better sense of the self.

It's come to my attention that Lost in the Cosmos is one of theologian Peter Kreeft's favorite books, a further selling point for me.  

Frederick Hess of the American Enterprise Institute looks at how woefully skewed the whole field of high school civics is:

Last month, the American Educational Research Association (AERA) touted a new study reporting that, as the press release headline had it, “State-Mandated Civics Test Policy Does Not Improve Youth Voter Turnout.” With more than a little evident glee, given the education school community’s hostility to anything that smacks of testing, the Penn State researchers reported that requiring high schoolers to pass a civics test didn’t lead to statewide increases in self-reported voting by 18 to 22-year-olds.

Now, there are several issues with this study, including the fact that boiling millions of students down to several dozen state-level aggregates made it unlikely that the researchers would find an impact. Indeed, one might wonder why the AERA chose to highlight the non-findings of a not-very-compelling study.

But let’s focus on the larger issue: The study fundamentally misconstrues the point of civics instruction. In American education today, it’s widely assumed that voting, advocacy, and “speaking out” are the ultimate aim of civic education.

There’s something odd about that premise. In a landscape pocked by hyperbolic social media, pro-terrorist theatrics on campus, and performative MAGA lawmakers in the U.S. House of Representatives, does it look like America’s problem is a lack of activism? Last week, Students for Justice in Palestine published an op-ed in Columbia University’s newspaper “celebrating” their having held “one of the largest campus protests” in Columbia’s history. The presumption is that a big protest is innately deserving of celebration. This is what follows from the premise that “activism is good,” even when it’s on behalf of rapists, kidnapers, and murderers.

As a one-time high school civics teacher, I wholly embrace the need to prepare students for democratic citizenship. But democratic government is about a lot more than activism and voting. It’s also about respect for rules, personal responsibility, patience, and a willingness to work with those who see things differently.

And those are the things that are getting neglected. After all, voter participation is at record highs. Impassioned activists and “small-money” donors are calling the shots in party primaries. The nation’s most visible lawmakers are those who have the least interest in the job of actually crafting laws. The U.S. citizenry is lacking not political participation but restraint, trust, knowledge, and respect for institutions and norms.

Democratic self-government is secured less by getting students to pull a lever (or mail in a ballot) than by helping them develop a proper respect for due process, free speech and a free press, canvassing boards that faithfully review vote tallies, election officials who resist political pressure, public agencies that maintain public trust, independent courts, responsible legislators, and limited executive authority. This is what civics needs to teach.

Today, civics education isn’t doing that. Teachers don’t even realize they’re supposed to teach those basic values. How do I know this? Because teachers themselves say so. The RAND Corporation has found that barely half of social studies teachers think it essential that students understand concepts like the separation of powers or checks and balances. In 2022, another RAND survey of K-12 teachers found that more thought a key aim of civics education is promoting environmental activism than “knowledge of social, political, and civic institutions.”


Longtime readers of LITD know that my view of Donald Trump is that, as long as he was merely a cartoonish media personality, he was no more culturally or societally harmful than any other outrageous, amoral celebrity our culture has been producing for the last half-century, but that, with his entry into the political realm, he'd wreaked ruin on our nation on an unprecedented scale. 

Now that he's had a taste of the Oval Office, his appetite for yet another foray into authoritarianism has been whetted. Jack Shafer of Politico has the details:

According to a page one story in The Washington Post Monday, Trump plans on the first day of his new administration to invoke the Insurrection Act so he can dispatch the military to counter any demonstrations that might resist his policies.

Why might he need the Insurrection Act? Well, the piece also notes Trump intends to turbo-politicize the Department of Justice and order prosecutions of his former aides and officials who have criticized him. Perhaps he thinks the country won’t let him go buck wild on the rule of law without a stink, so he wants to be ready to sic troops on the inevitable protestors. Fingered by Trump for legal beat-downs, the Post reports, are one-time Trump stalwarts and staffers like former chief of staff John F. Kelly, former attorney general William Barr, his ex-attorney Ty Cobb, and former Joint Chiefs of Staff Chair Mark A. Milley. Trump has singled out other officials at the FBI and the Department of Justice for prosecution, the piece adds, as well as President Joe Biden and his family.

Leading Trump’s Insurrection Act initiative is Jeffrey Clark, a Trump-era Department of Justice official currently being prosecuted for his part in an alleged scheme to overturn Georgia’s 2020 election results. According to the Post piece, Trump intends to leaven the entire federal bureaucracy with appointees like Clark who are willing to do his bidding. (Told by a colleague that there would be riots in the streets if Trump sought to stay in office despite losing the 2020 election, Clark is said to have responded, “That’s why there’s an Insurrection Act.”)

How much of this Trump power lust is new? Recall that he called for the Constitution’s termination in December 2022 so he could return to the presidency. Also, he’s always loved to entertain himself and his followers by talking about throwing opponents in jail. Over the summer, ABC News compiled a list of plenty of people he wanted indicted or jailed for their crimes, including ex-FBI Director James Comey, former special counsel Robert Mueller, Steele dossier author Christopher Steele, Bill and Hillary Clinton, former national security adviser John Bolton. You may recall that locking up Hillary Clinton was elemental to his 2016 campaign. As for testing the limits of presidential power, that’s old hat, too. During his first administration, he banned Muslim visitors, issued an emergency declaration to build a border wallafter Congress refused to pay for it, and sought to overturn the 2020 electionresults.

But this new round of bombast and threats is not just a matter of Trump being Trump. What’s different this time is that Trump’s building an extra-legal foundation of declarations and appointments to make his 2017-2021 aspirations, which sounded like off-the-cuff ravings at the time, come true. Recall the scary preview of his ambitions he gave in a March 2023 speech at the Conservative Political Action Conference, in which he promised his “wronged and betrayed” supporters that he would be their “justice” and “retribution.”

However critically you think of the team Trump assembled in his first administration, he could never convince them to conduct prosecutions of either his political opponents or officials who defied him. The next time around we won’t be lucky if he succeeds in peppering the Department of Justice and other agencies with his yes-men. Can he get away with it? It’s not illegal for a president to instruct an attorney general on how to do his job as long as those instructions are consistent with the law. But lining up presidential critics for prosecution, as Trump appears ready to do, makes a mockery of that consistency — especially when no laws appear to have been broken!

I am pleased to announce that I am now a contributor to The Freemen News-Letter, a project of The Freemen Foundation, the mission of which is "to conserve and renew American constitutionalism." There are several newsletters within it.

For Self-Evident, I wrote "Conservatism and Immutable Verities."

In the run-up to Halloween, The Seeing Place did a series on scary movies that raise broad considerations. To that, I contributed "Points of No Return," in which I examine scenes from two classic Universal horror films. 

For The Daily Saucer, I wrote "The Conflation Problem."  It's about a subject I've visited many times, because I feel it's crucial to an understanding of our bleak predicament, namely, that the general public in post-America now assumes that Trumpist bombast is a key element of the conservative project, which it most definitely is not.

I've been busy over at Precipice (and may I ask here if you're a subscriber yet):

"The Pervasiveness of Human Waywardness" is a thought process kicked off by the role of abortion in last Tuesday's Democrat electoral success:

What would it take to uphold the ideal of family, that most basic of social units, where, when it’s in a healthy condition, is the environment in which we learn about how to lovingly interact with other people? What about venerating nurturing, guidance, encouragement, team spirit, humor, and generosity?

There seems something bitter at the core of a pro-choice position. Its inclination is to respond to what I’m saying in the above paragraph with, “Yeah, show me an actual family that unfailingly venerates those things, that sustains the happiness of everyone in it, that isn’t fraught with underlying issues.”

And there’s a valid point there. Any family anywhere is comprised of fallen human beings. That’s why I used the term “ideal” rather than “actual.”

It doesn’t help that a lot of Christians go about their family-venerating in the most boneheaded way they possibly could. .

In 2023, we’re not going to be able to avoid the hot potato of patriarchy and complimentarianism. Not only is it going to make a society-wide conversation difficult if not impossible, given our polarization, but Christians are at a standoff about it among themselves. 

The Council on Biblical Manhood and Womanhood is a lightning rod within institutional Christianity. Its Nashville Statement looks at first glance like a tidy summation of proper relations and dynamics between the sexes. But men being what they are, they can indulge their inclination to imperiously shoot off their mouths and alienate a whole lot of their sisters in Christ. Witness John McArthur’s advice to Beth Moore to “go home.” The ripple effects of that one are still being felt.


In "I'm a Non-Voter Precisely Because I'm Engaged With Policy and Culture," I say that I'm still planning to stay home next May and November.  

"Jews, the West, and Whether Our Species Still Deserves the Description 'Human'" was inspired by the big matter on our plate for the last month. 

"Maybe Some Seriousness Is In Order" is a brisk heads-up about the present moment. 

That ought to keep you out of trouble this weekend.



 

Saturday, August 5, 2023

Saturday roundup

 Gonna excerpt generously from this one.  "Ruler Over All: Notes Toward the Restoration of a Christian Culture" by Ken Myers at Touchstone is full of important insights that must be made available for contemplation and discussion. 

And I'll try not to - well, basically reprint the whole thing, but the way he unfolds his overall point is impressive.

He begins thusly:

One hundred years ago, in September 1923, the Hogarth Press published the first English book edition of T.  S. Eliot’s 434-line poem, The Waste Land. The type was set by hand by Eliot’s friend Virginia Woolf, who with her husband Leonard had founded the small publishing venture. The previous autumn, The Waste Land had appeared in the inaugural issue of Eliot’s own journal, The Criterion, and then in the U.S. in the prestigious literary magazine, The Dial. 

The Waste Land has been judged by many to be the most influential English-language poem of the twentieth century. Often analyzed as a depiction of the turmoil and fragmentation of Eliot’s own inner life, its continued power after a century is surely because of its account of public—not just private—dislocation. When the poem first appeared in 1922, the second volume of Oswald Spengler’s The Decline of the West had just been published. Spengler predicted the twenty-first-century collapse of Western civilization following decades of decay and concomitant  tyranny.

Eliot’s poem was published in a time haunted by a sense of global chaos unleashed by the destruction of the First World War and the social and political uncertainties that were both its causes and effects. Writing in the 1950s, the French diplomat and critic Georges Cattaui described Eliot’s expression of “a shipwrecked world” and “a longing for order.” In biographer Alzina Stone Dale’s judgment, “The Waste Land portrays failed civilization, or St. Augustine’s ‘earthly city’ doomed by its sterility and loss of spiritual power.”

Social and literary critic Russell Kirk engaged Eliot’s ideas about society and spirituality in depth in his 1971 book, Eliot and His Age: T.  S. Eliot’s Moral Imagination in the Twentieth Century. Commenting on the state of the West in the decade preceding The Waste Land, Kirk noted:

This decay of order and justice and freedom within the old European community was paralleled by the decadence of the old moral order, the Church falling into disrepute and the governing motive of many eminent men being merely ‘put money in thy purse.’ For the charlatan and the cheat, large opportunities were opened everywhere; while the old motives to integrity were fearfully shaken. Out of the War’s brutality had emerged gross appetites and violent ambitions, and everywhere egoism swaggered.

In one of the drafts of the poem, Eliot affixed a blunt epigraph from Conrad’s Heart of Darkness: “The Horror, the Horror.” Decades later, when he received the Nobel Prize for Literature in 1948, the presentation address cited The Waste Land’s“melancholy and sombre rhapsody [which] aims at describing the aridity and impotence of modern civilization.”

Eliot himself commented, not long after The Waste Land’s publication, that

the present situation is radically different from any in which poetry has been produced in the past: namely, that now there is nothing in which to believe, that Belief itself is dead; and that therefore my poem is the first to respond properly to the modern situation and not call upon Make-Believe.

We can be grateful that, five years after writing The Waste Land, Eliot converted from the austere Unitarianism of his New England ancestors to Christianity, specifically the faith as practiced in the Anglo-Catholic tradition of the Church of England. With his newly acquired recognition of Christ as the still point of the turning world, he would go on to write some of the most profound Christian poetry of the twentieth century, including Four Quartets, written between 1936 and 1942 and published in book form in 1943, about the time Eliot began writing the chapters in the essay that would become Notes Towards the Definition of Culture.

He then brings Russell Kirk into the mix, via Richard Nixon:

President Richard Nixon once asked Russell Kirk to recommend one important book that he should read; Kirk named Eliot’s Notes Towards the Definition of Culture. In Kirk’s judgment, “Eliot might well have set upon his title page a sentence that James Fitzjames Stephen had written in 1873: ‘The waters are out and no human force can turn them back, but I do not see why, as we go with the stream, we need sing Hallelujah to the river god.’” Eliot’s title page did include a quotation, but in the spirit of his book’s title, it was from the Oxford English Dictionary:“definition: 1.  The setting of bounds; limitation (rare)—1483.”



A little later, he brings Philip Rieff into the proceedings:

In 1966, sociologist Philip Rieff presented in The Triumph of the Therapeutic both a description of how a culture typically functions—orienting the moral compasses of its members—and a diagnosis of how Western societies were going down a path toward what he called an anti-culture. For millennia, human beings within traditional cultures were given guidance about how to live, guidance that served like roadmaps and guardrails. The work of a culture was to sustain an “inherited organization of permissions and restraints.” A culture was a moral legacy, a received and communally shared way of life, not just a set of options for personalized lifestyles, to be adopted or discarded at whim.

But Rieff saw that modern societies were increasingly committed to abandoning the presentation of authoritative guidance, preferring that social institutions embrace the role of expanding the possibilities for free expression. During the 1960s, the structures of societies were being radically reordered to encourage and accommodate the radical individualism and subjectivity that defines the modern project.

“A culture survives principally,” Rieff wrote,

by the power of its institutions to bind and loose men in the conduct of their affairs with reasons which sink so deep into the self that they become commonly and implicitly understood—with that understanding of which explicit belief and precise knowledge of externals would show outwardly like the tip of an iceberg.

A culture must, he argued, 

communicate ideals, setting as internalities those distinctions between right actions and wrong that unite men and permit them the fundamental pleasure of agreement. Culture is another name for a design of motives directing the self outward, toward those communal purposes in which alone the self can be realized and satisfied.

Rieff affirmed the ancient (classical and Christian) belief that human persons can only be fully themselves within a society in which there are—in St. Augustine’s phrase—common objects of love.

But he saw how, in the first half of the twentieth century, there was a “reorganization of those dialectical expressions of Yes and No the interplay of which constitutes culture.” He saw that the rapid and radically changing standards of moral life in the twentieth century were creating conditions whereby “all communications of ideals [would] come under permanent and easy suspicion. The question is no longer as Dostoevski put it: ‘Can civilized men believe?’, but rather: Can unbelieving men be civilized?”

We moderns believe, he continued, “that we know something our predecessors did not: that we can live freely at last, enjoying all our senses—except that of the past—as unremembering, honest, and friendly barbarians all, in a technological Eden.” And so, “a new and dynamic acceptance of disorder, in love with life and destructive of it, has been loosed upon the world.”

A bit further into his train of thought, Myers insists that we have to get back to basic questions about what particular things are - in their essences - in order to have any way of evaluating them: 

D.  C. Schindler has pointed out that not only in public discourse but even in private conversation, we avoid asking fundamental ontological questions, basic questions that begin with the words, “What is . . .”:

At the highest intellectual level, we will discuss economic conditions, for example, and the focus will be on how to improve them, how to stimulate growth, how to make possible a more equitable distribution of wealth, and so forth. But we do not ask what an economy, after all, is, or what wealth is. We discuss education, its cost, its availability, its effectiveness, and so forth. But we do not ask: What is education? We discuss foreign policy, the question of immigration, of the just use of force, but we do not ask: What is a nation? What is a citizen? What does it mean to belong to a political community? What is justice? (Love and the Postmodern Predicament: Rediscovering the Real in Beauty, Goodness, and Truth, 2018, p.   27)

As a society, we not only do not agree about such things; we assume that achieving agreement on such things is unnecessary and undesirable. What “pluralism” means for us is that no one should ever feel obliged to change his mind about questions of value, since all judgments about value are, after all, not rational, but merely expressions of personal preference.

And so, most public speech about contested matters seems designed to denigrate and humiliate one’s opponents, not to persuade them. We no longer even seek to come to agreement about the principles that matter most. We assume that all values are personal and subjective, which means that we can’t talk about them publicly and rationally and hope to change anyone’s mind.

We further assume that our public institutions can nevertheless be run like value-free machines, serving everyone in the nation with mechanistic indifference to questions of what things really are. We assume that institutions of government and finance and education and journalism and manufacturing and healthcare can all be run without any reference to higher purposes or ultimate good, and still perform to everyone’s satisfaction, if only we could get clever and disinterested people running things. At least, this is what we profess to assume, as good modern heirs of Enlightenment liberalism. This is what our public institutions encourage us—and sometimes coerce us—to profess. 

Confused About Reality

This vision of pluralism and diversity, and its commitment to public institutions that somehow maintain absolute neutrality, is deeply mistaken. First of all, it ignores how substantive ideas about what is good or true are frequently—necessarily—smuggled into allegedly neutral policies and procedures.

Second, it misrepresents some fundamental facts about human nature. We are creatures made to pursue what is really good and really true, not made to invent our own personalized account of reality. Modern men and women are deeply confused about what things really are, not just about what is true and what is good, but about what is real, a confusion that is sustained by many of our cultural institutions. G. K. Chesterton once quipped that modern man has not only lost his way; he’s forgotten his address. Modern culture and its institutions sustain a view of the kinds of creatures we are and the kind of world we live in that is more misleading, more fundamentally false than the view of St. Paul’s contemporaries. Our contemporaries have a false understanding of the very nature of reality, not just a false set of assumptions about how they should live. The challenges we face have to do with metaphysics, not just morality.


Okay, hopefully that's enough of a taste to get you over there to read the whole thing.

In her essay "Filling Time Filling Minds" at Front Porch Republic, Nadya Williams cautions us not to ever-entertain ourselves:

. . .  I haven’t seen a movie in about a year. For several years now, in fact, I have been averaging about one movie per year, usually choosing materials I considered essential for teaching. Now that I have quit academia, I may never need to see a film again, which is a rather appealing thought. Of course, to be fair, I do spend a good two or three hours a day on my computer—writing, editing, responding to emails, and checking social media. My life is assuredly not screen-free. But it is free of on-screen entertainment, and the same largely applies to my children, who only get to watch a film once every few weeks as a special treat. Although, apparently, in the meanwhile, they get to enjoy the best that the rear-view cam in the van has to offer.

Second, relationships matter. Dan and I try to prioritize the things we can do for and with others, whether in the family or outside the home, over solitary activities. We were made to live in community, in relationship with family, friends, neighbors. One of the problems of a passive entertainment activity, like Netflix, is that it takes us away from interactions with others. Watching something can be done in company with others, but it will never be the same kind of togetherness as we have when we sit down for dinner with family and friends, play a board game, read a book out loud for each other, or go for a walk as a family. 

My point, ultimately, is not to wage a war on Netflix. I am merely using the most famous and most popular of the American streaming services as a representative of the larger problem of the desire to disengage. The idea of relaxing and enjoying one’s well-earned rest has biblical undertones—the Sabbath is a key theme; even God rested. But the Sabbath was never meant to be a time of selfish retirement from all relationships, nor was it made to be spent on diversion or amusement. At the same time, however, I want to be clear that I am not advocating here for the industrial-era idol of squeezing the utmost out of every minute. Rather, I want us to think about who we become, and (for parents) who we cause our children to become, through how we use our time.

 In "Augustine and the Order of Love: Debunking a Dumb Christian Nationalist Argument," Jake Meadors, writing at Mere Orthodoxy, looks at the question of how we should love everybody in a world where we only personally know, or even have some kind of categorical relation to, a small percentage of the entire human species. 

Good book reviews are works that make attention-deserving points all their own. Such is the case with Mark K. Spencer's review of Wendy Brown's Nihilistic Times at Law & Liberty:

“If you live today, you breathe in nihilism,” wrote Flannery O’Connor in 1955. “In or out of the Church, it’s the gas you breathe.” What makes Wendy Brown’s Nihilistic Times worth reading is that she convincingly reminds us of this situation. The attitude that nothing has essential meaning or value is pervasive in our times, always in the background of other lines of thought. “Values” make things important, worthy of care and esteem, in various ways; justice, beauty, holiness, and pleasure are examples of values. Right and left, religious and secular alike speak the language of promoting and defending values. And yet, an underlying malaise of nihilism pervades our world. While most people fluently speak the language of values, there is also a ubiquitous suspicion or fear that all values have been reduced to mere instruments, used by individuals, corporations, and governments to achieve and maintain power and personal gratification.

Brown, a political theorist of the anti-liberal left, is, of course, not the first to make these claims; they’ve been made at least since the nineteenth century. But Brown’s book—a reworking of her 2019 Tanner Lectures on Human Values at Yale University—is a timely reminder of the nihilistic air we breathe. It’s easy to lose sight of this situation, especially if we’re caught up in defending some particular worldview or policy proposal. A well-crafted reminder of fundamental features of the contemporary human condition is always beneficial. While Brown frequently signals her adherence to left-wing orthodoxy on hot-button issues, she also repeatedly shows how all sides in current political and cultural debates have imbibed a nihilistic attitude. Politicians increasingly manipulate information to sway voters, “woke capitalists” use progressive values to sell products, and students are told to treat themselves as “human capital,” who should “invest” in their own future through education, reducing themselves to mere instruments for their own gratification. 

In our society’s quest to maximize utility and power, we have created systems that seem to act automatically for the perpetuation of their own power, without any real consideration of what is of fundamental importance in human life. Such systems include artificial intelligence, government and corporate bureaucracies, the surveillance state, and the global finance system. Brown’s main opponent is “neoliberalism,” the system built on a combination of market forces and “traditional morality” that she thinks drives the contemporary global order. But she is equally critical of any automated, self-propagating system that makes persons mere instruments of its pursuit of power. Nihilistic attitudes underlie ideologies from capitalist consumerism to social justice progressivism. What they have in common is that they reduce persons to being mere “cogs in economic machines and superficial individualists.” We live to serve these systems and to achieve purely “trivial, immediate, and personal” gratification. We tend to lack any sense of anything larger than our private selves, which we could love and esteem for its own sake, rather than for what we get out of it. 

In our fraught political environment, I often find myself wondering who is really allied with whom, as traditional partisan and ideological alliances are increasingly shaken up. On many political and moral issues, Brown and I are vastly divided. But her book nevertheless supported an intuition of mine: one fundamental divide in our political culture, which cuts across traditional left-right or religious-secular divides, is between two attitudes towards persons. 

A bit later, he brings in the go-to guy for pointing us toward a way out of our twilight moment:

 The thinker who best saw that values must be discovered, not invented, is Plato. In my view, we cannot overcome nihilism without rediscovering the truths that Plato saw. Both Weber and Brown are haunted by Plato, referring to him at key points in their arguments. Brown invokes Plato’s Socrates as an ideal for the political and academic vocations: he combines selfless charisma with a vocation to educate his students’ desires. But she fails to notice that Socrates has these characteristics because he has seen objective beauty, goodness, justice, and other values. He inculcates desire for these given values in his students. Weber invokes Plato’s idea of the experience of inspiration as the source of the two vocations. But he fails to note how the truly good politician doesn’t just have a feeling of being inspired or called, but sees the given value of political flourishing and seeks to bring it about. The one really called to be an academic perceives the given value of knowledge and responds by seeking it out. 

Brown, like Plato, points us toward a personalist vision of being called to respond to higher values, but she defects from this vision, refusing to submit to given values and instead seeking to invent them. She is right to caution us about how a religious vision of reality, like Plato’s, can succumb to cynical nihilism. But ultimately, if we want to avoid nihilism, there is no other route available than a “religious” view, in Plato’s and O’Connor’s sense of a life that presupposes that there is “something given and accepted before it is experienced.” One can be a personalist and live in such a way that one moves upward to higher values, or one can be a nihilist, stuck in one’s will to power and self-gratification, and in the self-perpetuating systems such attitudes engender. There are no other options.

Daniel Buck, writing at the Fordham Institute's website, brings to our attention a governor who is actually doing constructive things on the education front:

Not since former Governor Scott Walker bludgeoned the unions in my home state of Wisconsin has there been such national outrage over state-level education policies. Historically, state-scale education has been a secondary affair, rarely topping the list of people’s substantive or political priorities, and most decisions have been left to local decision-making. There, too, school board meetings had as much political intrigue as a local knitting group. Not so in recent years.

California recently adopted a contentious new mathematics framework that emphasizes a glorified choose-your-own-adventure approach to instruction. It has earned bipartisanopprobrium, including parental petitions and open letters with hundreds of scholarly signatories.

Far more controversially, Florida governor (and presidential contender) Ron DeSantis almost weekly kicks an education hornet’s nest—most recently with a revision of his state’s history curriculum that includes a line about the “personal benefit” some slaves drew from the “peculiar institution.” And while I’m sympathetic to previous DeSantis policies that banned the instruction of divisive concepts, they’re misdirected, too. Bans will accomplish little unless a robust curriculum takes its place.

Consider, instead, Virginia Governor Glenn Youngkin, who has achieved some productive, bipartisan education wins that could provide guidance for other conservative governors, as well as real victories for American students.

Most notably, his state board of education voted in spring to approve new K–12 history standards. While the rest of the country is clutched in a mutual chokehold argument over how to frame American history, Youngkin’s administration updated the standards of the Old Dominion in a way that balances competing pressures. Advancing neither a blinkered idealism about the nation’s past, nor unrelenting criticism of it, the standards open with a commitment that “students will know our nation’s exceptional strengths, including individual innovation, moral character, ingenuity and adventure, while learning from terrible periods and actions in direct conflict with these ideals.”

After that, the new standards detail both the specific content and skills that students ought to know, clearly listing historical figures, from Frederick Douglass to Teddy Roosevelt, and specific events, such as the war of 1812 and the Louisiana purchase. Over sixty-one pages, it details clear goals for every student—everything from identifying the key components of the Declaration of Independence to the most important events and leaders of the Cold War, including the Bay of Pigs, President John F. Kennedy, Nikita Khrushchev, and much more. Any student graduating from Virginia's system with this knowledge would leave with a robust understanding of American history.

A comparison to flawed standards helps to explicate the strengths of this one. In place of concrete knowledge, Wisconsin lists wishy-washy skills and aptitudes. In place of events and figures, it names meaningless goals in inscrutable language: “[S]tudents will analyze, recognize, and evaluate patterns of continuity and change over time and contextualization of historical events.” That provides as much guidance to a teacher as map-less driving directions spoken in gibberish. Where it tries to identify concrete knowledge, it spends one page suggesting ambiguous concepts like “the modern era” or “the meeting of peoples and cultures” without any specifics or timing.

I've been getting  after it over at Precipice. Here are my most recent posts:

Thoughts on Principles

The Price of Our Frivolity

Defining Human Flourishing

Narcissism, Attitude, and the Smoldering Rubble of Post-America

Place

More Thoughts on Seriousness

Happy reading!