Saturday, August 5, 2023

Saturday roundup

 Gonna excerpt generously from this one.  "Ruler Over All: Notes Toward the Restoration of a Christian Culture" by Ken Myers at Touchstone is full of important insights that must be made available for contemplation and discussion. 

And I'll try not to - well, basically reprint the whole thing, but the way he unfolds his overall point is impressive.

He begins thusly:

One hundred years ago, in September 1923, the Hogarth Press published the first English book edition of T.  S. Eliot’s 434-line poem, The Waste Land. The type was set by hand by Eliot’s friend Virginia Woolf, who with her husband Leonard had founded the small publishing venture. The previous autumn, The Waste Land had appeared in the inaugural issue of Eliot’s own journal, The Criterion, and then in the U.S. in the prestigious literary magazine, The Dial. 

The Waste Land has been judged by many to be the most influential English-language poem of the twentieth century. Often analyzed as a depiction of the turmoil and fragmentation of Eliot’s own inner life, its continued power after a century is surely because of its account of public—not just private—dislocation. When the poem first appeared in 1922, the second volume of Oswald Spengler’s The Decline of the West had just been published. Spengler predicted the twenty-first-century collapse of Western civilization following decades of decay and concomitant  tyranny.

Eliot’s poem was published in a time haunted by a sense of global chaos unleashed by the destruction of the First World War and the social and political uncertainties that were both its causes and effects. Writing in the 1950s, the French diplomat and critic Georges Cattaui described Eliot’s expression of “a shipwrecked world” and “a longing for order.” In biographer Alzina Stone Dale’s judgment, “The Waste Land portrays failed civilization, or St. Augustine’s ‘earthly city’ doomed by its sterility and loss of spiritual power.”

Social and literary critic Russell Kirk engaged Eliot’s ideas about society and spirituality in depth in his 1971 book, Eliot and His Age: T.  S. Eliot’s Moral Imagination in the Twentieth Century. Commenting on the state of the West in the decade preceding The Waste Land, Kirk noted:

This decay of order and justice and freedom within the old European community was paralleled by the decadence of the old moral order, the Church falling into disrepute and the governing motive of many eminent men being merely ‘put money in thy purse.’ For the charlatan and the cheat, large opportunities were opened everywhere; while the old motives to integrity were fearfully shaken. Out of the War’s brutality had emerged gross appetites and violent ambitions, and everywhere egoism swaggered.

In one of the drafts of the poem, Eliot affixed a blunt epigraph from Conrad’s Heart of Darkness: “The Horror, the Horror.” Decades later, when he received the Nobel Prize for Literature in 1948, the presentation address cited The Waste Land’s“melancholy and sombre rhapsody [which] aims at describing the aridity and impotence of modern civilization.”

Eliot himself commented, not long after The Waste Land’s publication, that

the present situation is radically different from any in which poetry has been produced in the past: namely, that now there is nothing in which to believe, that Belief itself is dead; and that therefore my poem is the first to respond properly to the modern situation and not call upon Make-Believe.

We can be grateful that, five years after writing The Waste Land, Eliot converted from the austere Unitarianism of his New England ancestors to Christianity, specifically the faith as practiced in the Anglo-Catholic tradition of the Church of England. With his newly acquired recognition of Christ as the still point of the turning world, he would go on to write some of the most profound Christian poetry of the twentieth century, including Four Quartets, written between 1936 and 1942 and published in book form in 1943, about the time Eliot began writing the chapters in the essay that would become Notes Towards the Definition of Culture.

He then brings Russell Kirk into the mix, via Richard Nixon:

President Richard Nixon once asked Russell Kirk to recommend one important book that he should read; Kirk named Eliot’s Notes Towards the Definition of Culture. In Kirk’s judgment, “Eliot might well have set upon his title page a sentence that James Fitzjames Stephen had written in 1873: ‘The waters are out and no human force can turn them back, but I do not see why, as we go with the stream, we need sing Hallelujah to the river god.’” Eliot’s title page did include a quotation, but in the spirit of his book’s title, it was from the Oxford English Dictionary:“definition: 1.  The setting of bounds; limitation (rare)—1483.”



A little later, he brings Philip Rieff into the proceedings:

In 1966, sociologist Philip Rieff presented in The Triumph of the Therapeutic both a description of how a culture typically functions—orienting the moral compasses of its members—and a diagnosis of how Western societies were going down a path toward what he called an anti-culture. For millennia, human beings within traditional cultures were given guidance about how to live, guidance that served like roadmaps and guardrails. The work of a culture was to sustain an “inherited organization of permissions and restraints.” A culture was a moral legacy, a received and communally shared way of life, not just a set of options for personalized lifestyles, to be adopted or discarded at whim.

But Rieff saw that modern societies were increasingly committed to abandoning the presentation of authoritative guidance, preferring that social institutions embrace the role of expanding the possibilities for free expression. During the 1960s, the structures of societies were being radically reordered to encourage and accommodate the radical individualism and subjectivity that defines the modern project.

“A culture survives principally,” Rieff wrote,

by the power of its institutions to bind and loose men in the conduct of their affairs with reasons which sink so deep into the self that they become commonly and implicitly understood—with that understanding of which explicit belief and precise knowledge of externals would show outwardly like the tip of an iceberg.

A culture must, he argued, 

communicate ideals, setting as internalities those distinctions between right actions and wrong that unite men and permit them the fundamental pleasure of agreement. Culture is another name for a design of motives directing the self outward, toward those communal purposes in which alone the self can be realized and satisfied.

Rieff affirmed the ancient (classical and Christian) belief that human persons can only be fully themselves within a society in which there are—in St. Augustine’s phrase—common objects of love.

But he saw how, in the first half of the twentieth century, there was a “reorganization of those dialectical expressions of Yes and No the interplay of which constitutes culture.” He saw that the rapid and radically changing standards of moral life in the twentieth century were creating conditions whereby “all communications of ideals [would] come under permanent and easy suspicion. The question is no longer as Dostoevski put it: ‘Can civilized men believe?’, but rather: Can unbelieving men be civilized?”

We moderns believe, he continued, “that we know something our predecessors did not: that we can live freely at last, enjoying all our senses—except that of the past—as unremembering, honest, and friendly barbarians all, in a technological Eden.” And so, “a new and dynamic acceptance of disorder, in love with life and destructive of it, has been loosed upon the world.”

A bit further into his train of thought, Myers insists that we have to get back to basic questions about what particular things are - in their essences - in order to have any way of evaluating them: 

D.  C. Schindler has pointed out that not only in public discourse but even in private conversation, we avoid asking fundamental ontological questions, basic questions that begin with the words, “What is . . .”:

At the highest intellectual level, we will discuss economic conditions, for example, and the focus will be on how to improve them, how to stimulate growth, how to make possible a more equitable distribution of wealth, and so forth. But we do not ask what an economy, after all, is, or what wealth is. We discuss education, its cost, its availability, its effectiveness, and so forth. But we do not ask: What is education? We discuss foreign policy, the question of immigration, of the just use of force, but we do not ask: What is a nation? What is a citizen? What does it mean to belong to a political community? What is justice? (Love and the Postmodern Predicament: Rediscovering the Real in Beauty, Goodness, and Truth, 2018, p.   27)

As a society, we not only do not agree about such things; we assume that achieving agreement on such things is unnecessary and undesirable. What “pluralism” means for us is that no one should ever feel obliged to change his mind about questions of value, since all judgments about value are, after all, not rational, but merely expressions of personal preference.

And so, most public speech about contested matters seems designed to denigrate and humiliate one’s opponents, not to persuade them. We no longer even seek to come to agreement about the principles that matter most. We assume that all values are personal and subjective, which means that we can’t talk about them publicly and rationally and hope to change anyone’s mind.

We further assume that our public institutions can nevertheless be run like value-free machines, serving everyone in the nation with mechanistic indifference to questions of what things really are. We assume that institutions of government and finance and education and journalism and manufacturing and healthcare can all be run without any reference to higher purposes or ultimate good, and still perform to everyone’s satisfaction, if only we could get clever and disinterested people running things. At least, this is what we profess to assume, as good modern heirs of Enlightenment liberalism. This is what our public institutions encourage us—and sometimes coerce us—to profess. 

Confused About Reality

This vision of pluralism and diversity, and its commitment to public institutions that somehow maintain absolute neutrality, is deeply mistaken. First of all, it ignores how substantive ideas about what is good or true are frequently—necessarily—smuggled into allegedly neutral policies and procedures.

Second, it misrepresents some fundamental facts about human nature. We are creatures made to pursue what is really good and really true, not made to invent our own personalized account of reality. Modern men and women are deeply confused about what things really are, not just about what is true and what is good, but about what is real, a confusion that is sustained by many of our cultural institutions. G. K. Chesterton once quipped that modern man has not only lost his way; he’s forgotten his address. Modern culture and its institutions sustain a view of the kinds of creatures we are and the kind of world we live in that is more misleading, more fundamentally false than the view of St. Paul’s contemporaries. Our contemporaries have a false understanding of the very nature of reality, not just a false set of assumptions about how they should live. The challenges we face have to do with metaphysics, not just morality.


Okay, hopefully that's enough of a taste to get you over there to read the whole thing.

In her essay "Filling Time Filling Minds" at Front Porch Republic, Nadya Williams cautions us not to ever-entertain ourselves:

. . .  I haven’t seen a movie in about a year. For several years now, in fact, I have been averaging about one movie per year, usually choosing materials I considered essential for teaching. Now that I have quit academia, I may never need to see a film again, which is a rather appealing thought. Of course, to be fair, I do spend a good two or three hours a day on my computer—writing, editing, responding to emails, and checking social media. My life is assuredly not screen-free. But it is free of on-screen entertainment, and the same largely applies to my children, who only get to watch a film once every few weeks as a special treat. Although, apparently, in the meanwhile, they get to enjoy the best that the rear-view cam in the van has to offer.

Second, relationships matter. Dan and I try to prioritize the things we can do for and with others, whether in the family or outside the home, over solitary activities. We were made to live in community, in relationship with family, friends, neighbors. One of the problems of a passive entertainment activity, like Netflix, is that it takes us away from interactions with others. Watching something can be done in company with others, but it will never be the same kind of togetherness as we have when we sit down for dinner with family and friends, play a board game, read a book out loud for each other, or go for a walk as a family. 

My point, ultimately, is not to wage a war on Netflix. I am merely using the most famous and most popular of the American streaming services as a representative of the larger problem of the desire to disengage. The idea of relaxing and enjoying one’s well-earned rest has biblical undertones—the Sabbath is a key theme; even God rested. But the Sabbath was never meant to be a time of selfish retirement from all relationships, nor was it made to be spent on diversion or amusement. At the same time, however, I want to be clear that I am not advocating here for the industrial-era idol of squeezing the utmost out of every minute. Rather, I want us to think about who we become, and (for parents) who we cause our children to become, through how we use our time.

 In "Augustine and the Order of Love: Debunking a Dumb Christian Nationalist Argument," Jake Meadors, writing at Mere Orthodoxy, looks at the question of how we should love everybody in a world where we only personally know, or even have some kind of categorical relation to, a small percentage of the entire human species. 

Good book reviews are works that make attention-deserving points all their own. Such is the case with Mark K. Spencer's review of Wendy Brown's Nihilistic Times at Law & Liberty:

“If you live today, you breathe in nihilism,” wrote Flannery O’Connor in 1955. “In or out of the Church, it’s the gas you breathe.” What makes Wendy Brown’s Nihilistic Times worth reading is that she convincingly reminds us of this situation. The attitude that nothing has essential meaning or value is pervasive in our times, always in the background of other lines of thought. “Values” make things important, worthy of care and esteem, in various ways; justice, beauty, holiness, and pleasure are examples of values. Right and left, religious and secular alike speak the language of promoting and defending values. And yet, an underlying malaise of nihilism pervades our world. While most people fluently speak the language of values, there is also a ubiquitous suspicion or fear that all values have been reduced to mere instruments, used by individuals, corporations, and governments to achieve and maintain power and personal gratification.

Brown, a political theorist of the anti-liberal left, is, of course, not the first to make these claims; they’ve been made at least since the nineteenth century. But Brown’s book—a reworking of her 2019 Tanner Lectures on Human Values at Yale University—is a timely reminder of the nihilistic air we breathe. It’s easy to lose sight of this situation, especially if we’re caught up in defending some particular worldview or policy proposal. A well-crafted reminder of fundamental features of the contemporary human condition is always beneficial. While Brown frequently signals her adherence to left-wing orthodoxy on hot-button issues, she also repeatedly shows how all sides in current political and cultural debates have imbibed a nihilistic attitude. Politicians increasingly manipulate information to sway voters, “woke capitalists” use progressive values to sell products, and students are told to treat themselves as “human capital,” who should “invest” in their own future through education, reducing themselves to mere instruments for their own gratification. 

In our society’s quest to maximize utility and power, we have created systems that seem to act automatically for the perpetuation of their own power, without any real consideration of what is of fundamental importance in human life. Such systems include artificial intelligence, government and corporate bureaucracies, the surveillance state, and the global finance system. Brown’s main opponent is “neoliberalism,” the system built on a combination of market forces and “traditional morality” that she thinks drives the contemporary global order. But she is equally critical of any automated, self-propagating system that makes persons mere instruments of its pursuit of power. Nihilistic attitudes underlie ideologies from capitalist consumerism to social justice progressivism. What they have in common is that they reduce persons to being mere “cogs in economic machines and superficial individualists.” We live to serve these systems and to achieve purely “trivial, immediate, and personal” gratification. We tend to lack any sense of anything larger than our private selves, which we could love and esteem for its own sake, rather than for what we get out of it. 

In our fraught political environment, I often find myself wondering who is really allied with whom, as traditional partisan and ideological alliances are increasingly shaken up. On many political and moral issues, Brown and I are vastly divided. But her book nevertheless supported an intuition of mine: one fundamental divide in our political culture, which cuts across traditional left-right or religious-secular divides, is between two attitudes towards persons. 

A bit later, he brings in the go-to guy for pointing us toward a way out of our twilight moment:

 The thinker who best saw that values must be discovered, not invented, is Plato. In my view, we cannot overcome nihilism without rediscovering the truths that Plato saw. Both Weber and Brown are haunted by Plato, referring to him at key points in their arguments. Brown invokes Plato’s Socrates as an ideal for the political and academic vocations: he combines selfless charisma with a vocation to educate his students’ desires. But she fails to notice that Socrates has these characteristics because he has seen objective beauty, goodness, justice, and other values. He inculcates desire for these given values in his students. Weber invokes Plato’s idea of the experience of inspiration as the source of the two vocations. But he fails to note how the truly good politician doesn’t just have a feeling of being inspired or called, but sees the given value of political flourishing and seeks to bring it about. The one really called to be an academic perceives the given value of knowledge and responds by seeking it out. 

Brown, like Plato, points us toward a personalist vision of being called to respond to higher values, but she defects from this vision, refusing to submit to given values and instead seeking to invent them. She is right to caution us about how a religious vision of reality, like Plato’s, can succumb to cynical nihilism. But ultimately, if we want to avoid nihilism, there is no other route available than a “religious” view, in Plato’s and O’Connor’s sense of a life that presupposes that there is “something given and accepted before it is experienced.” One can be a personalist and live in such a way that one moves upward to higher values, or one can be a nihilist, stuck in one’s will to power and self-gratification, and in the self-perpetuating systems such attitudes engender. There are no other options.

Daniel Buck, writing at the Fordham Institute's website, brings to our attention a governor who is actually doing constructive things on the education front:

Not since former Governor Scott Walker bludgeoned the unions in my home state of Wisconsin has there been such national outrage over state-level education policies. Historically, state-scale education has been a secondary affair, rarely topping the list of people’s substantive or political priorities, and most decisions have been left to local decision-making. There, too, school board meetings had as much political intrigue as a local knitting group. Not so in recent years.

California recently adopted a contentious new mathematics framework that emphasizes a glorified choose-your-own-adventure approach to instruction. It has earned bipartisanopprobrium, including parental petitions and open letters with hundreds of scholarly signatories.

Far more controversially, Florida governor (and presidential contender) Ron DeSantis almost weekly kicks an education hornet’s nest—most recently with a revision of his state’s history curriculum that includes a line about the “personal benefit” some slaves drew from the “peculiar institution.” And while I’m sympathetic to previous DeSantis policies that banned the instruction of divisive concepts, they’re misdirected, too. Bans will accomplish little unless a robust curriculum takes its place.

Consider, instead, Virginia Governor Glenn Youngkin, who has achieved some productive, bipartisan education wins that could provide guidance for other conservative governors, as well as real victories for American students.

Most notably, his state board of education voted in spring to approve new K–12 history standards. While the rest of the country is clutched in a mutual chokehold argument over how to frame American history, Youngkin’s administration updated the standards of the Old Dominion in a way that balances competing pressures. Advancing neither a blinkered idealism about the nation’s past, nor unrelenting criticism of it, the standards open with a commitment that “students will know our nation’s exceptional strengths, including individual innovation, moral character, ingenuity and adventure, while learning from terrible periods and actions in direct conflict with these ideals.”

After that, the new standards detail both the specific content and skills that students ought to know, clearly listing historical figures, from Frederick Douglass to Teddy Roosevelt, and specific events, such as the war of 1812 and the Louisiana purchase. Over sixty-one pages, it details clear goals for every student—everything from identifying the key components of the Declaration of Independence to the most important events and leaders of the Cold War, including the Bay of Pigs, President John F. Kennedy, Nikita Khrushchev, and much more. Any student graduating from Virginia's system with this knowledge would leave with a robust understanding of American history.

A comparison to flawed standards helps to explicate the strengths of this one. In place of concrete knowledge, Wisconsin lists wishy-washy skills and aptitudes. In place of events and figures, it names meaningless goals in inscrutable language: “[S]tudents will analyze, recognize, and evaluate patterns of continuity and change over time and contextualization of historical events.” That provides as much guidance to a teacher as map-less driving directions spoken in gibberish. Where it tries to identify concrete knowledge, it spends one page suggesting ambiguous concepts like “the modern era” or “the meeting of peoples and cultures” without any specifics or timing.

I've been getting  after it over at Precipice. Here are my most recent posts:

Thoughts on Principles

The Price of Our Frivolity

Defining Human Flourishing

Narcissism, Attitude, and the Smoldering Rubble of Post-America

Place

More Thoughts on Seriousness

Happy reading!









 

 


 

 

 

 


No comments:

Post a Comment