MANY CONSERVATIVE cultural critics look back to the 1950s as America’s Golden Age, the last decade of cultural consensus before Selma, the pill, Vietnam, and other uglinesses that, in the 1960s, set America on a path to cultural decline. This tendency is especially pronounced among Catholic conservatives such as George Weigel, Ross Douthat, and Robert George, for whom the 1950s was a time of Catholic advancement as well as conservative cultural hegemony. Their arguments tend to read like a more sophisticated elaboration of the Archie Bunker theme song, “Those were the days.”
George Marsden has always been more scholar than culture warrior — a man with a perspective, not an agenda, critical, engaged, and thoughtful. His analysis of the 1950s, elaborated in his new book, The Twilight of the American Enlightenment: The 1950s and the Crisis of Liberal Belief, is more like an account of kayakers who suddenly find themselves in some heavy rapids, and the water is moving faster. They are beginning to wonder why they chose this trip, why in this river. All the while, they hear the roar of an approaching waterfall becoming louder and louder.
In addition to the anxieties provoked by the Cold War and the Bomb, Marsden’s book focuses on a different source of cultural agita. “For the most thoughtful observers of American life of the time, the most basic question was whether this civilization could be saved from itself,” he writes. The artists who created, and the audiences that devoured, the most prominent cultural icons of the age — Willie Loman and Holden Caufield, The Man in the Gray Flannel Suit and Rebel Without a Cause — would be surprised to see the 1950s presented as a quiet time, of cross-cultural and interreligious consensus. As Marsden convincingly demonstrates, in the 1950s, there was a deep concern about the effects of materialism and mass consumer culture on the moral fiber of the nation, a rigorous, multifaceted debate about “national purpose” and “national character,” and profound changes in the relationship of religion to culture and politics. These discussions were carried on in the pages of middle-brow magazines like Time and Life, as much as in the academy.
In 1953, Dwight Macdonald, a prominent New York intellectual, wrote an essay decrying popular culture, which he considered a new opiate for the people. Macdonald thought that mass culture was, in Marsden’s words, “inevitably degrading” because, “unlike folk art, it did not arise from the people but was manufactured and distributed from the top down.” Macdonald was not alone. In 1959, John Steinbeck wrote to his friend Adlai Stevenson, “if I wanted to destroy a nation, I would give it too much and I would have it on its knees, miserable greedy, and sick.” Many intellectuals worried that the degradation of culture was a sure road to totalitarianism.
Hannah Arendt did not discern a totalitarian fear, but she did believe that “societies needed intellectual, artistic, and literary leadership.” Marsden quotes her approvingly:
Consumerism had become a nearly all-controlling force in the modern era. Everything had come to be valued in terms of its function. What was being lost was the ability to love the world for its own sake, to value art simply for its beauty. “We can say without exaggeration,” Arendt wrote, “that a society obsessed with consumption cannot at the same to be cultured or produce a culture.”
David Manning White did not share these nearly apocalyptic diagnoses. Writing in The Saturday Review, he wondered what those who warned about the cultural decline manifested by television foolishness would make of entertainments in earlier centuries, like bear-baiting. White celebrated the fact that paperbacks made classics available to the masses. As for the threat of totalitarianism, he noted that Germany possessed as much high culture as a nation could circa 1932. Others, such as Bernard Rosenberg, of the journal Dissent, celebrated the fact that “manual labor is becoming obsolete” and millions did not have to work merely to survive. Yet his perceptions were not all rose-colored. While the “precondition for transfiguring Homo sapiens into a higher species begins to exist,” one should perhaps be afraid that “before man can transcend himself he is being dehumanized….Freedom is being placed before him and snatched away. The rich and varied life he might lead is standardized.”
Amid all the anxiety, “freedom” was, according to Marsden, “a word one could use without explanation or argument” and all Americans invoked it as their shared ideal, the essence of our national character and purpose. He writes:
What is fascinating and revealing is how easily talk about the unassailable ideal of “freedom” in a political sense blended into an ideal of personal attitudes of independence from social authorities and restraints. A key word that was often used to express this taken-for-granted ideal was “autonomy” […] The opposite of autonomy was “conformity.” Everyone, it seemed, agreed that one should not be a conformist.
Here is the crux of the matter: autonomy versus authority. Except that in an age of mass consumer culture, autonomy and its kissin’ cousin, “authenticity,” are not as easy to come by as most think. I read this passage in Marsden’s book and recalled interviewing a bartender who pointed to her many tattoos and piercings and leather accessories as evidence of her non-conformist credentials. Her individuality was expressed entirely by means of things she had purchased.
The other authority of the time was science, and no science was more popular in this age of emerging mass culture than psychology. B.F. Skinner’s stimulus-response theories, contingent upon the belief that the human person developed in response to external controls, and Carl Rogers’s theory of an innate human power to achieve self-actualization or an internal control mechanism, were in obvious conflict, but they were oddly melded by Dr. Spock, whose childhood rearing book became a best-seller and vacillated between the Skinner-external and Rogers-internal approaches. Parents would be told to reinforce desired behavior, but also to trust their own instincts.
Science then, as now, was often given a role beyond its own epistemic borders and, in a kind of drag, became “scientism”: a belief system, not a method of naturalistic inquiry. As Leon Wieseltier would observe in the 1990s, “there is not a chart in the world that can explain the significance of charts in the world,” but few in the 1950s perceived the difficulty. This was an age when scientific expertise served as a trump card in intellectual argument. The confidence in humankind’s ability to discern scientific solutions to its age-old problems was higher than ever.
Marsden then pivots his analysis of the 1950s to examine what he terms “the latter days of the Protestant Establishment.” There is no denying that the decade was a high-water mark for non-denominational, mainstream-Protestant-inflected, civil religion. Then we added “under God” to the Pledge of Allegiance. President Eisenhower famously observed that “our form of government has no sense unless it is founded in a deeply felt religious faith, and I don't care what it is.” Church attendance rates in the United States hit an all-time high. But the seeds of decline were evident. Will Herberg’s Protestant-Catholic-Jew, published in 1955, applauded the acceptance of Jews and Catholics into the American mainstream, “but he believed [such acceptance] was at the price of effectively subordinating their traditional religious beliefs and practices to the operative religion of most Americans, ‘the American way of life.’ ” Religion became a vehicle for non-religious values as Americanism and religiosity became fused.
Reinhold Niebuhr, arguably the most influential intellectual of the times, confronted the conflation of religion and patriotism in his 1952 book The Irony of American History. At the height of McCarthyism, Niebuhr pointed out that there were many similarities between Soviet communism and American capitalism, arguing that while religious Americans condemned the Soviets for their “materialism,” we are “rather more successful practitioners of materialism as a working creed than the communists.” Niebuhr’s keen sense of the power of original sin made him suspicious of the dominant cultural belief that man was the creator of his own destiny. More than any other commentator, Niebuhr focused on the paradox that the United States was “at once the most religious and the most secular of Western nations."
Marsden correctly notes that religion had become “privatized” for most Americans. “In the religiously diverse United States,” he says, “it has typically been considered fine to practice a specific religious faith as a private option, but one’s faith is not supposed to intrude in any substantive way into the spheres of one’s public activity.” There were exceptions: abolitionism, the temperance movement, the Social Gospel. Yet these exceptions proved the rule. The privatization of religion was obscured in the 1950s by the veneer of civic religion and, in the 1960s, the civil rights movement, led by Dr. Martin Luther King Jr., would serve as a notable exception to this privatized religion. King, however, tapped into the emerging liberal value of equality and, just so, found allies in the secular world he would not have acquired if his campaign had been focused on, say, a revival of chastity or temperance.
Marsden is right to recognize that a privatized religion would prove incapable of withstanding the whirlwinds of cultural change in the 1960s. But there was another problem, which he does not identify: the reduction of religion to ethics. In an effort to keep religious peace in a pluralistic society, U.S. Christians not only privatized their faith, but when they did enter the public square it was in the role of an ethical authority. This reduction of religion to ethics has its roots in the Reformation, as ably demonstrated by Marsden’s colleague at Notre Dame, the historian Brad Gregory, in his influential The Unintended Reformation: How a Religious Revolution Secularized Society. Divorced from a doctrinally based worldview, the ethical teachings of the Christian church were little match for the moneymaking permissiveness of the counterculture.
Combined, the reduction of religion to ethics and the privatization of religion made religion simultaneously synonymous with American life and practically irrelevant. Americans liked the trappings of religiosity but were loathe to permit religion to affect their economic and commercial life. This had been a sufficiently dominant cultural force to gain the attention of de Tocqueville and by the 1950s was utterly comprehensive. The business of America was business. In the name of technological efficiency, all manner of social norms were cast aside. Capitalism was, and is, as Commonweal’s Matthew Boudway recently pegged it, “the great disruptor.” In the 1950s, Americans may have gone to church in record numbers on Sundays, but there was no protest as Christmas was turned from a religious holiday about God’s graciously sending His Son as a savior into an opportunity to educate youngsters in acquisitiveness and greed. When the cultural turmoil of the late 1960s finally exposed the emptiness of the civic religion, less was lost than was imagined.
That did not prevent conservative evangelical Christians from representing the 1950s as a time of grace and the 1960s as the embodiment of perdition. These conservative Christians adopted a restorationist program in which religion would be anything but privatized. They refashioned American history into a providential tale, with Christianity setting the tone and aspirations for the nation from its founding. This took a great deal of license with the historical record, and with the rise of the Moral Majority and the Christian Coalition, it yielded an agenda that was as political as it was religious.
The culture wars, then, were on. Religion was in the public square to be sure, but with the intellectual limitations of fundamentalism, the still-dominant reduction of religion to ethics, and a championing, rather than a challenging, of capitalism, there was little chance the religious right could build a national consensus. Additionally, the long-pronounced secular habits of Americans took on a more aggressive, cohesive intellectual force after the 1950s, and combined with the explicit exclusion of religious symbols and teachings from the public square, many Americans came to see religion as essentially divisive. In the face of what was perceived as a secularist onslaught, the evangelicals found allies among conservative Catholics, and today religion has become almost as balkanized and fractious as politics.
Marsden’s conclusion, no doubt reflecting his years teaching at Notre Dame more than his evangelical roots, amounts to a “Hail Mary” pass. He thinks the answer, or at least the beginning of an answer, to the problem of religious pluralism can be found in the writings and life of the great Dutch philosopher and politician Abraham Kuyper (1837–1920). Kuyper argued for a “confessional pluralism,” in which the possibility of consensus on first principles was rejected as a goal and, instead, a vibrant society could be achieved by protecting and nourishing mediating institutions and subcultures which could come together for common purposes at times, while maintaining their integrity and differences.
Kuyper’s views are similar to those of his contemporary, Pope Leo XIII, on subsidiarity: the notion that societal tasks should be addressed at the lowest level of societal organization, with the state entering the fray when those lower levels fail to achieve a necessary societal objective. Subsidiarity was invoked by certain conservative Catholics like Congressman Paul Ryan in their opposition to the Affordable Care Act and other federal government initiatives, but they neglected to recognize that subsidiarity does not prohibit state action, especially when a basic social right such as access to health care is not achieved by private markets or local levels of government. Subsidiarity is a two-way street. Conservatives also overlooked the actual meaning of the word, based on the Latin root subsidium (help). There is nothing in the social doctrine of the Catholic Church that privileges competition among interests as a means to achieve desired goals, as Americans, especially conservative politicians, do. Indeed, part of the rationale for Catholic social doctrine was to protect both the individual and society from over-powerful market forces.
Marsden correctly notes another key difference between the intellectuals of the 1950s and the Kuyperian vision. He writes:
[I]n the dominant midcentury American liberal-moderate view of building towards consensus, scientific outlooks were often presented as ideologically neutral […] In the Kuyperian approach to pluralism, there is no conceding that modern scientific methods are objective so far as they go and hence could serve as neutral ways to view religious faiths. Rather, the outlook recognizes from the outset that the modern world is divided by fundamental differences in underlying faiths and commitments, some of which have nontheistic naturalism as their starting points and some of which have various forms of theism and openness to the supernatural as their starting points.
As noted above, this recognition of pluralism in regard to starting points is quite different from the desire for a scientifically informed, religiously inflected, cultural consensus, even if the religious inflections were superficial. There is something deeper at work here, an understanding that worldviews shape the way discrete facts are recognized and appreciated, and that any pluralism worth the name must take account of religious worldviews and not relegate them to the margins of society.
Still, the secularizing market forces march on. It is a remarkable fact that religious leaders in the United States very rarely entertain the conversations about the dangers of materialism that animated so much public discussion in the 1950s. Remarkable, too, is the fact that as secularists embraced equality as an overarching goal in the past 40 years, they tended to focus on social equality for blacks, women, and gays, and only recently on income inequalities. Perhaps Pope Francis, with his astonishingly blunt criticisms of modern capitalism, will raise questions that American religious leaders have been reluctant to ask, let alone answer. Perhaps the research of scholars such as Robert Putnam on the importance of social capital will gain renewed interest for Kuyper’s alternative vision.
Marsden’s brilliant little book concludes that there is no going back, no way to re-constitute a perceived cultural consensus based on an admittedly superficial mixture of civic, mostly mainstream Protestant religion and an Enlightenment commitment to universal reason. That consensus did not survive the mid-20th century and holds little promise for the 21st. Unless Americans find better ways to navigate their differences than those offered by the culture wars of the past 40 years, we will be little able to confront the large challenges the nation faces. Marsden has not given us a map, really, but he has pointed us in some potentially fruitful directions.