It is on this basis that Du Bois was able to defend African Americans from the accusations of the scientific racists of his day even while accepting certain aspects of that science—such as race essentialism—that most scientists today would reject. In so doing, Du Bois raises important questions regarding the relationship between the sciences and the humanities. While Du Bois was able to make an argument against racism through his scientific approach to humane disciplines like history and philosophy, the arguments he formulated rebutted ideas that were accepted as scientific fact in his day. By framing his life as a refutation of the scientific racism popular in the United States at the turn of the twentieth century and by applying scientific methods to the humane arts in an attempt to rebut racial pseudoscience, Du bois helped to define the relationship between the humanities and the sciences and points to a healthy engagement between the two in which each can inform the other. A scientism which ignores the human element severs itself from the experiential facts of the lives it hopes to explain while a humanism that disvalues scientific ways of knowing is incomplete and likely to result in navel-gazing prognostications with little meaning for the real world. By bringing the two together, Du Bois used his training in the humanities and his knowledge of the sciences as means by which to explain and to change for the better the lives of millions of people.
The span of W. E. B. Du Bois’s life runs nearly from the end of slavery in the United States to Martin Luther King, Jr.’s March on Washington. Born in 1868, less than three full years after the passage of the thirteenth amendment outlawing slavery, Du Bois died on the eve of the March on Washington in 1963. At the time that he was born, most African Americans were illiterate and lived in rural areas in the South. It was commonly assumed even by those who had been dedicated abolitionists just a few years before that these black peasants in the South were naturally inferior to whites and would be ultimately unable to rise from or to greatly improve their condition. From an early point in his life, Du Bois resolved to dedicate his life to fighting against the negative assumptions and low expectations attached to African Americans. He intended to do this both by himself becoming so well-educated as to act as living evidence against the innate intellectual inferiority of African Americans and by applying his abilities to demonstrating the real reasons behind the low social status of African Americans. In so doing, Du Bois became one of the first American American classicists, a founding figure in the then-incipient science of sociology, and a pioneer in research into African-American history.
The idea of race has been a defining feature of American social and cultural life since long before the independence of the United States. As historian Nell Irvin Painter has noted in her masterful history of the origins and development of the idea of a white race, “most racial thought in the United States served to justify slavery,” arising, in part, as an ex post facto justification for the Atlantic Slave Trade and the subjugation of people of African descent. The existence of a supposed cultural and scientific “racial hierarchy” which “placed the darkest-skinned and poorest people—Africans and Australians—at the bottom” became a standard feature in the rhetoric of the justification of slavery in the eighteenth and nineteenth centuries.
Among the features which those at the bottom of this hierarchy were supposed to possess were low intelligence and a natural servility. John C. Calhoun, a vice-president of the United States from 1825 to 1828 and a United States Senator from South Carolina from 1832 to 1850 announced in a speech on the Senate floor that he would not “believe that the Negro was a human being and should be treated as a man” until he could “find a Negro who knew the Greek syntax.” Clearly, his assumption was that he would never find such a Negro. Similarly, in his famous speech proclaiming slavery and racial hierarchy to be the “corner-stone” of the newly-formed Confederate government, Alexander H. Stephens, the vice-president of the Confederate States of America, argued that this new government was “the first government ever instituted upon the principles in strict conformity to nature, and the ordination of Providence, in furnishing the materials of human society.” The subordination of inferior races to superior races, he said, was a scientific truth like those discovered by the great scientists of the recent past:
As I have stated, the truth of this principle may be slow in development, as all truths are and ever have been, in the various branches of science. It was so with the principles announced by Galileo it was so with Adam Smith and his principles of political economy. It was so with Harvey, and his theory of the circulation of the blood. It is stated that not a single one of the medical profession, living at the time of the announcement of the truths made by him, admitted them. Now, they are universally acknowledged. May we not, therefore, look with confidence to the ultimate universal acknowledgment of the truths upon which our system rests?
Just as Galileo had discovered the structure of the solar system, Adam Smith the laws of economics, and Harvey the movements of the heart and the circulation of blood, so the American South, says Stephens, had discovered the laws that properly govern the relations between the races.
Trained scientists among those in support of Southern slavery were eager to lend their authority to boost the credibility of such claims. In 1854, for example, George Gliddon and Josiah Nott published their Types of Mankind: Or, Ethnological Researches, Based upon the Ancient Monuments, Paintings, Sculptures, and Crania of Races, and upon their Natural, Geographical, Philological, and Biblical History, which, among the other supposed evidence it provided, featured a chart of skulls which placed that of an African American between a European and a chimpanzee. “Gliddon and Nott and others” like them not only insisted that people of African descent were inherently inferior to people of European descent, but went as far as attempting “to prove that the Negro was of a different species from the white man.” As Robert J. C. Young points out, by the middle of the eighteenth century, Southern slaveholders and their supporters “could claim that Southern slavery was a time-honored institution, authorized by history and science alike.”
Even after the end of slavery with the passage of the thirteenth amendment in December of 1865, theories of the inherent intellectual and moral inferiority of people of African descent persisted as a means by which to justify segregation laws and the withholding of opportunity for social and educational advancement from African Americans. The groundwork of the ostensibly scientific and historical research and writing that had been laid in defense of slavery now became a means by which to perpetuate the racial hierarchy of the United States in its new segregationist forms. As Carol M. Taylor writes, at the turn of the twentieth century, there was “virtual unanimity by the leading figures in American social science” as well as among “biologists, psychologists, and sociologists” on the subject of “the inherent and immutable inferiority of the black race.”
 Nell Irvin Painter, The History of White People (New York: W. W. Norton and Company, 2011), 190.
 Ibid., 180.
 Margaret Malamud, African Americans and the Classics: Antiquity, Abolition and Activism (New York: I. B. Taurus, 2016), 10.
 Alexander H. Stephens, “‘Corner Stone’ Speech,” Savannah, Georgia (March 21, 1861), http://teachingamericanhistory.org/library/document/cornerstone-speech/ (accessed March 28, 2017).
 Malamud, 179–181.
 Alexander Crummell, “The Attitude of the American Mind toward the Negro Intellect,” in Destiny and Race: Selected Writings, 1840–1898, ed. Wilson Jeremiah Moses (Amherst: University of Massachusetts Press, 1992), 292.
 Robert J. C. Young, “The Afterlives of Black Athena,” in Daniel Orrells, Gurminder K. Bhambra, and Tessa Roynon, eds., African Athena: New Agendas (Oxford: Oxford University Press, 2011), 182.
 Carol M. Taylor, “W. E. B. Du Bois’s Challenge to Scientific Racism,” Journal of Black Studies 11, no. 4 (June 1981), 449.
Human beings, by their nature, seek to understand themselves and the world around them. Each of us is placed into a world which we neither created nor comprehend. It is as if we have woken up in a dark room with no knowledge of who we are or how we got here. As our eyes gradually adjust to the dark, we glimpse a variety of unknown objects, clues to our origins, the origins of the room and the task we have been place into the room to complete. Before anything else can be done, we must answer the questions: who am I and what am I doing here? Throughout history, many answers to these questions, of varying validity, have been offered.
Today, and since the Enlightenment, one way of answering these questions, the scientific, has come to predominate to the detriment of other ways of answering. While the means provided by science have provided numerous benefits, they have proven incomplete and unsatisfactory at best. While the scientific method may be able to measure the speed and quantity of the water pouring over a waterfall, its chemical composition and its erosive effects, scientists can say relatively little about its beauty and its evocation of a sense of sublimity in its human observers. This, rather, is the place of the poet and the artist, whose ways of understanding do not contradict those of the scientist but do indeed complete and even surpass them. Knowledge is the imposition of human order onto otherwise apparently disorderly experience of disparate phenomena with the bodily senses and the faculties of the mind. Genius, then, is the ability to form connections between what appear to others to be entirely unrelated experiences. With these definitions in mind, the poet is the genius par excellence; he is a creator of cosmos out of chaos through the use of metaphor.
Richard Wilbur is undoubtedly an outstanding modern example of such a genius. For Wilbur, in his poetry, there is nothing that is not both significant and signifying; each experience is both valuable in itself and valuable in its ability to represent or otherwise point beyond itself to something else, entering thereby into the cohesive network of all created (and, perhaps, uncreated) things. With this dual relevance of each thing as his axiom, Wilbur is able to transform the mundane into the infinitely meaningful and thereby imbue the mundane itself with infinite meaning. In “Transit,” Wilbur begins with a chance sighting of “a woman I have never seen before” exiting her townhouse on a city street. He describes her as “so beautiful that she or time must fade,” thereby entering through an otherwise prosaic event into a poetic meditation on beauty and time. In “Love Calls Us to the Things of This World,” Wilbur again exhibits his ability to begin with the banal and end in the eternal. The poem begins as Wilbur sees laundry drying on the line “outside the open window.” He begins immediately to imagine that the drying laundry is “angels,” some of whom “are in bed-sheets, some are in blouses, some are in smocks.” Nearly at the climax of the poem, Wilbur records the cry of his soul: “Oh, let there be nothing on earth but laundry, / Nothing but rosy hands in the rising steam / And clear dances done in the sight of heaven.” In the poetic genius of Wilbur, the daily drudgery of cleaning clothes and sheets has become a celebration of life, a spotting of angelic beings and an affirmation of the inherent goodness of the created world as it stands.
That all of this may be far from the way most people experience the world, with all of its necessities and drudgeries, is precisely an argument in favor of Wilbur’s genius. He has taken up our shared sense impressions and the ideations they produce and reoriented them in an exuberant and original way. The laundry is indeed still laundry and the laundry must be done, but it is also something else; it is fuel for the often forgotten but most essential aspect of man: his eternal soul. Wilbur himself provides the most succinct, and, of course, poetical, description of his genius in his poem “A Wood”:
Given a source of light so far away
That nothing, short or tall, comes very near it,
Would it not take a proper fool to say
That any tree has not the proper spirit?
Air, water, earth and fire are to be blended,
But no one style, I think, is recommended.
Wilbur has here avoided an error reciprocal to scientism. He has not asserted the tyranny of his position but rather acknowledged that if his understanding is correct, if indeed each thing is both significant and signifying, there must, then, be as many ways of metaphoring, as establishing connections between apparently disparate elements, as many ways of knowing as there are ways of being human, which is to say, they must be as numerous as are human beings themselves.
Just as the Copernican Revolution several centuries earlier had displaced the earth and its inhabitants from the center of the universe, so the Darwinism of the nineteenth century unseated man from the throne he had claimed for himself. With the earth removed from the center of the universe by Copernicus and man removed from the zenith of the created order by Darwin, the old understanding of human beings and their place in the cosmos was overthrown. The task taken up by thinkers of the generation after Darwin was to understand the implications of Darwin’s theory for humanity and to formulate a cohesive philosophy capable of imbuing human life with meaning while taking the new scientific discoveries into account. In the words of historian Ruth C. Crocker, as in European thought, “American intellectual life in the Gilded Age is often viewed primarily in terms of a response to Darwinism.”1
Perhaps the most ubiquitous element of this response was a newfound impetus for the idea of progress. Westerners, particularly Americans, had made the idea of progress a central aspect of their self-understanding since the Enlightenment. In fact, Darwin himself was one of the inheritors of this idea and his theories in large part presuppose and depend upon it. In short, “the idea of evolution gets some of its moral, social, and even cosmic significance from its implication that the general motion in the world of living things, perhaps in the universe, is a progress from lower to higher forms.”2 All of the various Gilded Age responses to Darwin’s ideas, no matter how much they may differ from each other on their particulars, share in this belief in and focus upon progress. In their beliefs about what constituted progress and precisely what man and the cosmos were progressing toward, however, the various responses differed radically from one another.
European responses to Darwinism were often attempts at a synthesis with Hegelianism, another philosophy, very popular and influential throughout Europe, which placed a strong emphasis on the idea of progress. According to historian Richard Tarnas, “metaphysically inclined scientists such as Henri Bergson, Alfred North Whitehead, and Pierre Teilhard de Chardin sought to conjoin the scientific picture of evolution with philosophies similar to Hegel.”3 These philosophies tended to see the process of evolution as oriented toward a divinely-directed goal and a point of unity between God, the cosmos, and man in the future. American responses, however, as well as later European responses, tended in the opposite direction of denying the possibility of formulating any “metaphysical system claiming the existence of a universal order accessible to human awareness” and emphasizing the disunity, and even enmity, between human beings and between all creatures.
The philosophy of pragmatism, the product of the thought of American philosophers and psychologists William James and John Dewey, which “question[ed] whether there was such a thing as universal truth,” is one example of the former type of response to Darwinism.4 According to James, Dewey, and the other pragmatists, ideas and beliefs were similar to the biological components of a species. There were none that were true in an absolute sense, or at least discernible as such as by biological beings such as humans, but some were “true” in a contingent sense in that they had demonstrated value for the current state of the species. This idea cast all ideas, as well as the very concept of and search for truth, into question.
Social Darwinism is perhaps the greatest example of the latter type of American response to Darwinism in its emphasis on the competition between individual men as well as between races and social classes. One of the most extreme proponents of a philosophy of pure Social Darwinism was the sociologist William Graham Sumner. Sumner spent a large portion of his career defending the thesis that social policy should adhere to the concept of survival of the fittest. To this end, Sumner attacked any program which attempted to aid the poor through charity or to redistribute wealth as contrary to nature and detrimental to the future of humanity. He believed that “feeding the hungry and unemployed” impeded the progress of human evolution and that “unfit people” should be allowed “to die, or at least not reproduce.”5 Although Sumner was one of the most outspoken and extreme advocates of Social Darwinism, the philosophy itself was popular throughout the American elite and was used by such figures as John D. Rockefeller and Andrew Carnegie to justify their tenacious pursuit of financial success to the detriment of others.
The various reactions to and extensions of Darwinism during the Gilded Age, including in the European attempts at a synthesis between Darwin and Hegel, as well as in American pragmatism and Social Darwinism, all demonstrate the disorienting effect Darwinism had on Western thought at the close of the 19th century. For some, as with the pragmatists, this displacement in ideas was impetus to abandon the very search for truth. For many, such as the Social Darwinists, this displacement prompted a kind of conservative synthesis, in which older ideas were combined with Darwinism in order to present a firmer ideological basis for the status quo. For all, Darwinism forever changed the nature of Western thought.
Calhoun, Charles W. The Gilded Age: Perspectives on the Origins of Modern America. Lanham: Rowman and Littlefield Publishers, Inc., 2007.
Edwards, Rebecca. New Spirits: Americans in the “Gilded Age,” 1865-1905. New York and Oxford: Oxford University Press, 2011.
Hutchins, Robert Maynard. Editor. The Great Books of the Western World, Volume 3: The Great Ideas: II. Chicago: William Benton, 1952.
Tarnas, Richard. The Passion of the Western Mind: Understanding the Ideas That Have Shaped Our World View. New York: Ballantine Books, 1993.
There are few issues more important to the future of the United States than the issue of education. It is through the nation’s educational systems that its future is being built. The boys and girls who are studying and learning in American schools today will be the men and women who will lead this country and even the world tomorrow. And yet, American students have been steadily falling behind their international counterparts in standardized test scores and overall academic performance. If we are going to do the right thing for our children and save the future for the United States, this nation needs to reorient its priorities, stop throwing money at the problem, and be willing to work hard and take the necessary steps to drastically overhaul American education.
Gallup Polls conducted in the month before each of the United States’ most recent presidential elections have found that the percentage of American voters who name education as their primary concern in the election has decreased dramatically over the last decade (Saad, “Economy is Dominant Issue for Americans as Election Nears”). Before the 2000 presidential election, 17% of voters stated that education was their number one concern. Before the 2004, 2008, and 2012 presidential elections, however, a mere 5%, 3%, and 4%, respectively, statistically even numbers, said that education was their primary concern. Instead, a majority of Americans have designated issues such as defense, healthcare, and the economy as their central concerns.
While these are valid and important things to be concerned about, education is the more important issue as it forms the baseline and background for these others. To take one example, those Americans primarily concerned with defense should also be equally concerned about education as the United States requires well-educated people, especially people who can become experts in technology, science, and mathematics, fields the United States is falling behind in, if it is to maintain its global military superiority. In a recent speech, Secretary of Defense Leon Panetta made this point clear, saying “Just as DoD developed the world’s finest counterterrorism force over the past decade, we need to build and maintain the finest cyber force and operations. We’re recruiting, we’re training, we’re retaining the best and the brightest in order to stay ahead of other nations” (Panetta, “Remarks”). Without an educational system that adequately prepares young people to enter fields such as cyber operations, the United States will lose its military dominance in the next generation.
Some might wonder, in response to all of this, whether the American school systems really are all that bad. Are education systems in the United States really failing that badly to prepare students for the future and are they really falling that far behind their peers in other nations? A recent study by Public Agenda, for instance, found that most American parents “say the amount of science and math their child studies now is sufficient” (“Preparing Today’s Students for Tomorrow’s Workforce”).
The reality, however, is that the education American students are receiving is far from sufficient. “Scores from the 2009 Programme for International Student,” for instance, found tat “out of 34 countries” ranked in a recent study of standardized test scores, “the U.S. ranked 14th in reading, 17th in science and 25th in math” (Armario, “Wake-up call”). This places the United States “far behind the highest scoring countries, including South Korea, Finland and Singapore, Hong Kong and Shanghai in China and Canada” (ibid.). What this means for the next generation in terms of military and economic superiority is both obvious and alarming.
There is no simple solution to this problem. Americans have tried for years to merely throw money at the issue and have seen little in terms of lasting results. What is necessary is a complete overhaul of the American public education system. While holding teachers accountable, raising budgets, and other popularly proposed solutions are all part of the fabric of what it will take to made a real and lasting change for the better, they are not the underlying issue. The underlying issue and what ultimately needs the most reform is the current approach to education in America; the United States needs a revamped and updated perspective and curriculum that is able to provide the education the modern world demands. The old system, based on the ideas of philosophers of education such as John Dewey focused essentially on providing just enough learning to allow the average student to enter a workforce of laborers and servers. The future demands that we provide more than “just enough” learning, that we strive for an above average education for above average children, and that education be focused on molding innovators, creators, and thinkers (Hutchins, The Great Conversation). This overhaul will no doubt be an expensive and often painful effort that will require a great deal of sacrifice for all of us, but we are speaking about our future, our children, and I believe we can all agree no price is too high to pay to do the very best we can do for future generations of Americans.
Armario, Christine. “’Wake-up call’: U.S. students trail global leaders.” MSNBC.com. 7 December 2010. Web. 9 December 2012.
Hutchins, Robert M. The Great Conversation: The Substance of a Liberal Education. New York: William Benton, 1952. Print.
Panetta, Leon E. “Remarks by Secretary Panetta on Cybersecurity to the Business Executives for National Security, New York City.” U.S. Department of Defense. 11 October 2012. Web. 9 December 2012.
“Preparing Today’s Students for Tomorrow’s Workforce. (cover story).” NSTA Reports! Jan. 2007: 1+. Education Research Complete. Web. 8 Dec. 2012.
Saad, Lydia. “Economy is Dominant Issue for Americans as Elction Nears.” Gallup Politics. 22 October 2012. Web. 9 December 2012.
One of the defining features of the United States both historically and today is its unique religious landscape. Particularly prominent in this landscape is the Christian Fundamentalist movement, a movement that has largely taken shape in the United States in the 20th century and has had a major effect on the United States in its political, cultural, educational, and social life during that time. One aspect of the influence that Christian Fundamentalism has had on the United States is in the debate over science education, human origins, and evolution. The so-called “Scopes ‘Monkey Trial’” is a landmark in this debate and an important case study in the ongoing struggles of communities of faith and doubt to define themselves and shape America according to their respective ideals.
While there are certain earlier antecedents in Christian thought that point towards the development of Christian Fundamentalism, its roots are most readily located in the 19th century. The 19th century was a period of rapid and profound change in both Europe and the United States. The rise of the Industrial Revolution brought about a great deal of new technology, which changed the way people lived their daily lives both at work and at home. Simultaneously, new ideas, which had simmered under the surface and had been largely the purview only of certain educated minorities until that point, began to gain popular currency. As A.N. Wilson succinctly states it in his history of doubt in Victorian England, God’s Funeral, “the ideas which undermined nineteenth-century religion took shape in the eighteenth century.”1
Among these ideas were the scathing attacks of Edward Gibbon upon the history of the Christian Church. His Decline and Fall of the Roman Empire, especially in its fifteenth and sixteenth chapters, which discussed the rise of Christianity in the Roman Empire, became infamous for its attacks upon some of the most revered figures and sacred ideals of Christianity.2 In addition to these attacks upon the mythology that had developed around Christian history as a whole, more specific attacks were launched against the sacred center point and beginning of Christian history as it was recorded in the New Testament. David Friedrich Strauss’s Life of Jesus, originally published in German in 1835-6 and translated shortly thereafter into English, became a surprisingly popular read in England and the United States.3 Through the book, Strauss was able to popularize the ideas that had been circulating among academic circles in Germany which treated the Gospels and other sacred writings of Christianity the same as any other ancient work and led to the claim that much of the life of Christ as it was recorded in the Gospels was myth, including the miracles and the very central claim of Christianity: the resurrection. Perhaps the biggest shock of all to 19th century Christians was a new scientific theory introduced to the public in 1859 with the publication of Charles Darwin’s Origin of Species. Darwin’s theory of evolution, which posited that all species, including human beings, held common descent and had evolved through the process of natural selection, called into question the account of creation found in Genesis, the idea of a provident creator-god, and the very concept of human uniqueness. Viewed by many in the 19th century and since as “modern science’s culminating triumph over traditional religion, Darwin’s theory of evolution” was the culminating and deepest blow to 19th century Christian faith.4
The responses by Christians to these new challenges were various. The Roman Catholic Church, in an attempt to evade another affair like the 17th century trial of Galileo, a permanent source of criticism and mockery, assumed an officially moderate stance in which it affirmed both the traditional and central claims of Christianity while allowing that modern scientific theory and biblical criticism may be correct within their sphere of concern as well. The Orthodox Church, largely cut off from the currents of Western thought by a combination of geography and historical circumstance, remained largely unaffected by these new ideas and assumed no official stance, though reaction among individual thinkers within the Orthodox Church was largely consonant with the Catholic stance. It was among Protestants that these new ideas made the greatest ripples. Reactions among Protestants generally took one of two forms, either accommodation and adaptation or retrenchment and counterattack.
Those who adopted the former course of action came to be labeled “liberals” or “modernists.” This group accepted the new theories, often in their totality, and altered their central message to fit accordingly. In so doing, according to historian Harold Carl, they “believed they were rescuing religion from doctrinal bondage and obscurity” and making “Christianity palatable to modern people.”5 Many of them abandoned the belief in miracles, even in the resurrection of Christ, and the traditional Christian dogmas of sin, redemption, and salvation, in favor of a version of Christianity in line with modern science and higher criticism of the Bible. They focused instead on the social implications of the message of the Bible, such as egalitarianism and care for the poor and oppressed, often ignoring the dogmatic and doctrinal altogether. In his 1938 book The Kingdom of God in America, Protestant Neo-Orthodox theologian H. Richard Niebuhr satirically summarized the Gospel of the liberals as the belief that “a God without wrath brought men without sin into a kingdom without judgment through the ministrations of a Christ without a cross.”6
Those who assumed the latter course of retrenchment and counterattack saw the liberals as traitors to the Christian faith. “It is this group,” says Carl, “—the vocal and the intransigent—who began to publicly attack liberalism in the early 1900s and who eventually took on the name ‘fundamentalists.’”7 Originally emerging from the ranks of clergy of the Presbyterian Church but later encompassing a variety of denominations, this group “would not budge on any point.”8 Even Christians who were not liberals had been willing to concede certain points of modern science and higher criticism as acceptable, but the Fundamentalists would have none of it.
A series of books published in 1910-5 by the Bible Institute of Los Angeles entitled The Fundamentals: A Testimony to the Truth provided the name for this movement.9 The books in the series consist of essays written by a large group of theologians, professors, and clergymen aligning themselves with this new conservative movement in Christianity. The included essays addressed such topics as “the Mosaic Authorship of the Pentateuch,” “Internal Evidence of the Fourth Gospel,” “the Recent Testimony of Archeology to the Scriptures,” and “the Decadence of Darwinism.”10 Nearly any perceived threat, from Darwinism to liberalism to Roman Catholicism, was attacked and the unwavering position of the authors in clinging to Protestant orthodoxy was clearly affirmed; Christian Fundamentalism was born.
The 1920s were a decade largely marked by conservatism in American politics and culture. Following the brutality and upheaval of World War I and the Progressive politics of the previous two decades, Americans longed for a simpler time. According to historian John Milton Cooper, Jr., President Warren G. Harding was elected on a platform that promised a return to the “normalcy,” a word he coined, of “pre-war quiescence and detachment in foreign policy, and of calmer times at home.”11 Manipulating the same distant memories of a better past, the Ku Klux Klan gained enormous popularity. As many as 40,000 members demonstrated in front of the White House in 1925. Christian Fundamentalism found a natural home in the minds of many American Christians of this era, including many in positions of power and influence.
Through the combination of popular conservatism and those adherents to Fundamentalism who were in positions of power, Fundamentalism was able to begin making a major effect on American culture and politics from a very early date in its history. On 13 March 1925, the state legislature of Tennessee passed a law, the Butler Act, ordering
that it shall be unlawful for any teacher in any of the Universities, Normals and all other public schools of the State which are supported in whole or in part by the public school funds of the State, to teach any theory that denies the story of the Divine Creation of man as taught in the Bible, and to teach instead that man has descended from a lower order of animals.12
John Washington Butler, a Christian Fundamentalist and Tennessee legislator who had introduced the law and for whom the law was named, knew very little about the science behind evolutionary theory but was influenced to oppose it by the work of William Jennings Bryan, an influential politician who had been a presidential candidate as well as a secretary of state. Bryan, a conservative Presbyterian who aligned himself with the Fundamentalist movement, had supported a number of conservative Christian causes throughout his career in politics, including prohibitionism and pacifism; he had now turned his sights on Darwinism.
Following the passage of the law, the American Civil Liberties Union set out to challenge it. In May 1925, John T. Scopes, a high school sports coach who sometimes acted as a substitute teacher for a biology class, agreed to be charged with violating the law in order to bring it to court. Scopes, however, quickly took a backseat in his own trial. Two other very imposing figures took center stage. William Jennings Bryan agreed to participate in the trial on behalf of the prosecution and Clarence Darrow, a famous trial lawyer and self-identified agnostic, agreed to enter on behalf of the defense. Media across the country began following the trial and reporting on it if it were an epic battle between faith and disbelief; poised on one side was Bryan, the man of faith and an emerging spokesman for the Fundamentalist movement, and on the other was Darrow, the rationalistic freethinker and opponent of biblical faith.
In spite of all else that occurred during the course of the eight days of the trial, “it was a heated, two-hour exchange” between Darrow and Bryan “that, in the end, did not affect the case as much as it did the nation” that has been remembered.13 The 1955 play, made into a film in 1960, Inherit the Wind, a fictionalized dramatization of the trial, particularly worked to crystallize this exchange as the defining moment in the trial, as it portrayed the confrontation between the two as the climax of the trial. The popular record has also remembered Darrow as outsmarting Bryan during their exchange and Bryan as being narrow-minded and ignorant. This is the version of events that is presented in Inherit the Wind and it is certainly the image that Darrow sought to create in the debate.
The actual exchange, however, indicates a more nuanced and complex picture. In fact, Darrow often appears to be the narrow-minded bigot whereas Bryan appears more ready for compromise and dialogue. Darrow returns, for example, several times over to the question of the age of the earth in spite of Bryan’s willingness to concede that he does not know the age of the earth and that it may in fact be “six million years or … six hundred million years” old.14 Similarly, Darrow seems at several points in their exchange to insist that the Bible be interpreted even more literally than Bryan interprets it. For example, he questions Bryan concerning the length of the days of creation found in the opening chapter of Genesis in the Bible several times, seeming to insist that Bryan interpret them as literal days and ignoring Bryan’s clear statements that he does not believe them to be literal days. One example of this recurrent line of questioning is in this bizarre exchange:
MR. DARROW–Do you think those were literal days?
MR. BRYAN–My impression is they were periods, but I would not attempt to argue as against anybody who wanted to believe in literal days.
MR. DARROW–Have you any idea of the length of the periods?
MR. BRYAN–NO, I don’t.
MR. DARROW–Do you think the sun was made on the fourth day?
MR. DARROW–And they had evening and morning without the sun?
MR. BRYAN–I am simply saying it is a period.
MR. DARROW–They had evening and morning for four periods without the sun, do you think?15
The perception of the cross-examination of Bryan by Darrow as one in which the unbeliever outsmarted the believer, as oversimplified as this is shown to be when compared to the actual content of the trial transcript, is one that has colored subsequent understandings of the trial as well as subsequent debates between believers and unbelievers. In many ways, this misunderstanding of the exchange between Bryan and Darrow has come to characterize the entire debate between Fundamentalists and other conservative believers on the one hand and unbelievers and liberal Christians on the other hand. It has also colored subsequent debates over religion’s place in American society, politics, and especially education. The view of Bryan as simpleminded and backwards has become a caricature applied to Christian Fundamentalists in general.
A recent example of this recurring caricature and the continuation of some the themes present in Darrow’s cross-examination of Bryan, even outside of the United States, is in the recent debate between Richard Dawkins, a scientist and prominent atheist, and Archbishop Rowan Williams, the current Archbishop of Canterbury and head of the Anglican Church. The article published on the website of The Independent, a popular London newspaper, about the event is indicative of this caricature. The very title of the article, “God vs Science: Richard Dawkins takes on Archbishop of Canterbury,” implies that the Christian participant stands opposed to scientific ideas.16 During the course of the debate itself, Dawkins seemed surprised that Williams, who is neither a Fundamentalist nor a modernist, was willing to state that he did not believe in a literal Adam and Eve and that humans had non-human ancestors. Dawkins admitted that he was “baffled by the way sophisticated theologians who know Adam and Eve never existed still keep talking about it,” to which statement the Archbishop countered that the Genesis narrative is not about scientific theories but about deeper truths about God and man.17 Such an exchange is highly reminiscent of Darrow’s adoption of and insistence upon a more literal understanding of Genesis than that of Bryan and his subsequent bafflement at Bryan’s refusal to adopt that narrow, literalistic understanding.
The stereotyping of each side by the other in debates over faith and doubt continues to fall into the narrow categories represented by Darrow and Bryan in the popular remembrance of the Scopes Trial and presented by each in their accusations hurled at the other. Bryan’s claim that skeptics “have no other purpose than ridiculing every person who believes in the Bible” remains a refrain of many on the side of faith and especially in the Fundamentalist camp today, whereas Darrow’s characterization of Bryan and his party as “bigots and ignoramuses” remains the common view of many unbelievers of all believers generally but especially of Fundamentalists.18 Just as in the Scopes Trial, however, the reality is never so simple. On the contrary, as was exhibited by the remarkably cordial and thoughtful nature of the exchange between Dawkins and Williams, which nearly every media outlet that reported on the debate expressed surprise at, there are clearly intelligent and well-meaning people on both sides of the issues. As this debate which began in the Enlightenment and has run through Western popular thought and culture for nearly two centuries continues and as each side in it attempts to reshape culture according to its own view, overcoming the legacy of the Scopes “Monkey” Trial and remembering that the other side does not consist of “bigots and ignoramuses” but others who have simply reached different conclusions may be the most important thing any participant can do.
One thing that is abundantly clear from Nazi actions, propaganda, and literature is that they were obsessed with concepts like race, racial purity, and “racial hygiene.” Among the central tenets of Naziism were the beliefs in a pure Aryan race and in the innately inferior, and even insidious, nature of the blood of other races, especially that of the Jews. These ideas, like all ideas, have a genealogy, and what is perhaps most remarkable about these ideas is that very genealogy. The Nazi obsession with race and the uniquely Nazis twists on and responses to that idea are the product of a kind of “perfect storm,” a chance collision of a variety of otherwise unrelated ideas and events which led to catastrophic consequences. Foremost among these disparate concepts, as well as most important for an examination of why this Nazi obsession with race developed in the first place, are the European heritages of anti-Judaism and the scientific outlook that emerged from the Enlightenment.
Anti-Judaism, which must be distinguished from Antisemitism as a separate but related historical antecedent, began very early in European antiquity. The Greek conquerors and overlords of Judea in the fourth through second centuries BCE viewed the Jews, with their unique ritual and social practices such as circumcision and their insistence upon religious exclusiveness, with a great measure of suspicion and skepticism. While most were willing to tolerate and even protect the Jews as an exceptional people, some rulers, such as Antiochus IV Epiphanes, attempted, however unsuccessfully, to force the Jews to Hellenize and renounce their unique religious practices and beliefs.1
The Greek distrust and dislike of the Jews was continued among the Romans, who conquered both the Greeks and the Jews in the second and first centuries BCE. While the Romans were willing to accept and make exceptions for unique Jewish beliefs and practices and large numbers of Jews emigrated throughout the Roman Empire, Jews were consistently mocked and looked down upon by Romans, who saw practices like circumcision as barbaric and the exclusive Jewish monotheism as potentially seditious.2 This negative view of Judaism continued, and was even strengthened in many ways, when the Roman Empire gradually became Christianized beginning in the fourth century CE.
Christianity had emerged from a particularly unpleasant split with Judaism in the first century CE. Christians were viewed by the Jews as treacherous and heretical and, as a result, often suffered persecution and expulsion from the synagogues. This hostility on the part of mainstream Jews toward the Christians in their midst precipitated a final split between Judaism and Christianity. It also led to a great deal of vociferously hostile words making their way into the mainstreams of both Jewish and Christian literature and thought about the other. When Christians began to assume power in the Roman Empire several centuries later, these ideas about the Jews combined with the popular Roman prejudices to strengthen Roman anti-Jewish attitudes.3 These anti-Jewish attitudes, a combination of the Greco-Roman prejudices and Christian theological and historical disagreements, became the predominant view of Judaism throughout Europe for many centuries.
It is notable in all of this that none of these prejudices revolve around Judaism or Jews as a race or ethnicity, but as a specific religious group which one can join and leave by changing belief and custom. This began to change, however, in the early modern period. One element of the Reconquista in Spain was the forced conversion or expulsion of the Jewish population.4 When given the option of converting to Christianity or leaving, many Spanish Jews chose to convert. These conversos, as they were called, came to be viewed with a great deal of envy and suspicion by their Christian neighbors. Many suspected that because they had converted under duress that their conversion had only been affected for appearances and that they secretly continued to practice Judaism. In addition, many whose families had been Christians for centuries viewed with envy the children and grandchildren of conversos who were able to attain to important spots in government and in the the Church. As a result, the name of converso came to be applied, however improperly, even to those whose grandparents had converted to Christianity and the stigma of sedition attributed to the Jews continued to be attached to these conversos even after generations as Christians. What had been a difference in religion was coming to be viewed as a difference in race.
With the era of the Enlightenment in the seventeenth and eighteenth centuries, Europeans came to focus more attention and importance on science than on religion. Whereas the emphasis of the Middle Ages had been a primarily religious emphasis, which the denizens of the Enlightenment saw as superstitious, the emphasis of the Enlightenment was one of science and rationality. Rather than actually shucking superstition, however, many instead simply adopted a new set of superstitions or rephrased old superstitions in the new, more acceptable terminology.
This can be seen especially in the rise of Antisemitism from anti-Judaism, as constructed by people like Wilhelm Marr. “Marr” was among the first of those who “assigned to Jews the attributes of a race” and was the first, in 1873, to use the term “anti-Semitism” to describe this position.5 While an intellectual living in the wake of the Enlightenment could not take religious differences seriously, or, at least, as seriously as they had been taken previously, he could take supposedly scientific ideas like race seriously; Judaism, then, became no longer a religion, but a race, and all of the same superstitions and conspiracies which had formerly surrounded the Jewish religion were transferred to the new Jewish race.
One of the greatest ironies of the Nazi obsession with race is that they, while taking up this “scientific” view on Judaism as a race, re-translated it into religious terms. For the Nazis, race became a religious concept. As one Nazi ideologist, Arthur Rosenberg, wrote in his The Myth of the 20th Century: “A new faith is awakening today: The faith that blood will defend the divine essence of man; the faith, supported by pure science, that Nordic blood embodies the new mystery which will supplant the outworn sacrament.”6 The Greek incredulity at what they saw as the bizarre customs of the Jews, the Roman suspicions toward Jewish exclusivity, and the Christian theological and historical differences with Judaism, all of which had been matters of religious and cultural opposition, became, for the Nazis, attributed to an insidiousness inherent in Jewish blood. This was contrasted with the inherent superiority and goodness of pure Aryan blood. The Nazis took up a heritage of anti-Judaism and a pseudoscience of race to create their own unique racial religiosity which lay at the heart of their entire philosophy and practice.
The Christian Church of the Middle Ages has become somewhat of a boogeyman in the modern popular imagination. It is fairly typical to hear even supposedly educated individuals claim that Christianity quashed out all science, philosophy, and learning, which aspects of civilization would only reemerge from the darkness of the “Dark Ages” with the Renaissance and, still later, with the Enlightenment.1 The destruction of the Great Library of Alexandria, supposedly at the hands of a violently anti-intellectual Christian mob, the gruesome murder of the Alexandrian female mathematician Hypatia, supposedly at the hands of a similarly violently anti-intellectual (and anti-woman) mob of Christian monastics, and the supposed stagnation of scientific knowledge, along with other similar examples, are paraded out as evidence for this assertion. However, many of these examples, such as the destruction of the Great Library of Alexandria, are myths,2 others, such as the murder of Hypatia, are vastly exaggerated and wildly misreported,3 and still others, such as the decline of scientific knowledge, are outright fabrications of Christianity’s Enlightenment and post-Enlightenment detractors such as Edward Gibbon and John William Draper.4 Contrary to the common misconception of history, the advent and eventual triumph of Christianity was a great boon to the intellectual tradition of the Greco-Roman world, as it freed this tradition from superstitious presuppositions and encouraged its proliferation within a more logical worldview.
As much as the pre-Christian scientific tradition of the Greco-Roman world has been hailed and lauded by some in the course of criticizing medieval Christians, if there is anything for which modern people can pass blame on the Christians of the Middle Ages it is that they so long held on to so many of the methods and notions of the Greco-Roman world, a world whose intellectual tradition had been on the decline for many years before the triumph of Christianity.5 As David C. Lindberg, a historian of science, observed, “It is agreed by most historians of ancient science that creative Greek science was on the wane, perhaps as early as 200 B.C., certainly by A.D. 200.”6 The field of cosmology is a notable example.
Aristotle’s model of the universe, based upon his philosophical concepts and not upon anything even remotely resembling modern scientific research, posited that the universe was composed of a series of concentric “celestial spheres” which moved in a perfectly circular motion around a perfectly spherical earth and “that the heaven as a whole neither came into being nor admits of destruction … but is one and eternal.”7 It was only with the advent of Christianity that these assumptions, now shown ridiculous by modern science, of an eternal and perfect symmetry and harmony in the universe, began to be questioned. Importantly, the questioning of these ancient pagan presuppositions was engaged in upon the basis of uniquely Judeo-Christian concepts.
The Judeo-Christian beliefs that only God is inherently eternal, that he created all that exists ex nihilo, and that all things continue to exist only because he sustains them, not because of any inherent immortality on their part, clearly stood in stark contradiction to Aristotelian cosmology. It was upon this uniquely Judeo-Christian basis that the assumptions of Aristotle and the many who had followed him were criticized by philosophers and scientists such as the Byzantine Christian philosopher John Philoponus (490-570 CE).8 Philoponus would be read, admired, and heavily borrowed from by Galileo Galilei (1564-1642 CE), whose theory of a heliocentric universe, in spite of its infamous and habitually misrepresented condemnation by the Inquisition of the Roman Catholic Church, would be foundational for modern scientific views of cosmology.9
Medieval Islam, by contrast, would never produce such a flowering of scientific thought as did the Christian world in spite of handling the same Greco-Roman texts and observing the same astronomical phenomena as the Christians for a nearly equal period of time. Although Muslims, such as the theologian Ghazali (1058-1111 CE), did question certain aspects of the cosmological models received from the Greco-Roman tradition, they typically did so only by arguing from another aspect of the Greco-Roman tradition.10 For instance, the Muslim polymath Averroës (1126-1198 CE) opposed the Ptolemaic model of the universe primarily by arguing for the superiority of the Aristotelian model.11
In spite of the mythology propagated by Christianity’s fashionable enemies during the Enlightenment and since and still held in the popular consciousness today, Christianity not only is not responsible for any kind of disappearance or weakening of the Greco-Roman intellectual tradition, it is in fact responsible for having saved that intellectual tradition, in many ways from itself. As the modern Christian philosopher and historian David Bentley Hart has pointed out, “despite all our vague talk of ancient or medieval ‘science,’ pagan, Muslim, or Christian, what we mean today by science … came into existence, for whatever reasons, and for better or worse, only within Christendom, and under the hands of believing Christians.”12
1 A popular recent example of such erroneous thinking can be found in Charles Freeman, The Closing of the Western Mind: The Rise and Fall of Reason (New York: Knopf, 2003).
2 David Bentley Hart, The Story of Christianity: An Illustrated History of 2000 Years of the Christian Faith (London: Quercus, 2007), 47.
3 ibid., 97.
4 David C. Lindberg, The Beginnings of Western Science: The European Scientific Tradition in Philosophical, Religious, and Institutional Context, 600 B.C. To A.D. 1450 (Chicago: University of Chicago Press, 1992).
6 David C. Lindberg, “Science and the Early Church,” in God and Nature: Historical Essays on the Encounter between Christianity and Science, eds. David C. Lindberg and Ronald L. Numbers (Berkeley: University of California Press, 1986), 30.
9 Edward Grant, Science and Religion, 400 B.C. to A.D. 1550: From Aristotle to Copernicus (Baltimore: Johns Hopkins University Press, 2004).
11 David Bentley Hart, Atheist Delusions: The Christian Revolution and Its Fashionable Enemies (Ann Arbor: Sheridan Books, 2009), 59.
12 ibid., 63.