In Defense of Dead White Men, Part 1: Western Civilization and Higher Education

“One of the chief defects of modern education has been its failure to find an adequate method for the study of our own civilization.”1 So wrote Christopher Dawson in 1961, almost precisely at the inception of the trends in thought about education and culture which would not only further exacerbate this pre-existing problem but, in an attempt to pretend that the problem did not exist at all, would ignore it altogether, thus allowing a minor illness to ripen into a full-blown plague. While Dawson sought to reform the way in which the study of Western Civilization was approached in schools, he could still, in 1961, take it for granted that all primary and secondary school students in the United States would be inducted into the history, thought, and culture of Western Civilization. He could still, at that time, assume that all American college undergraduates would be required to take at least a course in Western Civilization and would be exposed through other classes to the great literary, artistic, philosophical, scientific, and other products of that culture, which is, in fact, their culture. In the same paragraph, Dawson laments the dominance in educational institutions of the twin forces of “the democratic utilitarianism of compulsory state education, on the one hand, and … scientific specialization, on the other.”2 These two forces have continued to dominate American education in the more than half a century since Dawson wrote his Crisis of Western Education. The result has been that the crisis has reached such proportions that each new graduating class of students is further restricted within a field of technical specialization and further alienated from their heritage as the children of Western Civilization and citizens of the United States.

In colleges throughout the United States, courses in Western Civilization, once required for majors in all areas of study, have not only ceased to exist as a requirement but have ceased to exist altogether. In a study published in May 2011, the National Association of Scholars documented the decline of Western Civilization courses in American colleges beginning in 1964.3 They found that throughout both public and private institutions of higher learning across the United States, including top-ranked universities, a survey course in Western Civilization has all but gone extinct as a requirement for undergraduates. Very few even require such a course for students majoring in history. Instead, the trend has been to replace the study of Western Civilization with a class dedicated to a more general study of world history, downplaying the importance of Western Civilization and downgrading it to the status of just one civilization among many.

When universities do choose to teach their students about Western Civilization, it generally comes packaged with vitriolic criticism. While Harvard University offers no introductory course in Western Civilization to its undergraduates, for example, it does offer a graduate course targeted to its teaching fellows whose title is “Western Ascendancy: Historiography and Pedagogy.” The description of the course from Harvard’s course catalogue states its purpose without equivocation:

The purpose of this graduate seminar is to get Teaching Fellows and other graduates to engage with the historiographical and pedagogical challenges of the General Education course, Societies of the World 19: Western Ascendancy. Courses in Western Civilization are nowadays widely seen as outmoded and excessively Eurocentric. The aim of SW 19 is to address questions of global economic and political divergence in a fresh way, taking advantage of more recent literature on economic history, for example.4

The trend in favor of the degradation of Western Civilization has penetrated academia so deeply that those institutions which have resisted the trend have become the targets of governmental organizations dedicated to enforcing educational homogeneity. Larry P. Arnn, the president of Hillsdale College, a private liberal arts college in Hillsdale, Michigan, for example, has documented his institution’s struggle against government agencies tasked with the imposition of conformity to current trends his book Liberty and Learning. One example he provides concerns the Michigan Department of Education’s criticism of Hillsdale’s stated mission to act as “a trustee of modern man’s intellectual and spiritual heritage from the Judeo-Christian faith and Greco-Roman culture.”5 The Michigan Department of Education insisted, with the threat that it would refuse to issue teaching certifications to graduates of Hillsdale, that Hillsdale College reform its introductory courses in Western Civilization, which Hillsdale requires for all undergraduates in any major, so that the “intent is to point out the limitation to Western culture.”6 The Department asserted, in addition, that “the Hillsdale program, based on the principles of Western culture, does not incorporate global perspectives by design. It is unclear how to resolve this weakness.”7 In other words, Hillsdale College’s focus “on the principles of Western culture” is, in the eyes of the state of Michigan, a “weakness” that must be “resolve[d]” in order to incorporate a more multicultural approach, to the detriment of both Western Civilization as a course of study and to the detriment of the students who are receiving this education.

 

Notes

1 Christopher Dawson, The Crisis of Western Education (New York: Sheed and Ward, 1961), 119.

2 Ibid.

3 Glenn Ricketts, Peter W. Wood, Stephen H. Balch, and Ashely Thorne, “The Vanishing West: 1964-2010, The Disappearance of Western Civilization from the American Undergraduate Curriculum,” National Association of Scholars (May 2011) http://www.nas.org/articles/The_Vanishing_West_1964-2010.

4 “History 2921 – Western Ascendancy: Historiography and Pedagogy: Seminar,” Harvard University Course Catalogue: 2013-2014.

5 Larry P. Arnn, Liberty and Learning: The Evolution of American Education (Hillsdale: Hillsdale College Press, 2010), 53.

6 Ibid.

7 Ibid.

Next: In Defense of Dead White Men, Part 2: Western Civilization and the Common Core

Defining Western Civilization: Christendom By Any Other Name

There can be little doubt that Western Civilization is and will for the foreseeable future remain the dominant civilization of the world. The nations of Western Civilization have, over the past several centuries, spread their languages, their cultures, their ideologies, and their political rule to every continent. Despite the decline of Europe, the home of this civilization for the bulk of its lifespan thus far, the ideas of the West continue to be the major shaping influences of the modern world, though the focal points of that world have since moved to North America and are now moving to Asia. Ideas such as communism, democracy, and human rights are finding new homes in India, China, and Japan, far from their birthplaces in Germany, Greece, and Italy. While this renders the term “Western Civilization,” with its directional emphasis, a quaint anachronism, the ideas themselves have taken on a renewed vigor in their current host nations. The first step toward understanding the reasons for the dominance of Western Civilization and for responding to its movement into new and foreign terrain is defining Western Civilization itself.

To define Western Civilization, the term itself must, in a sense, be dismissed. It is clearly not merely “Western,” meaning European, but rather universal in its embrace and pertinence. The “Western” idea of liberty is equally true and meaningful in both France and China. A close look at the history of Western Civilization even before its globalization in the modern era reveals that it has never been strictly “Western.” Its most ancient ancestors, in fact, lie altogether outside of the borders of Europe. The genetics of Western Civilization reveal that it is and has been since its inception an amalgam of peoples and cultures, often with widely divergent worldviews and geographies.

Ancient Greece is generally, and rightly, credited as the birthplace of many distinctively Western ideas, including its political and philosophical systems, its art and literature, its science and medicine, and much else. The Greeks themselves, however, often credited their forebears among the Egyptians and the Babylonians as the progenitors of a great deal of their knowledge. A sizeable portion of this credit is undeserved and may be attributed to the desire, common until fairly recently, to link one’s original ideas with the respectability of antiquity;1 these attributions, however, do demonstrate a Greek admiration for and imitation of the knowledge of the Egyptians and Babylonians.

Fittingly, these two nations also figure prominently among the shaping influences upon the other great early strand in the DNA of Western Civilization, the Jews. Genesis 11:31 claims the Mesopotamian city of Ur as the birthplace of Abraham, the patriarch of the Jewish people, and the stories that make up much of the Jewish scriptures exhibit a common origin with or perhaps an improvement upon the traditional stories of Mesopotamia, such as the creation story of the Enuma Elish and the flood story of the Epic of Gilgamesh. Similarly, Jewish law reflects an improved and universalized application of the rule of lex talionis evident in Mesopotamian law codes such as the Babylonian Code of Hammurabi.2 Jewish influence by the Egyptians is demonstrated in the Jews’ own record in the Book of Exodus of their period of enslavement in Egypt and their subsequent escape therefrom.

The commingling of these two cultures, the Greek and the Jewish, began in earnest with the conquest of the Israelite lands by Alexander the Great in 331 BC. Although the relationship between the two was often a tumultuous one, as in the suppression of a distinctively Jewish identity under Antiochus IV Epiphanes and the subsequent revolt of the Jews against Seleucid Greek rule under the Maccabees, it nonetheless bore spectacular fruit, particularly in the Roman period. The production of the Septuagint translation of the Jewish Scriptures into the Greek language and the Jewish-Hellenic synthesis philosophy of Philo of Alexandria are two noteworthy early examples among many. By far the most important fruit of this contact between the Greek and Jewish cultural systems was the Christian Church. Early Christians employed Greek language and ideas to convey the events of the life of a Jewish man and their understanding of the significance of those events, which they saw as the culmination of the history and hopes of the Jewish people. When the early Christian author Tertullian wrote in his blustering attack on Christian heretics “what indeed does Athens have to do with Jerusalem?” he had hoped for a negative response.3 Had he stopped to consider the origins of his own faith, however, or had access to its later developments, he would have heard his question resoundingly answered to the contrary of his expectations. The Christian Church, and Christians more generally, would continue this grand synthesis of the Greek and the Jewish throughout the Middle Ages, incorporating along with them a number of other cultures as well, most notably the Germanic culture of the Northern European peoples. Indeed, as Christopher Dawson has described it, Western Civilization is the product of “several peoples, composed of different racial elements, all co-operating in the development of a common cultural heritage.”4

When using the term “Western Civilization” one is referring to a great amalgam of cultures and peoples, ideas and worldviews, including but by no means limited to the Egyptians, the Babylonians, the Greeks, the Jews, the Romans, and the Germans, all brought together within the framework of Christianity. Early Christian writers, the great majority of whom were Romans writing in the Greek language, were fond of bragging about the expansion of their religion well outside of the bounds of the Roman Empire among the various barbarian nations which surrounded it. They were not, of course, conscious of the great civilization which would be forged by the unity they were bringing to these peoples. Christianity was able to provide a framework which united such disparate cultures while sustaining their local customs because of its emphasis on one particular and central idea, namely, the Incarnation. As Dawson explains, Western Civilization’s “religious ideal,” unlike that of the Chinese, Indian, and other great civilizations, “has not been the worship of timeless and changeless perfection, but a spirit that strives to incorporate itself in humanity and to change the world.”5 Western Civilization has had the marked tendency to regard all knowledge as worthy and to absorb this knowledge into itself, further accreting ever more peoples and their traditions while widening its own civilizational embrace. This is why theories of the dominance of Western Civilization which have seen race or, more recently, geography as the primary impetus fall far short of possessing full explanatory power.

Jared Diamond’s thesis in his 1997 book Guns, Germs, and Steel, for example, that the success of the West in comparison with other cultures is the result of European geography’s ability to absorb and combine elements from surrounding civilizations fails to account for a number points which must be considered. Diamond’s thesis, for example, does not account for the history of locations such as Alexandria, Egypt, which was a center for the combination, incubation, and distribution of ideas in Western Civilization but has since fallen into stagnation after being acquired and enculturated by another civilization. More importantly, his theory ignores altogether the human factor, or what Dawson calls the “psychological factor,” the place of people and their ideas, which is the primary factor in the shaping of a civilization.6 It was the “psychological factor” of the Christian belief in the Incarnation which provided the glue to hold together such divergent and disparate peoples and traditions as those of which Western Civilization consists.

From an early point, and perhaps because of its dual parentage in Greek and Jewish civilizations, Christians demonstrated a unique openness to the beliefs and practices of a variety of peoples. In the words of the late historian Roland N. Stromberg, “no other civilization … has ever possessed the capacity for change that ours has shown. This was probably the result of its complex inheritance, which came to it from several sources.”7 With some exceptions (such as Tertullian, quoted previously), Christians generally viewed their faith not only as the fulfillment of Jewish messianic expectations, but as the completion of the philosophies of non-Jews as well. The second century Christian apologist Justin Martyr unequivocally asserted that Christian “doctrines … appear to be greater than all human teaching; because Christ, who appeared for our sakes, became the whole rational being, both body, and reason, and soul.”8 From this centrality of the Incarnation, Justin was able to simultaneously assert that the body, reason, and soul of man, which were taken on and redeemed by God in the Incarnation, were also given by God to man as tools for man’s use in acquiring wisdom and virtue.9 With this foundation in the Incarnation and its implications, Justin found it acceptable to commend a number of ideas of the Platonists, the Stoics, the Greek poets, and others as both wise in themselves and consonant with Christian teaching.10 This Christian openness to foreign ideas continued throughout the history of Western Civilization and allowed it to both absorb ideas from outside, such as the medieval Islamic translations of and commentaries upon Aristotelian texts, as well as find new homes in a stunning variety of ethno-linguistic and cultural groups, transforming each of these to meet its own requirements while not displacing their native heritages.

From the foregoing, a definition of Western Civilization can be formulated which removes the misguided focus on geography and favors instead a more complete understanding of the history and nature of the civilization itself. Western Civilization is not strictly European or entirely Western. It is, rather, that collection of disparate cultures which has united itself around the fundamental notion of the Incarnation. Western Civilization is, in short, Christendom.

The immediate objection to such a formulation is the observation that Western Civilization has, beginning with the Enlightenment, entered a period of turning away from its Christian heritage which has resulted in the modern so-called post-Christian societies of Europe and the emergent post-Christian societies of North America. With such a turn to secularism in the former domains of Christendom and with such nations as India and China, which are not now and never have been majority Christian nations, taking on and internalizing ideas which originated in the West, some may see the designation of Western Civilization as Christendom as unnecessary and antiquated. To adopt such a position, however, is to ignore or to be ignorant of the overwhelming influence Christianity has had upon the formation of this civilization. As Dawson points out,

In fact, no civilization, not even that of ancient Greece, has ever undergone such a continuous and profound process of change as Western Europe has done during the last nine hundred years. It is impossible to explain this fact in purely economic terms by a materialistic interpretation of history. The principle of change has been a spiritual one and the progress of Western civilization is intimately related to the dynamic ethos of Christianity, which has gradually made Western man conscious of his moral responsibility and his duty to change the world.11

Although Christianity may be in the process of becoming a minority religion even within the historical borders of Christendom and although the ideas of Christendom are now put into practice with more vigor and among nations with far larger populations in lands yet unbaptized, the force of Christianity in the shaping of Western Civilization cannot be ignored or downplayed. Even the very ideas which are replacing traditional Christian religiosity among those living within Christendom’s native lands are the product, or perhaps the byproduct, of Christianity. Scientific materialism, for example, would hardly be a tenable worldview without the process of the development of scientific thought in the West, a process which largely occurred not only at the hands and in the minds of believing Christians but also, and more importantly, as a result of the impact of Christian ideas. The Christian scholastics of the Middle Ages, for example, in their attempts to reconcile the contents of the Christian faith with the philosophy of Aristotle, “laid a solid foundation of logical thought on which later science could build.”12 The early giant of the Scientific Revolution, Galileo Galilei, was himself inspired and driven by his belief that “this grand book, the universe, … is written in the language of mathematics.”13 This Platonic notion refracted through the lens of his medieval Christian heritage drove Galileo to attempt to formulate mathematical proofs for Copernicus’s heliocentric theory. There are, in addition, more subtle ways in which Christianity made modern science and its sickly cousin, philosophical naturalism, possible; for example, the idea of monotheism renders the cosmos intelligible as natural forces are freed from the provenance of various competing deities and instead placed under the providence of a single divine entity, thereby imbuing the universe with an orderliness and meaningfulness it could not formerly possess.

Whatever Western Civilization may become in the future, it remains the product of Christianity and is as yet inseparable from that foundation. That many of its members are turning away from that foundation and that other civilizations are attempting to adopt its ideas in a piecemeal manner without also adopting that foundation is a challenge Western Civilization is only now beginning to face for the first time. How radically Western Civilization will be altered, whether its products can survive outside of their natural habitat and without the food sources they have hitherto depended upon, and, indeed, whether Western Civilization can survive these upheavals at all are yet to be seen. Until that time, Western Civilization remains what it has been since its inception two thousand years ago in the incipient stage of that great synthesis of Judaism and Greece; it is Christendom.

Notes

1 The attribution of the Babylonians as the source of the astronomical knowledge which enabled Thales of Miletus’s famous prediction of the solar eclipse of 28 May 585 BC, for example, is almost certainly false. See Dmitri Panchenko, “Thales Prediction of a Solar Eclipse,” in Journal for the History of Astronomy (November, 1994): 275-288.

2 Where the two most notably diverge and where the Jewish law exhibits an improvement over the other Mesopotamian law codes, like that of Hammurabi, is in its application of the law to all people. Leviticus 24:22, for example, makes explicit that there will be one law which applies to all people. Whereas Hammurabi prescribes lex talionis for offenses among equals, the Jewish law prescribes this standard for nearly all offenses by any party against any party. The difference is undoubtedly the result of the previous improvement of the Jewish creation story, in which man is created as a child (in his “image” and “likeness,” according to Genesis 1:26-27) of God and his co-operator, over the Mesopotamian, in which man is created as the slave of the gods. This Jewish emphasis on equality would enter deeply into the DNA of Western Civilization.

3 Tertullian, “The Prescription Against Heretics,” 7.

4 Christopher Dawson, Dynamics of World History (Wilmington: ISI Books, 2002), 399.

5 Christopher Dawson, “Christianity and the New Age,” in Jacques Maritain, Peter Wust, and Christopher Dawson, Essays in Order (New York: Macmillan, 1931), 228.

6 Christopher Dawson, The Age of the Gods (Washington, D.C.: Catholic University of America Press, 2012), xxiv.

7 Roland N. Stromberg, An Intellectual History of Modern Europe (Englewood Cliffs: Prentice-Hall, 1975), 8-9.

8 Justin Martyr, “Second Apology,” 10.

9 Ibid., 7.

10 Justin Martyr, “First Apology,” 20.

11 Christopher Dawson, The Judgment of the Nations (Washington, D.C.: Catholic University of America Press, 2012), 23.

12 Stromberg, 32.

13 Galileo, The Assayer.

Personhood in Medieval Philosophy (Personhood Part VI)

The history of medieval thought is largely a history of attempts by various thinkers to bridge the gap between and create a synthesis of biblical faith and Greco-Roman philosophy within the context of the Christian Church. As is to be expected from any attempt to reconcile such disparate sources as Plato, Aristotle, and Genesis, and to create a coherent whole out of this reconciliation, this medieval synthesis of Western thought was often an uncomfortable amalgam of contradictory elements. Medieval ideas about personhood are largely the result of this tension and combination.

One relatively early example of this tension in Christian thought is demonstrated in the words of the fourth century bishop Gregory of Nyssa in his work “On Infants’ Early Deaths.” In that work, Gregory refers to a newborn who has died shortly after birth as passing away “before he is even human,” adding to this statement the parenthetical explanation that “the gift of reason is man’s peculiarity, and he has never had it in him.”69 For his belief that reason is the defining feature of humanity, Gregory drew upon the ideas of the extremely influential late second and early third century Christian author Origen, according to whose assertion, “we hold the resemblance to God to be preserved in the reasonable soul.”70 Origen, who drew heavily on Greek philosophy to explain biblical ideas, in turn, drew on that philosophy for this explanation of the content of the Imago Dei. The Bible itself, however, offers no such identification between human reason and the Imago Dei. In bringing together the Greek philosophical idea that reason is the defining feature of personhood and the biblical idea of the Imago Dei, the beginning of the uncomfortable synthesis of the Greco-Roman with the biblical is demonstrated. In spite of his denial of full personhood to an infant, however, an apparent departure from previous Christian understandings, Gregory nonetheless does not express doubt in the same work that said infants possess immortal and complete human souls.

Another fairly early example of this uncomfortable synthesis that marked medieval Christian thought occurs in Augustine of Hippo’s early fifth century work “On the Holy Trinity.” In that work, as in much else that he wrote, Augustine exhibits a bizarre mix of Platonism, Judaism, and Christianity. This amalgam leads him, in a discussion of women, to draw simultaneously on the opening chapters of Genesis and on 1 Corinthians 11:3-12, interpreting both through the lens of Neo-Platonic philosophy. The rather strange conclusion that he reaches is that a woman herself does not bear the Imago Dei but is the Imago Dei only in conjunction with her husband. According to Augustine, “woman herself alone … is not the image of God; but as regards the man alone, he is the image of God as fully and completely as when the woman too is joined with him in one.”71 The uncomfortable mixture of the biblical and Platonic in Augustine’s thought runs throughout his discussion of the Imago Dei and reaches its high point when he, along with Origen and Gregory before him, identifies the Imago Dei with a “rational mind.”72 He is forced to admit, in order to remain true to the biblical text and to traditional Christian anthropology and soteriology but clearly in contradiction to what his previously stated views on women imply, that “it is clear, not men only, but also women have” full possession of this “rational mind.”73

Perhaps the most conspicuous example of the tension between the biblical and the Greco-Roman in medieval Christian thought on personhood is in the ideas of the thirteenth century theologian Thomas Aquinas, whose influence on Western Christianity is arguably less than only Paul and Augustine. Whereas Augustine struggled to find a synthesis between the Neo-Platonic and the biblical, Aquinas sought to bring Aristotle’s philosophy together with the Bible. Just as in Augustine’s work, this attempted synthesis creates a tension that is a palpable and ubiquitous presence in Aquinas’s works. His thoughts on women certainly present an outstanding example of this uncomfortable synthesis, as is exhibited by his discussion of women in his Summa Theologica’s Question 92.74 There, Aquinas almost desperately attempts to make the statements of Genesis in regards to the creation and dignity of women agree with Aristotle’s thought on women in his work On the Generation of Animals. In order to make two very different and ultimately mutually exclusive accounts agree, however, Aquinas is forced to perform strenuous mental gymnastics. In his First Article, Reply to Objection 1 in that section, for instance, he is forced to affirm both that woman is a good and complete creation of God, as Genesis claims, and that she is “defective and misbegotten,” as Aristotle claims. In spite of his very best mental gymnastics, Aquinas is clearly unable to make Genesis and Aristotle agree.75 

Notes

69 Gregory of Nyssa, “On Infants’ Early Deaths,” in Nicene and Post-Nicene Fathers , 2nd series, Vol. 5 (Grand Rapids: Eerdmans Publishing Company, 2004).

70 Origen, Against Celsus, book 7, ch. 66.

71 Augustine of Hippo, On the Holy Trinity, ch. 7, in Nicene and Post-Nicene Fathers , 1st series, Vol. 3 (Grand Rapids: Eerdmans Publishing Company, 2004).

72 Ibid.

73 Ibid.

74 Thomas Aquinas, Summa Theologica, Question 92, in Thomas Aquinas: I, ed. Robert Maynard Hutchins (Chicago: William Benton, 1952).

75 I have adapted most of the preceding paragraph from a post to my blog. David Withun, “Aquinas’s uncomfortable synthesis,” Pious Fabrications, 4 April 2013, http://www.piousfabrications.com/2013/04/aquinass-uncomfortable-synthesis.html (accessed 20 April 2013).

Personhood in Roman Law (Personhood Part V)

The interpretation of early Christian beliefs about personhood into the law of the Roman Empire began very early in the reign of Constantine. On 21 March 315, for instance, only two years after he issued the Edict of Milan, which document granted official religious toleration to Christianity following the worst persecution the Church had yet endured, Constantine promulgated a law which ordered that “if any person should be condemned to the arena or to the mines … he shall not be branded on his face … so that the face, which has been made in the likeness of celestial beauty, may not be disfigured.”62 Although the interpretation of the doctrine of Imago Dei which this law offers is rather haphazard and peculiar, it is nonetheless significant that Christian anthropology, even if in an incomplete form, was being used as a source for Roman law at this early date. Just two months later, on 13 May 315, Constantine promulgated another law with made infanticide and exposure of infants illegal in the Roman Empire and appointed money from the imperial treasury be used to feed children whose parents could not feed them.63 Similarly, four years later, on 11 May 319, Constantine issued another law which forbade masters from mistreating or killing their slaves.64 Constantine also published a number of laws whose intent was to encourage slave owners to manumit their slaves and to make the process of manumission, formerly a complicated process under Roman law, as easy and desirable as possible for them. A law promulgated on 18 April 321, for instance, grants Christian clergy the right to legally free slaves whose owners wish to manumit them.65 Another law, promulgated in an attempt to prevent poor parents from selling their children into slavery and published on 6 July 322, stipulated that children whose parents are too poor to support them should receive their support from the imperial treasury.66 As significant as are these and other laws promulgated by Constantine, the most significant reform of Roman law in accordance with Christian beliefs came under the Emperor Justinian in the sixth century. Under the influence of his powerful wife Theodora, Justinian included in his extensive and thorough reforms of Roman law the promulgation of many laws protecting the rights of women and children. Among them were laws prohibiting forced prostitution, allowing marriages between members of any social class, banning infanticide, granting women guardianship over their children, and allowing women to more easily leave prostitution without being subject to continuing legal or social handicaps. In justifying the promulgation of such laws, Justinian echoed the words of Paul, proclaiming, “in the service of God, there is no male nor female, nor freeman nor slave.”67 The influence of the Corpus Juris Civilis, the massive product of Justinian’s comprehensive reform of Roman law, continues to the modern day. Later, in 797-802, a woman, Irene of Athens, would reign for the first time as empress regnant of the Roman Empire.68 She also convoked the Seventh Ecumenical Council of the Christian Church at Nicaea in 787.

Notes

62 Codex Theodosiani 9.40.2, in Joseph Story, ed., Conflict of Laws (Clark: Lawbook Exchange, Ltd., 1841).

63 Codex Theodosiani 11.27.1

64 Codex Theodosiani 9.12.1

65 Codex Theodosiani 4.8.1

66 Codex Theodosiani 11.27.2

67 Justinian, quoted in J. A .S. Evans, The Empress Theodora: Partner of Justinian (Austin: University of Texas Press, 2003), 37.

68 Lynda Garland, Byzantine Empresses: Women and Power in Byzantium AD 527-1204 (London: Routledge, 1999), 73-94.

Personhood in Greco-Roman Thought and Practice (Personhood, Part II)

Demonstration of the very narrow understanding of personhood in Greek thought begins with the earliest texts of Western civilization, the Iliad and the Odyssey, both attributed to the poet Homer and composed in about the eighth century BC.1 Both works limit their purview to the lives of male Greek aristocrats. The concerns of women and children are treated only insofar as they affect the men. The concerns of slaves, of the poor, of the handicapped, and other such groups are never considered at all. The world of Homer is the world of a small but powerful elite class.

Later developments in Greek thought served to justify this narrow definition of personhood. Aristotle, for instance, writing in the fourth century BC, provided a succinct list of groups explicitly excluded from the category of personhood as well as a justification for the exclusion of each in his Politics: “Although the parts of the soul are present in all of them, they are present in different degrees. For the slave has no deliberative faculty at all; the woman has, but it is without authority; and the child has, but it is immature.”2 Because of their lack of “the deliberative faculty,” Aristotle claims that slaves, along with “brute animals[,] … have no share in happiness or in a life based on choice.”3 Similarly, says Aristotle, “the female is, as it were, a mutilated male.”4 In addition, Aristotle also excluded the lower classes, the poor and even laborers from his definition of personhood, arguing, for instance, that “the life of mechanics and shopkeepers … is ignoble and inimical to goodness.”5 Aristotle also placed the entirety of the non-Greek population into the category of those lacking “the deliberative faculty,” asserting that “barbarians … are a community of slaves” who should rightfully be ruled by the Greeks.6

These negative assessments regarding the personhood of women, slaves, children, barbarians, and others in the writings of Aristotle can be taken as representative of Greco-Roman thought more generally. The Leges Duodecim Tabularum, or Law of the Twelve Tables, for instance, a document of the fifth century BC which formed the foundation of Roman law, institutionalized the systematic marginalization and oppression of these groups within Roman society.7 In the Twelve Tables, the male head of household was granted the right to dispose of the women, children, and slaves within his household in the same manner as he treats animals and other property under his control, including the right to sell them and even to kill them; he is, in fact, ordered by the Tables to kill any children born with deformities (Table IV). Women, being property themselves, are denied the rights of property ownership (Table VI). Marriages between members of the aristocracy and members of the lower classes were banned outright (Table XI). In short, only an adult male member of the Roman aristocracy was granted full personhood in this initial document which governed and defined Roman society. This narrow understanding of personhood remained the standard understanding in the Roman Empire until the fourth century.

Notes


1 Harold Bloom, Homer (New York: Infobase Publishing, Inc., 2009), 205.

2 Aristotle, Politics, in Aristotle: II, ed. Robert Maynard Hutchins (Chicago: William Benton, 1952), 1260a10-14.

3 Ibid., 1280a32-34.

4 Aristotle, On the Generation of Animals, in Aristotle: I, ed. Robert Maynard Hutchins (Chicago: William Benton, 1952), 737a26-7.

5 Aristotle, Politics, 1328b39-40.

6 Ibid., 1252b4-8.

7 The Laws of the Twelve Tables, http://www.constitution.org/sps/sps01_1.htm (accessed 24 March 2013).

Personhood in Late Antiquity: How Barbarians, Slaves, Women, and Children Became Persons (Personhood in Late Antiquity, Part I)

The Greco-Roman world, whose Hellenistic culture and thought dominated the West throughout Antiquity, possessed a very narrow definition of what constituted a person, a full and equal member of the human political and legal community with all of the rights and responsibilities that status confers. In large part, the full application of that term and the concept it represented were limited to free adult male Greek, or, later, Roman, aristocrats. Groups such as slaves, children, women, men who were not Roman citizens, the poor, and others who did not fit into this narrow category were excluded from full participation in personhood. Slaves alone constituted a third of the population of the Roman Empire and women made up approximately half. The majority of the population of the Roman Empire, then, was seen as possessing less than full personhood. Groups that were denied full personhood were often subject to disdain, abuse, brutality, and even execution with no legal recourse. The Jews, on the other hand, who made up a small but visible minority of subjects and citizens under Greek and Roman rule in Antiquity, because of their doctrine of the Imago Dei, held a much wider understanding of personhood and included under that concept all members of the human species regardless of social status, age, gender, or nationality. As a result, Jewish law conferred upon slaves, women, children, the poor, and other such groups the status of full personhood and the rights associated with that status under Jewish law. Christianity emerged from Judaism in the first century AD and carried with it the idea of the Imago Dei, coupling with that idea its own original ideas of the Incarnation of God as man and the availability of salvation for all people through recapitulation. Already heavily influenced by Hellenistic thought from its inception, Christianity in large part became a point of synthesis between Judaism and Hellenism beginning in the second century as an increasing number of converts to the incipient religion came from segments of the Roman Empire outside of the Jewish community, especially from marginalized and oppressed groups. Because of its message of the full personhood of women, children, slaves, and other marginalized and oppressed classes in Roman society, it drew its converts especially from these groups. In the fourth century, Christianity became the official, dominant, and popular religion of the Roman Empire and began to exert a major influence on law, thought, and culture in the West. Although it continued to struggle with the process of reconciling and synthesizing the Judaic and Hellenistic elements it had inherited, Christianity introduced a new and wider understanding of who was fully a person, a definition which included even unborn children and the lowest and most degraded segments of society. Popularized and refined throughout Late Antiquity and the Middle Ages, this definition became the standard understanding of what constitutes a human being according to Western thought and, although it has been and continues to be challenged from various quarters, it remains the standard understanding today.

The Cold War and Modern Identity

Although the 20th century was a period great trials and tribulations throughout the world, including the two world wars, the anti-colonialist movements throughout Asia, Africa, and elsewhere, and the many massacres and genocides, such as the Turkish massacre of Armenians and the Holocaust carried out in Nazi-occupied Europe, if a single defining event must be pinpointed, the defining feature of the 20th century must undoubtedly be said to be the Cold War. The Cold War, which lasted for nearly half of the 20th century, saw first Europe and then most of the rest of the world divided into two camps, communist and authoritarian on one side and capitalist and democratic on the other. The split between these two groups of powers, the former headed by the Soviet Union and the latter led by the United States, was viewed by both sides as an apocalyptic struggle of good versus evil, liberty versus oppression, and democracy versus tyranny. Both sides of the Cold War, the communistic and authoritarian as well as the capitalistic and democratic, have deep roots in the history of Western civilization; the Cold War, then, represented a kind of coming of age and decision point in Western culture, in which sets of principles which had been at tension with one another nearly since the inception of Western thought finally reached a point at which one idea must triumph over the other. Although, of course, the capitalist and democratic ideas won out over the communist and authoritarian, as with nearly any conflict of such a clearly Hegelian nature, the conflict produced a kind of synthesis in which the representatives of capitalism also absorbed portions of communism and the representatives of democracy also absorbed or made peace with elements of authoritarianism. In the end, the Cold War was not so much a victory for either side as an exercise in Hegelian dialectic, in which the final result was, while dominated by one side, a synthesis of both sides.

Although the birth of communism is most readily associated with the labor movements of the 19th century and especially with the thought of Karl Marx and Friedrich Engels, the authors of the famous, or perhaps infamous, Manifesto of the Communist Party, as even they point out in the Manifesto, the roots of communism are much deeper in history, and extend to the very origins of Western thought in both of its earliest contributors, Greek philosophy and Jewish religion.1 The similarities between Marx’s ideas and the communal utopia expounded upon by Plato in his Republic are glaring and have been noted by many commentators in the past. Desmond Lee, a scholar in classics and ancient philosophy, for instance, has drawn attention to Plato’s injunction that “both private property and the family are to be abolished” in Plato’s utopia.2 The abolition of private property is, of course, a cornerstone of Marxist philosophy. Although the attempt would later be abandoned, especially during and following World War II, during its earlier, more idealistic phase, the leadership of the Soviet Union, in hopes of creating a communist utopia, also made “a sustained effort … to undermine the family,” which included “establish[ing] collective kitchens and day care centers.”3 According to Nicholas V. Riasanovksy and Mark D. Steinberg, both professors of Russian history, “some Bolshevik leaders even spoke of ‘free love,’” a practice and principle which also bears a similarity to the counsel of Plato.4

In regards to the Jewish antecedents of communist thought, the prolific 20th century philosopher Bertrand Russell, among many others, has pointed out that the “soteriology” and “eschatology” of Marxism are essentially biblical in character; Russell even provides a handy “dictionary” to Marx’s ideas:

Yahweh=Dialectical Materialism
The Messiah=Marx
The Elect=The Proletariat
The Church=The Communist Party
The Second Coming=The Revolution
Hell=Punishment of the Capitalists
The Millennium=The Communist Commonwealth5

Marxist communism in both the form developed by Marx himself and in its later develops in the Soviet Union represents a combination of these and other similar elements in Western thought.

Similarly, democracy and capitalism in their modern liberal forms, which largely emerged from the thought of the Enlightenment, also have deep roots in Western thought. In the first book of history by the West’s first historian, The History of Herodotus, the wars between the Persians and the Greeks in the 5th century BCE are identified as struggles between “freedom” and “slavery” and consistently portrayed in such terms and ideas throughout.6 The Greek polis of Athens is, of course, generally identified as the world’s first democracy and even Sparta, with its characteristically militaristic and authoritarian society, has traditionally been granted a measure of respect as in some sense embodying the first fundaments of later Western democratic ideals, as, for instance, in its insistence on multiple rulers who must reach unanimous agreement in matters of policy so that no one individual can hold absolute power or unilateral decision-making authority.

Just as with communism, democracy and capitalism also had their antecedents in Jewish thought. Historian Thomas Cahill, for instance, has pointed out that “capitalism, communism, and democracy” are all in some sense

children of the Bible, … modeled on biblical faith and demanding of their adherents that they always hold in their hearts a belief in the future and keep before their eyes the vision of a better tomorrow, whether that tomorrow contains a larger gross domestic product or a workers’ paradise. … Democracy … grows directly out of the Israelite vision of individuals, subjects of value because they are images of God, each with a unique and personal destiny. There is no way that it could ever have been ‘self-evident that all men are created equal’ without the intervention of the Jews.7

While democracy, capitalism, and communism, as well as the measure of authoritarianism which the latter implies, all have roots in the very earliest origins of Western thought and have existed alongside each other in that thought as well as in practice since their inception, they have clearly existed in tension and in competition. With the onset of the Cold War, this tension took on new proportions and finally demanded a resolution.

The American poet Walt Whitman once poignantly wrote that it was on the United States that the “Earth’s résumé entire floats” and, addressing the United States itself, added “the antecedent nations sink or swim with thee.”8 In other words, the United States, in the view of Whitman, acts as the heir and representative of the entirety of the tradition of Western civilization. While there may be those who would debate Whitman’s point, there is undoubtedly a great measure of truth to it. The United States, more than any other nation, enshrined the democratic principles of Western thought in its founding documents and principles. No nation embodies Enlightenment thought on politics and economics, as well as in other areas, more than the United States. The principles of the equality of all men before the law, of popular participation in government and the insistence that the state possess the consent of the governed, of the freedom of the individual human conscience, and other similar principles which are essentially unique to Western thought all entered into the Declaration of Independence and the Constitution, two documents which might, not inaccurately, be referred to as American scripture.

In 1917, with the Bolshevik Revolution and the transformation of the Russian Empire into the Soviet Union, an, in a sense, equal-and-opposite of the United States was established. If the United States can be considered the representative of the democratic and capitalist principles of Western thought, the Soviet Union can be seen as the embodiment of the authoritarian and communist principles. The Soviet government nearly immediately set about trying to build an ostensibly more egalitarian society, “a new realm of freedom and equality, free of conflict.”9

This age-old dream of such a utopia was alluring even to those who lived in the capitalist democracies and republics of the United States and Western Europe. This is particularly true of Marxism’s claim that “the proletarian revolution marks the end of … [the] historic process.”10 David Gress, a historian whose work has focused on Western identity, has pointed out that this view of communism as replacing and surpassing, perhaps in some sense fulfilling, capitalist democracies drew the admiration of Western intellectuals for the Soviet Union. Following World War II and the collapse of European fascism as well as the witness of worldwide atrocities, the conscience of the West was piqued. According to Gress, “what they needed was the secularized religious impulse that impelled political and intellectual leaders to continue the search for the perfect society, for the revolutionary transformation of all existing conditions, for the place and the moment of the leap into the kingdom of freedom.”11 It was this that allowed the Soviet Union to attain the “moral high ground of anticapitalism” both in the minds of its own leaders as well as in the minds of many Westerners.12

Although the two had been rather cordial allies during World War II and had defeated Nazi Germany with its fascist ideals through their combined efforts, the United States and the Soviet Union were doomed to a wide split from one another. Almost immediately after their mutual victory over Germany, the two sides of the ideological split retreated from each other and entrenched themselves into their ideological camps. As early as 5 March 1946, less than a full year after the surrender of Nazi Germany to the Allied powers, Winston Churchill, who had served as Prime Minister of the United Kingdom during the majority of World War II, referred to this ideological split, using the phrase “iron curtain,” which would later become popular parlance in describing the situation of the Cold War:

From Stettin in the Baltic to Trieste in the Adriatic, an iron curtain has descended across the Continent. Behind that line lie all the capitals of the ancient states of Central and Eastern Europe. Warsaw, Berlin, Prague, Vienna, Budapest, Belgrade, Bucharest and Sofia, all these famous cities and the populations around them lie in what I must call the Soviet sphere, and all are subject in one form or another, not only to Soviet influence but to a very high and, in many cases, increasing measure of control from Moscow.13

On the other side of that “iron curtain,” of course, were the United States and its democratic and capitalistic allies in Europe, including Churchill’s own United Kingdom. A line had been drawn in the proverbial sand. In the words of Louis J. Halle, a political scientist who worked in the U.S. State Department during the Cold War:

In ideological terms, the Cold War presented itself as a worldwide contest between liberal democracy and Communism. Each side looked forward to the eventual supremacy of its system all over the earth. The official Communist goal was the liberation of mankind from capitalist oppression. Ideologically minded Westerners interpreted this as signifying that Moscow was trying to impose its own authoritarian system on a world it meant to rule. Americans, for their part, had traditionally looked forward to the liberation of mankind from the oppression of autocracy, and to the consequent establishment of their own liberal system throughout the world. To the ideologists in Moscow this meant that “the imperialist ruling circles” in America were trying to enslave all mankind under the yoke of Wall Street.14

This ideological split and the consequent perceptions on either side of it would lead to one of the world’s most protracted and widespread conflicts, which played itself out on nearly every continent of the world in wars both “hot” and “cold.”

The Cold War would, of course, end with the collapse of the Soviet Union in 1991. This collapse is popularly viewed as the final triumph of liberal democracy and capitalism over communism and authoritarianism. Some commentators, such as Francis Fukuyama, a former deputy director of the U.S. State Department’s policy planning staff, have even went as far as declaring the end of the Cold War to be “the end of history,” in an ironic use of the same Hegelian ideas Marx made use of in declaring communism to be the final result of the historical dialectic.15

The truth of the situation, however, is that, in a far more Hegelian fashion, the result of the dialectic of the two antitheses was a synthesis. The United States, even while expounding on the virtues of democracy, supported autocratic regimes throughout the world, such as that of Shah Mohammad Pahlavi in Persia, on the condition that they opposed communism. While it could be argued that such support was hypocritical, it may also, more positively, be portrayed as an acknowledgement of the value of authoritarian rule in some cultural contexts. In addition, throughout the Cold War, the United States and, to an arguably greater extent, its European allies adopted a number of reforms which reflected the social ideals of communism, including protection for workers’ rights, social welfare systems, universalized healthcare, and others. In the end, these concessions to communism are a large part of what brought down the Soviet Union; in granting that the communists had a point in regards to their criticisms of wealth and poverty in the Western world and the exploitation of the laboring class, the capitalistic democratic nations regained the moral high ground and won the war of ideas. The West became the synthesis, rendering the antithesis obsolete.

Notes 1 Karl Marx and Friedrich Engels, Manifesto of the Communist Party, in Robert Maynard Hutchins, Great Books of the Western World, Vol. 50: Marx (Chicago: William Benton, 1952), 419.

2 Desmond Lee, “Translator’s Introduction” in Plato, The Republic (New York: Penguin Books, 2003), xliv.

3 Nicholas V. Riasanovksy and Mark D. Steinberg, A History of Russia, Eighth Edition (New York: Oxford Unversity Press, 2011), 595.

4 Ibid.

5 Bertrand Russell, The History of Western Philosophy (New York: Simon & Schuster, 1972), 364.

6 Herodotus, The History, Book IX, 45, in Robert Maynard Hutchins, ed., Great Books of the Western World, Vol. 6: Herodotus and Thucydides (Chicago: William Benton, 1952), 298.

7 Thomas Cahill, The Gifts of the Jews: How a Tribe of Desert Nomads Changed the Way Everyone Thinks and Feels (New York: Anchor Books, 1998), 249.

8 Walt Whitman, “Thou Mother With Thy Equal Brood,” 4, Leaves of Grass (New York: The Modern Library, 2001), 564.

9 Riasanovksy and Steinberg, History of Russia, 482.

10 Ibid., 481.

11 David Gress, From Plato to NATO: The Idea of the West and Its Opponents (New York: Simon & Schuster, 1998), 404.

12 Ibid.

13 Winston Churchill, “The Sinews of Peace,” http://www.nato.int/docu/speech/1946/s460305a_e.htm (accessed 30 December 2012).

14 Louis J. Halle, “The Cold War as History,” in Kevin Reilly, Readings in World Civilizations, Volume 2: The Development of the Modern World (New York: St. Martin’s Press, 1988), 265.

15 Francis Fukuyama, “The End of History?” in Marc A. Genest, ed., Conflict and Cooperation: Evolving Theories of International Relations, Second Edition (Belmont: Wadsworth, 2004), 393.

Amazon.com Widgets

Was the American Civil War a Just War?

Introduction 

The American Civil War was a defining moment not only in the history of the United States but in the history of the world. As Walt Whitman, an eyewitness of the Civil War, poignantly wrote in his book of poetry Leaves of Grass, it was on the United States that the “Earth’s résumé entire floats” and “the antecedent nations sink or swim with thee.”1 In other words, the United States acted, and arguably still acts, as the heir and representative of the entirety of the tradition of Western civilization. In the insistence of the founders of the United States that the underlying, central, and governing principles of the new nation were to be that “all men are created equal, that they are endowed by their Creator with certain unalienable Rights,” and that governments “deriv[e] their just powers from the consent of the governed,” the formation of the United States became a culminating moment in the history of Western thought.2 Principles that were primary in and essentially unique to Western culture, such as the equality of all men before God and the law, the belief that all human beings are entitled to certain rights by virtue of being members of the human race, and that a government must have the consent of the governed, were identified as the principles upon which the United States would stand. The Civil War, then, represents a summarizing event in Western civilization; it stands in line with the Peloponnesian War, the triumph of Christianity in Late Antiquity, the split between Eastern and Western Christendom in 1054, and the Protestant Reformation as one of the greatest schisms in Western civilization. Both sides of the Civil War, the federal government and the incipient Confederate States of America, represent this common heritage in all its contradiction and complexity. Part of this common heritage is the Just War theory developed by Greco-Roman thinkers like Aristotle and Cicero, which culminated in the thought of medieval and early modern Christian thinkers such as St. Augustine of Hippo, St. Thomas Aquinas, and Hugo Grotius. Ironically, although both belligerents represent this common heritage and were fighting for two different aspects of Western civilization, and although both saw themselves as fighting for a just cause, neither participant in the American Civil War can be said to have fought a just war as both failed to meet the criteria of Just War theory.

Jus Ad Bellum

When considering whether a war effectively met the criteria of Just War theory, the first consideration that must be made is whether the reasons for war in the first place were just. In Latin, this stage of consideration is referred to as “Jus Ad Bellum,” meaning “just to war.” Traditionally, four criteria have been identified by Just War theorists as creating a situation in which a power is “just to war,” namely, (1) just authority, (2) just cause, (3) just intention, and (4) last resort.3

1. Just Authority

The first criterion, just authority, requires that the powers initiating and engaging in hostilities possess the legitimate authority to do so. Thomas Aquinas summarizes this point in his Summa Theologica in his claim that “in order for a war to be just” there must be a “sovereign” with valid authority “by whose command the war is to be waged” because “it is not the business of a private person to declare war” nor “the business of a private person to summon together the people, which has to be done in wartime.”4 While it is apparent that the federal government of the United States meets this criterion, the government of the Confederacy does not appear to do so.5 Theoretically, it could be argued that the central government of the Confederacy derived its authority from the states which chose to enter into it and which were undoubtedly legitimate governing authorities, which in turn lends legitimacy to the government of the Confederacy as a kind of conglomerate government of these states. The Constitution of the United States of America, however, of which all of the constituent states of the Confederacy were signers, specifically grants the right “to raise and support Armies” only to the federal government.6 Furthermore, the Constitution also did not provide for the means nor even seem to envision the possibility of any state or group of states to decide to leave the Union, a fact which Abraham Lincoln himself pointed out in his First Inaugural Address, delivered on 4 March 1861:

It is safe to assert that no government proper ever had a provision in its organic law for its own termination. Continue to execute all the express provisions of our National Constitution, and the Union will endure forever, it being impossible to destroy it except by some action not provided for in the instrument itself.7

In addition, as Charles Guthrie and Michael Quinlan point out in their treatment of Just War theory in the modern world, “historically,” the criterion of just or competent authority “has usually meant the ruler or government of a sovereign state, as opposed to an internal warlord or faction.”8 In other words, traditional Just War theory does not seem to countenance a civil war, no matter for how ostensibly just a cause. The Confederacy, then, fails to meet the criterion of just authority.

2. Just Cause

A just cause for war is perhaps the most central and important of the criteria of Jus Ad Bellum. Even those who are entirely unversed in the niceties of Just War theory and international law generally demand that there be a just cause for the initiation of military action by one nation upon another. To determine if either or both sides of the Civil War possessed a just cause for war, the reasons for the conflict as viewed and enunciated by each side must be examined; although there are a variety of causes which led to the Civil War, there are two overarching reasons behind all of the causes: (1) a dispute over the role of the federal government in relation to the rights of the states to govern themselves and (2) slavery, arguably the deepest of all underlying issues and causes of the war.

From a Southern perspective, the ultimate cause of the Civil War was the infringement on the rights of the states by the federal government. As the website of the Civil War Trust, a non-profit organization dedicated to the preservation of historical sites related to the Civil War, succinctly states it, “Southerners were sure that the North meant to take away their right to govern themselves, abolish slavery, and destroy the Southern economy.”9 From this perspective, it is possible to see the Civil War as a struggle by the Confederacy against the tyranny of the United States government, which would seem to indicate a just cause. If the implications of and reasons for the cry of “states’ rights” on the part of Southerners is examined deeper, however, the uncovered roots overturn such a conclusion.

Ultimately, for Southerners, the right of the states that was being demanded was the right to determine the legality of slavery. According to Alexander Hamilton Stephens, the vice-president of the Confederacy,

our new government is founded upon exactly the opposite idea [from abolition]; its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man, that slavery — subordination to the superior race — is his natural and normal condition.10

While Union leaders, on the other hand, identified the war primarily as “a struggle to preserve the Union” early in the conflict, they quickly realized that the only way to preserve the Union was to agree with the Confederate leadership that the war was primarily about slavery and to adopt the equal and opposite position of those like Stephens, seeking instead to “reconstruct the Union into the nation it should have been without slavery.”11 In its very essence, then, at its deepest roots, the Civil War was a war about slavery.

Although slavery has been practiced throughout most of the history of the world, including those segments of the world and its history that make up Western civilization, slavery has also received an unequivocal condemnation by this tradition. As historian Thomas Cahill notes, “in the prescriptions of Jewish law we cannot but note a presumption that all people, even slaves, are human and that all human lives are sacred.”12 From these ancient Jewish roots, Christianity derived its “claim that all were equal before God and all equally precious to him,” a claim which “ran through class-conscious, minority-despising, weakness-ridiculing Greco-Roman society like a charged current” and overturned the previous ideological foundations upon which Western society had based its belief in the legitimacy of the practice of slavery.13 14 As a result of this claim, early Christian thinkers and leaders like late fourth century bishop St. Gregory of Nyssa became among the first writers in the world to adopt a truly abolitionist position towards slavery and to oppose the practice on principle.15 As Cahill has pointed out, it is only within the context of this strain of thought that a claim like that of the American Declaration of Independence that it is “self-evident that all men are created equal” can make any sense at all or, for that matter, “could ever have been” made in the first place.16 If such a claim is accepted as true, whether self-evidently or not, slavery must, by implication, be viewed as immoral per se.

If the ultimate and underlying cause of the Civil War for both sides thereof is indeed slavery, it is this issue which must determine which side, if either, had a just cause for the initiation of hostilities. According to St. Augustine of Hippo, as quoted by Thomas Aquinas in his discussion of Just War, “a just war is … one that avenges wrongs, when a nation or state has to be punished, for refusing to make amends for the wrongs inflicted by its subjects, or to restore what it has seized unjustly.”17 Given that slavery is a “wrong” in need of “punishment” and that the Confederacy had “unjustly” seceded from the Union for this cause, the federal government here again seems to meet the criteria of Jus Ad Bellum whereas the Confederacy fails to do so.

3. Just Intention

The third criterion of Jus Ad Bellum, just intention, requires that the belligerents involved in a war have the correct intentions in commencing hostilities. In the succinct phrasing of Aquinas, the criterion of just intention is the criterion that belligerent powers “intend the advancement of good, or the avoidance of evil.”18 They cannot intend to inflict hateful or undue punishments upon their opponents but only to redress the wrongs for which they are going to war. In this regard, again, the Union seems to have the advantage over the Confederacy in meeting the criteria of Just War theory.

Although, as will be discussed in the section on Jus in Bello, the Union often failed to live up to its intentions, it is clear from both his words and his actions that President Abraham Lincoln, as the leader of the Union, desired “to achieve peace as expeditiously as possible.”19 His goal from the beginning of the war and throughout its duration was to end the conflict and reintegrate the South back into the Union as quickly and easily as possible. To this end, he opposed those members of his own political party who called for more radical measures in punishing the South’s political and military leadership as well as its economic aristocracy at the end of the war. Although he insisted upon the emancipation of blacks and the abolition of slavery throughout the United States, he was, not to his credit, even willing to compromise on the enfranchisement of former slaves and other blacks as full citizens with voting rights in order to satisfy the prejudices and alleviate the fears of Southern whites, stating in his final speech before his assassination that he desired that, among blacks, only “the very intelligent” and Union veterans of the Civil War be granted the right to vote.20

In contrast to these rather amicable intentions on the part of the highest leadership in the federal government stands the rancor that dominated the intentions of the highest leadership in the Confederate government. In his Normans and Saxons, an intellectual history of the idea of race in its relation to the Civil War, Ritchie Devon Watson, Jr., demonstrates that the rhetoric of white Southerners against blacks, Northern whites, and other target groups exceeded mere polemic and entered the realm of vitriolic demonization.21 One example of the existence and nature of such hatred even among the highest ranks in the Confederacy may be found in the apparent approval of Jefferson Davis, the president of the Confederacy, for the assassination of Lincoln.22 In this point of Just War theory as in those previously considered, the Union once again meets this criterion whereas the Confederacy fails to measure up.

4. Last Resort

The final essential ingredient of Jus Ad Bellum, according to classical formulations of Just War theory, is that the resort to armed conflict be a last resort. Even if just authority, just cause, and just intention all exist, warfare must itself be the final and even unavoidable course of action in order for engagement in warfare to be deemed just. Augustine goes as far as saying that in order for a war to be just the nation which engages in its and its leader must be compelled by force of necessity to enter into warfare, claiming that “it is the wrongdoing of the opposing party which compels the wise man to wage just wars.”23 If either side in the Civil War can be said to have been compelled to enter the war by force of necessity, it must be the Union.

While there are many events which contributed to the eventual outbreak of open conflict between North and South, the 1860 election of Abraham Lincoln to the presidency is undoubtedly the match that sparked the flame. The crisis created by the Kansas-Nebraska Bill of 1854, in which antislavery and proslavery factions vied to populate the territories with their own members and, by extension, to depopulate the territories of members of the other faction, in order to ensure that the new territories entered the Union as non-slave or slave states, respectively, the 1859 attack of John Brown and his men upon the federal arsenal at Harpers Ferry, Virginia, by which he hoped to spark a war over slavery, and other similar events created a tension which hung thick in the air in 1860. The election of Lincoln, who had been elected as a senator from Illinois only two years earlier on “a strong anti-slavery ticket,” as it has been described, was the final straw as far as Southerners were concerned.24

Although he did not receive the majority of the vote, Lincoln did receive a strong plurality among the four candidates for the presidency. Whereas his Democratic opponent, Stephen Douglas, carried 29.5% of the vote, Lincoln took 39.9%, more than enough to represent a decisive victory.25 In the words of historian William E. Gienap, “the northern majority possessed the power to which it was entitled. Yet southerners refused to accept the popular verdict.”26 According to historian William C. Harris, who, in turn, relies upon the account of historian John William Draper, Jefferson Davis himself once plainly informed two Northerners who inquired of him the reasons for secession during the Civil War, “we seceded to rid ourselves of the rule of the majority.”27 In short, in the words of Harris, “Southern failure to abide by majority rule was at the center of the secession crisis.”28 29

Lincoln, on the other hand, tried to prevent Southern secession and the outbreak of war. Although he was portrayed by those who wanted to stoke Southern fears as a “black Republican” and an “abolitionist” and although he had voiced opposition to slavery in the past, Lincoln continually reassured those who would listen to him that he was no radical and did not plan to drastically overturn the state of things in the United States.30 His priorities, as he himself said, were to maintain the Union, to enforce its laws as they stood, and to seek peaceful resolutions to the conflicts and complexities that plagued it. The South, however, hardly gave him the opportunity to even begin taking action. Only “one month after Lincoln was elected president, the state of South Carolina announced its secession from the Union” and “within a few weeks, Mississippi, Florida, Alabama, Georgia, Louisiana, and Texas followed suit.”31 The Confederates were also the first to engage in violence against the other side, firing the opening shots of the Civil War at Fort Sumter, South Carolina, on 12 April 1861. Even in his Second Inaugural Address, delivered on 4 March 1865, as the war was drawing to a close, Lincoln expressed a belief, perhaps solidified throughout the course of a war he had first fought to prevent and then tried desperately to abbreviate and lessen the harshness of but had failed in both goals, that the United States had been inexorably drawn into the war by divine mandate:

We shall suppose that American slavery is one of those offenses which, in the providence of God, must needs come, but which, having continued through His appointed time, He now wills to remove, and that He gives to both North and South this terrible war as the woe due to those by whom the offense came.32

In its rush to and insistence upon secession, the Confederacy yet again failed to meet the standard set by Just War theory. The Union, on the other hand, especially in its leader’s willingness to continue to attempt to negotiate through the differences of ideology and practice that separated the two major regions of the nation and in his stated commitment to place the peace and preservation of the Union foremost in his desires, successfully satisfied the criterion of last resort. There can be little doubt that in meeting the requirement of Augustine that a just war be a war in which a national power is compelled to participate by force of necessity the Confederacy fell far short and the Union succeeded.

5. Conclusion

In final consideration of the four criteria of Jus ad Bellum, the Union is shown to have had the “right to war” in the Civil War whereas the Confederacy did not. Whereas the federal government was a legitimate and sovereign governing authority, the Confederacy failed, as a rebellious group rising against its legitimate government, to meet the criterion of just authority. The federal government also satisfied the criterion of just cause in its desire to simultaneously preserve its sovereign territories to itself and to end the gravely unjust practice of slavery within its borders, whereas the Confederacy’s quest to uphold the institution of slavery, given that it is unjust per se, is clearly an unjust cause for war. The Union’s just intention of repatriating the Southern states to itself quickly and peacefully also satisfied the criterion of just intention, whereas the vitriolic hatred exhibited by all ranks of Confederate leadership for blacks, Northern whites, and anyone else opposed to its cause runs obviously contrary to the criterion of just intention. Finally, the South’s overeager rush for war presents a stark contrast with the nearly desperate pleas of the leadership in the federal government for a peaceful resolution to the internal dissensions of the United States, demonstrating that only the federal government meets the criterion of last resort. In short, the Union adequately satisfied the criterion for Jus Ad Bellum, whereas the Confederacy did not.

Jus In Bello

The next series of points which must be considered in a discussion of whether a specific war can be considered a just war in accordance with traditional formulations of Just War theory is that set of criteria which fall under the category “Jus In Bello,” a Latin phrase meaning “just in war.”33 As the name of this set of criteria indicates, Jus In Bello involves the consideration of whether the actual conduct of a particular belligerent in a war was just. The three criteria of Jus In Bello are (1) proportionality, (2) discrimination, and (3) responsibility. Whereas the federal government adequately satisfied all of the criteria for Jus Ad Bellum, both the Confederacy and the Union failed to satisfy any of the three criteria of Jus In Bello. The actions of the Union army which entered into and crossed through Georgia under General William Tecumseh Sherman, perhaps best demonstrate the failures of both sides in the Civil War to conduct a just war. Sherman’s infamous March to Sea, which has been remembered by subsequent generations largely for its brutality, particularly serves as an outstanding case study in the failure of both powers in the Civil War to practice just conduct within warfare.

1. Proportionality

The first criterion of Jus In Bello is proportionality; proportionality requires that the methods and amount of force used during warfare be proportionate to their desired effect. In other words, given that a belligerent power has just cause and just intention, said belligerent power may only use the minimum amount of force necessary to achieve its intention and satisfy its cause. The actions of the Union army under General Sherman, in flagrant defiance of this criterion, exemplify disproportionality in wartime conduct.

Even before their March to the Sea, more properly referred to as the Savannah Campaign, the Union troops led by Sherman proved their preference for cruelty and their penchant for disproportionality. The burning of Atlanta, Georgia, is one example. On 14 November 1864, just over two months after his army had captured the city, Sherman ordered the entire destruction of the city of Atlanta. According to historian Russell S. Bonds, approximately 4000 homes and businesses were burned to the ground; of the entire city only 400 buildings, just about a tenth of the city, remained standing.34 In a description reminiscent of the common, even if probably false, depiction of the burning of Rome, during which the Emperor Nero, ostensibly the perpetrator of the crime, arrayed himself in a stage costume and sang a song, Union officer Captain Daniel Oakey reported that, while Atlanta burned, the Second Massachusetts’s “post band and that of the Thirty-third Massachusetts played martial airs and operatic selections.”35 36

Whatever the accuracy or lack thereof in this grotesque picture, there can be little doubt that the burning of Atlanta was an act of gross disproportionality in the conduct of warfare. The burning of Atlanta, however, was only the beginning. The March to the Sea that commenced with the burning of Atlanta continued for more than a month, with the federal troops under Sherman “creating a charred avenue over 40 miles wide through the unprotected State [of Georgia], destroying the railroads, seizing all provisions, pillaging, plundering and burning.”37 Sherman’s actions were drastically disproportionate to the cause and intentions of the federal government; the Union and its leaders, then, especially Sherman, failed to succeed in meeting the criterion of proportionality.

2. Discrimination

The second criterion of Jus In Bello is discrimination, which refers to the responsibility of the belligerent power to discriminate between military and civilian targets and to only strike the former while avoiding as much as possibly any damage to the latter. Sherman’s burning of Atlanta and the entirety of his Savannah Campaign once again demonstrate the failure of the federal forces engaged in the Civil War to conduct themselves justly on this point. Not only did Sherman fail to distinguish between military and civilian targets, he actively ordered and encouraged his troops to raid and attack civilian targets.

While his troops were in the Carolinas, for instance, before entering into Georgia, Sherman sent out foraging parties which became known as “Sherman’s bummers” who became a well-known and much-despised presence among the civilian population for their behavior.38 These “bummers” became known among the civilian populations of the Carolinas for their lewd and disrespectful demeanor and for “pillaging and burning” food and other necessary supplies that were often extremely scarce in the South during the war.39

When his “bummers” began to be found murdered wearing signs indicating “death to all foragers,” Sherman offered pale and unacceptable excuses for their behavior. He wrote to one of the generals under him, for instance, that “I contend if the enemy fails to defend his country we may rightfully appropriate what we want.”40 41 He added the further justification that he believed his troops had the right to “destroy cotton and tobacco,” in spite of the fact that these crops were grown by civilians on privately-owned property and often represented the livelihood of those who grew them, “because these things are assumed by the rebel Government to belong to it, and are used as a valuable source of revenue.”42 For Sherman, nearly every Southerner was in some sense an enemy, complicit in the Confederate rebellion against the federal government and liable to punishment for his or her complicity. Every target, then, was, in some sense, a civilian target.

Using a similar line of reasoning, Sherman justified his burning of Atlanta by claiming that the city had been and could again be, after the departure of his troops to continue their march, be put to military use.43 This is hardly a valid reason, however, to destroy nearly an entire city, including thousands of private homes and businesses. Years after the Civil War, Sherman would, perhaps in an attempt, whether conscious or not, to justify his actions during the war, tell a crowd of listeners, “there is many a boy here today who looks on war as all glory. But boys it is all hell.”44 45 In the end, it is abundantly clear that Sherman and the Union forces of which he was a leader refused to distinguish between combatants and noncombatants; as a result, they failed to meet the criterion of discrimination.

3. Responsibility

According to Jon Dorbolo, the third and final criterion of Jus In Bello, responsibility, itself divides into three parts.46 According to this criterion, a belligerent power is not responsible for the negative consequences of the war and therefore not itself unjust in spite of the injustice which inevitably accompanies armed conflict if (a) the particular course of action which caused the negative consequences was intended for good, (b) the particular course of action which caused the negative consequences was not intended for bad, and (c) the overall good outweighs the bad.

It could be argued that even Sherman’s March to the Sea, in spite of all its apparent brutality, does in fact fit the criterion of responsibility and therefore qualifies as Jus In Bello. It was, after all, so it could be argued, only what was necessary to end the war as quickly as possible. By demoralizing Southerners and destroying their means of subsistence in addition to their military supplies, Sherman stripped them of their will to war and so brought about the end of the war. If this is true, it can be seen that Sherman’s actions were intended for good, were not intended for bad, and, given that he accomplished his goal of bringing about the end of the war, this good outweighs all of the bad he did in order to achieve it. Even Sherman himself, after all, once said, only a few months after his brutal Atlanta and Savannah campaigns, that “the legitimate object of war is a more perfect peace.”47

Such a line of reasoning, however, does not stand up to the light of scrutiny and thorough, thoughtful consideration. In the end, this line of reasoning amounts to little more than a Machiavellian assertion that the ends justify the means. If Sherman’s March to the Sea is allowed as somehow “just” simply because it contributed to the eventual Confederate surrender and Union victory in the Civil War, nearly any conduct within warfare can be twisted to fit the definition of Jus In Bello. While it can be admitted that Sherman’s actions contributed substantially to the fall of the Confederacy and the triumph of the Union, this admission can in no way be used to justify the actions as having been just per se.

4. Conclusion

The only sound conclusion that can be reached in regards to Jus In Bello and the Civil War is that neither belligerent power met any of the criteria. Both sides in the Civil War failed to practice proportionality and discrimination. As a result, both sides bear the full burden of responsibility for the negative consequences of their actions.

Jus Post Bellum

Although not included in the classical treatments of Just War theory, the concept of Jus Post Bellum, or “justice after war,” has been become a standard aspect of formulations of Just War theory in the modern world and seems a fitting conclusion to any discussion of Just War theory.48 Brian Orend, one of the first of the modern Just War theorists to discuss the concept of Jus Post Bellum, outlined two criteria in particular for Jus Post Bellum: (1) compensation and (2) rehabilitation. Drawing upon earlier and generally accepted formulations of Just War theory, Orend posits that, in short, the victor in a war must not exact undue punishment from the losing power but should instead assist in its attempts to rebuild and rehabilitate.

While the era of Reconstruction which followed the Civil War had both its accomplishments and its failures, a fair assessment would conclude that Reconstruction largely met the criteria of Jus Post Bellum as outlined by Orend. The Union succeeded in reintegrating the South back into the United States in a relatively expeditious manner. Efforts were made to rebuild the South and what few punishments were exacted upon the former Confederacy and its leaders, such as the disenfranchisement of many Southerners from the vote and the imprisonment of leaders like Jefferson Davis, were generally, for better or worse, short-lived. The failure that lingers over Reconstruction is, ultimately, its inability to simultaneously integrate the newly freed slaves and other blacks throughout the United States as well as reintegrate the whites of the South into the fabric of American life and politics. These two goals appear to have been mutually exclusive in practice. As a result, the unequivocal recognition of full citizenship for black Americans was delayed for nearly 100 years and a long era of segregation, lynching, second-class citizenship, distrust, and hatred set in Southern life and in American life as a whole. In consideration of this, it could be said that the United States also failed to accomplish Jus Post Bellum in that it did not fully satisfy the criterion of rehabilitation, or at least took an inordinately long time to do so.

Conclusion

The American Civil War, as the outbreak of armed conflict due to a rift that had existed in the fabric of Western civilization nearly since the infancy of that civilization, embodied a certain tension in Western thought and finally determined the course that Western civilization would take on the questions of slavery, liberty, equality, and democracy. Although the Civil War, on both sides, was truly representative of the heritage of the Western tradition, neither belligerent satisfied all of the criteria for Just War theory, a central aspect of Western thought on warfare and international relations.

While the Union met the criteria of Jus Ad Bellum, qualifying as having just reason and ability to engage in warfare, the Union failed to maintain justice throughout the war and so satisfy the criteria of Jus In Bello. In addition, although it could be argued that the efforts of the federal government to reintegrate white Southerners back into the mainstream of the United States indicates that the Union satisfied the criteria for Jus Post Bellum, it should also be pointed out that in allowing the reentrance of Southern whites into American life a very large number of human beings, namely freed slaves and other blacks, were excluded from meaningful participation in American life and denied justice. In addition, injustice was allowed to continue in the South, in spite of the end of slavery, in the form of segregation and oppression targeting blacks and other ethnic and religious minorities. The other belligerent power in the war, the Confederacy, failed to satisfy any of the criteria of Just War theory. On final analysis, then, although the Civil War accomplished the good of finally ending slavery in the United States, a power representative of and at the helm of Western civilization, it must be concluded that the American Civil War was not a just war.

Notes1 Walt Whitman, “Thou Mother With Thy Equal Brood,” 4, Leaves of Grass (New York: The Modern Library, 2001), 564.

2 Declaration of Independence, http://www.ushistory.org/declaration/document/index.htm (accessed 23 December 2012).

3 Jon Dorbolo, “Just War Theory,” Oregon State University (2010) http://oregonstate.edu/instruct/phl201/modules/just_war_theory/criteria_intro.html (accessed 23 December 2012).

4 Saint Thomas Aquinas, Summa Theologica, Part II, Section II, Q. 40. Art. 2., ed. Robert Maynard Hutchins, Great Books of the Western World, Vol. 20 (Chicago: William Benton, 1952), 578.

5 Although it could be and has been argued that the incipient American government failed to meet this criterion in the Revolutionary War, the consequences of such a determination for the Civil War are ambiguous. There is the potential for using the assumption of the inherent righteousness of the American cause in the Revolution coupled with the lack of support for any revolution at all in Just War theory as an argumentum ad absurdum against Just War theory. Free of the assumption of the justness of the American cause against the British monarchy, however, the case could also be made that the American Revolution was in fact unjust. One example of a paper which argues that the American Revolutionary War was an unjust war is John Keown, “America’s War for Independence: Just or Unjust?,” Kennedy Institute of Ethics, Georgetown University, http://kennedyinstitute.georgetown.edu/files/KeownAmericasWar.pdf (accessed 23 December 2012).

6 The United States Constitution, Article I, Section 8, item 12, http://constitutionus.com/ (accessed 23 December 2012).

7 Abraham Lincoln, “First Inaugural Address,” http://www.bartleby.com/124/pres31.html (accessed 23 December 2012).

8 Charles Guthrie and Michael Quinlan, Just War: The Just War Tradition: Ethics in Modern Warfare (New York: Walker & Company, 2007), 13.

9 “States’ Rights: The Rallying Cry of Secession,” Civil War Trust (2011) http://www.civilwar.org/education/history/civil-war-overview/statesrights.html (accessed 23 December 2012).

10 Alexander Hamilton Stephens, in David J. Eicher, The Longest Night: A Military History of the Civil War (New York: Simon & Schuster, 2002), 49.

11 Ibid., 364-5.

12 Thomas Cahill, The Gifts of the Jews: How a Tribe of Desert Nomads Changed the Way Everyone Thinks and Feels (New York: Anchor Books, 1998), 154.

13 Thomas Cahill, Mysteries of the Middle Ages: The Rise of Feminism, Science, and Art from the Cults of Catholic Europe (New York: Doubleday, 2008), 44.

14 Aristotle, for example, argues in his Politics, Book I, Chapters 3-6, as elsewhere, that there are those who are “intended by nature to be a slave” and those, on the other hand, who are naturally masters. The Confederate racial ideology as elucidated by Stephens, though never fully developed, seems to have been a revival of this way of reasoning, which further exhibits the nature of the American Civil War as a civil war in Western civilization as a whole, perhaps between the Hebraic and Greco-Roman strands thereof. (Aristotle, Politics, in, Aristotle II, ed. Robert Maynard Hutchins, Great Books of the Western World, Vol. 9 (Chicago: William Benton, 1952), 446-9.)

15 Although the sentiment is common to many early Christian writers, Gregory of Nyssa is singled out for having issued one of the clearest calls for abolition in the ancient world in his fourth homily on Ecclesiastes; see Eric Denby, “The First Abolitionist? Gregory of Nyssa on Ancient Roman Slavery,” 9 May 2011, http://www.academia.edu/1485109/The_First_Abolitionist_Gregory_of_Nyssa_on_Ancient_Roman_Slavery (accessed 23 December 2012).

16 Cahill, Gifts of the Jews, 249.

17 Augustine of Hippo, in Aquinas, Summa Theologica.

18 Aquinas, Summa Theologica.

19 Eric Foner, Reconstruction: America’s Unfinished Revolution, 1863-1877 (New York: HarperCollins Publishers, Inc., 2002), 73-4.

20 Abraham Lincoln, in Foner, Reconstruction, 74.

21 Ritchie Devon Watson, Jr., Normans and Saxons: Southern Race Mythology and the Intellectual History of the American Civil War (Baton Rouge: Louisiana State University Press, 2008).

22 “Jefferson Davis and the Assassination,” University of Missouri – Kansas City School of Law, http://law2.umkc.edu/faculty/projects/ftrials/lincolnconspiracy/davistestimony.html (accessed 23 December 2012).

23 St. Augustine, The City of God, Book 4, Chapter 14, tr. Marcus Dods, in Robert Maynard Hutchins, Augustine (Chicago: William Benton, 1952), 196.

24 Thomas H. Flaherty, ed., The Colonial Overlords (TimeFrame AD 1850-1900) (Alexandria: Time-Life Books, 1990), 140.

25 “Election of 1860,” The American Presidency Project, http://www.presidency.ucsb.edu/showelection.php?year=1860 (accessed 23 December 2012).

26 William E. Gienap, “The Republican Party and the Slave Power,” in Robert H. Abzug and Stephen E. Maizlish, editors, New Perspectives on Slavery and Race in America: Essays in Honor of Kenneth M. Stampp (Lexington: University Press of Kentucky, 1986), 64-65.

27 John William Draper, in William C. Harris, “Abraham Lincoln and Secession,” The Lincoln Institute Presents: Abraham Lincoln’s Classroom, http://www.abrahamlincolnsclassroom.org/library/newsletter.asp?ID=140&CRLI=197 (accessed 23 December 2012).

28 Harris, “Abraham Lincoln.”

29 This conflict between the democratic principle of majority rule, enshrined in the Constitution, and the interests of the wealthy and powerful Southern aristocracy exhibits another way in which the American Civil War represents the summarizing of a conflict that had long troubled Western civilization as a whole, namely the conflict between the oligarchic and democratic forms of government. This rift in Western thought makes perhaps its first appearance in a written document with Herodotus, The History, Book III, pars. 80-3, in which passage the respective merits and demerits of monarchy, democracy, and oligarchy are discussed and debated. The history of Athens, arguably the world’s first democracy, also exhibits this tension. (Herodotus, The History, in Herodotus and Thucydides, ed. Robert Maynard Hutchins, Great Books of the Western World, Vol. 6 (Chicago: William Benton, 1952), 107-8.)

30 Harris, “Abraham Lincoln.”

31 Flaherty, Colonial Overlords, 140.

32 Abraham Lincoln, “Second Inaugural Address,” http://www.bartleby.com/124/pres32.html (accessed 23 December 2012).

33 Dorbolo, “Just War Theory.”

34 Russell S. Bonds, War Like the Thunderbolt: The Battle and Burning of Atlanta (Yardley: Westholme Publishing, 2009), 363.

35 For a classical presentation of the common depiction of the burning of Rome, see Suetonius, “The Life of Nero,” 38, in The Lives of the Caesars, http://penelope.uchicago.edu/Thayer/E/Roman/Texts/Suetonius/12Caesars/Nero*.html (accessed 23 December 2012).

36 Daniel Oakey, in “Sherman in Georgia!,” Home of the American Civil War (10 February 2002) http://www.civilwarhome.com/shermangeorgia.htm (accessed 23 December 2012).

37 “Sherman in Georgia!”

38 “The Carolinas Campaign: Death To All Foragers,” Wade Hampton Camp, http://www.wadehamptoncamp.org/hist-hvs.html (accessed 23 December 2012).

39 John G. Barrett, Sherman’s March Through the Carolinas (Chapel Hill: University of North Carolina Press, 1956), 96.

40 William T. Sherman, in “The Carolinas Campaign.”

41 Sherman’s statement sounds very similar to the claim of Aristotle in his Politics, Book I, Chapter 8, in which he asserts that “the art of war is a natural art of acquisition, an art which we ought to practise … against men who, though they be intended by nature to be governed, will not submit; for war of such a kind is naturally just.” In short, Aristotle, in a foreshadowing of Sherman, claims that it is right to take what one’s enemy cannot prevent one from taking and that the ability to acquire indicates that it is naturally just to do so. A similar sentiment is expressed in the famous Melian dialogue recorded in Thucydides’s account of The History of the Peloponnesian War, Book V, par. 89, in which the Athenians nonchalantly inform the Melians that “the strong do what they can and the weak suffer what they must.” This ethic of “might makes right” perhaps indicates the similarity of Sherman’s ideas of warfare to those developed before the advent of a full-fledged Just War theory following the triumph of Christianity in the Roman Empire. (Thucydides, The History of the Peloponnesian War, in Herodotus and Thucydides, ed. Robert Maynard Hutchins, Great Books of the Western World, Vol. 6 (Chicago: William Benton, 1952), 505.)

42 Ibid.

43 “Sherman’s March to the Sea,” Home of the American Civil War (16 February 2002) http://www.civilwarhome.com/marchtothesea.htm (accessed 23 December 2012).

44 Sherman, in Eicher, Longest Night, 847.

45 This statement presents an interesting contrast with the claim of the Presocratic Greek philosopher Democritus, as recorded by Plutarch, that men “ought to be instructed in the art of war … which is a source of great and glorious things for men,” in Plutarch, Against Colotes, 1126A. It demonstrates that even in the case of someone like Sherman, whose approach to warfare was far more in line with combat before the full flowing of Just War theory in the Christian era, perspectives had been altered and shaped by the introduction of new ideas on warfare. (Jonathan Barnes, Early Greek Philosophy (New York: Penguin Books, 2001), 229.)

46 Dorbolo, “Just War Theory.”

47 Sherman, in Eicher, Longest Night, 847.

48 Brian Orend, “Justice after War,” Carnegie Council for Ethics in International Affairs, http://www.carnegiecouncil.org/publications/journal/16_1/articles/277.html/_res/id=sa_File1/277_orend.pdf (accessed 23 December 2012).

Amazon.com Widgets