Personhood in Greco-Roman Thought and Practice (Personhood, Part II)

Demonstration of the very narrow understanding of personhood in Greek thought begins with the earliest texts of Western civilization, the Iliad and the Odyssey, both attributed to the poet Homer and composed in about the eighth century BC.1 Both works limit their purview to the lives of male Greek aristocrats. The concerns of women and children are treated only insofar as they affect the men. The concerns of slaves, of the poor, of the handicapped, and other such groups are never considered at all. The world of Homer is the world of a small but powerful elite class.

Later developments in Greek thought served to justify this narrow definition of personhood. Aristotle, for instance, writing in the fourth century BC, provided a succinct list of groups explicitly excluded from the category of personhood as well as a justification for the exclusion of each in his Politics: “Although the parts of the soul are present in all of them, they are present in different degrees. For the slave has no deliberative faculty at all; the woman has, but it is without authority; and the child has, but it is immature.”2 Because of their lack of “the deliberative faculty,” Aristotle claims that slaves, along with “brute animals[,] … have no share in happiness or in a life based on choice.”3 Similarly, says Aristotle, “the female is, as it were, a mutilated male.”4 In addition, Aristotle also excluded the lower classes, the poor and even laborers from his definition of personhood, arguing, for instance, that “the life of mechanics and shopkeepers … is ignoble and inimical to goodness.”5 Aristotle also placed the entirety of the non-Greek population into the category of those lacking “the deliberative faculty,” asserting that “barbarians … are a community of slaves” who should rightfully be ruled by the Greeks.6

These negative assessments regarding the personhood of women, slaves, children, barbarians, and others in the writings of Aristotle can be taken as representative of Greco-Roman thought more generally. The Leges Duodecim Tabularum, or Law of the Twelve Tables, for instance, a document of the fifth century BC which formed the foundation of Roman law, institutionalized the systematic marginalization and oppression of these groups within Roman society.7 In the Twelve Tables, the male head of household was granted the right to dispose of the women, children, and slaves within his household in the same manner as he treats animals and other property under his control, including the right to sell them and even to kill them; he is, in fact, ordered by the Tables to kill any children born with deformities (Table IV). Women, being property themselves, are denied the rights of property ownership (Table VI). Marriages between members of the aristocracy and members of the lower classes were banned outright (Table XI). In short, only an adult male member of the Roman aristocracy was granted full personhood in this initial document which governed and defined Roman society. This narrow understanding of personhood remained the standard understanding in the Roman Empire until the fourth century.

Notes


1 Harold Bloom, Homer (New York: Infobase Publishing, Inc., 2009), 205.

2 Aristotle, Politics, in Aristotle: II, ed. Robert Maynard Hutchins (Chicago: William Benton, 1952), 1260a10-14.

3 Ibid., 1280a32-34.

4 Aristotle, On the Generation of Animals, in Aristotle: I, ed. Robert Maynard Hutchins (Chicago: William Benton, 1952), 737a26-7.

5 Aristotle, Politics, 1328b39-40.

6 Ibid., 1252b4-8.

7 The Laws of the Twelve Tables, http://www.constitution.org/sps/sps01_1.htm (accessed 24 March 2013).

Personhood in Late Antiquity: How Barbarians, Slaves, Women, and Children Became Persons (Personhood in Late Antiquity, Part I)

The Greco-Roman world, whose Hellenistic culture and thought dominated the West throughout Antiquity, possessed a very narrow definition of what constituted a person, a full and equal member of the human political and legal community with all of the rights and responsibilities that status confers. In large part, the full application of that term and the concept it represented were limited to free adult male Greek, or, later, Roman, aristocrats. Groups such as slaves, children, women, men who were not Roman citizens, the poor, and others who did not fit into this narrow category were excluded from full participation in personhood. Slaves alone constituted a third of the population of the Roman Empire and women made up approximately half. The majority of the population of the Roman Empire, then, was seen as possessing less than full personhood. Groups that were denied full personhood were often subject to disdain, abuse, brutality, and even execution with no legal recourse. The Jews, on the other hand, who made up a small but visible minority of subjects and citizens under Greek and Roman rule in Antiquity, because of their doctrine of the Imago Dei, held a much wider understanding of personhood and included under that concept all members of the human species regardless of social status, age, gender, or nationality. As a result, Jewish law conferred upon slaves, women, children, the poor, and other such groups the status of full personhood and the rights associated with that status under Jewish law. Christianity emerged from Judaism in the first century AD and carried with it the idea of the Imago Dei, coupling with that idea its own original ideas of the Incarnation of God as man and the availability of salvation for all people through recapitulation. Already heavily influenced by Hellenistic thought from its inception, Christianity in large part became a point of synthesis between Judaism and Hellenism beginning in the second century as an increasing number of converts to the incipient religion came from segments of the Roman Empire outside of the Jewish community, especially from marginalized and oppressed groups. Because of its message of the full personhood of women, children, slaves, and other marginalized and oppressed classes in Roman society, it drew its converts especially from these groups. In the fourth century, Christianity became the official, dominant, and popular religion of the Roman Empire and began to exert a major influence on law, thought, and culture in the West. Although it continued to struggle with the process of reconciling and synthesizing the Judaic and Hellenistic elements it had inherited, Christianity introduced a new and wider understanding of who was fully a person, a definition which included even unborn children and the lowest and most degraded segments of society. Popularized and refined throughout Late Antiquity and the Middle Ages, this definition became the standard understanding of what constitutes a human being according to Western thought and, although it has been and continues to be challenged from various quarters, it remains the standard understanding today.

Race and Representation in the Gilded Age: Popular Culture and Depictions of Marginalized Racial Groups

The end of the nineteenth century and beginning of the twentieth saw the rise of the first genuine popular culture in the United States. New advances in technology coupled with an increase of leisure time and extra money among a significant portion of the American population made this popular culture possible. Popular culture exerted a major influence on American life as its various mediums were used to create a cultural homogeneity which had not previously existed as well as to reinforce white cultural hegemony through propagating stereotypes about marginal groups.

An important distinction that must be made is that between folk culture and popular culture. As American cultural commentator Dwight Macdonald pointed out, “folk art grew from below” as the “spontaneous, autochthonous expression of the people” who were “without the benefit of High Culture.”1 In other words, in the absence of access to more refined artistic and cultural forms, folk art was a natural aesthetic outgrowth from people who wished to express themselves artistically. Popular culture, on the other hand, Macdonald goes on, “is imposed from above.”2 Its creation and dissemination are controlled by capitalists who “exploit the cultural needs of the masses in order to make a profit and/or to maintain their class-rule.”3 Popular culture, then, acts as “an instrument of political domination.”4 In this way, popular culture becomes the vehicle for the imposition of cultural homogeneity and the maintenance of hegemony.

The growth of the concept of “whiteness” in opposition to the ostensibly existentially opposed concept of “blackness” during the Gilded Ages provides one clear example of a case in which popular culture served this function. Richard L. Hughes, a historian whose work has focused on the history of American culture and society, has pointed out how the creation and dissemination of stereotypes about blacks created a sense of unity among the white audiences who viewed minstrel shows. According to Hughes, portrayals of blacks in popular culture “contributed to the growing sense of ‘whiteness’ among an ethnically diverse population in the urban North and … to a sense of a unique, albeit problematic, American national identity.”5 Blacks were often portrayed in minstrel shows and other venues of popular culture in ways that were comically over-the-top. Black characters were often bumbling, hopelessly ignorant, and obsessed with sex. An audience at a minstrel show might consist of individuals who were immigrants or the children of immigrants from such diverse nations as Italy, Poland, and England, nations with different languages, religious beliefs, and cultural traditions. The black, as he was portrayed in caricature at the minstrel show, presented such an obvious contrast with anything that any of them would consider normal or acceptable, however, that the contrast created a sense of unity among those of European descent. Thus, the concept of “whiteness” came to encompass a broad swathe of people with little else in common than ancestors who had come from the same continent and who now defined themselves in opposition to the similarly fabricated concept of “blackness.”

One ironic feature of popular culture, in the light of its functions and effects as a vehicle for white solidarity and black marginalization, is that many of the elements of popular culture derived from earlier expressions of black folk culture. Ragtime, for example, a form of music and dance that was particularly popular among young people during the Gilded Age, was derived from black folk music and dance. In other words, the origins of ragtime were in what Dwight Macdonald identified as genuine folk art; it was the product of people whose social status isolated them from the cultivated aesthetics of High Culture but who simultaneously felt the need for artistic expression. This authentic folk culture, however, was transformed into popular culture through its appropriation and adaptation by whites. Ragtime’s origins in black culture served both to attract the attention and cultivate the awe of white youths as well as to excite the repugnance of members of older generations. Ragtime was seen as shocking, immoral, and even dangerous.6 The lyrics of ragtime songs, as its detractors never tired of pointing out, included such themes as “‘hot town,’ ‘warm babies,’ and ‘blear-eyed coons’ armed with ‘blood-letting razors’” as well as other topics similarly offensive to bourgeois tastes.7 In addition, the dances associated with these songs often involved jerking movements of the hips and close contact between dance partners of opposite genders, which appeared lascivious and immoral in contrast with the more tame and subdued dances common among previous generations of the American bourgeoisie. All of these elements as well as their origins in African and African-American culture were viewed, according to Ellen M. Litwicki, a professor of American history, as a potential source of “moral depravity” for white youth who partook of popular culture.8 This identification of black culture with immorality was also used as a means by which to reinforce stereotypes of blacks and propagate racism, reinforcing the established atmosphere of subjugation and marginalization.

Reactions among African-Americans to the acquisition and transformation of black folk culture by white capitalists whose product was primarily targeted to audiences of white youth varied. Some African-Americans sought to work within the new milieu that was afforded to them by popular culture in order to secure a modicum of social respectability and a means of wealth acquisition that was not formerly available to them. Ernest Hogan, for instance, an African-American man who was one of the founding figures of ragtime, built his career on writing songs that portrayed stereotypes of blacks. One of his most popular songs, for instance, declared in its title that “All Coons Look Alike to Me.”9 Shortly before his death in 1909, Hogan expressed some ambivalence about his role in creating ragtime and about that song in particular. “With nothing but time on my hands now, I often wonder if I was right or wrong,” he told a friend.10 He concluded that in spite of the negative stereotypes such songs helped to propagate, the popularization of black folk culture which he played such an important role in was, in the end, a great boon to the culture itself, which “would have been lost to the world” had it not been popularized, as well as to the many black songwriters whose careers he made possible.11

Other African-Americans, however, particularly those of the middle class, viewed ragtime, along with minstrelsy and vaudeville, in overwhelmingly negative terms. According to historian Matthew Mooney in his survey of responses to American popular music in the first quarter of the twentieth century, “popular music in all its permutations was often subject to sweeping condemnations by … arbiters of Black middle-class propriety.”12 Black members of the bourgeoisie saw popular culture as a vehicle for “demeaning racial stereotypes” which served to undermine the progress that African-Americans had made since the Civil War and emancipation.13 In response to the new popular culture, the African-American bourgeoisie sought to displace blame for the creation and popularization of such musical forms as ragtime from blacks alone to the uncultured in general, black and white alike.14 They also sought to cultivate an appreciation for and African-American participation in venues of High Culture, such as more respectable forms of music and performance like opera. In large part, the vociferous opposition to popular culture espoused by many in the black bourgeoisie arose from a desire to minimize differences between themselves and whites by distancing themselves from supposedly low-class blacks and from traditional black culture. In so doing, they hoped to attain the measure of social respectability that might result from identification with the values and mores of the white bourgeoisie and thereby uplift the black race in general. A noteworthy similarly between those African-American bourgeois who opposed popular culture and those African-Americans such as Ernest Hogan who actively participated in it is that each attributes its respective stance on the issue to the desire of blacks to enter the American mainstream by attaining prestige and wealth. In spite of the divergence in approaches, the motivation was essentially identical for both parties.

Such prestige and wealth was also the motivation for those Native Americans who chose to participate in popular culture venues which presented the stereotype of the Indian as a warlike savage. Included among Native Americans who participated in Wild West shows, for example, are such prominent figures as Sitting Bull and Black Elk.15 According to Litwicki, the stereotyped roles in which Native Americans were depicted in the Wild West shows and which such Native American participants in those shows took part in “while degrading in many respects, were never as completely negative as those African Americans had to work within.”16 Indeed, unlike their black counterparts in minstrelsy and vaudeville who were forced to behave in ways that were entirely the product of white imaginations and which distorted the nature of black culture to an extreme degree, Native Americans were often able and delighted in the opportunity to share authentic representations of their heritage and lifestyle with white audiences, including their prowess as “warriors, riders, marksmen, and hunters” as well as traditional “dances, songs, and other aspects of their cultures.”17 Nonetheless, however, Native Americans were subject to the same disfiguring white consciousness as African-Americans and were expected to behave in stereotyped ways. Through their representations in popular culture, both Native Americans and African-Americans were dehumanized, stripped of individuality and personality, and replaced with caricatures that met white expectations, reinforced white superiority, and justified the continued marginalization of these groups in their exclusion from bourgeois respectability. This subjugation and marginalization frequently determined the course of government policy. The Wild West shows’ depictions of Native Americans as savages and their culture as backwards and primitive, for example, justified the continued attempts by the federal government to eradicate their traditional ways of life, cultural traditions, and tribal units by removing tribes from their ancestral homelands and children from their families, forcing young Native Americans to receive propagandistic education in which they were encouraged to act in accordance with white social expectations, and encouraging Native Americans to adopt the agricultural lifestyle of rural white farmers.18

Similarly, the stereotyped depictions of blacks in popular culture as comically ignorant, ugly, immoral, and sexually promiscuous and the idea of a “blackness” which differed ontologically and stood opposed existentially to “whiteness” which these depictions created and perpetuated justified the exclusion of African-Americans from the white mainstream of American society as well as the separation of blacks from whites more generally. This exclusion and separation was made law with the Supreme Court decision in Plessy v. Ferguson (1896) that gave federal sanction to segregation as a constitutional practice.19

The origins, content, and effect of popular culture in the Gilded Age presents an important comparison with that of more recent American popular culture. Hip hop music, for instance, presents an insightful parallel to the story of ragtime. Just as ragtime emerged from black folk art, hip hop music began as a genuinely folk cultural form among African-American youth in impoverished urban centers. Just as ragtime was adopted, digested, and popularized by the incipient popular culture industry of the late nineteenth century, hip hop similarly became a product of popular culture at the hands of bourgeois, and generally white, capitalists. Both were viewed as repellent by parents and others of older generations because of their perceived immoral content and link with the criminality associated with black culture, both were consumed by eager white youths, and both served to bring a measure of fame, wealth, and even respectability to certain African-American individuals involved in their production while simultaneously reinforcing stereotypes of African-Americans more generally. In addition to this clear parallel between ragtime and hip hop, depictions of other marginal groups in contemporary popular culture also present interesting and insightful comparisons. Just as depictions of Native Americans in popular culture served to justify their exclusion from the mainstream of American society and the systematic destruction of their traditional way of life at the hands of the federal government, depictions of Hispanics in contemporary popular culture often reinforce stereotypes of Hispanics as ignorant, religious to the point of superstition, linked to the criminal drug trade, and, in the case of women, extremely sexualized. These depictions, in turn, influence laws and policies pertinent to, for example, immigration and education.

The impact that stereotyped depictions can have on laws, on lives, and on the individual psyches of members of marginalized and subjugated groups as well as on those of their hegemons should be carefully considered by the producers, distributors, and consumers of popular culture. The nearly ubiquitous presence of popular culture today makes a thorough examination of the influence of its content all the more important. Such an examination is most properly conducted in the light of the insights that can be afforded by an understanding of the origins of American popular culture in the Gilded Age and its perpetual use since that time as a tool for the creation of a false cultural homogeneity and the imposition of a cultural hegemony which is far more the product of the imaginations and aspirations of the moneyed classes and establishment power structure than an authentic democratic movement in aesthetics.

Notes


1 Dwight Macdonald, “A Theory of Mass Culture,” in John Storey, ed., Cultural Theory and Popular Culture: A Reader (Harlow: Prentice Hall, 1998), 23.

2 Ibid.

3 Ibid.

4 Ibid.

5 Richard L. Hughes, “Minstrel Music: The Sounds and Images of Race in Antebellum America,” The History Teacher 40:1 (Nov. 2006): 29.

6 Rebecca Edwards, New Spirits: Americans in the “Gilded Age,” 1865-1905 (New York and Oxford: Oxford University Press, 2011), 118.

7 “Musical Impurity,” Etude (January 1900): 16.

8 Ellen M. Litwicki, “The Influence of Commerce Technology, and Race on Popular Culture in the Gilded Age,” in Charles W. Calhoun, The Gilded Age: Perspectives on the Origins of Modern America (Lanham: Rowman and Littlefield Publishers, Inc., 2007), 194.

9 Ibid., 196.

10 Karen Sotiropoulos, Staging Race: Black Performers in Turn of the Century America (Cambridge: Harvard University Press, 2008), 118.

11 Ibid., 120.

12 Matthew Mooney, “An ‘Invasion of Vulgarity’: American Popular Music and Modernity in Print Media Discourse, 1900-1925,” in Leslie Wilson, ed., Americana: Readings in Popular Culture (Hollywood and Los Angeles: Press Americana, 2010), 7.

13 Ibid.

14 Ibid., 8.

15 Litwicki, 202.

16 Ibid.

17 Ibid.

18 Edmund J. Danziger Jr., “Native American Resistance and Accommodation during the Late Nineteenth Century,” in Calhoun, Gilded Age, 180.

19 Leslie H. Fishel Jr., “The African-American Experience,” in Calhoun, Gilded Age, 157.

Reconstruction Under Lincoln

The greatest mark of the Reconstruction Era is perhaps its failure to effectively unite and rebuild the United States after the Civil War. If President Abraham Lincoln had lived to serve out his second term as president, the Reconstruction Era would have been smoother in its goal of reintegrating the South back into the Union but would have been the same as that under Andrew Johnson in its failure to fully account for, reckon with, and make amends for the evils of the past. In this failure, it would have created a similar situation to that which did occur in which oppression and disenfranchisement followed slavery and in which the real work of achieving equality and justice for all was slowed and delayed until a much later date.

As historian Eric Foner points out, “Lincoln did not … believe that Reconstruction entailed social and political changes beyond the abolition of slavery.”1 In this belief, Lincoln failed miserably to understand human nature and societies or ignored reality in favor of his own hopes and ideals. Whatever the reason for his belief, such a course of action would have been a recipe for disaster. To simply end the war and to end slavery without simultaneously working to eliminate the root and underlying causes behind why a clearly unjust institution like slavery was able to flourish in the American South in the first place, to attempt to balance the injustice by providing some form of monetary compensation and/or education as well as full citizenship rights to those who had suffered such an injustice, and to institute the proper laws and organizations for preventing future injustice is a remarkably great oversight on the part of someone remembered for their wisdom and thoughtfulness.

Lincoln had begun his first term as president expressing a desire to maintain the Union in peace at nearly any coast. His approach throughout the Civil War had indicated “a desire to achieve peace as expeditiously as possible.”2 Similarly, his approach to Reconstruction was largely one without any “fixed plan” aside from reattaching the South to the United States as quickly and easily as possible. For the most part, this did not mean fighting to procure social justice for former slaves nor, for that matter, any significant change in Southern culture, in which a deeply-entrenched and violently hateful racism inhered.

This unwillingness by Lincoln to “rock the boat” is reflected in Lincoln’s views concerning black voting rights. In modern liberal democracies and republicans like the United States full citizenship is reflected in one’s right to participate in one’s government by voting and having the right to run for political office. If one cannot participate in government, one is not a full citizen, in any meaningful sense, of a democracy. Lincoln’s rejection, then, of full political enfranchisement for freed slaves was a rejection of their full citizenship and, by implication, of their full personhood.3

Although Lincoln is often hailed as hero for having ended slavery in the United States and this heroic image and reputation leads many to believe the post-war years would have seen greater achievements and improvements, the truth seems rather to be that Reconstruction would not have taken place much differently under Lincoln than under Johnson. Lincoln’s policies before and during the Civil War reflect first and foremost a desire to restore the Union. No doubt his post-war policies would have reflected the same desire. Reconstruction under Lincoln, then, might have seen a smoother transition of the South into the Union than occurred under Johnson but would have seen a similar, if not grater, intentional ignorance of justice for former slaves.

1 Eric Foner, Reconstruction: America’s Unfinished Revolution, 1863-1877 (New York: HarperCollins, 2002), 36.

2 Ibid., 73-4.

3 Ibid., 74.

Was the American Civil War a Just War?

Introduction 

The American Civil War was a defining moment not only in the history of the United States but in the history of the world. As Walt Whitman, an eyewitness of the Civil War, poignantly wrote in his book of poetry Leaves of Grass, it was on the United States that the “Earth’s résumé entire floats” and “the antecedent nations sink or swim with thee.”1 In other words, the United States acted, and arguably still acts, as the heir and representative of the entirety of the tradition of Western civilization. In the insistence of the founders of the United States that the underlying, central, and governing principles of the new nation were to be that “all men are created equal, that they are endowed by their Creator with certain unalienable Rights,” and that governments “deriv[e] their just powers from the consent of the governed,” the formation of the United States became a culminating moment in the history of Western thought.2 Principles that were primary in and essentially unique to Western culture, such as the equality of all men before God and the law, the belief that all human beings are entitled to certain rights by virtue of being members of the human race, and that a government must have the consent of the governed, were identified as the principles upon which the United States would stand. The Civil War, then, represents a summarizing event in Western civilization; it stands in line with the Peloponnesian War, the triumph of Christianity in Late Antiquity, the split between Eastern and Western Christendom in 1054, and the Protestant Reformation as one of the greatest schisms in Western civilization. Both sides of the Civil War, the federal government and the incipient Confederate States of America, represent this common heritage in all its contradiction and complexity. Part of this common heritage is the Just War theory developed by Greco-Roman thinkers like Aristotle and Cicero, which culminated in the thought of medieval and early modern Christian thinkers such as St. Augustine of Hippo, St. Thomas Aquinas, and Hugo Grotius. Ironically, although both belligerents represent this common heritage and were fighting for two different aspects of Western civilization, and although both saw themselves as fighting for a just cause, neither participant in the American Civil War can be said to have fought a just war as both failed to meet the criteria of Just War theory.

Jus Ad Bellum

When considering whether a war effectively met the criteria of Just War theory, the first consideration that must be made is whether the reasons for war in the first place were just. In Latin, this stage of consideration is referred to as “Jus Ad Bellum,” meaning “just to war.” Traditionally, four criteria have been identified by Just War theorists as creating a situation in which a power is “just to war,” namely, (1) just authority, (2) just cause, (3) just intention, and (4) last resort.3

1. Just Authority

The first criterion, just authority, requires that the powers initiating and engaging in hostilities possess the legitimate authority to do so. Thomas Aquinas summarizes this point in his Summa Theologica in his claim that “in order for a war to be just” there must be a “sovereign” with valid authority “by whose command the war is to be waged” because “it is not the business of a private person to declare war” nor “the business of a private person to summon together the people, which has to be done in wartime.”4 While it is apparent that the federal government of the United States meets this criterion, the government of the Confederacy does not appear to do so.5 Theoretically, it could be argued that the central government of the Confederacy derived its authority from the states which chose to enter into it and which were undoubtedly legitimate governing authorities, which in turn lends legitimacy to the government of the Confederacy as a kind of conglomerate government of these states. The Constitution of the United States of America, however, of which all of the constituent states of the Confederacy were signers, specifically grants the right “to raise and support Armies” only to the federal government.6 Furthermore, the Constitution also did not provide for the means nor even seem to envision the possibility of any state or group of states to decide to leave the Union, a fact which Abraham Lincoln himself pointed out in his First Inaugural Address, delivered on 4 March 1861:

It is safe to assert that no government proper ever had a provision in its organic law for its own termination. Continue to execute all the express provisions of our National Constitution, and the Union will endure forever, it being impossible to destroy it except by some action not provided for in the instrument itself.7

In addition, as Charles Guthrie and Michael Quinlan point out in their treatment of Just War theory in the modern world, “historically,” the criterion of just or competent authority “has usually meant the ruler or government of a sovereign state, as opposed to an internal warlord or faction.”8 In other words, traditional Just War theory does not seem to countenance a civil war, no matter for how ostensibly just a cause. The Confederacy, then, fails to meet the criterion of just authority.

2. Just Cause

A just cause for war is perhaps the most central and important of the criteria of Jus Ad Bellum. Even those who are entirely unversed in the niceties of Just War theory and international law generally demand that there be a just cause for the initiation of military action by one nation upon another. To determine if either or both sides of the Civil War possessed a just cause for war, the reasons for the conflict as viewed and enunciated by each side must be examined; although there are a variety of causes which led to the Civil War, there are two overarching reasons behind all of the causes: (1) a dispute over the role of the federal government in relation to the rights of the states to govern themselves and (2) slavery, arguably the deepest of all underlying issues and causes of the war.

From a Southern perspective, the ultimate cause of the Civil War was the infringement on the rights of the states by the federal government. As the website of the Civil War Trust, a non-profit organization dedicated to the preservation of historical sites related to the Civil War, succinctly states it, “Southerners were sure that the North meant to take away their right to govern themselves, abolish slavery, and destroy the Southern economy.”9 From this perspective, it is possible to see the Civil War as a struggle by the Confederacy against the tyranny of the United States government, which would seem to indicate a just cause. If the implications of and reasons for the cry of “states’ rights” on the part of Southerners is examined deeper, however, the uncovered roots overturn such a conclusion.

Ultimately, for Southerners, the right of the states that was being demanded was the right to determine the legality of slavery. According to Alexander Hamilton Stephens, the vice-president of the Confederacy,

our new government is founded upon exactly the opposite idea [from abolition]; its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man, that slavery — subordination to the superior race — is his natural and normal condition.10

While Union leaders, on the other hand, identified the war primarily as “a struggle to preserve the Union” early in the conflict, they quickly realized that the only way to preserve the Union was to agree with the Confederate leadership that the war was primarily about slavery and to adopt the equal and opposite position of those like Stephens, seeking instead to “reconstruct the Union into the nation it should have been without slavery.”11 In its very essence, then, at its deepest roots, the Civil War was a war about slavery.

Although slavery has been practiced throughout most of the history of the world, including those segments of the world and its history that make up Western civilization, slavery has also received an unequivocal condemnation by this tradition. As historian Thomas Cahill notes, “in the prescriptions of Jewish law we cannot but note a presumption that all people, even slaves, are human and that all human lives are sacred.”12 From these ancient Jewish roots, Christianity derived its “claim that all were equal before God and all equally precious to him,” a claim which “ran through class-conscious, minority-despising, weakness-ridiculing Greco-Roman society like a charged current” and overturned the previous ideological foundations upon which Western society had based its belief in the legitimacy of the practice of slavery.13 14 As a result of this claim, early Christian thinkers and leaders like late fourth century bishop St. Gregory of Nyssa became among the first writers in the world to adopt a truly abolitionist position towards slavery and to oppose the practice on principle.15 As Cahill has pointed out, it is only within the context of this strain of thought that a claim like that of the American Declaration of Independence that it is “self-evident that all men are created equal” can make any sense at all or, for that matter, “could ever have been” made in the first place.16 If such a claim is accepted as true, whether self-evidently or not, slavery must, by implication, be viewed as immoral per se.

If the ultimate and underlying cause of the Civil War for both sides thereof is indeed slavery, it is this issue which must determine which side, if either, had a just cause for the initiation of hostilities. According to St. Augustine of Hippo, as quoted by Thomas Aquinas in his discussion of Just War, “a just war is … one that avenges wrongs, when a nation or state has to be punished, for refusing to make amends for the wrongs inflicted by its subjects, or to restore what it has seized unjustly.”17 Given that slavery is a “wrong” in need of “punishment” and that the Confederacy had “unjustly” seceded from the Union for this cause, the federal government here again seems to meet the criteria of Jus Ad Bellum whereas the Confederacy fails to do so.

3. Just Intention

The third criterion of Jus Ad Bellum, just intention, requires that the belligerents involved in a war have the correct intentions in commencing hostilities. In the succinct phrasing of Aquinas, the criterion of just intention is the criterion that belligerent powers “intend the advancement of good, or the avoidance of evil.”18 They cannot intend to inflict hateful or undue punishments upon their opponents but only to redress the wrongs for which they are going to war. In this regard, again, the Union seems to have the advantage over the Confederacy in meeting the criteria of Just War theory.

Although, as will be discussed in the section on Jus in Bello, the Union often failed to live up to its intentions, it is clear from both his words and his actions that President Abraham Lincoln, as the leader of the Union, desired “to achieve peace as expeditiously as possible.”19 His goal from the beginning of the war and throughout its duration was to end the conflict and reintegrate the South back into the Union as quickly and easily as possible. To this end, he opposed those members of his own political party who called for more radical measures in punishing the South’s political and military leadership as well as its economic aristocracy at the end of the war. Although he insisted upon the emancipation of blacks and the abolition of slavery throughout the United States, he was, not to his credit, even willing to compromise on the enfranchisement of former slaves and other blacks as full citizens with voting rights in order to satisfy the prejudices and alleviate the fears of Southern whites, stating in his final speech before his assassination that he desired that, among blacks, only “the very intelligent” and Union veterans of the Civil War be granted the right to vote.20

In contrast to these rather amicable intentions on the part of the highest leadership in the federal government stands the rancor that dominated the intentions of the highest leadership in the Confederate government. In his Normans and Saxons, an intellectual history of the idea of race in its relation to the Civil War, Ritchie Devon Watson, Jr., demonstrates that the rhetoric of white Southerners against blacks, Northern whites, and other target groups exceeded mere polemic and entered the realm of vitriolic demonization.21 One example of the existence and nature of such hatred even among the highest ranks in the Confederacy may be found in the apparent approval of Jefferson Davis, the president of the Confederacy, for the assassination of Lincoln.22 In this point of Just War theory as in those previously considered, the Union once again meets this criterion whereas the Confederacy fails to measure up.

4. Last Resort

The final essential ingredient of Jus Ad Bellum, according to classical formulations of Just War theory, is that the resort to armed conflict be a last resort. Even if just authority, just cause, and just intention all exist, warfare must itself be the final and even unavoidable course of action in order for engagement in warfare to be deemed just. Augustine goes as far as saying that in order for a war to be just the nation which engages in its and its leader must be compelled by force of necessity to enter into warfare, claiming that “it is the wrongdoing of the opposing party which compels the wise man to wage just wars.”23 If either side in the Civil War can be said to have been compelled to enter the war by force of necessity, it must be the Union.

While there are many events which contributed to the eventual outbreak of open conflict between North and South, the 1860 election of Abraham Lincoln to the presidency is undoubtedly the match that sparked the flame. The crisis created by the Kansas-Nebraska Bill of 1854, in which antislavery and proslavery factions vied to populate the territories with their own members and, by extension, to depopulate the territories of members of the other faction, in order to ensure that the new territories entered the Union as non-slave or slave states, respectively, the 1859 attack of John Brown and his men upon the federal arsenal at Harpers Ferry, Virginia, by which he hoped to spark a war over slavery, and other similar events created a tension which hung thick in the air in 1860. The election of Lincoln, who had been elected as a senator from Illinois only two years earlier on “a strong anti-slavery ticket,” as it has been described, was the final straw as far as Southerners were concerned.24

Although he did not receive the majority of the vote, Lincoln did receive a strong plurality among the four candidates for the presidency. Whereas his Democratic opponent, Stephen Douglas, carried 29.5% of the vote, Lincoln took 39.9%, more than enough to represent a decisive victory.25 In the words of historian William E. Gienap, “the northern majority possessed the power to which it was entitled. Yet southerners refused to accept the popular verdict.”26 According to historian William C. Harris, who, in turn, relies upon the account of historian John William Draper, Jefferson Davis himself once plainly informed two Northerners who inquired of him the reasons for secession during the Civil War, “we seceded to rid ourselves of the rule of the majority.”27 In short, in the words of Harris, “Southern failure to abide by majority rule was at the center of the secession crisis.”28 29

Lincoln, on the other hand, tried to prevent Southern secession and the outbreak of war. Although he was portrayed by those who wanted to stoke Southern fears as a “black Republican” and an “abolitionist” and although he had voiced opposition to slavery in the past, Lincoln continually reassured those who would listen to him that he was no radical and did not plan to drastically overturn the state of things in the United States.30 His priorities, as he himself said, were to maintain the Union, to enforce its laws as they stood, and to seek peaceful resolutions to the conflicts and complexities that plagued it. The South, however, hardly gave him the opportunity to even begin taking action. Only “one month after Lincoln was elected president, the state of South Carolina announced its secession from the Union” and “within a few weeks, Mississippi, Florida, Alabama, Georgia, Louisiana, and Texas followed suit.”31 The Confederates were also the first to engage in violence against the other side, firing the opening shots of the Civil War at Fort Sumter, South Carolina, on 12 April 1861. Even in his Second Inaugural Address, delivered on 4 March 1865, as the war was drawing to a close, Lincoln expressed a belief, perhaps solidified throughout the course of a war he had first fought to prevent and then tried desperately to abbreviate and lessen the harshness of but had failed in both goals, that the United States had been inexorably drawn into the war by divine mandate:

We shall suppose that American slavery is one of those offenses which, in the providence of God, must needs come, but which, having continued through His appointed time, He now wills to remove, and that He gives to both North and South this terrible war as the woe due to those by whom the offense came.32

In its rush to and insistence upon secession, the Confederacy yet again failed to meet the standard set by Just War theory. The Union, on the other hand, especially in its leader’s willingness to continue to attempt to negotiate through the differences of ideology and practice that separated the two major regions of the nation and in his stated commitment to place the peace and preservation of the Union foremost in his desires, successfully satisfied the criterion of last resort. There can be little doubt that in meeting the requirement of Augustine that a just war be a war in which a national power is compelled to participate by force of necessity the Confederacy fell far short and the Union succeeded.

5. Conclusion

In final consideration of the four criteria of Jus ad Bellum, the Union is shown to have had the “right to war” in the Civil War whereas the Confederacy did not. Whereas the federal government was a legitimate and sovereign governing authority, the Confederacy failed, as a rebellious group rising against its legitimate government, to meet the criterion of just authority. The federal government also satisfied the criterion of just cause in its desire to simultaneously preserve its sovereign territories to itself and to end the gravely unjust practice of slavery within its borders, whereas the Confederacy’s quest to uphold the institution of slavery, given that it is unjust per se, is clearly an unjust cause for war. The Union’s just intention of repatriating the Southern states to itself quickly and peacefully also satisfied the criterion of just intention, whereas the vitriolic hatred exhibited by all ranks of Confederate leadership for blacks, Northern whites, and anyone else opposed to its cause runs obviously contrary to the criterion of just intention. Finally, the South’s overeager rush for war presents a stark contrast with the nearly desperate pleas of the leadership in the federal government for a peaceful resolution to the internal dissensions of the United States, demonstrating that only the federal government meets the criterion of last resort. In short, the Union adequately satisfied the criterion for Jus Ad Bellum, whereas the Confederacy did not.

Jus In Bello

The next series of points which must be considered in a discussion of whether a specific war can be considered a just war in accordance with traditional formulations of Just War theory is that set of criteria which fall under the category “Jus In Bello,” a Latin phrase meaning “just in war.”33 As the name of this set of criteria indicates, Jus In Bello involves the consideration of whether the actual conduct of a particular belligerent in a war was just. The three criteria of Jus In Bello are (1) proportionality, (2) discrimination, and (3) responsibility. Whereas the federal government adequately satisfied all of the criteria for Jus Ad Bellum, both the Confederacy and the Union failed to satisfy any of the three criteria of Jus In Bello. The actions of the Union army which entered into and crossed through Georgia under General William Tecumseh Sherman, perhaps best demonstrate the failures of both sides in the Civil War to conduct a just war. Sherman’s infamous March to Sea, which has been remembered by subsequent generations largely for its brutality, particularly serves as an outstanding case study in the failure of both powers in the Civil War to practice just conduct within warfare.

1. Proportionality

The first criterion of Jus In Bello is proportionality; proportionality requires that the methods and amount of force used during warfare be proportionate to their desired effect. In other words, given that a belligerent power has just cause and just intention, said belligerent power may only use the minimum amount of force necessary to achieve its intention and satisfy its cause. The actions of the Union army under General Sherman, in flagrant defiance of this criterion, exemplify disproportionality in wartime conduct.

Even before their March to the Sea, more properly referred to as the Savannah Campaign, the Union troops led by Sherman proved their preference for cruelty and their penchant for disproportionality. The burning of Atlanta, Georgia, is one example. On 14 November 1864, just over two months after his army had captured the city, Sherman ordered the entire destruction of the city of Atlanta. According to historian Russell S. Bonds, approximately 4000 homes and businesses were burned to the ground; of the entire city only 400 buildings, just about a tenth of the city, remained standing.34 In a description reminiscent of the common, even if probably false, depiction of the burning of Rome, during which the Emperor Nero, ostensibly the perpetrator of the crime, arrayed himself in a stage costume and sang a song, Union officer Captain Daniel Oakey reported that, while Atlanta burned, the Second Massachusetts’s “post band and that of the Thirty-third Massachusetts played martial airs and operatic selections.”35 36

Whatever the accuracy or lack thereof in this grotesque picture, there can be little doubt that the burning of Atlanta was an act of gross disproportionality in the conduct of warfare. The burning of Atlanta, however, was only the beginning. The March to the Sea that commenced with the burning of Atlanta continued for more than a month, with the federal troops under Sherman “creating a charred avenue over 40 miles wide through the unprotected State [of Georgia], destroying the railroads, seizing all provisions, pillaging, plundering and burning.”37 Sherman’s actions were drastically disproportionate to the cause and intentions of the federal government; the Union and its leaders, then, especially Sherman, failed to succeed in meeting the criterion of proportionality.

2. Discrimination

The second criterion of Jus In Bello is discrimination, which refers to the responsibility of the belligerent power to discriminate between military and civilian targets and to only strike the former while avoiding as much as possibly any damage to the latter. Sherman’s burning of Atlanta and the entirety of his Savannah Campaign once again demonstrate the failure of the federal forces engaged in the Civil War to conduct themselves justly on this point. Not only did Sherman fail to distinguish between military and civilian targets, he actively ordered and encouraged his troops to raid and attack civilian targets.

While his troops were in the Carolinas, for instance, before entering into Georgia, Sherman sent out foraging parties which became known as “Sherman’s bummers” who became a well-known and much-despised presence among the civilian population for their behavior.38 These “bummers” became known among the civilian populations of the Carolinas for their lewd and disrespectful demeanor and for “pillaging and burning” food and other necessary supplies that were often extremely scarce in the South during the war.39

When his “bummers” began to be found murdered wearing signs indicating “death to all foragers,” Sherman offered pale and unacceptable excuses for their behavior. He wrote to one of the generals under him, for instance, that “I contend if the enemy fails to defend his country we may rightfully appropriate what we want.”40 41 He added the further justification that he believed his troops had the right to “destroy cotton and tobacco,” in spite of the fact that these crops were grown by civilians on privately-owned property and often represented the livelihood of those who grew them, “because these things are assumed by the rebel Government to belong to it, and are used as a valuable source of revenue.”42 For Sherman, nearly every Southerner was in some sense an enemy, complicit in the Confederate rebellion against the federal government and liable to punishment for his or her complicity. Every target, then, was, in some sense, a civilian target.

Using a similar line of reasoning, Sherman justified his burning of Atlanta by claiming that the city had been and could again be, after the departure of his troops to continue their march, be put to military use.43 This is hardly a valid reason, however, to destroy nearly an entire city, including thousands of private homes and businesses. Years after the Civil War, Sherman would, perhaps in an attempt, whether conscious or not, to justify his actions during the war, tell a crowd of listeners, “there is many a boy here today who looks on war as all glory. But boys it is all hell.”44 45 In the end, it is abundantly clear that Sherman and the Union forces of which he was a leader refused to distinguish between combatants and noncombatants; as a result, they failed to meet the criterion of discrimination.

3. Responsibility

According to Jon Dorbolo, the third and final criterion of Jus In Bello, responsibility, itself divides into three parts.46 According to this criterion, a belligerent power is not responsible for the negative consequences of the war and therefore not itself unjust in spite of the injustice which inevitably accompanies armed conflict if (a) the particular course of action which caused the negative consequences was intended for good, (b) the particular course of action which caused the negative consequences was not intended for bad, and (c) the overall good outweighs the bad.

It could be argued that even Sherman’s March to the Sea, in spite of all its apparent brutality, does in fact fit the criterion of responsibility and therefore qualifies as Jus In Bello. It was, after all, so it could be argued, only what was necessary to end the war as quickly as possible. By demoralizing Southerners and destroying their means of subsistence in addition to their military supplies, Sherman stripped them of their will to war and so brought about the end of the war. If this is true, it can be seen that Sherman’s actions were intended for good, were not intended for bad, and, given that he accomplished his goal of bringing about the end of the war, this good outweighs all of the bad he did in order to achieve it. Even Sherman himself, after all, once said, only a few months after his brutal Atlanta and Savannah campaigns, that “the legitimate object of war is a more perfect peace.”47

Such a line of reasoning, however, does not stand up to the light of scrutiny and thorough, thoughtful consideration. In the end, this line of reasoning amounts to little more than a Machiavellian assertion that the ends justify the means. If Sherman’s March to the Sea is allowed as somehow “just” simply because it contributed to the eventual Confederate surrender and Union victory in the Civil War, nearly any conduct within warfare can be twisted to fit the definition of Jus In Bello. While it can be admitted that Sherman’s actions contributed substantially to the fall of the Confederacy and the triumph of the Union, this admission can in no way be used to justify the actions as having been just per se.

4. Conclusion

The only sound conclusion that can be reached in regards to Jus In Bello and the Civil War is that neither belligerent power met any of the criteria. Both sides in the Civil War failed to practice proportionality and discrimination. As a result, both sides bear the full burden of responsibility for the negative consequences of their actions.

Jus Post Bellum

Although not included in the classical treatments of Just War theory, the concept of Jus Post Bellum, or “justice after war,” has been become a standard aspect of formulations of Just War theory in the modern world and seems a fitting conclusion to any discussion of Just War theory.48 Brian Orend, one of the first of the modern Just War theorists to discuss the concept of Jus Post Bellum, outlined two criteria in particular for Jus Post Bellum: (1) compensation and (2) rehabilitation. Drawing upon earlier and generally accepted formulations of Just War theory, Orend posits that, in short, the victor in a war must not exact undue punishment from the losing power but should instead assist in its attempts to rebuild and rehabilitate.

While the era of Reconstruction which followed the Civil War had both its accomplishments and its failures, a fair assessment would conclude that Reconstruction largely met the criteria of Jus Post Bellum as outlined by Orend. The Union succeeded in reintegrating the South back into the United States in a relatively expeditious manner. Efforts were made to rebuild the South and what few punishments were exacted upon the former Confederacy and its leaders, such as the disenfranchisement of many Southerners from the vote and the imprisonment of leaders like Jefferson Davis, were generally, for better or worse, short-lived. The failure that lingers over Reconstruction is, ultimately, its inability to simultaneously integrate the newly freed slaves and other blacks throughout the United States as well as reintegrate the whites of the South into the fabric of American life and politics. These two goals appear to have been mutually exclusive in practice. As a result, the unequivocal recognition of full citizenship for black Americans was delayed for nearly 100 years and a long era of segregation, lynching, second-class citizenship, distrust, and hatred set in Southern life and in American life as a whole. In consideration of this, it could be said that the United States also failed to accomplish Jus Post Bellum in that it did not fully satisfy the criterion of rehabilitation, or at least took an inordinately long time to do so.

Conclusion

The American Civil War, as the outbreak of armed conflict due to a rift that had existed in the fabric of Western civilization nearly since the infancy of that civilization, embodied a certain tension in Western thought and finally determined the course that Western civilization would take on the questions of slavery, liberty, equality, and democracy. Although the Civil War, on both sides, was truly representative of the heritage of the Western tradition, neither belligerent satisfied all of the criteria for Just War theory, a central aspect of Western thought on warfare and international relations.

While the Union met the criteria of Jus Ad Bellum, qualifying as having just reason and ability to engage in warfare, the Union failed to maintain justice throughout the war and so satisfy the criteria of Jus In Bello. In addition, although it could be argued that the efforts of the federal government to reintegrate white Southerners back into the mainstream of the United States indicates that the Union satisfied the criteria for Jus Post Bellum, it should also be pointed out that in allowing the reentrance of Southern whites into American life a very large number of human beings, namely freed slaves and other blacks, were excluded from meaningful participation in American life and denied justice. In addition, injustice was allowed to continue in the South, in spite of the end of slavery, in the form of segregation and oppression targeting blacks and other ethnic and religious minorities. The other belligerent power in the war, the Confederacy, failed to satisfy any of the criteria of Just War theory. On final analysis, then, although the Civil War accomplished the good of finally ending slavery in the United States, a power representative of and at the helm of Western civilization, it must be concluded that the American Civil War was not a just war.

Notes1 Walt Whitman, “Thou Mother With Thy Equal Brood,” 4, Leaves of Grass (New York: The Modern Library, 2001), 564.

2 Declaration of Independence, http://www.ushistory.org/declaration/document/index.htm (accessed 23 December 2012).

3 Jon Dorbolo, “Just War Theory,” Oregon State University (2010) http://oregonstate.edu/instruct/phl201/modules/just_war_theory/criteria_intro.html (accessed 23 December 2012).

4 Saint Thomas Aquinas, Summa Theologica, Part II, Section II, Q. 40. Art. 2., ed. Robert Maynard Hutchins, Great Books of the Western World, Vol. 20 (Chicago: William Benton, 1952), 578.

5 Although it could be and has been argued that the incipient American government failed to meet this criterion in the Revolutionary War, the consequences of such a determination for the Civil War are ambiguous. There is the potential for using the assumption of the inherent righteousness of the American cause in the Revolution coupled with the lack of support for any revolution at all in Just War theory as an argumentum ad absurdum against Just War theory. Free of the assumption of the justness of the American cause against the British monarchy, however, the case could also be made that the American Revolution was in fact unjust. One example of a paper which argues that the American Revolutionary War was an unjust war is John Keown, “America’s War for Independence: Just or Unjust?,” Kennedy Institute of Ethics, Georgetown University, http://kennedyinstitute.georgetown.edu/files/KeownAmericasWar.pdf (accessed 23 December 2012).

6 The United States Constitution, Article I, Section 8, item 12, http://constitutionus.com/ (accessed 23 December 2012).

7 Abraham Lincoln, “First Inaugural Address,” http://www.bartleby.com/124/pres31.html (accessed 23 December 2012).

8 Charles Guthrie and Michael Quinlan, Just War: The Just War Tradition: Ethics in Modern Warfare (New York: Walker & Company, 2007), 13.

9 “States’ Rights: The Rallying Cry of Secession,” Civil War Trust (2011) http://www.civilwar.org/education/history/civil-war-overview/statesrights.html (accessed 23 December 2012).

10 Alexander Hamilton Stephens, in David J. Eicher, The Longest Night: A Military History of the Civil War (New York: Simon & Schuster, 2002), 49.

11 Ibid., 364-5.

12 Thomas Cahill, The Gifts of the Jews: How a Tribe of Desert Nomads Changed the Way Everyone Thinks and Feels (New York: Anchor Books, 1998), 154.

13 Thomas Cahill, Mysteries of the Middle Ages: The Rise of Feminism, Science, and Art from the Cults of Catholic Europe (New York: Doubleday, 2008), 44.

14 Aristotle, for example, argues in his Politics, Book I, Chapters 3-6, as elsewhere, that there are those who are “intended by nature to be a slave” and those, on the other hand, who are naturally masters. The Confederate racial ideology as elucidated by Stephens, though never fully developed, seems to have been a revival of this way of reasoning, which further exhibits the nature of the American Civil War as a civil war in Western civilization as a whole, perhaps between the Hebraic and Greco-Roman strands thereof. (Aristotle, Politics, in, Aristotle II, ed. Robert Maynard Hutchins, Great Books of the Western World, Vol. 9 (Chicago: William Benton, 1952), 446-9.)

15 Although the sentiment is common to many early Christian writers, Gregory of Nyssa is singled out for having issued one of the clearest calls for abolition in the ancient world in his fourth homily on Ecclesiastes; see Eric Denby, “The First Abolitionist? Gregory of Nyssa on Ancient Roman Slavery,” 9 May 2011, http://www.academia.edu/1485109/The_First_Abolitionist_Gregory_of_Nyssa_on_Ancient_Roman_Slavery (accessed 23 December 2012).

16 Cahill, Gifts of the Jews, 249.

17 Augustine of Hippo, in Aquinas, Summa Theologica.

18 Aquinas, Summa Theologica.

19 Eric Foner, Reconstruction: America’s Unfinished Revolution, 1863-1877 (New York: HarperCollins Publishers, Inc., 2002), 73-4.

20 Abraham Lincoln, in Foner, Reconstruction, 74.

21 Ritchie Devon Watson, Jr., Normans and Saxons: Southern Race Mythology and the Intellectual History of the American Civil War (Baton Rouge: Louisiana State University Press, 2008).

22 “Jefferson Davis and the Assassination,” University of Missouri – Kansas City School of Law, http://law2.umkc.edu/faculty/projects/ftrials/lincolnconspiracy/davistestimony.html (accessed 23 December 2012).

23 St. Augustine, The City of God, Book 4, Chapter 14, tr. Marcus Dods, in Robert Maynard Hutchins, Augustine (Chicago: William Benton, 1952), 196.

24 Thomas H. Flaherty, ed., The Colonial Overlords (TimeFrame AD 1850-1900) (Alexandria: Time-Life Books, 1990), 140.

25 “Election of 1860,” The American Presidency Project, http://www.presidency.ucsb.edu/showelection.php?year=1860 (accessed 23 December 2012).

26 William E. Gienap, “The Republican Party and the Slave Power,” in Robert H. Abzug and Stephen E. Maizlish, editors, New Perspectives on Slavery and Race in America: Essays in Honor of Kenneth M. Stampp (Lexington: University Press of Kentucky, 1986), 64-65.

27 John William Draper, in William C. Harris, “Abraham Lincoln and Secession,” The Lincoln Institute Presents: Abraham Lincoln’s Classroom, http://www.abrahamlincolnsclassroom.org/library/newsletter.asp?ID=140&CRLI=197 (accessed 23 December 2012).

28 Harris, “Abraham Lincoln.”

29 This conflict between the democratic principle of majority rule, enshrined in the Constitution, and the interests of the wealthy and powerful Southern aristocracy exhibits another way in which the American Civil War represents the summarizing of a conflict that had long troubled Western civilization as a whole, namely the conflict between the oligarchic and democratic forms of government. This rift in Western thought makes perhaps its first appearance in a written document with Herodotus, The History, Book III, pars. 80-3, in which passage the respective merits and demerits of monarchy, democracy, and oligarchy are discussed and debated. The history of Athens, arguably the world’s first democracy, also exhibits this tension. (Herodotus, The History, in Herodotus and Thucydides, ed. Robert Maynard Hutchins, Great Books of the Western World, Vol. 6 (Chicago: William Benton, 1952), 107-8.)

30 Harris, “Abraham Lincoln.”

31 Flaherty, Colonial Overlords, 140.

32 Abraham Lincoln, “Second Inaugural Address,” http://www.bartleby.com/124/pres32.html (accessed 23 December 2012).

33 Dorbolo, “Just War Theory.”

34 Russell S. Bonds, War Like the Thunderbolt: The Battle and Burning of Atlanta (Yardley: Westholme Publishing, 2009), 363.

35 For a classical presentation of the common depiction of the burning of Rome, see Suetonius, “The Life of Nero,” 38, in The Lives of the Caesars, http://penelope.uchicago.edu/Thayer/E/Roman/Texts/Suetonius/12Caesars/Nero*.html (accessed 23 December 2012).

36 Daniel Oakey, in “Sherman in Georgia!,” Home of the American Civil War (10 February 2002) http://www.civilwarhome.com/shermangeorgia.htm (accessed 23 December 2012).

37 “Sherman in Georgia!”

38 “The Carolinas Campaign: Death To All Foragers,” Wade Hampton Camp, http://www.wadehamptoncamp.org/hist-hvs.html (accessed 23 December 2012).

39 John G. Barrett, Sherman’s March Through the Carolinas (Chapel Hill: University of North Carolina Press, 1956), 96.

40 William T. Sherman, in “The Carolinas Campaign.”

41 Sherman’s statement sounds very similar to the claim of Aristotle in his Politics, Book I, Chapter 8, in which he asserts that “the art of war is a natural art of acquisition, an art which we ought to practise … against men who, though they be intended by nature to be governed, will not submit; for war of such a kind is naturally just.” In short, Aristotle, in a foreshadowing of Sherman, claims that it is right to take what one’s enemy cannot prevent one from taking and that the ability to acquire indicates that it is naturally just to do so. A similar sentiment is expressed in the famous Melian dialogue recorded in Thucydides’s account of The History of the Peloponnesian War, Book V, par. 89, in which the Athenians nonchalantly inform the Melians that “the strong do what they can and the weak suffer what they must.” This ethic of “might makes right” perhaps indicates the similarity of Sherman’s ideas of warfare to those developed before the advent of a full-fledged Just War theory following the triumph of Christianity in the Roman Empire. (Thucydides, The History of the Peloponnesian War, in Herodotus and Thucydides, ed. Robert Maynard Hutchins, Great Books of the Western World, Vol. 6 (Chicago: William Benton, 1952), 505.)

42 Ibid.

43 “Sherman’s March to the Sea,” Home of the American Civil War (16 February 2002) http://www.civilwarhome.com/marchtothesea.htm (accessed 23 December 2012).

44 Sherman, in Eicher, Longest Night, 847.

45 This statement presents an interesting contrast with the claim of the Presocratic Greek philosopher Democritus, as recorded by Plutarch, that men “ought to be instructed in the art of war … which is a source of great and glorious things for men,” in Plutarch, Against Colotes, 1126A. It demonstrates that even in the case of someone like Sherman, whose approach to warfare was far more in line with combat before the full flowing of Just War theory in the Christian era, perspectives had been altered and shaped by the introduction of new ideas on warfare. (Jonathan Barnes, Early Greek Philosophy (New York: Penguin Books, 2001), 229.)

46 Dorbolo, “Just War Theory.”

47 Sherman, in Eicher, Longest Night, 847.

48 Brian Orend, “Justice after War,” Carnegie Council for Ethics in International Affairs, http://www.carnegiecouncil.org/publications/journal/16_1/articles/277.html/_res/id=sa_File1/277_orend.pdf (accessed 23 December 2012).

Amazon.com Widgets

Diplomacy and International Relations in the 20th Century

 Diplomacy and international relations dominated the daily lives of average people more in the 20th century than in perhaps any previous century. Whereas it had been possible for earlier generations to live their lives free of such concerns, escaping the state of international relations in the 20th century was a near impossibility for the majority of the world’s population. The state of international relations and diplomacy was instead their ever-present concern and interest. This heightened importance for diplomacy and international relations to nearly all people in the 20th century is largely attributable to two phenomena that arose essentially side-by-side, namely the rise of modern republican and democratic nation-states in which every citizen plays a part in determining the policies of the government and the increase in technology, especially the technology used for warfare, that, in a sense, made the world simultaneously a “smaller” place as well as a more dangerous one.

Earlier generations of people had had the ability to live lives largely independent of any concern with diplomacy, international relations, or even politics in a more general sense. This was true of the ancient and medieval worlds as well as of the early modern period, essentially right up to the beginning of the 19th century. Although, of course, warfare has existed throughout human history and various peoples have no doubt been subject to the vicissitudes of politics, the whims of rulers, war, and diplomacy, any change was generally gradual and, given the limitations in communication and travel, generations could pass their lives with little or no knowledge of the political situation of the kingdom of which they were ostensibly subjects. Historian William Chester Jordan notes in his history of Europe in the High Middle Ages, for instance, that in that time period few in France outside of Paris would have considered themselves “French.”1

The change from this situation to the one that predominated in the 20th century largely occurred in the 19th century. As with so much that distinguishes the 20th century from previous eras in history, the 19th century was the transition point. It was during this period, under the influence of such events as the American Revolution and the French Revolution, both of which occurred near the close of the 18th century, that the subjects of the various kingdoms of the world began the transition to becoming citizens of the nations of the world, a very important difference in terminology. Individuals of all ranks, races, and economic statuses had a greater say in the policies of their governments than ever before in history. As a result, politics became a greater concern for the average person than it had been at any previous point in history. Political decisions were now in the hands of the people as a whole rather than in being the purview of only kings and the various aristocrats and nobles who surrounded these monarchs. As a result, politics was a greater concern for the private individual than it had ever been before in history.

The 19th century was also in large part the transition point for the second and equally affective major change that brought about the differences in regards to diplomacy and international relations in the 20th century in contrast with previous centuries, namely the advent of a great deal of new technology, especially travel, communications, and military technology.

New technology in travel that arose in the 19th century and advanced significantly in the 20th century includes trains, airplanes, and motor vehicles. Railroad travel enabled materials and men to travel greater distances at greater speeds than ever before. Airplanes also increased the ability to move people and materials quickly and effectively, as well as to bring the war behind enemy lines in combat and reconnaissance. The reconnaissance balloons of the American Civil War in the 1860s led to the stealth craft used by the opposing powers of the Cold War to spy on each other and also led to the omnipresent danger of bombs falling suddenly and unexpectedly from the sky in any given place, making the matters of diplomacy an ever-present reality for all people.2 Similarly, motor vehicles made people all over the world more mobile than ever before.

In addition to these abilities to move people and things faster than ever before over great distances, messages also moved with greater speed than ever before. The telegraph changed the nature of warfare in the 19th century and in the 20th century the advent of telephones, radios, and, later, computers and the internet made it possible to communicate around the world in a matter of seconds. Allied radio messages sent behind Nazi lines during World War II demonstrate the effectiveness of these new communication tools in shaping ideas, diplomacy, and warfare.3

Military technology is perhaps the greatest inventive force in shaping the realities of diplomacy and international relations in the 20th century and bringing these subjects into the homes of otherwise average people all over the world. The Cold War was largely the product of a mutual fear between the Soviet Union and the United States that the other would use nuclear weapons to advance their side in the conflict of ideas. Even after the fall of the Soviet Union in 1991, the threat of nuclear weapons falling into the hands of Islamic terrorist groups or rogue nations with bizarre ideologies such as Iran and North Korea continued to shape diplomacy at the highest levels as well as to bring the concerns of international relations to the minds of average people.

As a result of these two factors, the rise of individual concern in politics and the increase in technology that brought the realities of international relations into homes all over the world, a further element that defined diplomacy in the 20th century emerged, specifically the focus on nearly all-encompassing conflicts in ideology between large blocs of nations. Though it may seem ironic at first glance, the reality is that individual participation in politics, through spreading the concern in these issues wider than ever before, forced a situation in which international relations took on larger proportions than ever before. This can be seen in cases like World War I, World War II, and the Cold War, three conflicts which arguably defined international relations in the 20th century and all of which involved formations of alliances by dozens of nations arranged against an “equal and opposite” alliance of other nations, and all nations participating ostensibly out of a conflict of ideology coupled with a perceived existential threat from the other side.

The defining feature of diplomacy and international relations in the 20th century, as with so much of what makes the 20th century distinctive, is ultimately the allegorical shrinking of the world. The concerns of the government became the concerns of the average person. Simultaneously, the realities and concerns of far off lands came into the purview of people far away. These new advances in the political participation of individuals and technology created the unique diplomatic situation of the 20th century.


1 William Chester Jordan, Europe in the High Middle Ages (New York: Penguin Books, 2002), 229.
2 Amrom H. Katz, Some Notes on the History of Aerial Reconnaissance (Santa Monica: RAND Corporation, 1966).

3 Robert Rowen, “Gray and Black Radio Propaganda against Nazi Germany,” New York Military Affairs Symposium, 18 April 2003 (accessed 2 December 2012), http://bobrowen.com/nymas/radioproppaper.htm.

The American Civil War: A Case Study in 19th Century Diplomacy

Just as in previous eras, diplomacy and international relations functioned as tools by which nations sought to advance their respective interests relative to the interests of other nations in the 19th century. One event of the 19th century that serves as an example of many of the features and facets of diplomacy and international relations as it was practiced in that period is the American Civil War. In the Civil War, a part of the United States broke from the rest of the nation and formed its own ostensibly independent nation, the Confederate States of America. In a complex situation which combined domestic affairs with international relations, the struggle between the two sides included negotiations for prisoner exchanges and attempts by the Confederacy to draw certain European powers into the conflict on its side.

One feature of the Civil War which makes it an interesting case study in diplomacy is that the two powers primarily involved were two halves of the same nation, sharing in a common history and identity, and yet one of those powers, the Confederacy, tried to separate itself from the other and regard itself as a different entity. The other power, the Union, attempted to keep the states which had joined the Confederacy from breaking away but was forced by circumstance to interact with the Confederacy as if it were a separate power. This created an unique situation for both powers, one in which domestic affairs and international relations had to be combined and treated as synonymous in some sense.

One example of this situation may be found in the attempted prisoner exchanges between the Union and the Confederacy. The weapons used by both belligerents in the war, like nearly all weapons before the 20th century, were notoriously ineffective. The soldiers behind the weapons were also often undertrained and sometimes even entirely untrained. As a result, far more casualties were wounded than were killed and far more enemy soldiers were captured than wounded or killed by either side. Very early in the war “the ranks of prisoners began to swell.”1 In total, by the end of the war, the Union had “captured and held about 220,000 prisoners” and the Confederacy had taken approximately 210,000 prisoners.2

Because of these very large numbers of captured soldiers, the two sides found it difficult to adequately provide for those whom they held captive and devised a complex system of values by which to exchange the enemy’s prisoners for their own. Each prisoner was assigned a value determined by his rank and was traded to the enemy based on that value. A captured noncommissioned officer, for instance, was worth two privates. A captured general, on the other hand, was worth as many as 60 privates.

Of particular significance in regards to the complexities of mixing domestic affairs with international affairs due to the nature of the Civil War is the treatment prisoners received at the hands of the Confederacy versus that under the Union. Confederate soldiers captured by Union forces found far better conditions than Union soldiers captured by Confederate forces. The Union had hopes of restoring the Confederate states to itself and so tended to treat prisoners better in the hopes of repatriating them to itself in the future. The Union was also more willing to parole prisoners than the Confederacy, as can be seen by the 329,963 soldiers the Union “paroled or exchanged” by war’s end versus the 152,015 prisoners the Confederacy had “paroled or exchanged.”3

Also demonstrative of these complexities is the failed attempts of the Confederacy to gain the recognition and support of European governments. Immediately after secession, Confederate leaders had believed that European dependency on cotton from the states of the Confederacy would lead the nations of Europe to support the Confederate cause. Contrary to their hopes, however, the British government issued an official “proclamation of neutrality, which the other European powers followed” within only about a month of the war beginning.4

The Confederacy made several attempts throughout the years of the war to try to gain legitimacy through securing the recognition of European governments and possibly even bringing them in on its side. They sent ambassadors, for instance, to the French and English capitals in the hopes of persuading those nations’ respective leaders to support the Confederacy. They also, in part, determined battlefield tactics based on their belief that the Europeans might be swayed by what they saw on the battlefield. General Robert E. Lee, for instance, justified his strike into Northern territory, which seemed to go against the stated Confederate desire not to conquer the entire United States but to establish their own independent nation in the South, by reasoning that “a victory on Northern soil might spark foreign recognition for the young Confederate States, particularly from Britain and/or France.”5

Britain and France, for their parts, both exercised some very shrewd diplomacy in regards to the war, which they saw as a regional conflict from which they may be able to secure some profit. To this end, both European nations refused to give official recognition to the Confederacy, believing that doing so would alienate the United States. They did, however, agree to and engage in trading with the Confederacy as well as the Union. In this way, they were able to secure financial gain from both sides in the conflict and set themselves up for future diplomatic success no matter which side won the war.

The American Civil War was a complex situation which involved a strange combination of domestic and foreign affairs, and exhibits the intricacies of both as they were practiced in the 19th century. The issues of prisoner exchange and involvement of European powers both serve as examples of this complexity and importance.


1 David. J. Eicher, The Longest Night: A Military History of the Civil War (New York: Touchstone, 2001), 629.
2 Ibid, 628.
3 Ibid., 629.
4 U.S. Department of State, “Preventing Diplomatic Recognition of the Confederacy,” accessed 18 November 2012, http://future.state.gov/when/timeline/1861_timeline/prevent_confederacy.html.
5 Eicher, 337.