Essays/Addresses

Index:

  1. On the Uses of History in Foreign Affairs
  2. On the Original and Continuing Meanings of Memorial Day

1) On the Uses of History in Foreign Affairs

An address delivered to the Sons of the American Revolution, June 2, 2016

By Michael F. Duggan, Ph.D.

A favorite bone of contention in both history and foreign policy is over the role of history in policy, what kind of history should be used, and if not history, then what should take its place as a basis for informing policy decisions. In some respects, this divide between applied versus pure history is a part of the “realism versus theory/ideology/morality” debate. Some historians see history as primarily an intellectual or cultural pursuit comparable perhaps to the study of fine art or literature for personal enrichment and edification. Other historians actually in the field of foreign affairs make a powerful case that a broad, deep, and intimate understanding of history should be a, and perhaps the, primary basis for policy—i.e. to understand what work and what did not, and why (1) Given the record of historian-policy advisers like George F. Kennan, they would seem to have a point (2)

Whenever I would begin a course on history or historiography, I would ask my class the deceptively simple-sounding question: why do we study history? Although there are any number of answers to this question (and my students came up with many good ones), I tended to limit my own replies to three.

The first was a kind of metaphysical assertion: we study the past because it is a fundamentally and intrinsically worthwhile human activity. This would include the history-as-an-end-in-itself thesis. Second, I would tell them, that without a realistic understanding of what came before—without a grounding in liberal education to include philosophy, literature, art, as well as history—we have a less complete understanding of who we are and our place in the natural and human world in which we exist, however limited or imperfect that understanding might be. We study history therefore, in order to be more fully developed and integrated conscious beings.

Finally, I would tell them, we study history to learn about past instances of problem- solving, that a critical understanding of historical periods, episodes, and incidents may help us better navigate current problems. While history never repeats itself exactly, some events and sequences of events are sufficiently similar to provide meaningful comparable, and therefore applicable, historical lessons. The trick then is to know when to apply a lesson and when to ignore it or rather, when to honor it in the breach. Therefore, I would tell them, sound judgment and a flexible and realistic reading of history provide a better basis for policy than does either a rigid adherence to ideology or morality, and history helps develop the nuances of strategic judgment through understanding of the past and of human nature.

With the Middle East in flames and the American public living with a vague, open-ended foreboding of impending terrorist attacks, it is important to ask how got to where we are. The answer may be that with the possible exception of 1989-1993, most of the United States foreign policy of the past half-century has ranged from the merely bad to the catastrophic.

Part of the problem is cultural and educational: how students are taught these days, because today’s students are tomorrow’s policy advisors, statesmen and women, commentators, academics, and diplomats. Rather than the rigors and difficulty of analysis, the academy and the halls of power have become captivated with utopian narratives like globalization, neoliberalism, and neoconservatism, with dubious bases in historical reality. Above all we live in an age of often purblind overspecialization.

There was a time when universities taught students how to think, how to analyze questions, problems, and situations. If you went to Harvard in 1940 or 1960, you were taught to accept nothing on faith—you were taught to criticize and to evaluate ideas. At Georgetown you became a student of the Jesuit pedagogical tradition, learned analytical rigor, and you came to fear no question, no matter where it led. Although some distinctive schools still retain what makes them unique (St. Johns of Annapolis still bases its curriculum on the Great Books), most schools have become standardized.

Today higher education is becoming more and more homogenized as colleges and universities shift to a business model that sees students as customers and professors as the temporary help. Students going to a good state school are likely to get approximately the same education as they would at a good private school, and rather than being taught how to think, they will be taught what to think, with an eye to narrow, down-in-the weeds professional application.

The result has been well-behaved myopic careerists, credentialists, and conventionalists heads full of popular oracular wisdom, the million details of success (assuming that he/she chose a lucrative major to begin with) but with little idea of the big picture or how the details fit into it. Students are as smart as ever—perhaps even smarter—and with such powerful tools as laptops, Smartphones, and the Internet, there is virtually no limit to the information they can quickly access. But it has been my experience that many of them have little historical or wide-angle context for evaluating this information. They know about globalization as a mostly unexamined generality, but do they know that it leads to disparities of power and wealth that resemble the worst of nineteenth-century imperialism and plutocracy? Do they know that unrestricted free trade ultimately makes powerful nations less competitive among other nations of similar GNP who rely on trade bilateralism rather than multilateralism? They might if they had studied history.

There was a time when educated people were conversant in any number of topics from modern European literature to ancient Greek philosophy, to the newest discoveries in science. Many could read and even speak Latin and Greek. People took pride in their personal libraries and in the specific authors and titles. In recent decades however, trend in American higher education has been to educate specialists at the expense of general knowledge. As the joke goes, we will continue to know more and more about less and less until we know everything about nothing. American professionals may or may not be among the best specialists in the world in any given area of formal application, but as regards policy, narrowness of focus at the expense of understanding how the pieces fit into the big picture can be a severe liability.

One implication of the homogenization and over-specialization of American higher education in recent decades has been that all of our baby boomer presidents have been wonderful politicians, but inferior statesmen. As candidates, two of them were genuine wiz kids in terms of details and thinking on their feet in debates or at press conferences. Unless you were specifically against their policies, they could probably win you over with their oratory. Yet all three have shown themselves to be out of their depth in foreign affairs and relying on advisors of equally questionable judgment. Political style and an ideology of neoliberal economics via neoconservative means appear to have triumphed over substance and real understanding.

By contrast, consider that John Kennedy had an appreciation for history, had come of age with a front row seat to the Court of St. James during a critical period of world history, and had actually written two books on historical topics. He also read The Guns of August—Barbara Tuchman’s masterpiece of how the great powers stumbled into the First World War as the momentum of events went beyond their ability to control them—only months before the Cuban Missile Crisis (3).  Consider too, that Llewellyn “Tommy” Thompson knew Khrushchev personally, and that the Soviets were a more-or-less rational opponents who would welcome a way out of the crisis with a reasonable veneer of face-saving cover. Thompson firmly asserted his views to the president, and his perspective prevailed (4).  The Cuban Missile Crisis ended in a resolution that worked out well for everybody. Thus the knowledge of even specialists must go beyond mere formal understanding, and we ignore the importance of the personal at our own peril.

Consider also that some of the best Southeast Asia area experts at the State Department were weeded-out by the McCarthy hearings of the 1950s.5 With a dearth of intimate historical insight, American policy makers and strategists of the 1960s tended to see Vietnam as little more than a venue for fighting the Cold War, a blank square on a chessboard. If, in the early 1960s, we had had a better understanding of Vietnamese history—if we had known tat they had defeated the Mongols, the Chinese, and the Japanese in addition to the French, and were unlikely to ever surrender short of total annihilation—American choices to attempt to save Vietnam during the 1960s might have been very different.

This underscores the fact that there is also a kind of experiential knowledge and education that one cannot learn in the classroom: intimate (as opposed to formal) knowledge. How many present-day policy experts on the Middle East have advanced degrees in Islamic Studies or Middle Eastern Regional Studies from Ivy League universities? A lot. How many speak Arabic, Farsi, or Turkish? Quite a few. How many have lived among the ordinary people in this region for an extended period of time? How many have a good idea what these people say about us when we leave the room and when the microphones and cameras are gone? I wonder.

Why is this kind of experiential education important? Because it teaches lessons of what is really going on beyond the verbiage and appearances of policy, it informs the judgment, and helps us see ourselves the way others do. In the Middle East and elsewhere it is far more important to watch what people do rather than just listen to what they tell us, or imagine idealistically what parts of Western life and outlook they might or might not envy or emulate. Just because people are willing to take our money and tell us what we want to hear, hardly makes them our friends, and an intimate understanding of a region is the best way to gain this kind of insight and to ward off the possibility of misunderstanding.

In 1946 George Kennan framed the overarching policy of the United States toward the Soviet Union, a grand strategy that came to be known as Containment. Sure Kennan had gone to Princeton and had formally studied Russian (and a number of other languages), as well as Russian literature, culture, and history. But he also lived in the USSR for many years—decades, really—and knew its people, leaders, and, for lack of a better term, its national psychology. He also knew commonalities and irrationalities of human nature. In spite of much modification, tampering, and outright vandalism, Containment worked, and between 1989 and 1991, the Soviet Union imploded, much as Kennan had predicted it would, more than 40 years before.

Both science and policy deal with solving problems. The more theoretical science gets, the clearer and more focused it becomes, the more solid its conclusions, while the more formally theoretical policy becomes, the more clouded and ineffectual it becomes, the more purblind and dogmatic its practitioners. No human situations repeat themselves exactly, but to the degree that they do, the best way of addressing current problems is to see how similar problems were successfully resolved in the past, or even when analogous lessons would apply to differing situations with a few key similar aspects.

Knowledge of the past is by no means a guarantee for predicting or securing the course of future events, but a diplomat or negotiator armed with such understanding has a better change of arriving at the optimal outcome than those without it. The understanding of human nature based on history (and as opposed to purely idealistic, moral, or utopian theories that are untested, or have been tried and failed), should therefore be a primary basis—perhaps the primary basis—for policy. As George Santayana reminds us, “[t]hose who cannot remember the past are condemned to repeat it.” It would seem that this lesson applies equally to those who never knew it.

Notes

  1. George F. Kennan, is probably the best example.
  2. Going back to the American Revolution, there has been a split in American foreign affairs between a kind of proselytizing idealistic universalism of Thomas Jefferson and Thomas Paine and a kind of moderate realism originally advocated by Benjamin Franklin, Alexander Hamilton, George Washington, and John Quincy Adams. See for instance Alexander Hamilton’s Pacificus IV essay dated July 10, 1793, and Adams’s “Monsters to Destroy” speech of July 4, 1821. But how does moderate realism (as opposed to hardball real politick) compare to the record of policy based on ideological or moral considerations?It is fair to say that victory in WWII was based on a realistic approach, as was the strategy of containment that won the Cold War, the Marshall Plan, the rebuilding of Japan, and the Powell Doctrine of the Persian Gulf War.The Cold War was a complex litany of realistic measures such as the Berlin Airlift, and the Korean War (other than MacArthur’s foray north of the 38th parallel), and the overreaction of ideology, “crackpot” realism, idealism such as NSC-68, the unnecessary escalations of nuclear arms race generally, plots to overthrow Castro as well as governments in Guatemala and Iran, the Vietnam War, and some of Reagan’s “Evil Empire” talk during the early 1980s.

More recently the U.S. intervention in Somalia, the Afghan War (other than the initial invasion to root-out al Qaeda), and the Iraq War are instances of policy based on moral or ideological considerations.

American policymakers interpreted the so-call Arab Spring within the intellectual frames of globalization, pan-democratization and other neoliberal beliefs and assertions. Rather an analyze each instance as a phenomenon unique to the nation in which it was taking place (Tunisia being different from Egypt, Egypt being different from Yemen, Yeman being different from Libya, Libya being different from…), they saw it in terms akin to a vitalistic Hegelian unfolding of historicist inevitability—a wave sweeping a region akin to the European revolutions of 1848. A NATO air campaign was launched over Libya as a humanitarian intervention that resulted in a humanitarian crisis that continues to this day.

In a similar spirit, the United States encouraged Sunnis in Syria to rise up against the Alawhite Assad regime. But again historical inevitability did not deliver, and more than a quarter of a million lives later, hundreds of thousands of Syrians are flooding into an already overcrowded Europe.

Interestingly there are times when the practical thing to do would also appear to be the moral thing to do—when the “right thing to do” will be so startling in its results relative to the cost, it may be in a nation’s interest to do it. For example, it is possible that a few battalions of paratroopers or marines with infantry weapons and air support could have prevented the Rwandan genocide. If this is true, then the result would have been worth the effort. As it is, it is little more than a haunting counterfactual.

The Marshall Plan, the rebuilding of Japan after the Second World War, and the Berlin Airlift were realistic measures with moral consequences. It could be argued that even such ostensibly idealistic initiatives such as the Peace Corps, and Alliance for Peace, and the Alliance for Progress, had realistic consequences.

James Giglio, The Presidency of John F. Kennedy (2nd ed.), Lawrence Kansas: University of Kansas Press, p. 213, 2991. 2006.

This incident is recalled by Robert McNamara in Errol Morris’s 2004 documentary film, The Fog of War.

Specifically I am referring to John Paton Davies, Jr., John Stewart Service, and John Carter Vincent. See Robert McNamara, In Retrospect, New York: Random House, pp. 32-33, 1995.

 

 

 

2)   On the Original and Continuing Meanings of Memorial Day

 

An address presented May 30, 2016 to the General William Smallwood Chapter of the Sons of the American Revolution, May 30, 2016

 By Michel F. Duggan, Ph.D.

For those of us who have been around a little while, the work-a-day year becomes and increasingly rapid cycle. Federal holidays become the mileposts of the year’s progress, and it is easy to lapse into a mindset in which a day off from work is just a day off from work—an opportunity to run errands, cook out, and go on day trips or long weekend getaways. I do not say this with particular judgment, much less condemnation. Who among us has not gone to a Fourth of July picnic? Who has not had a beer on Labor Day to toast the end of another summer?

 

But each holiday embodies an idea, a dedication—some celebratory, others solemn—some involve a particular kind of remembrance. Remembrance is a tricky thing though, especially when those who observe a day have no living memory, no living connection, and only a tenuous historical or cultural understanding of the event that occasioned the original purpose of the day, and the original purpose itself. Although I would wager that this august assembly—dedicated in purpose to historical remembrance—has a far better understanding of the origins of Memorial Day than do most of our citizens, it is still useful and desirable from time to time to remind ourselves about the tragic origins of what many take simply as a day of leisure.

 

What we now call Memorial Day finds its origins of course in the catastrophic loss, suffering, and national disruption of our Civil War. But even the formal history of the day does not tell its full story; Memorial Day was a very personal thing—Decoration Day, sometimes called—when those who had lost loved ones in the war would decorate the graves of the fallen. Presumably it was also an occasion for veterans to wear their decorations and associations with the effort. In spite of common recognition and purpose, the day meant something distinctive to everyone who participated in its observance,

 

I have taught the Civil War at Georgetown University, and have tried to impress upon my students the sense of loss the war produced among the populace. In fact it was the first course I ever taught—a mere 16 months after the attacks of September 11, 2001. Although losses were far from uniform over the course of the Civil War—there were long periods of relative calm, and oftentimes deaths occurred, day, weeks, and even months after battles—I told them that in order to understand the magnitude of the impact the war had on the country, imagine losses approximately equivalent to one 9/11-scale attack once a week for about 200 weeks in a country with probably 1/12 of our population.

 

The traditional statistic is that by the war’s end one in 200 Americans has died, although this ratio may be on the low side, given that the total losses have been rounded-up in recent years to be more than 700,000 killed in action. Some have posited even larger numbers. The South was profoundly defeated—some of its largest cities lay in ashes and the region would not fully recover for a century or more.

 

Harvard professor Drew Gilpin Faust, attempted to capture this all-pervasive sense of loss in her 2008 social history This Republic of Suffering. In its broadest terms, she sees the day as a kind of national project of reunification—the official and personal use of loss to unify the country in the decades that followed the war. In this sense, the men who fought to save the Union, and those who fought to break away from it, might in death serve in memory to mend the rifts and help reconcile the nation.

 

Today is also the 132nd anniversary of the famous Memorial Day Address of the great Supreme Court Justice, and Civil War veteran, Oliver Wendell Holmes, Jr. Holmes had been an officer in the 20th Massachusetts Volunteer Regiment of Infantry—the celebrated “Harvard Regiment” that was all but decimated between its first battle in the fall of 1861 and Grant’s Virginia Campaign of 1864. Holmes was mustered out during the summer of ’64.

 

Holmes’s address is remarkable in its intimacy but without illusion that the direct connection of the day would eventually be lost. He understood the inevitable obscuring effect of time on the original purpose of the day, and believed or hoped that that it would become a more generalized day of national observance.

 

We must remember that Holmes himself was shot within an inch of his life on two occasions: through the chest at Balls Bluff near Leesburg, and through the neck at Antietam. He almost lost his foot to a Confederate artillery shell at Chancellorsville—a wound that was slow to heal and which kept him out of Gettysburg a month-and-a-half later. At Gettysburg, in the center of the Union line to the South of the Copse of Trees, 10 of the regiment’s 13 officers were either killed or wounded.

 

One can see from this speech, that the day had a far more direct and personal connection to men like Holmes and his generation than for most of us today. In this address he calls out a kind of roster of the fallen, all friends, mostly killed. What is particularly striking about this impressive speech is that he did not mention the killed and wounded by name, but rather by deed—it is a roster of deeds—knowing that his audience would instantly recognize them. For those of who have not studied this unit in detail, their names, as we would expect, are unfamiliar, other than as generalities—the names of prominent New England families like Revere, Abbott, and Putnam. For those who are interested, their emblazoned on the walls of the great Memorial Hall in Cambridge under the names of battles, large and small, that are seared into the consciousness of the historically aware. A fair number of Irish and German immigrants and Massachusetts farmers and fisherman, also fought and died with the 20th Mass.

 

A born skeptic, the war transformed Holmes from a rebellious youth to something harder, a stern—some might say a dark—realist about life and the law, but he never lost his sense of sentiment about those with whom he fought. He would toast the dead, sometimes alone on the anniversary of Antietam, just before he would begin his annual journey from Beverly Farm to Washington D.C. and the beginning of the Supreme Court term in early October. Although he disagreed with their cause, he also praised the qualities of his enemy, with an almost Nietzschean sense of circumspect and a realization that sometimes our foes are reflections of ourselves.

 

The rigors of war, and more than three years of some of the worst combat ever experienced by American fighting men instilled in Holmes and atheistic, almost existentialist outlook with a kind of terrestrial faith. To him life was a campaign toward the unknown end in which the individual—like the individual soldier—could only see a small portion the field directly in front of him with little idea of the bigger picture—the “great campaign,” as he put it. This rings familiar to Lincoln’s famous “the last full measure of devotion.”

 

I have my own personal associations with Memorial Day. My father—now in retirement—was an Army combat officer who served two tours in Vietnam. He was West Point ’59, Airborne, Ranger, Special Forces. Growing up in a military family, one becomes especially cognizant of Memorial Day, Veterans’ Day, and Armed Forces Day. As a young child I had friends who own fathers had been killed in the Vietnam—my father was wounded during his first tour—and I distinctly remember him leaving for the theater of operations. Upon retirement from the military, he made a second career at the National Headquarters of the American Legion, fighting for the rights and benefits of veterans.

 

Unlike the Civil War and the World Wars—and even Vietnam—there is no draft today and therefore little shared sacrifice. In the Civil War __% of men of combat age in the Norther States served in the Union Army and Navy. In the Second World War, about 18 million served out of a national population of around 140 million. About 12 million or about 9% of the total population was in uniform at war’s end. More than 61% of those who served between 1941 and 1945 were draftees. People my parent’s age can still recall the blue starts in the windows of neighbors and the gold star mothers. Most Americans had a father, uncle, brother, or son, in military service.

 

Today, by contrast, those who fight make up a little over 1% of the overall population—they fight and die, and other than the occasional story on the news, it does not affect most of us, not in a very personal way. This is why remembrance and Memorial Day continue to be especially important.

 

We live in a time in which undeclared wars have increasingly been used as a basis for foreign and economic policy. U.S. service men and women may be sent into harms way in furtherance of policy, but with little shared sacrifice, because so few of us feel the loss, there is relatively little criticism of policy. This is why the remembrance of the original purpose is not only important today, it is also topical.

 

I am not calling for a national draft—a proposition fraught with political difficulties left, right, and center—but I am suggesting that when a nation is made up of people with little, if any, shared sacrifice—when the line between those who serve and those who do not becomes too distinct—bad things can happen. Please be assured that I tread this ground lightly and do not excuse myself—for although I attended military school, and registered for the draft, there was no pending war or call-up, and I never went through anything like the severe trials my father experienced—trials that the rest of us can only guess at.

 

Shared sacrifice is not only a watchdog on policy and policy makers, it is also a way to make sure that wars are not just fought by “other people’s children,” and a guarantee that days like today will be remembered for something like their original purpose. Otherwise, it becomes all-too easy to mouth well-worn popular patriotic saying about supporting the troops—what some British soldiers in the Great War sometimes called “newspaper patriotism”—while not realizing the true sacrifice of war. More simply, it is just nice when people remember the day for its original deeply-felt purpose. But speeches like this one are also very easy, and it is a source of cognitive dissonance to live in a nation that constantly praises our fighting men and women while our VA hospitals have languished for so long.

 

I will conclude with a story about a friend of mine who worked on the National Mall for the Park Service. One day, while posted at the Vietnam War Memorial, a group of visitors to the site—obviously swept up in the aesthetic and emotional power of the Wall asked him what they could do to help, presumably as volunteers. He replied that they should consider volunteering at a veteran’s hospital. It was not the answer they expected or liked very much, but it is sometimes the answer we need, and something I need to work on myself. What I think he was saying is that it is our responsibility to be sure that the commitment to the ideas of Memorial Day—that remembrance—should consciously or unconsciously linger in proximity to our daily lives—that it should last beyond the day itself, that it should last the busy, livelong year.