George F. Kennan died 15 years ago yesterday. See posting of March 11, 2018.
The Wisdom and Sanity of Andrew Bacevich
Book Review (Unedited)
By Michael F. Duggan
Andrew J. Bacevich, Twilight of the American Century, University of Notre Dame Press, 2018. 492 pages.
What do you call a rational man in an increasingly irrational time? An anomaly? An anachronism? A voice in the wilderness? A glimmer of hope?
For those of us who devour each new article and book by Andrew J. Bacevich, his latest volume, Twilight of the American Century—a collection of his post-9/11 articles and essays (2001-2017)—is not only a welcome addition to the oeuvre, but something of an event. In these abnormal times, Bacevich, a former Army colonel who describes himself as a traditional conservative, is nothing short of a bomb thrower against the national security establishment. The ominous title of the present collection does not look out of place among the apocalyptic titles of a New Left history professor (Alfred W. McCoy/In the Shadows of the American Century), an apostate New York Times journalist flirting with socialism (Chris Hedges/America the Farewell Tour), an economics professor from Brandeis University (Robert Kuttner/Can Democracy Survive Global Capitalism?), and a prophetic legend (Jane Jacobs/Dark Age Ahead).
The new book was worth the wait.
A collection by a prolific author with broad, deep, and nuanced historical knowledge and understanding, Twilight of the American Century lends powerful insight over a wide territory of issues, events, and personalities. The brevity of the pieces makes it possible to pick up the book at any point or to jump ahead to areas of personal interest. Bacevich, a generalist with depth and a distinctive voice, offers what is without a doubt one of the most sensible takes on foreign policy and military affairs today.
In terms of outlook, he is a throwback to a time when “conservatism” meant Burkean gradualism—a careful, moderate, outlook advocating terraced progress over the jolts and whiplash of radical change and destabilizing shifts in policy. This perspective is based on a realistic understanding of human nature, the idea that people are flawed and that traditions, the law, strong government, and the separating of power are necessary to accommodate—to contain and balance—the impulses of a largely irrational animal and what Peter Viereck calls the “satanic pride” of the “unchecked ego.”
As regards policy, traditional (read “true”) conservatism is fairly non-ideological. It holds that rapid fundamental change results in instability and eventually violence. Those who have studied utopian projects or events like the Terror of the French Revolution, the Russian Revolution, or the Cultural Revolution may realize that this perspective is on to something. Traditional conservatives like Viereck believe that a nation should keep those policies that work while progressing gradually in areas in need of reform. They also embrace progressive initiatives when they appear to be working or when a more conservative approach is insufficient (Viereck supported the New Deal). The question is whether or not gradualistic change is possible in a time of great division in popular politics and lockstep conformity and conventionalism among the members of the Washington elite.
From his shorter works as well as books like The Limits of Power, Washington Rules, and America’s War for the Greater Middle East (to name a few) one gets two opposite impressions about Bacevich and his perspective. The first is that he never abandoned conservatism, it abandoned him and became something very different—a bellicose radicalism of the right that is odious to true conservatives (Viereck describes this as “the smug reactionary misuse of conservatism”). The second is more personal, that, like a hero from Greek tragedy, he realized in midlife that what he had believed to be true was wrong. At the beginning of his brutally honest and introspective introduction to the present book, he writes:
“Everyone makes mistakes. Among mine was choosing at age seventeen to attend the United States Military Academy, an ill-advised decision made with little appreciation for any longer-term implications that might ensue. My excuse? I was young and foolish.”
The implication of such a stark admission is that when one errs so profoundly, so early in life, it puts everything that follows on a mistaken trajectory. While this seems to be tragic in the classical sense (and is certainly “tragic” in more common usage as a synonym for personally catastrophic), it also appears to be what has made Bacevich the powerful critic he has become. After all, to the wise, truth comes out of the honest realization and admission of error. His previous “erroneous” life gives him a template of uncritical assumptions against which to judge the insights hard bought through experience and independent learning after he arrived at his epiphany, his moment of peripetia. The “mistake” (more like an object lesson in harsh self-criticism) and the realizing of it with such clarity of vision and disillusioned historical understanding make him the superb and principled critic he has become (and to be frank, his career as an army combat officer gives him certain “street creds” that cannot be easily dismissed and which he could not have earned elsewhere). It seems unlikely that Bacevich would have arrived at his current perspective as just another university professor.
One can only speculate about whether or not he makes the truth of his early “error” out to be more tragic than it really is. A more charitable reading is that this admission casts him as the hero in a Popperian success story of one who has taken the correct lessons from his own experiences, from trial and error. One can hardly imagine a more fruitful intellectual rising in midlife. It is also difficult to imagine how he would have arrived at his depth as a mature commentator via a more traditional academic route. Apologies, I draw close to psychologizing my subject.
In order to be a commentator of the first rank, a writer must know human nature—its attributes as the paragon of animals, its foolishness, its willfulness, its murderous animal irrationality—and must have judgment and a sense of circumspect that comes from historical understanding. You must know when to criticize and when to forgive, lest you become mean. Twain was a great commentator because he forgives foibles while telling the truth. Mencken is sometimes mean because he does not always distinguish between forgivable failings or weakness and genuine fault and exempts himself from his spot-on criticism of others.
An emeritus professor at Boston University, Bacevich knows history as well as any contemporary public intellectual and better than most. His historical understanding far exceeds that of the neocon/lib critics and policymakers of the Washington foreign policy Blob. He carries off his criticism so effectively, not by a lightness of touch, but by frank honesty and frequent humor and irony. It is apparent from the first line of the book that he holds himself to the same standards and one senses that he is his own toughest critic. His introduction is self-critical to the point of open confession. Bacevich is tough, but he is one of those rare people who is able to keep himself unblinkingly honest by not exempting himself from the world’s imperfections.
He dominates discussion then, not by raising his voice, but by reason and clarity of vision, sequences of surprising observations and interpretations that expose historical mythologies, false narratives, and mistaken perceptions, with an articulate and nuanced, if at times dour voice. Frank to the point of bluntness, he calls things by their proper name and has what Hemingway calls “the most essential gift for a good writer… a built-in, shockproof, bullshit detector,” the importance of which goes double if the writer is a historian. In less salty language, and in a time where so many commentators tend to defend questionable positions, Bacevich’s articles are a tonic because he simply tells the truth.
In his review of Frank Costigliola’s The Kennan Diaries, he flirts with meanness and overkill, but perhaps I am being oversensitive. Like many geniuses—assuming that he is one—Kennan was an eccentric and a neurotic, and it is all-too easy to enumerate his many obvious quirks (if we judge great artists, thinkers, and leaders by their foibles and failures, one can only wonder how Mozart, Beethoven, Byron, van Gogh, Churchill, Fitzgerald, Frank Lloyd Wright, Hemingway, and Jackson Pollock would fare; even The Bard would not escape whipping if we judge him by Henry VIII). As a Kennan partisan who tends to rationalize the late ambassador’s personal flaws, perhaps I am just reacting as one whose ox is being gored. I am not saying that Bacevich gets the facts wrong, only that his interpretation lacks charity. He rightly calls out Kennan’s views on race.*
This outlining of Kennan’s shortcomings also struck me as ironic and perhaps counterproductive in that Bacevich is arguably the closest living analog or successor to Mr. X as a commentator on policy both in terms of a realistic outlook and in the role of historian-as-Cassandra who is likely to be right and unlikely to be heeded by the powers that be. Both men fill the role(s) of the conservative as liberal-minded realist, historian as tough critic, and critic as honest broker in times desperately in need of correction (i.e. a sane man in insane times). As regards temperament, there are notable differences between the two: Bacevich strikes one as a stoical (Augustinian?) Catholic where Kennan, at least in his diaries, comes across as a Presbyterian kvetch and perhaps a clinical depressive with some ugly social views. Like Kennan too, Bacevich is right about many, perhaps most things, but not about everything; perfection is too much to ask of any commentator and we should never seek spotless heroes. The grounded historical realism and clear-sighted adumbrate of both men is immune to the seduction of bubbles a la mode, the conventionalist clichés of liberal interventionism and neoconservatism. Such insight is a rare gift that deserves our consideration and admiration.
The book is structured into four parts: Part 1. Poseurs and Prophets, Part 2. History and Myth, Part 3. War and Empire, and Part 4. Politics and Culture. The first part is made up of book reviews and thumbnail character studies. If you have any sacred cows among the chapter titles or in the index, you may find your loyalties strongly tested and if you have anything like an open mind, there is a reasonable chance that your faith in a personal favorite will be destroyed. Charlatans, true believers, puppet masters, and bona fide villains, as well as mere scoundrels and cranks including the likes of David Brooks, Tom Clancy, Tommy Franks, Robert Kagan, Donald Rumsfeld, Paul Wolfowitz, Albert and Roberta Wohlstetter, and, yes, George Kennan, all take their lumps and are stripped of their New Clothes for all to see. Throughout the rest of the book there is a broad cast of characters that receive a similar treatment.
This is not to say that Bacevich does not sing the praises of his own chosen few including Randolph Bourne, Mary and Daniel Beard, Christopher Lasch, C. Wright Mills, Reinhold Niebuhr, and William Appleman Williams, but here too is he completely honest and provides a list of his favorites up front in his introduction (his inclusion of the humorless, misanthrope, Henry Adams—another Kennan-like prophet, historian, and WASPy whiner—is both surprising and not).
Where to begin? Bacevich’s essays are widely ranging and yet embody a consistent outlook. Certain themes inevitably overlap or repeat themselves in other guises. He has a Twain-like antipathy for frauds and fakes and is adept at laying bare their folly (minus Twain’s punchlines and folksy persona). The problem with our time is that these people have come to dominate and their outlooks have become an unquestioned orthodoxy among their followers and in policy circles in spite of a record of catastrophe that promises more of the same.
To read Bacevich’s criticism is to realize that things have gone well beyond an Establishment wedded to an ideology of mistaken beliefs and into a realm of group psychosis. One comes away with the feeling that the Establishment of our time has become a delusional cult beyond the reaches of reason and perhaps sanity. Hume observes that “reason is the slave of the passions” and it is striking and frustrating to read powerful arguments and interpretations that are unlikely to change anything. If anything, Bacevich’s clarity of vision, common sense, and impressive historical fluency tend to disprove the observation attributed to Desiderius Erasmus that “in the land of the blind, the one-eyed man is king.” Rather, in a kingdom of the blind, a clear-sighted person will be ignored as a lunatic or else a marginal threat. If the kingdom is a theocracy, he will be burned as a heretic, if caught.
Are there any criticisms of Bacevich himself? Sure. For instance, one wonders if, like a gifted prosecutor, at times he makes the truth out to be clearer than it may really be. In this sense his brilliant Washington Rules is a powerful historical polemic as well as an interpretive short treatment of a period (less pointed surveys of the Cold War would include Robert Dalleck’s The Lost Peace, Tony Judt’s Postwar, and James T. Patterson’s Grand Expectations). Thus it is fair to regard him as a historian with a strong jab (again, this is not to suggest that he is wrong or even that he exaggerates). Or to put it in another, perhaps more accurate way, Professor Bacevich is one of the great interpretive historians of our time, it just that the cynicism and abnormality of the period since 1945, and especially since 1989, that make an honest accounting seem polemical. Getting history right is important and whether one is an interpretive historian or a two-fisted counter-puncher (or both) is ultimately trivial.
Also, given the imminent threat posed by the unfolding environmental crises, I found myself hoping that he would wade further into topics related to climate change—the emerging Anthropocene (i.e. issues of population/migration, human-generated carbon dioxide, loss of habitat/biodiversity, soil depletion, the plastics crisis, etc.)—and wondering how he might fit in with commentators like John Gray, Elizabeth Kolbert, Jed Purdy, Roy Scranton, Edward O. Wilson, and Malthus himself.
The only other criticism is that Bacevich is so prolific that one laments not finding his most recent articles in this collection. This is obviously a First World complaint.
Unlike a singular monograph, there is no one moral to this collection but a legion of lessons: that events do not occur in a vacuum—that events like Pearl Harbor, the Cuban Missile Crisis, and 9/11, and the numerous U.S. wars in the Near East all had notable pedigrees of error—and that bad policy in the present will continue to send butterfly effect-like ripples far into the future; that the stated reasons for policy are never the only reasons and often not the real ones; that some of the smartest people believe the dumbest things and that just because you are smart doesn’t necessarily mean that you are sensible or even sane; that the majority opinion of experts is often wrong; that bad arguments sometimes resonate broadly and masquerade as good ones and that without a nuanced understanding of history it is impossible to distinguish between them (even an intimate historical understanding of past events is no guarantee of sensible policy). If there is an overarching lesson from this book it is that the United States has made numerous wrong turns over the past decades that have put it on a perilous course on which it continues today at an even greater pace: we have topped the great parabolic curve of our national history and are heading down. Thus the title.
In short, Bacevich, along with Barlett and Steele, and a handful of other commentators on foreign policy, economics, and the environment, is one of the contemporary critics whose honesty and rigor can be trusted. As a matter of principle, we should always read critically and with an open mind, but in my experience, here is an author whose analysis can be taken as earnest, sensible, and insightful. He is also a writer of the first order, and the book is a triumph of applied history.
My recommendation is that if you have even the slightest feeling that things are amiss in this nation, its governance and policy, or if you are simply earnest about testing the validity of your own beliefs, whatever they are, you should read this book. If you think that everything is fine with the country and its policy course, then you should buy it today and read it cover to cover. After all, there is nothing more dangerous than a true believer and we arrive at wisdom by correcting our mistaken beliefs in light of experience, good faith discussion, and more powerful arguments to the contrary.
Postscript
Having had a chance to read Professor Costigliola’s recent (2023) biography, Kennan, A Life between Worlds, I now believe that Bacevich’s criticisms are entirely warranted.
New Article: Climate Change and the Limits of Reason
A new article of mine, “Climate Change and the Limits of Reason,” is currently available on The CounterPunch at:
Climate Change and the Limits of Reason
A Wonderful Life?
By Michael F. Duggan
I have always loved the Capra Holidays classic It’s a Wonderful Life, but have long suspected that it is a sadder story than most people realize (in a similar but more profound sense as Goodbye Mr. Chips). One gets the impression from the early part of the film that George Bailey could have done anything but was held back at every opportunity. After watching it last year, I tried to get my ideas about the film organized and wrote the this analysis.
In spite of its heart-warming ending, the 1947 Christmas mainstay by Frank Capra, It’s a Wonderful Life, is in some ways an ambiguous film and likely a sad story. George Bailey, the film’s protagonist played by Jimmy Stewart (in spite of his real-life Republican leanings), is the kind of person who gave the United States it’s most imaginative set of political programs from 1933 to 1945, policies that shepherded the country through the Depression, won WWII, and resulted in the greatest period of economic prosperity from 1945 until the early 1970s. Bailey wants to do “something big and something important”—to “build things” to “plan modern cities, build skyscrapers 100 stories high… bridges a mile long… airfields…” George Bailey is the big thinker—a “big picture guy”—and his father, Peter Bailey, the staunch, sensible, and fundamentally decent local hero. We need both kinds today.
In a moment of frankness bordering on insensitivity, George tells his father that he does not want to work in the Bailey Building and Loan, that he “couldn’t face being cooped up in a shabby little office… counting nickels and dimes.” His father recognizes the restlessness, the boundless talent and quality, the bridled energy, the wide-angle and high-minded ambition of his son. Wounded, the senior Mr. Bailey agrees with George, saying “You get yourself an education and get out of here,” and dies of a stroke the same night (his strategically-placed photo remains a moral omnipresence for the rest of the film, along with photos of General Pershing and U.S. presidents to link local events to broader historical currents).
One local crises or turn of events after another stymies all of George’s plans to go abroad and change the world just as they are on the cusp of fruition. Rather than a world-changer, he ends up as the local fixer for the good—a better and more vigorous version of a local hero, a status that confirms his “wonderful life” at the film’s exuberant ending where a 1945 yuletide flash mob descends on the Bailey household, thus saving the situation by returning years worth of good faith, deeds, and subsequent material wealth and prosperity. But what is it that sets George apart from the rest of the town that comes to depend upon him over the years?
At the age of 12 he saves his younger brother Harry from drowning (and by extension, a U.S. troopship in the South Pacific a quarter of a century later), leaving him deaf in one ear. Shortly thereafter, his keen perception prevents Mr. Gower, the pharmacist (distracted by the news of the death of his college student son during the Spanish Flu pandemic of 1918-1919), from accidentally poisoning a customer. As a young adult, George’s speculating about making plastics from soybeans by reviving a local defunct factory adds to the town’s prosperity and makes a fortune for his ambitious but less visionary friend, Sam “hee-haw” Wainwright, but not himself.
Other than saving the Building and Loan from liquidation, George’s primary victory is marrying his beautiful and wholesome sweetheart—“Marty’s kid sister”—Mary (Donna Reed) and raising a family. With a cool head and insight and the help of his wife, they single-handedly stop a run on the Building and Loan in its tracks with their in-hand honeymoon funds. The goodwill is reciprocated by most of the institution’s investors (one notably played by Ellen “Can I have $17.50” Corby, later Grandma Walton).
From there George goes on to help an immigrant family buy their own house and in fact helps build an entire subdivision for the town’s respectable working class, all the while standing up to the local bully: the cartoonishly sinister plutocratic omnipresence and dark Manachiest counterweight to everything good and decent in town, Mr. Potter (Lionel Barrymore). Potter is the lingering unregulated nineteenth-century, a caricature of the predatory robber baron, a dinosaur that in modified form cooked the economy during 1920s, resulting in the Great Depression. Even Potter comes to recognize George’s quality and, with an approach distantly related to charm, unsuccessfully attempts to buy him off (after presenting a brutally accurate assessment/summary of George’s life to date).
During the war, George’s bad ear keeps him out of the fighting (unlike the real Jimmy Stewart who flew combat missions in a B-24), and makes himself useful with such patriotic extracurriculars as serving as an air raid warden, and organizing paper, rubber, and scrap metal drives. And yet he seems to have adapted to and even accepted his fate of being tethered to the small financial institution he inherited from his father, and therefore the role of the town’s protector. He seems more or less happily resigned to his fate as a thoroughbred pulling a scrap metal wagon.
Were George Bailey just another guy in Bedford Falls or most towns in the United States (or, in Old Man Potter’s words, “if this young man was a common, ordinary yokel”), this would indeed be a wonderful life and indeed for most of us it would be. Even with all of his disappointments, his life is a satisfactory reply to the unanswerable Buddhist question, “How good would you have it?”
Taken at face value, George seems to be a great success at the end of the movie. In case this is not abundantly clear from the boisterous but benevolent 1940s Christmastime riot of unabashed exuberance—a reverse bank run or bottom-up version of a New Deal program or a spontaneous neighborhood Marshall Plan—at the movie’s end. His life’s investment in common decency pays dividends he did not imagine because it was all too close and familiar. Indeed, George’s bailout upstages his brother—now a Medal of Honor recipient—who proclaims, “To George Bailey, the richest man in town.” This is confirmed in the homey wisdom inscribed in a copy of Tom Sawyer by George’s guardian angel Clarence (a silly device and comic relief in a story about attempted suicide), that “No man is a failure who has friends.”
Of course Clarence is introduced into an already minimally realistic story to provide George with the exquisite but equally silly luxury—“a great gift”—of seeing what would have become of the town and its people without him (although to a lover of jazz, the counterfactual business district of Pottersville—an alternate reality to the overly precious Norman Rockwellesque Bedford Falls—is not completely lacking in appeal, with its hot jazz lounges, jitterbugging swing clubs, a billiards parlor, a (God forbid) burlesque hall, stride piano, and what appears to be a fleeting cheap shot at Fats Waller).
In this Hugh Everett-like alternate narrative device and dark parallel universe, he sees that his wife Mary is an unhappy mouse-like spinster working in a (God forbid) library; that Harry drowned as a child and was therefore not alive in 1944 to save a fully-loaded troop transport in the South Pacific. Likewise, everybody else in the town is an embittered, antisocial, outright bad or tragic version of themselves relative to the personally frustrating yet generally wonderful G-rated version of George’s wonderful life and town.
The problem is that George is not ordinary. He is no mere careerist, conventionalist, or money-chasing credentialist—he is a quick-thinking, from-the-gut maverick problem-solver with a heart of gold. He is exactly the kind of person we need now, but whom the establishment of our own time despises. Although harder to spot on sight in our own time, the charming and attractive Mr. Potter’s of the world have won.
In literary terms, George is not a typical beaten-down loser-protagonist of the modernist canon; he is not Bartleby the Scribner, Leopold Bloom, J. Alfred Prufrock, Willie Lohman, William Stoner, or the clueless victims of Kafka, but then neither is his stolid father. George is more akin to Thomas Hardy’s talented but frustrated Jude Fawley or a better version of James Hilton’s Mr. Chips—characters who might have amounted to more had they not been limited or constrained by internal and external circumstances.
Even more so, George is a descendant or modern cousin to the tragic-heroic protagonists of the Greeks and Shakespeare (i.e. a person who could have pushed the limits of human possibility). If only he could have gotten up to bat. He might have done genuinely great things, had his plans gotten off the ground, had the unforeseen chaos of life and social circumstances not intervened. We have seen what things would have been like without George, but we can only wonder what might have been if he had been allowed to succeed. Let’s see Clarence pull that trick out of his hat.
Just after breaking his father’s heart by revealing his ambitions, George confides to the older man that he thinks he is a “great guy.” True enough. But the conspicuous fact is that the older Bailey is much more on the scale of a local hero, a “pillar of the community”—a necessary type for any town to extinguish the day-to-day brush fires and is therefore perhaps more fully actualized and resigned to his modest role (even though it kills him mere hours later, or was it George’s announcement of his ambitious and desire to leave?). But George has bigger plans and presumably the abilities to match.
In a perfect world, someone like Mr. Bailey, Sr. would be better (and in fact is) cast in the role to which his son is relegated, even though his ongoing David versus Goliath battles with Potter likely contributed to his early death. George might have found an even more wonderful life if he had gone to college and law school and then gone to Washington to work for Tommy Corcoran and Ben Cohen drafting legislation, or as a project manager of a large New Deal program, or managing war production against the Nazis and Imperial Japanese. Instead he admonishes people to turn off their lights during air raid drills. In a better world, a lesser man could have handled all the relative evils of Bedford Falls. It’s a Wonderful Life is a tale of squandered talent.
Of course an alternative reading is that George is delusional throughout the movie, that he is not as great as we are led to believe, that—like most of us—he is not as good as his biggest dreams would suggest. Desire ain’t talent. But there is nothing in the film to suggest that this is the case. And the film’s ending suggests the opposite (to say nothing of its place in the Capra cannon—compare and contrast the ensemble and the feel of this film with those of Capra’s 1938 version of George S. Kaufman’s You Can’t Take It with You, which also features Jimmy the Raven and a completely lovable Lionel Barrymore).
The moral for our own time is that we need both kinds of Mr. Baileys—father and the son—and it is clear that in spite of numerous local victories, George could have done far more in the broader world (his shorter, less interesting younger brother, Harry, seems to have unintentionally hijacked George’s plans and makes a good go of them: he goes off to college, lands a plumb research position in Buffalo as part-and-parcel of marrying a rich and beautiful wife, disproportionately helps to win a world war, and returns after flying through a snowstorm—amazingly, as the same happy-go-lucky prewar kid brother—complete with our nation’s highest military honor after lunching with Harry and Bess Truman at the White House). George is the Rooseveltian top-down planner and social democrat while Mr. Bailey, Sr., is the organic, Jane Jacobs localist. Harry provides a glimpse at what George might have accomplished.
Even if we accept Capra’s questionable premise that George’s life is the most wonderful of possible alternatives (or at least a pretty darned good one), the ending is not an entirely satisfactory Hollywood ending. George’s likable, but absent-minded, Uncle Billy (Thomas Mitchell) inadvertently places $8,000 dollars (perhaps ten or twenty-fold that amount in 2018 dollars) into Mr. Potter’s hands (a crime witnessed and abetted by Mr. Potter’s wheelchair-pushing flunky, who, without a uttering single word, is arguably the most despicable person in the film—an equal and opposite silent counterpart to the recurring photograph of the late Mr. Bailey, Sr.), and his honest mistake is never revealed nor presumably is the money ever recovered.
Mr. Potter’s crime does not come to light, and George is nearly framed by the incident and driven to despair. Instead of a watery, self-inflicted death in the surprisingly deep Bedford River, he is happily bailed out (Bailey is bailed out after bailing out the town so many times), first by a homely angel and then by the now prosperous town of the immediate postwar.
The fact that his rich boyhood chum, the affable, frat-boyish Sam Wainwright, is willing to advance $25,000 out of his company’s petty cash puts the crisis into broader perspective and makes us realize that George was never really in that much trouble, at least not financially (although the Feds might have found such a large transfer to a close friend with a mysterious $8000 deficit to be suspicious). Wainwright’s telegram is a comforting wink from Capra himself. Had he not been so distracted by an accumulation of trying circumstances—the daily slings and arrows of being a big fish in the plunge basin of Bedford Falls (to mix metaphors)—this kindness of Sam’s and the whole town is something that George might have intuited himself, thus averting his breakdown in the first place. The bank examiner (district attorney?), in light of the crowd’s vouchsafing George’s reputation with a cash flow cornucopia, tears up the summons, and lustily joins in singing “Hark, the Herald Angel Sings.” We know that the townspeople will be paid back with interest greater than a ten-year war bond.
Still, the loss of $8,000 in Bedford Falls in 1945 is a crisis that drove George to the brink of suicide. This is a movie about hitting one’s limit. The seriousness of the crisis is another manifestation of the scale of events to which George has been consigned. If he had been a manager of wartime industrial production or a 1940s industrialist, like Same Wainwright, a similar deficit would have been a rounding error on a government contract that nobody would have noticed. On a side note, it would have been more appropriate for Heaven to have dispatched its resources to war-ravaged Europe in late 1945, rather than to a single person in a prosperous American town (or was the Marshall Plan really the Clarence Plan?).
At the movie’s end, George is safe and obviously touched by the outpouring of his community and appreciates just how good things really are (and you just know that a scene beginning with Donna Reed rushing in and clearing off an entire tabletop of Christmas-wrapping paraphernalia to make room for the charitable deluge to follow is going to be ridiculously heart-warming). His life may not have been on a grand scale, but the historical course of events that includes him is clearly better than the alternative.
At the film’s end, George is just as local as he was at the beginning. He has been powerfully instructed to be happy with the way things have turned out (why not, it’s almost 1946 in the United States, after all, and the bigger events in the world appear to have turned out just fine, right?). His wonderful life has produced a wonderful effort to meet a (still unsolved) crisis. But the thought lingers: could Clarence have showed him an alternative life’s course in which he was able to pursue his dreams? Just imagine what he could have done with 1940s federal funding and thousands of similarly well-intended people to manage—like those who engineered the New Deal, the WWII military and industrial mobilization, and the Marshall Plan. Would his name have ranked along with the likes of Harry Hopkins, Harold Ickes, Rex Tugwell, Adolph Bearle, Raymond Moley, Frances Perkins, Thomas Corcoran, Benjamin Cohen, Averell Harriman, George Marshall, and Franklin and Eleanor themselves?
It is impossible to resist the warmth and decency of this film’s ending (I have watched it in June and July), and I know that this essay has been minute and dissecting in its analysis. But what lessons might we take? I think the moral to those of us in 2018 is that below the surface of this wonderful movie is the cautionary tale that if we are to face the emerging crises of our own time, we will need a whole Brains Trust worth of George Baileys in the right places and legions of local people like his father throughout the nation. There is a danger in shutting out the George Baileys of our time or cosigning them to the wrong role. And yet our system as it exists today seems designed to do just that. We must also come to recognize the Mr. Potters of big business, big finance and their minions in the halls of political power who have dominated American public life for the past half-century. I suspect that they look nothing like Lionel Barrymore.
The Last Realist: George Herbert Walker Bush
By Michael F. Duggan
There was a time not long ago when American foreign policy was based on the pursuit of national interests. During the period 1989-1992 the United States was led by a man who was perhaps the most well-qualified candidate for the office in its history—a man who had known combat, who knew diplomacy, intelligence, legislation and the legislative branch, party politics, the practicalities of business and organizational administration, and how the executive and its departments functioned. For those of us in middle life, it seems like only yesterday, and yet in light of what has happened since in politics and policy, it might as well be a lifetime and a world away. The question is whether his administration was a genuine realist anomaly or merely a preface to what the nation has become.
Regardless, here’s to Old Man Bush: a good one-term statesman and public servant who was both preceded and followed by two-term mediocrities and mere politicians. A Commander-in-Chief who oversaw what was arguably the most well-executed large-scale military campaign in United States history (followed by poll numbers that might have been the highest in modern times) only to lose the next election. A moderate in politics and a good man personally who famously broke with the NRA, gave the nation a necessary income tax hike on the rich (for which his own party never forgave him), but against his better instincts adopted the knee-to-groin campaign tactics of party torpedoes and handlers in what became one of the dirtiest presidential campaigns in US history (1988) and ushered-in the modern period of “gotcha” politics.
Some critics at the time observed that Bush arose on the coattails of others, a loyal subordinate, a second-place careerist and credentialist who silver-medaled his way to the top, a New England blue blood carpetbagger who (along with his sons) ran for office in states far from Connecticut and Maine. Such interpretations do violence to the dignity, nuance, diversity, and sheer volume of the man’s life. Bush was the real thing: a public servant—an aristocrat who dedicated most his life to serving the country. Prior to becoming President of the United States, Bush served in such diverse roles as torpedo bomber pilot, wildcat oilman, Member of the House of Representatives, Liaison to a newly-reopened China, U.S. Ambassador to the United Nations, Chairman of the RNC, Director of the CIA, and Vice President of the United States. He was not, however, a spotless hero.
Foreign Affairs
The presidency of George Herbert Walker Bush (or just plain “George Bush” prior to the late 1990s) was a brief moment, in some respects an echo of the realism that served the nation so well in the years immediately following WWII.
A foreign policy realist in the best sense of the term, Bush was the perfect man to preside over the end of the Cold War, and my sense is that the most notable foreign policy achievements of the Regan presidency probably belong even more to his more knowledgeable vice president with whom he consulted over Thursday lunches. As president in his own right, it was Bush who, with the help of a first team of pros that included the likes of Brent Scowcroft, James Baker, and Colin Powell, let up Russia gently after the implosion of the USSR (he knew that great nations do not take victory laps), only to be followed by amateurs and zealots who arrogantly pushed NATO right up to Russia’s western border and ushered-in what looks increasingly like a dangerous new Cold War. If a great statesman/woman is one who has successfully managed at least one momentous world event, than his handling of the end of the Cold War alone puts him into this category.
Desert Storm
Interpreted as a singular U.S. and international coalition response to a violation to territorial sovereignty of one nation by another—and in spite of later unintended consequences—Desert Shield/Storm was a strategic, operational, and tactical work of art: President Bush gave fair warning (admittedly risky) to allow the aggressor a chance to pull back and reverse course, masterfully sought and got an international mandate and then congressional approval, built a coalition, amassed his forces, went in with overwhelming force and firepower, achieved the goals of the mandate, got the hell out. But the success or failure of the “Hundred-Hour War” depends on whether it is weighed as a geopolitical “police action” or as just another episode of U.S. adventurism in the Near East (that led to much greater involvement and disasters), or as some kind of hybrid.
As a stand-alone event then, the campaign was “textbook,” but then in history there is no such thing as a completely discrete event. Can the operational success of Desert Storm be separated from what others see as a more checkered geopolitical legacy? Can the success of the felt necessities of the times of a theater of combat be tarnished by later, unseen developments? Was the “overwhelming force” and overkill of the Powell Doctrine (which could equally be called the Napoleon, Grant, MacArthur, or LeMay Doctrine) a preface to the “Shock and Awe” of his son’s war in the region? Was his calculated restriction of press access in a war zone a precursor to later and even more propagandistic wars with even less independent press coverage?
Just as history never happens for a single reason, nor is any victory truly singular, pure, and unalloyed. Twenty-six years on, I realize that my rosy construction of what has since become known as the First Gulf War (or the Second Iraq War in the sequence proffered by Andrew Bacevich) is not shared by all historians. Questions remain: was Saddam able to invade Kuwait because Bush and his team were distracted by momentous events in Europe? Was the Iraqi invasion merely a temporary punitive expedition that could have been prevented if Kuwait hadn’t aggressively undercut Iraqi oil profits? Would Hussein have withdrawn his forces on his own after sufficiently making his point? Was April Glaspie speaking directly for the President Bush or Secretary Baker when she met with the Iraqi leader on July 25, 1990? War is a failure of policy, and could the events leading up to the invasion (including public comments made by Baker’s spokesperson, Margaret Tutwiler) have been seen by the Iraqis as a green light in a similar way that the North Koreans could have construed Acheson’s “Defensive Perimeter” speech to the National Press Club in early 1950 as such? (See Bartholomew Sparrow, The Strategist, Brent Scowcroft and the Call of National Security. 420-421).
Some historians have been more critical in their “big picture” assessments of Desert Storm, claiming that when placed in the broader context of an almost four-decade long American war for the greater Middle East, this was just another chapter in a series of misled escalations (See generally Bacevich, America’s War for the Greater Middle East, A Military History). In this construction too, the war planners had not decapitated the serpent and had left Hussein’s most valuable asset—the Republican Guard—mostly intact to fight another day against an unsupported American ally who Mr. Bush had encouraged to rise up, the Iraqi Kurds (as well as Shiites).
While some of these points are still open questions, the mandate of the U.N. Security Council resolution did not include taking out Hussein. In light of what happened after 2003, when the U.S. military did topple the regime, Bush I and his planners seemed all the more sensible. Moreover the “Highway of Death” was beginning to look like just that—a traffic jam of gratuitous murder—laser-guided target practice, a precision-guided turkey shoot against a foe unable defend himself, much less fight back. With the Korean War as historical example, Scowcroft was cognizant of the dangers implicit in changing or exceeding the purely military goals of a limited mandate in the face of apparent easy victory. Having met the stated war aims, Powell and Scowcroft both advocated ceasing the attack as did Dick Cheney. (See Sparrow, The Strategist, Brent Scowcroft and the Call of National Security. 414-415).
When second-guessed about why the U.S. did not “finish the job,” advisors to the elder Bush answered with now haunting and even prophetic rhetorical questions about the wisdom of putting U.S. servicemen between Sunnis and Shiites (James Baker’s later observation about the war in the Balkans that “We don’t have a dog in that fight” seems to have applied equally to internal Iraqi affairs). Besides, it would have made no sense to remove a powerful secular counterbalance to Iran, thus making them the de facto regional hegemon. Did the U.S. “abandon” Iraq while on the verge of “saving” it? Should the U.S. have “stayed” (whatever that means) in Iraq in 1991? My takeaway from the history of outsiders in the Middle East is that the only thing more perilous than “abandoning” a fight in the region once apparent victory is secured, is to continue fighting, and that once in, there is no better time to get out than the soonest possible moment. The history of U.S. adventures in Iraq since 2003—the Neocon legacy of occupation and nation-building—speaks for itself.
Bush’s humanitarian commitment of American forces to the chaos of Somalia in the waning days of his administration still baffles realists and seems to have honored Bush’s own principles in the breach. It makes no sense. One can claim that it was purely a temporary measure that grew under the new administrations, but it is still hard to square with the rest of Bush’s foreign policy.
Of course there were other successes and failures of a lesser nature: high-handedness in Central America that included a justified but excessive invasion of Panama. The careful realist must also weigh his masterful handling of the demise of the Soviet Union with what looks like a modest and principled kind of economic globalization and what appears to be a kind of self-consciously benevolent imperialism: the United States as the good cop on the world beat. The subsequent catastrophic history of neoliberal globalization and interventionism have cast these budding tendencies in a more sobering light.
Domestic Policy and Politics
Domestically, Bush’s generous instincts came to the fore early on and were foreshadowed in the Emersonian “Thousand Points of Light” of his nomination acceptance address, and he did more at home than most people realize. He gave us the Americans with Disability Act (ADA)—one of the most successful pieces of social legislation of recent decades—the modest Civil Rights Act of 1991, the 1990 amendment to the Clean Air Act, a semiautomatic rifle ban, successfully handled the consequences of the Savings and Loan Crisis, and of course he put David Souter on the High Court. Perhaps he did not know how to deal with the recession of 1991. My reading is that the recession was an ominous initial rumbling of things to come, as American workers increasingly became victims of economic globalization. Some historians believe that the good years of the 1990s owe a fair amount to Bush’s economic polices, including the budget agreement of 1990, which reduced the deficit. Bush fatefully underestimated the rise of the far right in his own party, making his plea for a “kinder, gentler” nation and political milieu a tragic nonstarter. His catch phrase from the 1980 campaign characterizing the absurdity of supply-side economics as “voodoo economics” was spot-on, but was another apostasy that true-believers in his own party were unlikely to forget or forgive. Certainly he did not do enough to address the AIDS crisis.
It is shocking that a man of Bush’s sensibilities and personal qualities conducted the presidential campaign of 1988 the way he (or rather his handlers, like Lee Atwater) did. Against a second-rater like Michael Dukakis, the “go low,” approach now seems like gross overkill—a kind of political “Highway of Death”—that was beneath the dignity of an honorable man. On a similar note, it is hard to understand his occasional hardball tactics, like the bogus fight he picked with Dan Rather on live television at the urging of handlers. Perhaps it was to counter the charges of his being a “wimp.”
Again, this approach seems to have been completely unnecessary—overreaction urged by politicos and consultants from the darker reaches of the campaign arts. How is it even possible that a playground epithet like wimp would even find traction against a man of Bush’s demonstrated courage and service to country? All anybody had to do was remind people that he youngest navy pilot in the Second World War who had enlisted on the first day he legally could, and that he was fished out of the Pacific after being shot down in a Grumman TBF Avenger torpedo bomber (but then Bush embodied an ethos of aristocratic modesty and the idea that one did not talk about oneself, much less brag). By comparison, the rugged Ronald Reagan never went anywhere near a combat zone (as a documentary on The American Experience observed, “Bush was everything Regan pretended to be”: a war hero, college athlete, and a family man whose children loved unconditionally). I’m not sure if Clinton ever made any pretense of fortitude.
We ask our presidents to succeed in two antithetical roles: that of politician and of statesmen, and in recent years, the later has triumphed seemingly at the expense of the former. Style has mostly trumped substance, something that underscores a flaw in our system and what is has become. As casualties of reelection campaigns against charismatic opponents, Gerald Ford and “Bush 41” might be metaphors for this flaw and of our time and a lesson emphasizing the fine distinction that a single-term statesman is generally superior and preferable to a more popular two two-term politician. Reagan, Clinton, Bush 43, and Obama were all truly great politicians and unless you were specifically against them or their policies, there was a reasonable chance that they could win you over on one point or another with style, communication skills, and magnetic charm. That said, and unlike the senior Bush, I would contend that there is not a genuine statesman in that group.
It is difficult for any president to achieve greatness in either foreign or domestic affairs, much less in both (as a latter day New Dealer, I would say that FDR may have been the last to master both). George Herbert Walker Bush was a good foreign policy president and not bad overall—a leader at the heart of a competent administration. By all accounts, he was good man overall and the people who knew him are heaping adjectives on is memory: dignity, humility, honor, courage, class—a good president and a notable American public servant. But ultimately personal goodness has little to do with the benevolence or harm of policy, to paraphrase Forrest Gump, good is what good does (some policy monsters are personally charming and even decent while some insufferable leaders produce great and high-minded policy), and as aging news transforms with greater circumspect into history, the jury is still out on much of the complex legacy of Bush I.
Subsequent events have cast doubt on what seemed at the time to be spotless successes, and realistic gestures now seem more like preface to less restrained economic internationalism and military adventurism. Still, I am willing to give the first President Bush the benefit of the doubt on interpretations of events still in flux. Just in writing this, and given what has happened in American politics and policy ever since, I have the sinking feeling that we not see his like again for a long time, if ever again.
Geoffrey Parker’s Global Crisis
Book Review
Geoffrey Parker, Global Crisis: War, Climate Change and Catastrophe in the Seventeenth Century, Yale University Press, 2014, 904 pages.
Reviewed by Michael F. Duggan
This book is about a time of climate disasters, never-ending wars, economic globalism complete with mass human migration, imbalances, and subsequent social strife—a period characterized by unprecedented scientific advances and backward superstition. In other words, it is a world survey about the web of events known as the 17th century. Although I bought it in paperback a number of years ago, I recently found a mint condition hardback copy of this magisterial tome by master historian, Geoffrey Parker (Cambridge, St. Andrews, Yale, etc.), and felt compelled to write about it, however briefly. I am drawn to this century because of its contrasts as the one that straddles the transition from the Early Modern to the Age of Reason and Enlightenment and more broadly marks the final shift from Medieval to Modern (even before Salem colonists hanged their neighbors suspected of witchcraft, Leibniz and Newton had independently begun to formulate the calculus).
In 1959, British historian H. R. Trevor-Roper, presented the macro-historical thesis of the “General Crisis” or the interpretive premise characterizing the 17th century as an overarching series of crises from horrible regional wars (e.g. the Eighty Years War, the Thirty Years Wars, the English Civil War and its roots and spillover into Scotland and Ireland) and rebellions, to widespread human migration and the subsequent spread of disease, any number of epidemics, global climate change, and a long litany of some of the most extreme weather events in recorded history (e.g. the Little Ice Age). When I was in graduate school, I had intuited this premise on my own (perhaps after reading Barbara Tuchman’s A Distant Mirror, about the “Calamitous 14th Century”), but was hardly surprised to discover that Trevor-Roper had scooped me by 40 years.
Parker has taken this thesis and generalized it in detail beyond Europe to encompass the entire world to include catastrophic events and change throughout the Far East, Russia, China, India, Persia, the greater Middle East, Africa, and the Americas. Others, including Trevor-Roper himself, also saw these in terms of global trends and scope, but, to my knowledge, Parker’s book is the fullest and most fleshed-out treatment. His work “seeks to link the climatologists’ Little Ice Age with the historians’ General crisis—and to do so without ‘painting bull’s eyes around bullet holes.'” It is academic history, but is well-written and readable for a general audience. It is well-researched history on a grand scales. For Western historians, such as myself, the broader perspective is eyeopening and suggestive of human commonality rather than divergence. We are all a part of an invasive plague species and we are all victims of events, nature, and our own nature.
Although I am generally skeptical of macro interpretive premises that try to explain or unify everything that happened during a period under a single premise—i.e. the more a theory or thesis tries to explain, the more interesting and important, but the weaker is usually is as a theory and therefore the less it explains (call it a Heisenberg principle of historiography)—this one is on to something, at least as description. The question(s), I suppose, is the degree to which the events of this century, overlapping or sequential in both geography and time, are interconnected or emerge from common causes or if they were a convergence of factors both related and discrete, or rather is the century a crisis, a sum of crises, or both? Correlation famously does not establish causation. To those who see human history in the broadest of terms—in terms of of the environment, of humankind as a singular prong of biology, and therefore of human history as an endlessly interesting and increasingly tragic chapter of natural history—this book will be of special interest.
In college I was ambivalent about the 17th century. More than most centuries, it was an “in between times” period, neither one thing, nor the other. All periods are artificial and intermediary, but the 17th century seemed especially artificial given the fundamental advances and shifts in intellectual history that occurred in the Europe between 1601 and 1700. In the West, the 18th century seemed like a coherent, unified world, the Newtonian paradigm. But the 17th century was a demarcation, a caldron from which the world of the Enlightenment and the Age of Reason emerged. The 18th century was the sum and creation of the previous century, a world unified under Bacon, Locke, Descartes, Spinoza, Leibniz, Locke, Newton, and many others, and today I find the earlier period to be the more interesting of the two. This book only feeds this belief.
As someone who thinks that one of the most important and productive uses of history is to inform policy and politics, it is apparent that the author intends this book to be topical—a wide-angle yet detailed survey of another time, for our time. In general the 17th century is good tonic for those who believe that history is all sunshine and light or that human progress (such as it is) is all a rising road. It is also serves as cautionary example for what may be coming in our own time, and a reminder that humanity is a subset of the planet and its physical systems. A magnum opus of breathtaking breadth and ambition, this book is certainly worth looking at (don’t be put off by its thickness, you can pick it up at any point and read a chapter here or there).
Fat Man and Little Boy
I wrote this for the 70th Anniversary of the atomic bombings of Japan. It appeared in an anthology at Georgetown University. This is taken from a late draft, but the editing is still a bit rough.
Roads Taken and not Taken: Thoughts on “Little Boy” and “Fat Man” Plus-70
By Michael F. Duggan
We knew the world would not be the same. A few people laughed, a few people cried. Most people were silent. I remembered the line from the Hindu scripture, the Bhagavad Gita… “I am become Death, the destroyer of worlds.”
-Robert Oppenheimer
When I was in graduate school, I came to characterize perspectives on the decision to drop the atomic bombs on Japan into three categories.
The first was the “Veterans Argument”—that the dropping of the bombs was an affirmative good. As this name implies, it was a position embraced by some World War Two veterans and others who had lived through the war years and seems to have been based on lingering sensibilities of the period. It was also based on the view the rapid end of the war had saved many lives—including their own, in many cases—and that victory had ended an aggressive and pernicious regime. It also seemed tinged with an unapologetic sense of vengeance and righteousness cloaked as simple justice. They had attacked us, after all—Remember Pearl Harbor, the great sneak attack? More positively, supporters of this position would sometimes cite the fact of Japan’s subsequent success as a kind of moral justification for dropping the bombs.
Although some of the implications of this perspective cannot be discounted, I tended to reject it; no matter what one thinks of Imperial Japan, the killing of more than 150 thousand civilians can never be an intrinsic good. Besides there is something suspect about the moral justification of horrible deeds by citing all of the good that came after it, even if true.1
I had begun my doctorate in history a couple of years after the fiftieth anniversary of the dropping of the Hiroshima and Nagasaki bombs, and by then there had been a wave of “revisionist” history condemning the bombings as intrinsically bad, as inhumane, and unnecessary—as “technological band aides” to end a hard and bitter conflict. The argument was that by the summer of 1945, Japan was on the ropes—finished—and would have capitulated within days or weeks even without the bombs. Although I had friends who subscribed to this position, I thought that it was unrealistic in that it interjected idealistic sensibilities and considerations that seemed unhistorical to the period and the “felt necessities of the times.” Was it realistic to interject 1990s moral observations onto people a half-century earlier in the midst of the most destructive war in human history?
This view is was also associated with a well-publicized incident of vandalism against the actual Enola Gay at a Smithsonian exhibit that ignited a controversy that forced the museum to change its interpretive text to tepid factual neutrality.
And then there was a kind of middle-way argument—a watered-down version of the first—asserting that the dropping of the bombs—although not intrinsically good—was the best of possible options. The other primary option was a two-phased air-sea-land invasion of main islands of Japan: Operation Olympic scheduled to begin on November 1, 1945, and Operation Coronet, scheduled for early March 1946 (the two operations were subsumed under the name Operation Downfall). I knew people whose fathers and grandfathers were still living who had been in WWII, and who believed with good reason that they would have been killed fighting in Japan. It was argued that the American casualties for the war—approximately 294,000 combat deaths—would have been multiplied two or three fold if we had invaded, to say nothing about the additional millions of Japanese civilians that would have likely died resisting. The Okinawa campaign of April-June 1945, the viciousness and intensity of the combat there and appalling casualties of both sides were regarded as a kind of microcosm, a prequel of what an invasion of Japan would be like.2
The idea behind this perspective was one of realism, that in a modern total war against a fanatical enemy, one took off the gloves in order to end it as soon as possible. General Curtis LeMay asserts that it was the moral responsibility of all involved to end the war as soon as possible, and if the bombs ended it by a single day, then using them was worth the cost.3 One also heard statements like “what would have happened to an American president who had a tool that could have ended the war, but chose not to use it, and by doing so doubled our casualties for the war?” It was simple, if ghastly, math: the bombs would cost less in terms of human life than an invasion. With an instinct toward the moderate and sensible middle, this was the line I took.
In graduate school, I devoured biographies and histories of the Wise Men of the World War Two/Cold War era foreign policy establishment—Bohlen, Harriman, Hopkins, Lovett, Marshall, McCloy, Stimson, and of course, George Kennan. When I read Kai Bird’s biography, Chairman, John McCloy and the Making of the American Foreign Policy Establishment, I was surprised by some of the back stories and wrangling of the policy makers and the decisions behind the dropping of the bombs.4 It also came as a surprise that John McCloy (among others), had in fact vigorously opposed the dropping of the atomic bombs, perhaps with very good reason.
Assistant Secretary of War John McCloy was nobody’s idea of a dove or a pushover. Along with his legendary policy successes during and after WWII, he was controversial for ordering the internment of Japanese Americans and for not bombing the death camps in occupied Europe, because doing so would divert resource from the war effort and victory. He was also the American High Commissioner of occupied Germany after the war and had kept fairly prominent Nazis in their jobs and kept out of prison German industrialists who had played ball with the Nazi regime. Notably, in the1960s, he was one of the only people on record who flatly stood up to President Lyndon Johnson after getting the strong-armed “Johnson treatment” and was not ruined by it. And yet this tough-guy hawk was dovish on the issue of dropping the atomic bombs.
The story goes like this: In April and May, 1945, there were indications that the Japanese were seeking a settled end to the war via diplomatic channels in Switzerland and through communications with the Soviets—something that was corroborated by U.S. intelligence.5 Armed with this knowledge, McCloy approached his boss, Secretary of War, and arguably father of the modern U.S. foreign policy establishment, “Colonel” Henry L. Stimson. McCloy told Stimson that the new and more moderate Japanese Prime Minister, Kantaro Suzuki, and his cabinet, were looking for a face-saving way to end the war. The United States was demanding an unconditional surrender, and Suzuki indicated that if this language was modified, and the Emperor was allowed to remain as a figurehead under a constitutional democracy, Japan would surrender.
Among American officials, the debates on options for ending the war included many of the prominent players, policy makers and military men like General George C. Marshall, Admiral Leahy and the Chiefs of Staff, former American ambassador to Japan, Joseph Grew, Robert Oppenheimer (the principle creator of the bomb), and his Scientific Advisory Panel to name but a few. It also included President Harry Truman. Among the options discussed was whether or not to give the Japanese “fair warning” and if the yet untested bomb should be demonstrated in plain view of the enemy. There were also considerations of deterring the Soviet, who had agreed at Yalta to enter the war against Japan, from additional East Asian territorial ambitions. Although it was apparent to Grew and McCloy, that Japan was looking for a way out, therefore making an invasion unnecessary, the general assumption was that if atomic bombs were functional, they should be used without warning.
This was the recommendation of the Interim Committee, that included soon-to-be Secretary of State, James Byrnes, and which was presented to Truman by Stimson on June 6.6 McCloy disagreed with these recommendations and cornered Stimson in his own house on June 17th. Truman would be meeting with the Chiefs of Staff the following day on the question of invasion, and McCloy implored Stimson to make the case that the end of the war was days or weeks away and that an invasion would be unnecessary. If the United States merely modified the language of unconditional surrender and allowed for the Emperor to remain, the Japanese would surrender under de facto unconditional conditions. If the Japanese did not capitulate after the changes were made and fair warning was given, the option for dropping the bombs would still be available. “We should have our heads examined if we don’t consider a political solution,” McCloy said. As it turned out, he would accompany Stimson to the meeting with Truman and the Chiefs.
Bird notes that the meeting with Truman and the Chiefs was dominated by Marshall and focused almost exclusively on military considerations.7 As Bird writes “[e]ven Stimson seemed resigned now to the invasion plans, despite the concession he had made the previous evening to McCloy’s views. The most he could muster was a vague comment on the possible existence of a peace faction among the Japanese populace.” The meeting was breaking up when Truman said “No one is leaving this meeting without committing himself. McCloy, you haven’t said anything. What is your view?” McCloy shot a quick glance to Stimson who said to him, “[s]ay what you feel about it.” McCloy had the opening he needed.8
McCloy essentially repeated the argument he had made to Stimson the night before. He also noted that a negotiated peace with Japan would preclude the need for Soviet assistance, therefore depriving them of any excuse of an East Asian land grab. He also committed a faux pas by actually mentioning the bomb by name and suggesting that it be demonstrated to the Japanese. Truman responded favorably, saying “That’s exactly what I’ve been wanting to explore… You go down to Jimmy Byrnes and talk to him about it.”9 As Bird points out,
[b]y speaking the unspoken, McCloy had dramatically altered the terms of the debate. Now it was no longer a question of invasion. What had been a dormant but implicit option now became explicit. The soon-to-be tested bomb would end the war, with or without warning. And the war might end before the bomb was ready.” but increasingly the dominant point of view was that the idea of an invasion had been scrapped and in the absence of a Japanese surrender, the bombs would be dropped.10
After another meeting with what was called the Committee of Three, most of the main players agreed “that a modest change in the terms of surrender terms might soon end the war” and that “Japan [would be] susceptible to reason.”11 Stimson put McCloy to work at changing the terms of surrender, specifically the language of Paragraph 12 that referenced the terms that the Japanese had found unacceptable. McCloy did not mention the atomic bomb by name. But by now however, Truman was gravitating toward Byrnes’s position of using the bombs.
After meeting with the president on July 3, Stimson and McCloy “solicited a reluctant invitation” to attend the Potsdam Conference, but instead of traveling with the President’s entourage aboard the USS Augusta, they secured their own travel arrangements to Germany. Newly sworn-in Secretary of State, James Byrnes, would sail with the president and was a part of his onboard poker group.12 The rest, as they say, is history.
At Potsdam, Truman was told by the Soviets that Japan was once again sending out feelers for a political resolution. Truman told Stalin to stall them for time, while reasserting the demand for unconditional surrender in a speech where he buried the existence of the bombs in language so vague, that it is likely that the Japanese leaders did not pick up on the implications.13 Japan backed away. Truman’s actions seem to suggest that, under Byrnes’s influence (and perhaps independent of it), he had made his mind to drop the bombs and wanted to sabotage any possibly of a political settlement. As Bird notes, “Byrnes and Truman were isolated in their position; they were rejecting a plan to end the war that had been endorsed by virtually all of their advisors.”14 Byrnes’s position had been adopted by the president over the political option of McCloy. As Truman sailed for home on August 6, 1945, he received word that the uranium bomb nicknamed “Little Boy” had been dropped on Hiroshima with the message “Big bomb dropped on Hiroshima August 5 at 7:15 P.M. Washington time. First reports indicate complete success which was even more conspicuous than earlier test.” Truman characterized the attack as “The greatest thing in history.”15 Three days later the plutonium bomb “Fat Man” fell on Nagasaki. The Soviets entered the fighting against Japan on August 8. The war was over.
Given Byrnes’s reputation as a political operative of rigid temperament and often questionable judgment, one can only wonder if the dropping of the bombs was purely gratuitous. Did he and he president believe that the American people wanted and deserved their pound of flesh almost four years after Pear Harbor and some of the hardest combat ever fought by U.S. servicemen?16 Of course there were also the inevitable questions of “what would Roosevelt have done?”
With events safely fixed in the past, historians tend to dislike messy and problematic counterfactuals, and one can only wonder if McCloy’s plan for a negotiated peace would have worked. One of the most constructive uses of history is to inform present-day policy decisions through the examination of what has worked and what has not worked in the past, and why. Even so the vexing—haunting—queries about the necessity of dropping the atomic bombs remain as open questions. The possibility for a political resolution to the war seems at the very least to have been plausible. The Japanese probably would have surrendered by November, perhaps considerably earlier, as the result of negotiations, but there is no way to tell for certain.17 As it was, in August 1945, Truman decided to allow the Emperor to stay on anyway, and our generous reconstruction policies turned Japan (and Germany) into a miracle of representative liberal democracy and enlightened capitalism.
Even if moderate elements in the Japanese government had been able to arrange an effective surrender, there is no telling whether the Japanese military, and especially the army, would have gone along with it; as it was—and after two atomic bombs had leveled two entire cities—some members of the Japanese army still preferred self-destruction over capitulation, and a few even attempted a coup against the Emperor to preempt his surrender speech to the Japanese People.
This much is certain: our enemies in the most costly war in human history have now been close allies for seven decades (as the old joke that goes, if the United States had lost WWII, we would now be driving Japanese and German cars). Likewise our Cold War enemy, the Russians, in spite of much Western tampering within their sphere of influence, now pose no real threat to us. But the bomb remains.
Knowledge may be lost, but an idea cannot be un-invented; as soon as a human being put arrow to bow, the world was forever changed. The bomb remains. It remains in great numbers in at least nine nations and counting, in vastly more powerful forms (the hydrogen bomb) with vastly more sophisticated means of delivery. It is impossible to say whether the development and use of the atomic bomb was and is categorically bad, but it remains for us a permanent Sword of Damocles and the nuclear “secret” is the knowledge of Prometheus. It is now a fairly old technology, the same vintage as a ’46 Buick.
The bombings of Hiroshima and Nagasaki broke the ice about the use of these weapons in combat and will forever live as a precedent for anyone else who may use it. The United States is frequently judgmental of the actions and motives of other nations, and yet the U.S. and the U.S. alone is the only nation to have used nuclear weapons in war. As with so many people in 1945 and ever since, Stimson and Oppenheimer both recognized the atomic bomb had changed everything. More than any temporal regime, living or dead, it and its progeny remain a permanent enemy of mankind.
Notes
- For a discussion of the moral justification in regard to dropping the atomic bombs, see John Gray, Black Mass, New York: Farrar, Strauss and Giroux, 2007, pp 190-191.
- For an account of the fighting on Okinawa, see Eugene Sledge, With the Old Breed, New York: Random House, 1981.
- LeMay expresses this sentiment in an interview he gave for the 1973 documentary series, The World at War.
- Generally Chapter 12, “Hiroshima”. Kai Bird, Chairman, John J. McCloy and the Making of the American Establishment, New York: Simon and Schuster, 1992, pp. 240-268.
- Bird, p. 242.
- Bird, p. 244.
- Bird, p. 245.
- Bird, p. 245.
- Bird, p. 246.
- Bird, p. 250.
- Bird, pp. 247-248.
- Bird, p. 249-250. Averell Harriman and Elie Abel, Special Envoy to Churchill and Stalin, 1941-1946, New York: Random House, 1975, 493.Bird, p. 251. It should be noted that most of the top American military commanders opposed dropping the atomic bombs on Japan. As Daniel Ellsberg observes: “The judgment that the bomb had not been necessary for victory—without invasion—was later expressed by Generals Eisenhower, MacArthur, and Arnold, as well as Admirals Leahy, King, Nimitz, and Halsey. (Eisenhower and Halsey also shared Leahy’s view that it was morally reprehensible.) In other words, seven out of eight officers of five star rank in the U.S. Armed Forces in 1945 believed that the bomb was not necessary to avert invasion (that is, all but General Marshall, Chief of Staff of the Army, who alone believed that an invasion might have been necessary.’ [Emphasis added by Ellsberg]. See Daniel Ellsberg, The Doomsday Machine, New York: Bloomsbury, 2017, pp262-263. As it happened, Eisenhower was having dinner with Stimson when the Secretary of War received the cable saying that the Hiroshima bomb had been dropped and that it had been successful. “Stimson asked the General his opinion and Eisenhower replied that he was against it on two counts. First, the Japanese were ready to surrender and it wasn’t necessary to hit them with that awful thing. Second, I hate to see our country be the first to use such a weapon. Well… the old gentleman got furious. I can see how he would. After all, it had been his responsibility to push for all of the expenditures to develop the bomb, which of course he had the right to do, and was right to do.” See John Newhouse War and Peace in the Nuclear Age, New York: Alfred A. Knopf, 1989, p. 47. Newhouse also points out that there were numerous political and budgetary considerations related to the opinions of the various players involved in developing and dropping the bombs. One can only hope that budgetary responsibility/culpability did not (or does not) drive events.
- Harriman, p. 293.
- For his own published account of this period, see James F. Byrnes, Speaking Frankly, New York: Harper Brothers & Company, 1947.
- See Robert Dallek, The Lost Peace, New York: Harper Collins, 2010, p. 128. Dallek makes hit point, basing it on the Strategic Bombing Survey, as well as the reports of Truman’s own special envoy to Japan after the war in October 1945.
Daniel Ellsberg
Book Review
Daniel Ellsberg, The Doomsday Machine, Confessions of a Nuclear War Planner, New York: Bloomsbury, 2017, 420 Pages, $30.00 (hardcover).
In the Shadow of the Mushroom Cloud (or: Bigger than the Pentagon Papers)
Reviewed by Michael F. Duggan
Before many centuries more… science may have the existence of mankind in its power, and the human race commit suicide by blowing up the world.
-Henry Adams
As it turns out, Stanley Kubrick got it mostly right.
“We came out into the afternoon sunlight, dazed by the light and the film [Dr. Strangelove] both agreeing that what we had just seen was essentially a documentary. (We didn’t yet know—nor did SAC—that existing strategic operational plans, whether for first strike or retaliation, constituted a literal Doomsday Machine, as in the film.)” Daniel Ellsberg, The Doomsday Machine, p. 65.
You should read this book, but not at bedtime.
As a nuclear strategist in the late 1950s and 1960s, this was the story Daniel Ellsberg wanted to tell, but that fact that “Vietnam is where the bombs are falling right now [1969]” forced his hand and diverted his attention elsewhere. The overarching theme of his recent book—the overwhelming feeling one comes away with—is that it is a miracle or a fortuitous aberration of probability that the United States and Soviet Union did not blow up the world during the Cold War. What is more is that the risk is still in place and that the threat of a nuclear war is greater than ever. A moral of the book is that wholesale war against civilians characterized by strategic terror bombing and which reached its apex in the omnicidal possibilities of nuclear war is not only immoral and a dubious means of winning wars. It is likely the grandest expression of the irrationality of war and of our aggressiveness as an animal.
In a sense, Ellsberg is a latter-day Siegfried Sassoon—the true believer-turned-apostate in the name of humanity, the patriot with a greater commitment to the truth, the man who saw insanity and folly and chose sense and sanity. Of course his name will always be associated with the Pentagon Papers that exposed the true motives of the war in Vietnam—a rivulet font that contributed to the deluge that eventually forced President Nixon from office. He is arguably the prototype of the modern whistle-blower. The present book tells an even bigger story and one that its author has waited a half-century to tell.
Ellsberg came close to telling this story at the time, but the thousands of pages he copied on nuclear strategy were lost in an almost comical sequence of events including the intervention of a tropical storm, and which by his own admission, likely spared him decades of hard prison time. He can now rely on his own memory corroborated by material declassified over the years without fear of breaking the law. Although much of the material here was previously known by historians of the Cold War, it is still likely to shock when presented so starkly by a person so intimately connected with the topic.
Ellsberg begins by recalling that as a thirteen-year-old, he and his ninth grade friends immediately latched on to the inherently problematic, the unavoidable and insurmountable implications of the mere existence of super-weapons that could destroy entire cities in a single blow, and of nations armed with such technology. These high school freshmen hit upon conclusions usually associated with physicists working on the Manhattan Project and epitomized with Robert Oppenheimer’s chilling paraphrase of the Bhagavad Gita: “I have become Death, the destroyer of worlds.”
Ellsberg’s social studies instructor, Bradley Patterson, was teaching the concept of “cultural lag,” or the idea that technology runs ahead of the cultural, social, political ability to handle it—i.e. “to control it wisely, ethically, prudently.” In the fall of 1944, the teacher had his students consider the idea of nuclear weapons (articles on the possibility of a Uranium 235 bomb had already appeared in the Saturday Evening Post and other magazines) as a kind of ultimate or paragon example of this concept. The students were given a week to write an essay on the implications of such a weapon.
“As I remember, everybody in the class had arrived at much the same judgment. It seemed pretty obvious: the existence of such a bomb would be bad for humanity. Mankind could not handle such a destructive force. It could not be safely controlled. The power would be “abused”—that is, used dangerously, with terrible consequences… A bomb like that was just too powerful.”
The first part of this book, “The Bomb and I”, deals with the ins and outs, the subtleties, caveats, conundrums, hypotheticals and counter-hypotheticals of the game theory logic imposed by nuclear weapons on strategists during the Cold War. It is a personal history of the implementation of nuclear strategy, unsettling breaches in the system, near accidents and potential for global thermonuclear catastrophe in the Manichean world of U.S.-Soviet relations. It is Ellsberg’s own story as a wiz kid, a consultant for the Air Force’s RAND (Research ANd Development) Corporation—its in-house think tank. As with the Pentagon Papers, Ellsberg’s purpose is to present what he saw versus the official line.
Summarizing in his introduction, Ellsberg states eight realities of American nuclear strategy that set the theme of the book. These are:
- “The basic elements of American readiness for nuclear war remain today what they were almost sixty years ago: Thousands of nuclear weapons remain on hair-trigger alert, aimed mainly at Russian targets” and that “the declared official rational” is to deter “an aggressive Russian first strike” is a “deliberate deception.” According to Ellsberg, “[d]eterring a surprise nuclear attack has never been the only or even the primary purpose of our plans and preparations.” Rather, “[t]he nature, scale, and posture of our strategic nuclear forces has always been shaped around the requirements of quite different purposes: to attempt to limit the damage to the United States from Soviet or Russian retaliation to a U.S. first strike against the USSR or Russia. This capability is, in particular, intended to strengthen the credibility of U.S. threats to initiate limited nuclear attacks, or escalate them—U.S. threats of ‘first use’—to prevail in regional, initially non-nuclear conflicts involving Soviet or Russian forces or their allies.”
- “The required U.S. strategic capabilities have always been for a first-strike force,” neither a surprise attack nor one “with an aim of striking ‘second’ under any circumstances, if that could ne avoided by preemption.” In other worlds, [t]hough officially denied, preemptive ‘launch on warning’ (LOW)—either on tactical warning of an incoming attack or a strategic warning that nuclear escalation is probably impending—has always been at the heart of our strategic alert.”
- Contrary to popular belief, nuclear weapons have been used “dozens of times in ‘crises’ since their actual combat use over Hiroshima and Nagasaki. This has been done “mostly in secret from the American people (though not from adversaries). They have used them in the precise way that a gun is used when it is pointed at someone in a confrontation, whether or not the trigger is pulled. To get one’s way without pulling the trigger is a major purpose for owning a gun.”
- “Posing as it does the threat of nuclear attack by the United States to every state that might potentially be in conflict with us (like North Korea), this persistent rejection by the United States of a no-first-use commitment has always precluded an effective nonproliferation campaign. “
- “With respect to deliberate, authorized U.S. strategic attacks, the system has always been designed to be triggered by a far wider range of events than the public has ever imagined. Moreover, the hand authorized to pull the trigger on nuclear forces has never been exclusively limited to the president, nor even his highest military officials.” “Dead hand” systems of delegation of nuclear launch authority probably exist in the systems of all nuclear powers, most likely including North Korea.
- During the Cuban Missile Crisis, “events spiraled out of control, coming within a handbreadth of triggering our plans for general nuclear war” (and we should bear in mind that this was a crisis presided over by two rational leaders looking for a way out of the standoff).
- “The strategic nuclear system is more prone to false alarms, accidents, and unauthorized launches than the public (and even most high officials) as ever been aware.” Ellsberg notes that false alarms did in fact occur in 1970, 1980, 1983, and 1995.
- “Potentially catastrophic dangers such as these have been systematically concealed from the public.” Not even the Joint Chiefs of Staff realized until 1983 that the nuclear winter that followed a general nuclear war between the U.S. and the U.S.S.R. would probably kill every person on the planet.
He concludes the introduction by observing that “[i]n sum, most aspects of U.S. nuclear planning and force readiness that became known to me half a century ago still exist today as prone to catastrophe as ever but on a scale, as known to environmental scientists, looming vastly larger than was understood then” and more economically, “[t]ragically, I believe that nothing has fundamentally changed.”
It is hard to know where to begin with this book (the eight points above should give the reader a fair, generalized sample to chew on). It is fascinating history, and, like a hero of fiction, the young Ellsberg, always seems to be in the center of things. Following Harvard and a three-year hitch as a Marine Corps infantry officer, he is thrown in as a consultant with a brilliant generation of wiz kinds at RAND. From there he recounts episodes including an eye-opening interview a squadron leader of nuclear-armed aircraft on the front lines of the Cold War, hearing a confession of alleged pre-appointed nuclear authority by an Air Force theater commander, and discussions with other high-level generals including the cigar-chomping Curtis LeMay himself. He writes a speech intended for President Kennedy that meets with McNamara’s approval but which is given by Deputy Secretary of Defense, Roswell Gilpatrick instead. He warns the haughty incoming National Security Adviser, McGeorge Bundy about the numerous lapses in the system, including the usurpation of the chain of command and undermining of civilian control.
With academic and military credentials, Ellsberg had a Selig-like knack for being at the right place at the right time. He was well-qualified to be both a detective of chinks in the system and the deliverer of often shocking messages, but to no avail. The lesson seems to be that even the planners of nuclear strategy were just as much captive to the self-direct logic of what was seen as a bipolar world as the unsuspecting rest of the nation, and just as helpless to do anything about it. Although nuclear war is averted by human agency during the Cuban Missile Crisis, the larger game continues and seems mostly immune from the efforts of people who see the madness.
Although the book is well structured—and it is better to read it for oneself rather than have a reviewer recount it chapter by chapter—one comes away with a myriad of troubling facts and imagery, of things generally unknown at the time (and still unknown by most Americans): drummed-up fictions like the missile gap and bogus theatrical props like the nuclear “football”. One is initially shocked and then overwhelmed and eventually numbed by a sequence of revelations like the inevitability of pre-approved delegation of nuclear launch authority, the daily breakdown of communications between Washington and bases in the Pacific, how commanders and even pilots circumvented launch codes, how the Chiefs of Staff got around civilian control authority, and how civilian authorities were kept in the dark about nuclear war plans.
One is taken aback at the lack of clarity in the minds of the men who would actually be flying nuclear-armed aircraft and under what circumstances might they launch an unauthorized attack (e.g. if the last plane in a squadron crashed on takeoff, thus detonating a thermonuclear weapon on its own base, would the pilot of an aircraft that had already taken off assume that the base had been attacked by the Soviets or Chinese and proceed with an attack in what was intended only to be a drill?). It is all a stark reminder of how closely we came to blowing up everything and how a Guns of August sequence of events with greater-than Missiles of October technology is still a very real possibility (his retelling of the now well-known story of how close a Soviet submarine under depth charge attack from a U.S. ship on the blockade line came to launching a nuclear weapon during the Cuban Missile Crisis is particularly harrowing).
Having grown up in a military family during the Cold War, I learned of the nuclear standoff of super powers at the tender age of eight or nine. I was of a generation, the more sensitive members of whom could imagine the contrails of ICBMs imposed on clear nighttime skies. While I was working on my doctorate in history, I had read John Lewis Gaddis’s masterful Strategies of Containment, and had come away thinking that both sides had unnecessarily ratcheted-up tensions (first with Nitze’s NSC-68 and later with the “New Look” of the Eisenhower years), that the Cold War was an unnecessarily dangerous and “costly political rivalry.”1 I did not know that, just in surviving the period, the world had in fact won a lengthy sequence of lotteries.
On the one hand, American triumphalists and boosters of their nation’s “victory” in the Cold War (now completely squandered) point to the zero-sum, game theory logic of deterrence, of Mutual Assured Destruction, and how it apparently worked. The idea, seemingly oxymoronic, is generally attributed to Bernard Brodie and the view that in order to prevent nuclear war, a nation must “be prepared to resort to atomic war” and to make it too terrible to be a viable option.2 Making nuclear war mutually suicidal seems to have accomplished this to date. But on the other hand, being in a Mexican standoff with the most destructive weapons ever conceived is hardly an admirable position in which to find oneself, and it is a state of affairs that only has to fail once. Add to this the fact that human beings are naturally aggressive animals, that unhinged leaders come to power from time-to-time, the role of accidents in history, hair-trigger strategies of first strike, and an ever-increasing nuclear club, and the rational reader of Ellsberg’s book can be excused for wanting to get off of the planet.3
Part II. History of Bombing Civilians
The second part of the book, “The Road to Doomsday” is a history of strategic bombing as the natural predecessor to nuclear war. This part is obviously less personal but gives an impressive outline about how we got to where we are in terms of not batting an eye at accepting civilian deaths as “collateral damage” and seeing non-combatants as legitimate targets in war. In some respects, this topic is a later chapter, a continuation of the more general history of the growth of modern total warfare since Napoleon and certainly since the American Civil War. Even so, it is remarkable to compare the unconcealed disgust of commentators like Theodore Roosevelt at the intentional targeting of a (mostly) civilian liner like the Lusitania in 1915, with the causal acceptance of bombing of entire cities in the Second World War by American political leaders and their constituents.
Indeed, as a child in the 1970s reading of the air campaigns of the Second World War, there was no greater symbol of heroism for me than the gorgeous lines and the all-business armament configuration of the B-17 Flying Fortress (the far more effective and severely aerodynamic B-29 never achieved the same appeal), and the brave men who flew them. To this day, the sight of a B-17 arouses the child in me, although I am certain that German who were children in 1943 or 1944 in Hamburg, Munich, or Dresden, do not share my affection for this plane.
As a practical matter, it is not clear that wholesale strategic bombing is an effective basis for strategy. Theorists and planners between the wars, like Giulio Douhet, believed that if total war could be brought on the cities and heartlands of an enemy nation, wars could be brought to a quick and decisive conclusion.
As regards Germany, this does not appear to have been the case as aircraft and tank production continued to increase until the final month or two of the war. In fact, strategic bombing may have only been successful in Europe against oil production and transportation. In Japan bombing had turned most of the major cities to ashes, and yet American war planners still feared such fierce resistance by the civilian population that they felt justified in dropping two atomic bombs. Even here it is not clear whether or not the bombs were the decisive factor in ending the war in the Pacific or whether it was the simultaneous intervention of the USSR in that theater, or both.4 It would seem that Japan was mostly defeated on the great island-dotted battlefield of the South Pacific.
Douhet’s dream of aerial war breaking the will of an enemy people does not have a record of the decisiveness that he sought. One of the most severe bombings campaigns in history did not break the will North Vietnamese, nor did a similarly impressive campaign over North Korea force a surrender. Bombing does change people, and the behavior of the North Koreans since 1953 and the genocide in Cambodia during the 1970s are likely attributable in large measure to the strategic bombing campaigns launched against them.
In 1946, George Kennan suggested that the world revert back to the limited Jominian wars5—the “cabinet wars” of the eighteenth-century that followed in the wake of the total wars of seventeenth-century Europe. His idea was that the purpose of war should be to minimize and not maximize casualties, that “[v]iolence… could not be an objective.”6 Nuclear weapons and the logic of Bernard Brodie to make war too horrible to be tolerated in fact makes it obsolete as a practical matter, and the possibility of a war being launched by accident or miscalculation made it additionally intolerable. And yet as the Flexible Response to Mutual Assured Destruction has demonstrated time and time again in the many regional wars since the early 1960s, limited military options to keep war alive only make it more likely, if less suicidal.
It would seem that at best humans may be forever damned to a condition in which the possibility of complete destruction by total Clausewitzian war with nuclear weapons and subsequent fallout and nuclear winter, or else to embrace an updated version of Flexible Response—limited war that would “keep the game (and the human race) alive” but which makes conflict so easy that it become all but inevitable.7 The result of this return to limited war seems to be a never-ending, mostly unnecessary series of the “semi-war” that James Forrestal, and more recently, Andrew Bacevich, warned of.8
It seems that the latter is already well upon us and will be until it becomes financially unsustainable. As with total warfare, limited war has also reached new technical heights with drone technology, allowing for the campaigns of remote video game-like strikes of a character arguably intermediate between war and assassination, while the great majority of our people are as oblivious to it as they were to the fact that they were nearly incinerated on a number of occasions during the Cold war and might still be. In other words, we now have the worst of both worlds: an ongoing state of never-ending limited wars while the nuclear omnipresence remains and could conceivably be triggered by a limited war, a misunderstanding, accident, or deteriorating relations with our old Cold War foes9 .
Conclusion
Regimes come and go, but The Bomb remains. The club of nuclear states continues to grow (South Africa being the only nation to have relinquished its nuclear weapons), and now includes nations who dislike and distrust each other perhaps even more than the U.S. and U.S.S.R. during the Cold War. If cautious, rational, and realistic leaders like John Kennedy and Nikita Khrushchev came within a wild card of blowing up the world in October 1962, what are the odds of intentional or accidental nuclear launches in an age with more fingers on more buttons, the virtually unlimited potential of computer hacking, and leaders of widely varying degrees of stability?
It is an open question of whether an accidental or intentional nuclear war is a greater threat to the world than global climate change and the intimately tied issues of human overpopulation and loss of habitat/biodiversity. The latter is already unfolding and potentially catastrophic climatological changes are already literally in the air and locked-in place. How fast and how severe these changes will manifest is the great unknowable. Possibilities between a gradual societal collapse due to environmental catastrophe and nuclear war followed by a nuclear war gives a potential full range of apocalypse from T,S, Eliot’s “bang” to “a whimper,” and Robert Frost’s “fire” to “ice”.10
Regardless, and as with Vietnam in the 1960s, climate change is actually happening while nuclear war remain only a possibility contingent on human folly, stupidity, and irrationality. As the smartest man who ever lived observed “[t]he unleashed power of the atom has changed everything save our modes of thinking, and we thus drift toward unparalleled catastrophe,” or in more picturesque terms “I don’t know how World War III will be fought, but Would War IV will be fought with sticks and stones.”11
Technology may be lost, at least for a time, but an idea cannot be intentionally destroyed or un-conceived. A weapon may not be un-invented. If you live long enough, you will see rival nations and even existential enemies become close allies (a 1970s wisecrack observed that if the U.S. had lost WWII, we would now all be driving German and Japanese cars). It is clear that The Bomb is a truer and more permanent enemy than any temporal regime. No conflict is worth destroying the planet over. Heavy-handed nuclear strategies in a time of declining U.S. economic and military power and an increasing number of nations with nuclear weapons and the rise of China as Eurasian hegemon will likely make the future even more dangerous than the past. Another negative effect resulting from the end of the Cold War is a sense of complacency that the threat of nuclear war is over.12 Nothing could be farther from the truth.
It is a singular coincidence that the great physicist, Hugh Everett III was a contemporary of Ellesberg’s and was also a nuclear planner (although he did not work for RAND and is not mentioned in the book). Everett’s “many worlds” interpretation of quantum mechanics suggests the possibility of many parallel universes, each one splitting off as the result probabilistic events. If his model is correct, one can only wonder how many parallel tracks include worlds that were destroyed by nuclear war. To date this one has been lucky, but my experience in life has been that luck does not hold out in human events, not over the long run. Cue Vera Lynn?
In my opinion, this is a book that Americans should read, including young people when they are able to handle the gravity of the subject. Ellsberg writes in a strong, unpretentious style, but his book is best read closely and carefully from to beginning to end. It does not skim well.
One should consider reading this book in conjunction with Andrew Bacevich’s history of the Cold War and the rise of the national security deep state, Washington Rules, Stephen F. Cohen’s Soviet Fates, and John Lewis Gaddis’s more conventional history of Cold War strategy, Strategies of Containment.
Notes
- Kennan, “Republicans won the Cold War?”, At a Century’s Ending; Reflections, 1982-1995, New York: W.W. Norton & Company 186, 1996.
- John Lewis Gaddis, George F. Kennan, an American Life, 233-234, 614. See also Bernard Brodie, ed., The Absolute Weapon, Atomic Power and World Order, 1946, as well as his later Strategy in the Missile Age, Princeton University Press, 1959.
- See Edward O. Wilson, “Aggression,” On Human Nature, Cambridge: Harvard University Press, 99-120, 1978.
- As regards the origins of modern total warfare, see Stig Forster and Jorg Nagler’s On the Road to Total War, and David Bell’s The First Total War. As with the Japanese 80 years later, it has been argued that many Southerners would have willingly continued to fight even after the “hard war” campaigns of Grant, Sheridan, and Sherman that prefigure the total wars of the twentieth-century. See generally Jay Wink, April 1865, the Month that Saved America, New York: HarperCollins, Inc., 2001.
- Gaddis, George F. Kennan. 234-235.
- Gaddis, George F. Kennan,235.
- In a sense, nuclear war—although obviously a form of total warfare—is actually antithetical to Clausewitz. War is policy “by other means” in Clausewitz’s formulation, but the complete mutual destruction of nuclear war would preclude the achievement of all policy goals. See John Keegan, A History of Warfare, New York: Alfred A. Knopf, 1993, 381.
- Andrew J. Bacevich, Washington Rules, New York: Henry Holt and Company, 27-28, 57-58, 2010.
- On the reviving of Cold War tensions with Russia, see Stephen F. Cohen, Soviet Fates and Lost Alternatives: from Stalinism to the New Cold War, New York: Columbia Press, 2009, 2011. On the rise of China and the decline of the United States, see Alfred W. McCoy, In the Shadows of the American Century, Chicago, IL: Haymarket Book, 2017.
- T.S. Eliot, “The Hollow Men, V,” Collected Poems 1909-1962, 92. Robert Frost, “Fire and Ice,: The Poems of Robert Frost, 232.
- Ralph E. Lapp, “The Einstein Letter that Started it All,” The New York Times Magazine, August 2, 54, 1984.
- Such major players of the Cold War as George Kennan, and Robert McNamara became supporters of the antinuclear movement during the 1980s. The end of the Cold War took much of the wind out of the sails of this effort. See generally, George F. Kennan, The Nuclear Delusion, New York, Random House, 1983. See also Robert S. McNamara, “The Nuclear Risks of the 1960s and their Lesson for the Twenty-first Century” In Retrospect, New York: Random House, 337-346, 1995.
New Article: The Open Hand: Moderate Realism and the Rule of Law
My new article “The Open Hand: Moderate Realism and the Rule of Law” just came out in the Howard Law Journal (Vol. 61 Issue 2). The hard copy is out, but I am not sure it is available online yet.
The overarching thesis is that if other nations wish to emulate the American legal and judicial systems, the United States should help them, but that we should not aggressively proselytize or foist our system on others. I also discuss the fact that although rule of law initiatives are seen by some to be idealistic ventures, they are often neoliberal policies used to leverage economic or strategic advantage in the developing world.
Mike Duggan
Alfred W. McCoy
Book Review
Alfred W. McCoy, In the Shadows of the American Century, the Rise and Decline of U.S. Global Power, Chicago: Haymarket Books, 2017, 359 pages. $18.00 (quality paper)
Successful imperialism wins wealth. Yet, historically, successful empires such as Persia, Rome, Byzantium, Turkey, Spain, Portugal, France, Britain, have not remained rich. Indeed, it seems to be the fate of empires to become too poor to sustain the very cost of empires. The longer an empire holds together, the poorer and more economically backward it tends to become.1
-Jane Jacobs
The curtain is now falling on the American Century.
-Andrew J. Bacevich
All empires come to an end.
-Napoleon
The Tides of Empire
Reviewed by Michael F. Duggan
Not every historian can make a career by speaking inconvenient truths to power, but Alfred W. McCoy has done it for nearly five decades. McCoy, the J.R.W. Short Professor in History at the University of Wisconsin at Madison, made his reputation as a young scholar by shining light on the politics and policy of heroin in Southeast Asia during the late 1960s and early ‘70s. With a strong claim to the title of American Dean of Southeast Asian History, his latest book looks at the decline of the American Empire and the rise of China.2
The book is a warning that traces the rise of the United States from an emerging world power, to a superpower following WWII (the term “American Century” was coined by Henry Luce in 1941) until the end of the Cold War, to its current role as the “sole remaining superpower,” the hitherto unchallenged world economic and military hegemon. As such, the American Empire is the liberal, English-speaking heir to the British Empire that dominated the nineteenth-century.
If McCoy is correct, the days of United States military and economic hegemony are numbered and likely to end sometime between 2020 and 2040—and the question is whether its decline will be controlled and managed, or if the denial of or resistance to changing geopolitical realities will lead to an uncontrolled collapse; will the American empire end with a sensible post-globalist grand strategy of consolidation, or will it end with a bang or fizzle? Denial and rationalization are the twin pillars of human psychology, and the willful or unconscious ignoring of hard facts that are now coming into sharp focus could lead to a catastrophic collapse or else a dismal but more gradual decline and end to the American Century a few years shy of an even hundred.
Do the facts support McCoy’s premise? With a rapidly mounting national debt and a shrinking tax base, it is increasingly likely that by 2030, the United States will experience economic crisis and paralysis. It is becoming more and more apparent that with a $22 trillion debt, a collapse is all but inevitable as the U.S. government begins to look more and more like a giant scheme that exhales more than it inhales and that it will never be able to pay off its debt. A Cold War-size military to police the world—military predominance generally—and the de facto imperialism of neoliberal globalization will soon become as unsustainable as they are already undesirable. If current trends continue, the Chinese will likely, sometime during the same period, be in a position to supplant the Dollar with the Yuan as the basis for the world reserve currency.
Unchecked power brings with it the potential for corruption, hubris, and an unselfcritical sense of entitlement in terms of intervening in the domestic affairs of other nations. The role of the world’s policeman in furtherance of an activist neoliberal worldview by neoconservative means and the misleading designation of humanitarian interventions, have sullied rather than strengthen the reputation of the United States as a force for good in the world, a reputation increasingly seen by others as honored in the breach.
Similarly, an entire generation of Americans has grown up to see no anomaly, no abnormally in their nation bombing, invading, and occupying other nations, killing tens and even hundreds of thousands of people in the process. Several generations have witnessed their nation increasingly use undeclared wars as a basis for policy. The unintended consequence of this is an inversion of Clauzwitz’s “war is an extension of policy” to a situation where policy has become a justification for military budgets and a seemingly limitless gravy train for the burgeoning defense industries. Budgets may inadvertently become a driver of policy. Undeclared foreign wars and a never-ending state of semi-war can be used, not only to justify new weapons systems, but some commentators have suggested that they also provide convenient venues to test them under battlefield conditions.3 As the demise of the Soviet Union well illustrates, economies typified by little growth and which rely on a manufacturing sector based on military production are both artificial and symptomatic of decline.
MacKinder’s “World Island” and the Rise of China
In the Shadows of the American Century makes a compelling case that China is poised to become the dominant Eurasian hegemon. The argument goes like this: China has emerged beyond the parameters of a rising regional power and is embarking on a massive infrastructure program that will link it throughout Eurasia.4 This, combined with technological advances, improved manufactured goods, and a rapidly expanding military, will secure its predominance on the world’s largest continent in the near future. McCoy believes that China’s strategy is analogous to the 1904 “World Island” model of Halford MacKinder asserting that the power that controls Eurasia will effectively control the world.5 Part and parcel with this view is the observation that the United States is likely entering a state of permanent and irreversible decline not unlike that of the British Empire a century before and perhaps worse.
The dominant historical/geopolitical outlook of MacKinder’s time was navalism and included such theorists as Alfred Thayer Mahan and Julian Corbett, and imperialist acolytes like John Hay, Henry Cabot Lodge, Theodore Roosevelt, and Elihu Root. MacKinder believed that it was possible to manage the World Island through naval power along the Eurasian littoral regions, the maritime periphery. Indeed, the Royal Navy maintained Britain’s massive maritime empire and exerted influence over portions of the World Island with small constabularies and friendly local regimes. On this point, McCoy understands the mechanics of empire as well as any American historian.
But MacKinder was not a navalist. He believed in control of landmass over sea lanes and that the heartland of Eurasia was “nothing less than the Archimedean fulcrum for world power. ‘Who rules the Heartland commands the World Island… Who rules the World Island rules the world.’” McCoy continues,
“Beyond the vast mass of that island which made up nearly 60 percent of the Earth’s landmass, lay a less consequential hemisphere covered with broad ocean and a few ‘smaller islands’. He meant, of course, Australia, Greenland, and the Americas.” [McCoy, p. 29]
This is a bold geopolitical vision or assertion, but how does it apply to the modern world? In the tradition of the British Empire, the United States has maintained its ability to project power through an elaborate system of far flung bases in various regions designated as strategic commands. The aircraft carrier is the new capital ship, and the carrier group is the regional squadron. But “[w]hile the U.S. military was mired in the Middle East, Beijing began to unify that vast ‘middle space of Eurasia and preparing to neutralize America’s ‘offshore bases.’” McCoy believes that China’s rising land-based MacKinderism will likely trump American navalism.
But the power dynamics do not stop there, and ultimately military power rests on sustainable economic strength. China’s economic rise since 1989, and especially in the twenty-first century, has been spectacular and perhaps unprecedented. As McCoy observes,
“From 1820 to 1870, Britain increased its share of global gross domestic product by 1% per decade; the United States raised its share by 2% during its half-century ascent, 1900 to 1950; at a parallel pace, Japan’s grew about 1.5% during its postwar resurgence, from 1950 to 1980. China, however, raised its slice of the world pie by an extraordinary 5% from 2000 to 2010 and is on course to do so again in the decade ending in 2020, with Indian not far behind. Even if China’s growth slows, by he 2020s, U.S. economic leadership is expected to be decisively ‘overtaken by China.'” [McCoy, p. 193]
One possible outcome of this trend is the proffering and perhaps supplanting of a democratic—if increasingly plutocratic-republican—form of capitalism embodied by the United States by an authoritarian state-based capitalism as an emerging alternative. In a more value-neutral sense, this would a local, land-based empire replacing a declining and remote maritime empire.
China’s next (i.e. current) phase of economic development, a great infrastructure project designed to link greater Eurasia though massive capital projects will dwarf the United States Interstate Highway system. This and declining economic prospects for the U.S. will likely hasten the transfer in global standing and economic status. Even so, how realistic is the World Island as a sustainable economic model?
It is notable that nobody—not even the Mongols, who briefly dominated the massive territory from the Sea of Japan to Hungary and Poland—ever completely controlled the entire Eurasian landmass (and they, having little if any culture, were quickly assimilated in the areas they conquered, something that should give all military imperialists pause). Similarly, from the west, neither the adventures of Napoleon or Hitler made it beyond Moscow, their armies reduced to inglorious defeat and retreat, their regimes doomed. Of course China would not attempt to dominate Eurasia by force of arms, but rather through the soft power of economics and massive capital projects that would integrate land transportation throughout Eurasia.
On this point, regardless of whether the application of power is soft or hard, the dictum of British navalist, Julian Corbett, that it is impossible to conquer the ocean would seem to apply in modified form to Eurasia. This is not because its affinity to the unique characteristics of water as territory, but rather in the geographical sense that Eurasia/Africa is the only physical feature on the surface of the Earth that compares with the Atlantic or Pacific oceans in terms of pure scale. Indeed invaders of the past sometimes compared the seemingly endless undulating steppes and shallow valleys of Russia to the open expanses of the rolling sea.
In addition to the massive, ocean-like expanses of the Eurasian continent, there are also cultural-historical facts that would work against such a project. It seems unlikely that a greater Eurasian prosperity sphere including China, Iran, perhaps Pakistan, perhaps India, and Russia, would hold together for any period of time (although the baffling renewal of U.S. tensions with Russia is likely to drive them closer to China for a while thus extending the prospects for a united Eurasia longer than it might have been otherwise).
Although economic prosperity can make up for a multitude of historical grievances and perhaps even smooth-over national pride and interest-based tensions for short periods, over the long term this would be an economic sphere in which the constituent parts would act like mutually-repelling magnets. By way of another metaphor, trying to dominate Eurasia is like trying to stabilize an inverted pyramid: it is inherently unstable as any number of other regional power jostle for influence and in doing so throw it off balance.
But even if such a top-down plan built on a scale necessary to integrate all or most of Eurasia is bound to fail over the long run, it is likely to succeed long enough to displace U.S. primacy in the world along with its currency, especially if the United States foolishly—insanely—chooses to actively challenge or confront China as the result of a misconceived “pivot to the Pacific” policy. Given that all military issues are ultimately economic issues, the United States could conceivably collapse virtually overnight, like the USSR during the late 1980s-early 1990s. Otherwise, it might follow a path of more protracted decline into a second or third-tier status like Britain during the last century, or Spain a couple of centuries before that.
Although the rise of China as characterized by McCoy is alarming, he does not think it likely that the Chinese will be able to fully replace or come to occupy the role United States as global hegemon. He rightly notes that China has less to offer the world than the liberal West:
“Every sustainable modern empire has had some source of universal appeal, Britain had free markets and fair play, and the United States democracy, human rights, and the rule of law. Searching for successors, both China and Russia have inward-looking, self-referential cultures, recondite non-Roman scripts, nondemocratic political structures, and underdeveloped legal systems that will deny them key instruments for global domination.” [pp. 232-233]
Indeed, China has little in terms of transplantable cultural advantages to offer other nations relative to those of the British and American empires that preceded it. The spoken and written languages of the Chinese are difficult and difficult to export and present an obstacle; they seem unlikely to become global languages of business or diplomacy. Overall, China’s advantages/disadvantages seem fairly evenly split, and this is to say nothing about China’s considerable internal problems.
What China lacks in terms of cultural and political offerings it makes up in regard to economic realism. China has a lighter touch than either the United States or British in regard to the technical appeal of infrastructure, funds, and political non-interference. On this score, the Chinese want to do business and tend to avoid the meddlesome judgment and moralizing of the liberal West.
China also has a strong geographical advantage. The “One Road, One Belt” or “New Silk Road” is a series infrastructure projects, overland routes amounting to the internal lines of an economic struggle, and will likely work better than the far-flung empires of the Americans and British that required lengthy supply lines over the oceans of the world. Even given the ocean-like distances of the Eurasian continent, it would be more effective to try to manage a landmass over land routes than along its maritime periphery—people live on land, and as a military matter, the air and maritime domains—even when they are decisive—are necessarily adjunct. Geography may or may not be destiny, but even with strong geographical advantages the Chinese will likely experience trouble.
McCoy’s Scenarios
In my experience, many, perhaps most historians despise counterfactuals and the game of “what if?” That said, a primary purpose of history is to inform policy decisions, and these necessarily depend on hypothetical scenarios analyzed with knowledge of the past. McCoy concludes his book with his “Five Scenarios for the End of the American Century”—i.e. five possible courses United States-Chinese relations might take, none of which play out happily for the United States.
Here his fluency in the relative strengths and shifting advantages shines and he writes with the powerful insight and analysis of a master historian (he observes that the fastest, most powerful computers in the world are now Chinese and are made of Chinese components, something that should be keeping policymakers in the West up at night). Although counterfactual historical fiction raises my blood pressure as much as it does for most histroians, this intelligent and insightful use of possible historical tracks is constructive and very useful to show how events might devolve. At the very least, they powerfully underscore the inadequacies of current American policies.
Given the fundamental unpredictability of geopolitics, it would not be surprising if China failed to live up to McCoy’s predictions over the next 10-20 years. The combination of lopsided economies and an increasing drag from climate change combined and potentially devastating human migrations could easily upend the best laid plans and optimistic forecasts (from the Chinese perspective) of even the most sensitive observers.
Of course the backdrop to this great drama is an even greater one, the unfolding environmental crises. Even if China’s plans have the potential to succeed and to sustain themselves for a few decades, such a massive infrastructure project would likely be coming to fruition conterminously with increasingly severe and fearful feedback of the emerging global environmental crises. It is all beginning to read like a great Shakespearean tragedy in which the machinations of human intrigue are about to be permanently upset by a far greater external tragedy. It is like watching a powerful play in a theater that is on fire.
A Proposed Solution: The “Other Island”
When reading McCoy’s book, I tried to devise a realistic U.S. policy in response to the scenario(s) he was depicting. On page 235, at the end of one of his scenarios, he writes “While [United States] global power would diminish, Washington would still have considerable influence as a regional hegemon for North America and an arbiter of the residual international order.”
Immediately before this however, McCoy decries the possibility of the world order reverting to a new Westphalian paradigm with regional hegemons, spheres of influence, a range of lesser sovereign states, and the balancing of power (I would argue that the original Westphalia paradigm is still mostly intact and that its lingering death has been greatly exaggerated by advocates of neoliberal globalization). While this would not be an optimal state of affairs, it would seem to be preferable to global empires and super-powerful world hegemons, and the increased possibility of war between great nations. We must accept the fact that as long as there are powerful nations, there will be spheres of influence, and that if any country would benefit from gracefully embracing the role of a regional world power, it is the United States.6
In December 2015, I presented a paper at a conference on land force strategy at the Army War College in Carlisle outlining an American regional policy for the Near East that was situated within a grand strategy calling for a more limited form of internationalism. This is based on the idea that the United States should be involved in the world only so far as necessary with a well-defined sphere of interest and demonstrable vital interests as a sustainable status as a world and regional power. Although this idea was geared toward addressing the eschatology that underlies terrorism, it dovetails seamlessly with McCoy’s characterization of a Chinese-dominated World Island of Eurasia.
What then is the best course of action for the United States to follow in an age of an ascending China? The problem with the Great Game is the game itself: it is a rotten, egotistical, and ultimately self-destructive game, and the United States should frankly and willingly relinquish its status of dominance—leave the Great Game insofar as possible—as a matter of mature and considered policy. Quite simply, the role of the superpower is fundamentally undesirable. It begs rational understanding why military, foreign, or economic policymakers would want to sustain it, given its costly liabilities and diminishing returns. A nation’s military should reflect its size and resources, rather than pride, ambition, and the realities of the past.
The desirability of consolidation into a more manageable status of a regional world power is quite simple and based on singular, self-evident fact: the United States occupies the best real estate on the planet; it is large enough to be self-sustaining and has relatively unproblematic neighbors. Defending North America would be like defending an actual island, rather than attempting to diminish China’s power by attempting to manage the littoral and maritime regions of the World Island in a provocative posture of forward presence in somebody else’s neighborhood.
Rather than continue to embrace the problematic role of the world’s military and naval hegemon, the United States should adopt an outlook where it could operate more effectively as a robust regional world power with capable land, air, and sea forces to match. Such a status would allow the U.S. to protect its vital interests and meet its treaty obligations while still acting as a world leader in international coalitions to preserve peace and order and to restore the status quo in instances where the territorial sovereignty of a nation has been violated by another. Such a role would also be an effective means for assuring the international cooperation necessary to address the unfolding world environmental crises. As an example of benevolence in the world, when the United States helps itself, it helps the world, and if it cannot lead by example, it has no basis for telling others how to live or act.
One of the intrinsic problems with attempts to “control” Eurasia other than its sheer size, is the fact that there a numerous old and proud nations—civilizations in addition to China, that are constantly jostling for local and regional dominance. The United States does not have to trouble itself with this dangerous and distracting jockeying for power and control. A return to the Westphalian paradigm may not be a perfect solution, but it is better than what we have and likely to be far more desirable than anything that would follow an all-out American collapse.
The strategy would be as follows: the United States would unofficially cede local influence of Asia to China while continuing to trade and do business with East Asia—frankly there is little other choice in the matter and it in no way furthers American interests to aggressively oppose China, further bankrupting ourselves and risking catastrophic war in the process. As George Kennan and others have observed, no single regime will ever control the entire world, and the “World Island” geopolitical model—the control of Eurasia by a single power—is only slightly more modest than grandiose schemes of actual world domination. My reading of McKinder’s World Island is that it is likely a model for exerting influence over much of the globe de facto, and not actually dictating the local or proximate administration for all parts of the entire planet. It would seem that the planet is big enough for more than one island, even an archipelago.
As for the Far East and Eastern Pacific, the U.S. should play the situation by ear and continue to do business rather than risk conflict as the result of its own insecurity and subsequent overreaction, and we must become acclimated to certain stark realities: the U.S. will no longer be the dominant power in the South China Sea—a quick glance to its proximity relative to China on a map or a globe instantly forces the question: by what geographical logic should the U.S. be dominant here any more than the Chinese should control the Gulf of Mexico? If you will excuse the mixed metaphor, it is unlikely that China will threaten the Golden Geese that are the Asian Tigers, or commerce related to it traversing international waters. They will likely not risk invading Taiwan, given the likely cost versus benefits of not invading. Even with all of the wealth of the Orient, there is nothing so valuable there that would justify a conflict between the United States and China that would likely to end with nuclear weapons.
Within our own sphere we would likely be the regional hegemon, unless unforeseen or underestimated domestic realities resulted in the breakup of the United States into smaller regions. If the U.S. does hold together, it would serve our interests to act in a “good neighbor” way rather than return to the policies of regional imperialism of the past. Under this plan the United States would chart a middle course between imperial overreach and “Fortress America” autarchy and isolation. It would allow the U.S. to pursue vital interests in its own sphere and would not relinquish vital relations with Western Europe and Australia.
In an age where conventional, cyber, and nuclear weapons can deter any conventional attack on the American homeland by another great power, our goal should no longer be predominance, but rather the sustenance and protection of an impressive mean standard of living in a multi-polar world—the pursuit of the optimal rather than the maximal. If the U.S. consolidates, preserves its strength through consolidation, it can take care of an increasing host of domestic and economic problems in an increasingly chaotic world.
Conclusion
Alfred McCoy may be a controversial figure in some circles, but he need not be. Those in the halls of power may or may not like McCoy or his works for whatever reason. He upsets in a way that all frank honesty is likely to upset. But this is a courageous man who has “walked the walk” for no apparent reason other than to tell the truth, often unpleasant truths, and the facing of unpleasant truths is a cornerstone of realism and necessary to sustain the health of a liberal republic. He is a historian of the front rank, and his book is both a serious academic work and readable for a general audience. There are things in this book with which I do not agree (I do not see President Obama as a “Grandmaster of the Great Game” and am skeptical about the idea that the TPP was a masterstroke designed primarily to poach potential regional customers away from the expanding Chinese economic sphere), but even these interpretations made me think, and if history shows McCoy to be correct, I will gladly concede the points. I rate this book very highly and believe it is one that should be read by every American with an interest foreign affairs or economic, military/naval, policy, current events, and history. It is a book that should be read by all people who care about the future of this country.
Finally, for those who are interested, I would also recommend that this book be read back-to-back or simultaneously with Stephen Cohen’s 2008/2012 masterwork on the end of the Cold War and the dangerous and unnecessary rekindling of tensions with Russia, Soviet Choices and Lost Alternatives. This is because, as Alfred McCoy masterfully demonstrates, China is likely a far greater potential threat than Putin’s Russia.
Notes
- Jane Jacobs, Cities and the Wealth of Nations, New York: Random House, p. 182, 1984.
- See generally Alfred W. McCoy, In the Shadows of the American Empire, Chicago: Haymarket Press, 2017.
- Regarding the term “semi-war”—originally coined by James Forrestal—see Andrew Bacevich, Washington Rules, New York, Henry Hold and Company, pp. 27-28, 57-58. See also Bachevich’s article “Ending Endless War, A Pragmatic Military Strategy,” Foreign Affairs, September/October 2016, pp. 36-44.
- See, for example Gal Luft, “China’s Infrastructure Play”, Foreign Affairs, September/October 2016, pp. 68-75. See also Pepe Escobar, “The New Silk Road will go through Syria”, Asia Times, July 13, 2017.
- See generally in McCoy. MacKinder is often regarded as not only the father of modern geopolitics, but as the land power analog to turn-of-the-twentieth-century theorists of naval power like Alfred Thayer Mahan and Julian Corbett or theorists of strategic airpower and nuclear weapons like Guilio Douhet and (perhaps) Bernard Brodie, respectively.
- Andrew Bacevich appears to advocate a similar view of regional power status for the Untied States. See America’s War for the Greater Middle East, New York: Random House, 2016, 367. Although there may be no ideal economic/geopolitical world order, Jane Jacobs has suggested that one possibility would be an economic order based on small nations in turn based on naturalistic production regions. It is difficult to argue with such a perspective, but it is even more dfficult to imagine how such an order might be put in place. See generally Jane Jacobs, Cities and the Wealth of Nations..