Category Archives: Uncategorized

Ken Burns’ Hemingway

Reviewed by Michael F. Duggan

Everybody has his or her own Hemingway, and someone else’s Papa will never be entirely yours.

A part of the problem is that I waited a year for this film.  Although it would be a overstatement to say that my anticipation for this documentary got me through the first year of the pandemic, I did look forward to it and sought out and watched the new trailers as they were issued. Biography, as a subset of history, is selective, and readers all have their own Hemingway. No account will appeal to everybody.

Full Disclosure: I like Hemingway’s better novels and stories and I consider his blend of stoicism and Epicureanism to be one of the most satisfactory replies to the modern void that followed in Darwin’s wake.  I find him to be a commentator on life of the first order.  I also like the films of Ken Burns, who, like Hemingway, has produced a handful of classics and larger number of good efforts (of course Hemingway also had some notable failures).  

Let me start with what I liked about Hemingway.  Unlike Burns’ treatments of topics with fewer contemporary images (e.g. Lewis and Clark), this one is not all just sunrises and scenery.  The collection of photos, film footage, and audio recordings, is impressive.  I was bowled over by a short clip of footage showing a young Hemingway with other convalescent soldiers and a nurse (who is not Agnes von Kurowski) in Italy in 1918.  As someone who has read a lot by and about Papa, I had no idea that this even existed. There are also some insightful observations about well-known and less obvious influences on Hemingway’s art, like the music of Bach along with more well known influences, like Cezanne. It also emphasizes the importance of rhythms in his writing.

The declarative narration of Peter Coyote is spot on (thank goodness writer, Geoffrey Ward, did not lapse into something like “Hemingwayese”).  Jeff Daniels is more than adequate as the voice of Hemingway (who did not have a good speaking voice), and Meryl Streep, as Martha Gelhorn, once again proves to be the paragon of accents. Hemingway also gives a good chronological outline of the events of the writer’s life for those who are not familiar with it.  His various books, injuries, marriages, and wars stand like mileposts in a high-intensity life that burned out before he was 62.

I also like what I took to be the film’s tagline: “the man is more interesting than the myth.”  The movie effectively shows Hemingway’s complicated relationship with the truth: the life like truth of his fiction and the omissions of his journalism, and of course, the myth, based on lies and distortions to impress those expecting to meet the legend. Herein too is a most interesting aspect of the film, the not entirely original thesis that the persona Hemingway created took over and smothered what was good in him (see also: Samuel Clemens/Mark Twain, Dylan Thomas/”Instant Dylan”).  Some biographers mark the arrival of the Hemingway avatar with the swaggering narrator of Death in the Afternoon.  Gertrude Stein places it much earlier, believing that he had been ruined by the age of 25 or about the time his first book of short stories, In Our Time, appeared.1

What the film does well, it does extremely well. Its ambiance is wonderful. The soundtrack is pitch-perfect and nails the atmosphere of the time and places depicted. The film captures the writer’s personality through illustrative quotes and excerpts from Hemingway’s own works and from those who knew him. For a subject of this complexity, three two-hour segments feels about right, and the film is nicely paced.  It is a well-crafted documentary.  And yet by focusing predominantly on Hemingway’s personality and relationships, there are important omissions in other areas.

For instance, although the film mentions the influence of the writing guidelines of the Kansas City Star on the budding reporter, it skips over the 1922 Genoa Conference where Lincoln Steffens and George Seldes taught the young journalist how to write “cablese”—a method of writing for the wire services using a bare minimum of words.  Hemingway himself references the cable style of writing and regarded to be “a new language.”2  Nor does the film mention his “iceberg” theory of writing—his most important structural/stylistic contribution to literature and the reason why he won the Noble Prize for Literature. It does quote period critics and clips of scholars commenting on his minimalist style.

The film does mention Hemingway’s admiration for Theodore Roosevelt and Jack London, but not Rudyard Kipling or Stephen Crane (another young war correspondent whose cut-down journalistic style in “The Open Boat,” reads like Hemingway and who is mentioned by name in The Green Hills of Africa),3 and contemporary influences such as Ring Lardner and other sports writers of the period.4  It mentions important friends in passing, but does not investigate why those who were not driven away defended him and his friendship to the end of their lives.  These accounts, from people now mostly dead, in books like Denis Brian’s classic The True Gen, shed more light on Hemingway’s frequently contradictory character than guesses at cryptic sexual references in letters and novels unpublished in his lifetime (The Garden of Eden, Islands in the Stream, and True at First Light are merely a suggestive apocrypha).

Even more important than his pared-down style is the Hemingway Code, its hero, and the Hemingway philosophy (there is a brief discussion of his outlook in the segment on The Old Man and the Sea).  Where most protagonists of the modern canon are helpless victims (e.g. the Kafka protagonist) or else are outright pathetic (e.g. Leopold Bloom, Willie Lohman, and J. Alfred Prufrock), the Hemingway hero stands in defiance, a modern cousin of the Byronic hero.  He may not push the limits of human potential like the heroes of Greek or Shakespearean tragedy, but he does push back and stands up to the modern void on his own terms and without illusion.  He is a beer-drinking version of the Nietzschean aristocrat of merit. To be fair, by using so many excerpts from Hemingway’s novels and stories, Burns may be cleverly employing an “iceberg” approach of his own, letting the philosophy emerge through inference.

As much as his lean style, it is Hemingway’s outlook that explains his continuing relevance. Traditional stoicism can be construed as morally austere and life-denying. It is only valid until it crushes joy. Hemingway’s importance therefore, is not only placing stoicism in a modern context, but in showing how life may still be enjoyable without illusions. He affirms the pleasure of the physical, the appetites, and the idea that even in a world without intrinsic meaning, life may still be heroic. And he does this in the most unique of voices and with clipped impressionistic description.    

With all of the film’s preoccupation with gender and the Hemingway women, it is striking that there is no mention of Pilar, his most interesting female character and a pillar that allows For Whom the Bell Tolls to stand (the film does mention more conventional, if idealized, female love interests like Catherine Barkley and Maria as well as the thoroughly modern expatriate bad girl, Lady Brett Ashley, the thinly veiled Lady Duff Twysden).

Above all, I wish the filmmakers had spent more time focusing on his philosophy of life rather than trying to psychoanalyze him and speculate about his sexuality, as has been so often done in various decades-old debates.  This is the kind of head-shrinking and psychologism that Hemingway loathed: the reduction of an artist or philosopher by a critic or psychiatrist to his constituent parts and why he might have done the things he did. Beyond biographical trivia, why is a person’s sexuality even that interesting?  Are we allowed to like him more if there are hints in his oeuvre of a view of sexuality that is in keeping with those of a more enlightened time? Are we really that insecure and intolerant?

I don’t believe that an ability to write a plausible character of the opposite sex (e.g. Flaubert/Madam Bovary, Mary Shelley/Victor Frankenstein) necessarily makes you androgynous. It does mean that you are a perceptive and empathetic writer. Also, the wont of the filmmakers to reference so many of his comments about death and suicide makes the course of his life and its outcome seem as inevitable as a Greek tragedy or Spanish bullfight.5

On a side note, I found it curious that Jeffrey Meyers, one of the most notable Hemingway biographers did not appear in the movie. It is also odd that no mention is made of Lillian Ross’s hatchet job, “How do You like it now, Gentlemen?” that appeared in the May 6, 1950 number of The New Yorker, and which portrays a Hemingway already in steep decline. In a similar vein, Burns uses a long quotation of the worst lines from what is widely regard to be Hemingway’s worst novel, Across the River and Into the Trees—lines that are read aloud by Daniels and appear on the screen as text as they are spoken to underscore their rambling, cringeworthy inferiority. Like Ross, or a clever prosecutor, Burns thus allows Hemingway to hang himself with this own words. This struck me as unnecessary and mean. It inspired me to reread the book, which I found to be better than I had remembered and far better than the way it was characterized by contemporary reviews.

It is best that the discussion of his suicide was brief.  Was it purely psychological?  Was it heredity or physiological—the byproduct of hemochromatosis, a blood disease?6  Was it an obsession with death and his father’s suicide that he could no longer suppress?  Was it the result of his many concussions?  Was it alcoholism?  Bipolar disorder?  Was it an expression that seamlessly flowed from his philosophy of life, given his appalling physical and psychological state near the end (if a brave man like Frederic Henry could run away from a war that meant nothing to him, why couldn’t a man leave an intolerable life?)?  Was it some or all of the above?

As a realist with some romantic inclinations, I prefer to believe that when life no longer allows for any of the things that give it meaning, the Hemingway Code allows, perhaps requires, that you leave it on your own terms.  Hemingway loved hunting, fishing, traveling, drinking, storytelling and reflecting on his life, sex, and above all, writing.  None of these things were possible for him by the summer of 1961.

Finally, to those viewers who might be turned off by Hemingway’s shortcomings as a human being (“seldom has a man written so well, yet lived so poorly”), I think it is important to separate the man from the art and its prescriptions.  The fact that he could be a horrible person who often failed to live up to his own Code does not mean that the Code is bad, nor does it detract from his work or its comment any more than the behavior of Mozart, Beethoven, Byron, and van Gogh spoils their art. To be great is to be abnormal, and oftentimes the art of highly flawed people is important and pure.  Show me a well-adjusted person, and I’ll show you a mediocrity.

My reading of the Hemingway Code is that in a world without intrinsic meaning, what matters is the courage and dignity with which you face it.  Then, when you are beaten, you will not be defeated. And that is something.

So, Hemingway really wanted to be a girl, eh?  My God, who cares?  I give the film an A-.

Notes

  1. See Paul Hendrickson, Hemingway’s Boat, 277-278.
  2. Hemingway talks about the function of cablese in his Esquire article “Old Newsman Writes: A Letter from Cuba,” Esquire, December 1934.  Reprinted in Byline Earnest Hemingway, 177.  George Seldes tells the story of Hemingway at the Genoa Conference in his memoir, Witness to a Century, 312-313. Seldes also writes: “[l]ater he spoke of this as the time he discovered a new language.” Denis Brian, The True Gen, 37.  See also Jeffrey Meyers, Hemingway, A Biography, 94.
  3. Hemingway, The Green Hills of Africa, 17.
  4. Brian, The True Gen, 17.
  5. This may be justified. As Norman Mailer writes, “The story [of Hemingway’s boxing match with Morley Callaghan] offers a fine clue to the logic if Hemingway’s mind, and tempts the prediction that there will be not definitive biography of Hemingway until the nature of his personal torture is better comprehended. It is possible Hemingway lived every day of his life in the style of suicide. What a great dread is that. It is the dread that sits in the silences of this short declarative sentences. At any instant, by any failure in magic, by a mean defeat, or by a moment of cowardice, Hemingway could be thrust back into the agonizing demands f his courage.” See “Punching Papa,” in Mind of an Outlaw, 169.
  6. Hemingway’s younger brother, Leicester, references this condition in a interview with Denis Brian. It is unclear what he is attributing to the imbalances resulting from this disease as opposed to the shock treatment that Hemingway was receiving near the end of his life. The True Gen, 252.

The Great War on COVID-19

By Michael F. Duggan

After more than two months in steep decline and the introduction of four highly effective vaccines, COVID-19 numbers in the United States are plateauing again. This of course is after three spikes that made the U.S. number one worldwide in the total number of COVID deaths. In some respects, the way that much of the country has dealt with the pandemic is reminiscent of how the Allies prosecuted the First World War on the Western Front.

The Great War in the west was a war of position. After an initial campaign of maneuver during the summer of 1914, the front quickly bogged down into a 450 mile-long line characterized by trenches and deadlock—”trenchlock.” The lines would barely move in either direction for four years.

The problem was not one of parity between the belligerents, but rather a disparity between modes of warfare at that point in military and technological history. The defensive, the inherently stronger mode, was given an exponential advantage by modern weapons wrought by the Industrial Revolution (repeating rifles, smokeless powder/flat-trajectory bullets, barbed wire, machine guns, modern artillery, etc.). The technologies of the modern offensive were more complex and technologically sophisticated—light automatic weapons, flamethrowers, attack aircraft, tanks—and were in their infancy or were actually developed during the war. Even if these weapons had existed in numbers, tactical, operational, and strategic doctrine was not sufficiently developed to employ them effectively until 1918. Even then, they were not definitive in securing victory.

As recent historians have observed, the First World War was characterize by a learning process: new weapons were developed (poison gas, flamethrowers, and tanks were all spawn of the Great War). New and innovative tactical and operational approaches were also formulated: the British and Germans both experimented with modern small unit infiltration tactics, and the combined arms attack that would win fame in the Second World War as Blitzkrieg were born during the First. But these measures were too nascent, too weak to overcome the entrenched power of the defensive mode. If this period marked the birth of the offensive revolution, it was more notable as the apex of the defensive revolution.

For most of the period from late 1914 until well into 1918, the war was characterized by unimaginative “pushes”—the “classic” World War One infantry assault supported by artillery hoping to punch through the enemy lines to a war of sweeping mobility and victory. Sure the generals tinkered with the formula: creeping barrages, “hurricane barrages,” gas barrages, variations in unit density, tanks used here and there, limited “bite and hold” attacks, etc., but most of the major attacks from Loos in 1915 to the great German Spring Offensives in 1918 were fairly similar. Both sides kept trying the same thing in the face of failure. So it is with so much of the American approach to COVID-19.

There was a learning process during the COVID-19 pandemic too, but it was mostly technical. Unlike the technical developments during the First World War, the development of a vaccine was quick and effective—the companies working on them got the solution (several solutions) right the first time. One genetics company mapped the genetic sequence of the virus in a matter of hours. Vaccines were produced within weeks, and were being administered to the public by December. It is one of the great success stories of medical history. But again, policy interfered.

I am not a physician, much less an epidemiologist. I am not a medical professional of any type. I do not understanding the mathematics of contagion, of vectors and trajectories of infection. And yet I do understand that if you put people infected with a highly contagious disease in close proximity to uninfected, unvaccinated people, the disease will likely spread. This is what happened: first in April, then in July, and then massively in the fall and early winter. Increases in new cases followed public celebrations of holidays and the partial reopening of businesses (fortunately the nation was spared a post-Christmas/New Years spike). Major universities opened in the fall of 2020 and then quickly shut down again after outbreaks among the student population.

Americans—a large percentage of whom seem incapable of any kind of shared national sacrifice—have let down their guard time and time again during this crisis. Some never had their guard up. Rather than bite the bullet and shut things down in earnest, the authority to shut and open business fell to the states and local governments. The result was a checkerboard approach of half measures and temporary half-results. Premature partial re-openings kept infection rates high until early January when the numbers began coming down. Now a new wave of relaxed state and local restrictions appears to be causing a plateau in the number of cases in spite of the impressive vaccination effort by the new administration. Since many Americans apparently no longer possess the kind of determination that gave us the magnificent industrial mobilization and war effort that let to victory in WWII, perhaps technology will save us in spire of ourselves.

By contrast China, New Zealand, Taiwan, and Vietnam did bite the bullet—came up with strict national policies that effectively shut down the virus. Not only are all of these nations open for business today, but their losses relative to ours speak volumes: Taiwan lost 10 people to the disease (Florida, with a population smaller than that of Taiwan has lost 32,712 to date). New Zealand lost 26 people. Vietnam, 35. If Chinese numbers are to be believed, their nation, which has a population more than four times larger that of the U.S. has lost 4,636. As of this morning, the United States has lost about 541,000 people to COVID-19. When necessary policies are rendered impossible or ineffective by the system, then it is the system and not the policies that have failed.

The offensive revolution in arms, technology, and doctrine arrived in earnest on the Western Front during the summer of 1918. But its success was mostly local in actions like the battle of Hamel on July 4, 1918 and on a larger scale at Amiens a month later. But by then the Germans, unable to capitalize on their gains from the Spring Offensive and faced with the prospect of 1,390,000 freshly-arrived Americans, succumbed to exhaustion and mostly traditional Allied attacks all along the front, rather than the decisive arrival of the modern combined forces offensive.

It is possible, perhaps even likely, that the rapidly-increasing number of vaccinations will eventually outpace contagion. With more than 80 million vaccines already administered, hopefully the plateau will not become a spike. Of course to defeat the virus, we will have to reach a percentage of vaccinations possibly in the 80s or 90s, and recent polls suggest that a quarter or more of Americans say they will not take the shots. The alternative is to reach herd immunity through a combination of injection and infection. In this case some of the non-vaccinated will continue to die (if there is no price for stupidity, then what is the benefit of not being stupid?). Let us just hope that the virus does not mutate sufficiently to produce a vaccine-resistant variant before we reach population immunity.

And so, like the chateau-bound generals of the Allied high command in WWI, governors of some states are pursuing a policy of more of the same. By starting to reopen businesses, the hope, presumably, is that a strategy that is largely responsible for the deaths of more than half a million Americans will yield fruit this time. Infection rates are increasing again in 16 states. The Great Abdication continues.

Realism: a Distillation

By Michael F. Duggan

The foreign affairs Blob is back.  In spite of appearances, it never really went away, and the past four years were not a significant deviation from the foreign policy course of the previous three decades.

In light of the return of a more overt interventionist foreign policy, I am posting this outline of aphorisms or tenets on what moderate realism means in a foreign policy context.  I put this sequence together a decade or more ago as a part of a much longer list of foreign policy prescriptions.  Many of these ideas also find expression in various articles of mine.  This is a short, partial list, and I may add to it from time to time.   

  • Never underestimate the imperfection of the world and its complexity.
  • History has a will of its own and its course cannot be guided, reigned-in, or shaped by simple rationalistic, ideological, theistic, or utopian programs. A great leader (e.g. FDR) may for a time guide a nation via a general program, if that nation is willing. Programs may not be successfully imposed from the outside, unless a nation is willing to accept it.
  • Even when a leader or a nation is acting altruistically, power is the underlying currency and subtext of human interactions.
  • Power must always be tempered.
  • Policy and governing are about the wielding of power, even when policies, governance, and laws are altruistic, egalitarian, and generous.  There is no contradiction in this observation.
  • Although policy is fundamentally about vital interests, in diplomacy, personal relationships and connections are everything in terms of making it work. A real diplomat can and should ameliorate and minimize—ideally eliminate—personal animosities and grievances that may interfere with policy and relations.
  • In matters of diplomacy, never cause the other side to lose face in public.
  • In spite of appearances and proximate causes, never underestimate contempt, hatred, and revenge as the real causes of war. Consult history and apply empathy in order to understand these things. To better understand human nature, also study sociobiology.
  • The foreign policy of a nation should be concerned with implementing or forwarding long-term, enlightened national interests.
  • A nation’s foreign policy should be rational, moderate, and non-ideological.  It should not be based on morality.
  • It is not in a nation’s interest to act abroad in an immoderate way, even in furtherance of the highest motives.
  • A nation should recognize and respect the legitimate interests of other nations.
  • Do not dismiss the national security claims of other nations, even when they run contrary to the security concerns of your own country. These are points that require focused attention and diplomatic maturity.
  • When a nation puts moral, ideological, or theoretical considerations above considerations of vital interest, it puts its own long term prospects in jeopardy.
  • Historically speaking, moderate realism has produced better moral results than policies specifically designed toward moral or ideological ends.  Measured, moderate or progressive realism (but not realpolitik) are preferable to fashions and bubbles like Neoconservatism, Neoliberalism, and economic globalization.
  • Realism is a range of outlooks. It can be circumspect, high-minded, moderate, progressive, sensible, vision-based, and non-ideological—its defining characteristic—or it can be cynical, mean, ill-considered, intolerant, reactive, short-sighted, and unbalanced (e.g. “crackpot realism,” realpolitik, etc,).  One can imagine a scale with these as poles and innumerable degrees or shades between them.
  • If a nation is a force for good in the world (i.e. a regular provider of relief programs generally not subsumed under foreign affairs), then when it helps itself, it also helps the world.
  • A nation that cannot lead by moral example has no business telling other nations how to run their domestic affairs.
  • Leading by example is a better basis for instruction than preaching, brow-beating, threats, sanctions, invasions, and occupations.
  • Sometimes the realistic thing to do and the “moral” thing to do overlap or are identical (e.g. the Marshall Plan, the Peace Corps, etc.). 
  • In every geopolitical situation, history cannot be discounted. History may or may not be destiny in every case, but you ignore it at your peril.
  • A broad and deep understanding of history is a better basis for policies and decision-making than economic theories and ideology. There are no guarantees of success in approaching policy, but a diplomat or policy planner with historical understanding is better off than those without it.
  • An intimate understanding of a region is a better basis than formal education (i.e. an insightful person who has lived among the people of a region has a more nuanced understanding than an expert with a Ph.D. in policy or area studies with no intimate understanding).
  • As with most other areas of human endeavor, in policy, there is nothing more dangerous than a true believer. More comprehensively, the most dangerous people are those who feel too much or too little (or not at all). The next strata are the sycophants and enablers, and the opportunists of the chaos they sow.
  • Power, interests, and irrationality are the underlying currencies of human interaction, and are drivers of conflict.  Altruism is another basis for human behavior, but it is not predominant.
  • Diplomacy should avoid the language of arrogance and stridency, moralism and self-righteousness.
  • Never underestimate the importance of the personal in foreign affairs. Snubs of national leaders in public have set back relations between countries by years. That said, George Kennan believed that “governments should deal with other governments as such, and should avoid unnecessary involvement, particularly personal involvement with their leaders.” (NYRB, 8/12/99)
  • The purpose of negotiation is not to dominate, but rather for both sides to achieve their respective goals as nearly as possible.
  • Without equality between parties, there can be no justice in negotiations, only dictation and charity (see: Thucydides, Hume).
  • Great national leaders set a sensible course and then bring the electorate around to this perspective.  This is what Roosevelt did during the run-up to the Second World War.  Lesser leaders determine which way public opinion is leaning and then get out in front of it.  The most pernicious presidents choose a bad path and then get others to follow.
  • The greatest American presidents have been generous at home and tough but cautious abroad.  This suggests that morality is a partial basis for domestic policy but not foreign affairs.  
  • Presidents and other officials making public statements about foreign policy should avoid manipulative euphemisms, especially when they are transparent and reveal cynicism.  “Private military/security contractors,” for example, are likely seen by people in a war zone as mercenaries (the way Americans see “Hessians” from our War of Independence).  Inaccurate or grossly overstated comparisons of other national leaders to Hitler should be avoided.  Such comparisons drive wrong-headed policies (the subtext being that Hitler cannon be allowed to remain in power).
  • As George Kennan often suggested, United States foreign policy should be insulated from domestic politics, domestic and foreign lobbying efforts, and parochial considerations.  Kennan suggested an independent State Department perhaps along the lines of the Judicial Branch or the Federal Reserve.  This is because every four to eight years, U.S. foreign policy risks ideological swings (although this has not been the case since the perspective of Washington Consensus—The Blob—took hold after the Cold War).  To further this idea, the U.S. should develop, articulate, and implement a singular long-term policy vision.  It should define the interests for which it will fight.  In order to be sustainable, it should probably be the vision of a robust regional world power, and not “the world’s sole remaining superpower.”
  • Policy planners must try to see the United States the way others see it and without illusions.
  • The death of the Westphalian paradigm has been greatly exaggerated.  The nation state is still the basis for the prevailing world order.
  • As George Kennan and other realists have noted, powerful and non-powerful nations should be afforded the same diplomatic respect.
  • The problem with the Great Game is the game itself: it is a rotten, egotistical, and ultimately self-destructive contest.  This is even truer today at the dawn of the Crises of the Environment. The United States should therefore willingly relinquish its status of predominance—leave the Great Game insofar as possible—as a matter of mature, measured policy.  Simply put, the role of superpower is intrinsically undesirable, and the Great Game is a set of infantile distractions that the world can no longer afford.  It begs rational understanding why military, foreign, or economic policymakers would want to sustain hegemonic status, given its significant liabilities and diminishing returns.  The most powerful nation on earth will always be regarded as a force of oppression if it exerts its power abroad.  A nation’s military should reflect its size and resources, rather than pride, ambition, and the realities of the past.
  • The desirability of the consolidation of the U.S. to a more manageable and sustainable status of a regional world power is self-evident and based on the singular fact that the United States occupies the best real estate on the planet; it is large enough to be self-sustaining and has peaceful neighbors.  It is thus exempt from the endless local contests of the World Island.  If Afro-Eurasia is the World Island (see: Halford MacKinder, Alfred W. McCoy), the Americas are the “other” islands.
  • The United States should not involve itself in regions that do not want it’s help, and do not need it, or where its very presence is a destabilizing factor.
  • Hostile ideologies that cannot be defeated outright should be contained.  Islamism, like Marxism, and Puritanism are examples of revolutionary eschatology.  It is nearly impossible to sustain revolutionary fervor over time, and sometimes leaving a holistic ideology alone is the best way to defeat it. Without external fuel, they will burn themselves out within a few generations.  They may periodically wax and wan, but they can be waited-out without engagement.
  • As much as possible, the United States should disengage from the Middle East.
  • In the Middle East, Americans will always be seen as outsiders, meddlesome interlopers, occupiers, and infidels. As long as we are in the region, there is nothing we can do to change these perceptions, and the more we try, the more obvious they become. The more we try to change these perceptions the more cynical we appear.
  • In the Middle East, as elsewhere, what people say is unimportant relative to what they do.  Always watch what people do and take their public statements with a fair degree of skepticism.
  • When the Untied States plays the role of the even-handed referee, and then favors one side over another, it does not fool anybody.
  • Unless it falls victim to a major internal crisis, China will be the regional hegemon of the Far East. U.S. naval dominance in the South China Sea makes about as much geographical sense as Chinese dominance in the Gulf of Mexico. It’s dominance will be undermined by the crises of the environment.
  • As long as there are large powerful nations, there will be spheres of interests in which their interests trump that of outsiders.
  • Denying the existence of spheres of influence is the geopolitical equivalent of denying the existence of gravity around planets and stars in astrophysics. As with gravity, you ignore the existence of spheres of influence at your own risk.
  • Consistency is to high of a standard to expect of people.

On War and Insurrection

  • Most of the world’s problems are not amenable to military solutions.
  • War must always be the policy choice of last resort.
  • In spite of the rational and quasi-rational reasons for war, conflict is a part of the human condition. Its ultimate causes are irrationality and the aggressive pursuit of perceived interests.
  • War is the manifestation of the behavior of an intrinsically aggressive animal.
  • In proximate terms, war is the result of policy failure.  Bad leaders precipitate crises; good leaders resolve them.
  • War is generally an indicator of one or more of the following: failed policy, bad leadership, an interventionist foreign policy, or a system that inhibits or precludes more effective policies.  The United States has labored under all of these for some or all of the past 60 years.
  • The stated reasons for war and policy are often not the real ones, and are never the only ones.  As George Kennan has observed, the misstating of reasons is to make war and policy more palatable for public consumption and something that puts democracies at a serious disadvantage.
  • As von Clausewitz famously observes, war an policy are incarnations of the same overarching enterprise of power and interest that also includes economics (finance, trade, etc.), and the law.
  • War and policy can be defined in terms of the other: war is the achieving of policy goals via hostile means; policy and diplomacy is the achieving of goals—is war— via civil means.
  • War and bad policy are like a disease, a syndrome: once initiated it must run its course. 
  • Before waging armed conflict, policy planners should be able to answer two questions in the affirmative: 1). Can the U.S. do anything (i.e. is the problem amenable to a military solution?)?  2). Should the U.S. intervene (i.e. is this a necessary war from a perspective of vital national interests?)?
  • In conflicts between the insane and the insane or the insane and the idiotic, it is best not to take sides.
  • In war, the goal must be specific, well-defined, and achievable.  Both the goal and the means of achieving it must be integrated and realistic.
  • A common mistake: the purpose of war should not be to kill as many people as possible, but to achieve a goal with as little destruction as possible.  
  • Asymmetrical wars should be avoided at all costs, especially wars against guerillas.
  • Never engage in fighting a foreign insurrection unless national survival is at stake.  It is difficult to imagine such a scenario.  
  • You cannot “save” people who do not want to be “saved.”
  • Invasions and occupations tend to destabilize regions they were intended to save or stabilize.   
  • Civil Wars are by their very nature among the most bitter and destructive.  The best an outside nation can do is to help contain a civil war (especially one with vicious ethnic factors, such as the Balkans wars of the 1990s) is to contain the conflict to prevent it from becoming a regional war and to aid in negotiations.
  • The United States should never involve itself in a conflict that is likely to degenerate into a guerilla war unless: 1). Doing so is of such overwhelming national importance, that not fighting it will be significantly worse than fighting it even with the chance of defeat. 2). The insurrection is unpopular with a large majority of the people of a nation (as with Malaysia in the 1950s and Bolivia in the 1960s).  Insurrections on islands or peninsulas or other geographical features where supplies can be cut off, are sometimes vulnerable counterinsurgency measures.                                                  If these criteria are not met then the only alternative in fighting such wars is to kill everyone in the nation in which the insurgency is occurring.  This is obviously not how a great republic should act and would run contrary to goals of “saving” the nation (a stated purpose for U.S. involvement in both Vietnam and Iraq).
  • If a plurality or majority of the people in a nation support an insurrection, there is no way to reasonably defeat it (unless the nation can be easily divided along ethnic-geographical lines).
  • If a guerilla flees into a crowd, and no one in the crowd turns him over, then they either support him, are afraid of him, or both. This is a most ominous sign. Likewise, if a soldier does not know whether to kill a person or help him, then the counterinsurgency (COIN) policy is hopeless.
  • Budgets should never be allowed to drive policy.  Exaggerating the danger of “adversaries” should not be allowed to drive policy or budgets.
  • If you are a liberal, what are the chances of a conservative talking you into being a conservative?  If you are a conservative, what are the chances of a liberal talking you into being a liberal?  The chances of you talking a Sunni or Shiite into adopting the worldview of an outsider is even less likely by an order of magnitude.  
  • Another mistaken lesson: transformational warfare is not more or less successful against popular guerillas than older conventional warfare doctrines.  Just as the United States did not scrap its conventional forces after the War in Vietnam, it should not abandon all of the ideas of transformational warfare. 

On Democracy and Foreign Policy

  • There is a tendency, common among Americans, to confuse or conflate democracy with liberal values and rights.  Democracy is a form of government; liberalism is a sensibility.  One is structural, the other ideological.  Although they go hand-in-hand in the nations of northwestern Europe and North America, where they originated, they are not synonyms.
  • Democracy embodies a self-evident legitimacy, but it is not a panacea against conflict.  At times it is a driver of conflict and domestic violence.
  • Mutual economic success in a region has more to do with preventing war than does form of government.
  • Policymakers in the United States should never become seduced by their own secular holy words.  The idea of “freedom” in an Islamic nation might mean the liberty to practice their religion and customs without outside interference. To a Russian, “freedom” might mean the chaos of the 1990s.
  • Governments should be judged on what they do and not on what they profess or on their structure.  All else being equal, democratic form is preferable to authoritarian regimes or governments based on holistic ideology or tribalism.  However, as many people (including George Kennan) have noted, a moderate and/or rational and stable despotic government may be preferable to a brutal, fanatical, or excessively corrupt republic.
  • Democracy and liberalism exist within a tradition of intellectual history.  They are not fungible things to be cut out and laid down when and there they are needed, like carpet.
  • The advice of dissidents and expatriates with only an ethnic connection to a region must always be treated with skepticism. Always beware of easy toadies eager to serve for money.
  • The greatest mistake the U.S. took away from the end of the Cold War was that its own ideology was the correct one, the chosen one. If we had learned the true lesson, we would have emerged with a suspicion of ideologies, especially ones preaching an “end of history” narrative.
  • A binary, either/or choice between isolation and interventionist internationalism is a false one.  Between these extremes is a wide and fruitful middle ground of limited internationalism.  This middle ground should be the spectrum of where U.S. policy should reside.
  • An effective way to destroy a vigorous nation is to make it into an imperial power.
  • Most empires do not last for more than a century in good health (the Roman Empire lasted longer by invading rich neighbors).
  • An old saying: “When you Romanize the provinces (the world), you provincialize Rome.”
  • Economic globalization is imperialism in modern garb.
  • Areas that lend themselves to global and/or large-scale regional approaches include international law enforcement, military coalitions to enforce violations of territorial sovereignty, international health issues, and the global environment crises.  Neo-liberal globalization has proved to be a world-historical debacle resulting in greater disparities both at home and abroad.

Marx and Malthus: of the Moment and for all Time

By Michael F. Duggan

A few months ago, a friend of mine observed that as a historian and sociologist, Marx got a lot of the generalities of history as class struggle right, while Marx the optimist, the revolutionary (to the extent that he is one at all), fails utterly.1  He fares even worse in providing a basis for practical politics.

My friend’s point was that power concentrates, and when it does, the newly powerful—whether it is a feudal nobility, capitalist oligarchy, or communist nomenklatura—favor their own and distance themselves from the less powerful.  It’s what people in power do.  And if an existing system is violently overthrown, the new landlords will eventually act as badly as the old ones.  You can replace regimes, but you cannot perfect human nature, with or without holistic ideology. Bertrand Russell makes a similar argument in Why I am not a Communist.2

Between the wars, Ernest Hemingway wrote that the world at the end of the Great War was ripe for revolt and that military debacle was a prerequisite for revolution.  Thoroughly defeated countries like Russia dissolved into revolution. Partially defeated nations like Austria-Hungary, France, Germany, and Italy were not too far gone to ward off utopian revolutions from the left.3  If Marx was the man of the moment in 1917-1923, then the man of the present moment and for all time is Malthus.

I knew Marxian professors and students in grad school, but I never warmed up to Uncle Karl and his righteous heirs.4  Their brand of moral rationalism struck me as overly selective, rigid, uncompromising, and reductive, and I saw people as being neither predominantly good or rational.  History makes no moral or rational assumptions, and, if the course of events was deterministic, it had more to do with biology or physics than with historical “laws.”5 The chaos of history was not amenable to the imposition of rational order.

When believers would speak to me about Marx and his interpreters, it was as if a key part of their understanding of history and humans—of complexity and nuance—was missing or else dismissed as marginal details.  They saw some things a little too clearly and other things not at all. They had latched on to a single current and made it dominant, even monolithic.  It was all so simple: the reduction of history and people to categories of economic class could explain everything and Marxism was the basis for an understanding by which all could be fixed.  The proof was all around us and apparent to anyone sufficiently evolved or moral enough to notice.

Indeed, their outlook was based on a noble human inclination to set things aright, but there was a real-world disconnect between the scholars and the subject. I came to realize that a lot of American intellectuals know a lot about Marx, but virtually nothing about how working people actually think; your average bartender or Madison Avenue adman/adwoman knows more about how the “masses” think than do most radicals with a Ph.D. As Hemingway observed, you shouldn’t write about the proletariat “if you don’t come from the proletariat.”

To the faithful, the underlying narrative of history was real and singular and Marx had figured it out.  Marxism was more than an interpretive frame for them, and, standing analysis on its head, history became a sequence of ideological confirmations.6  For me history was and is about irrationality and power in the pursuit of perceived interests with occasional periods of enlightenment. Economics may be one of the most important avenue of power, but it is not the only one.  History is comprised of numerous currents, dominant, complementary, independent, countervailing, and ambiguous. It should not be used as a basis to justify a singular program limited to one or another of these.

To the Marxians, history was a great morality play of good and evil, of haves and have-nots, and, like a heroic tragedy, its outcome was inevitable. By contrast I preferred (and still prefer) sociobiological explanations of human nature that dealt with evolutionary trends tens of thousands of years older than capitalist economics and subsequent commentary.  Marxism, like theistic religions, gives itself moral authority by placing blame—something Marxists seem to relish—along with causes; sociobiology attempts to describe causation as well as the basis for why we blame and why we enjoy it.

To be fair, Marx had hit on some real human propensities (we do order ourselves in terms of class, nobody likes to be a “have-not,” and oppressed people will eventually push back) which he misconstrues as historical laws.  But they allowed him to extrapolate trends that now seem to be playing out in the globalized economy—something that closely resembles “late stage capitalism” (a term Marx never used).  Again, these are just tendencies, but Marx saw them as the material unfolding of the inevitable, as do some, but not all of his latter day acolytes (not all Marxians and Marxists believe in historical determinism).  I don’t like true believers, packages of beliefs, and intellectual herds left or right; every true believer, intellectual ideologue, and partisan is the bitch of an idea and usually a faction.  In spite of their radicalism, Marxists embody an orthodoxy—they are members of a tribe of the saved frequently at odds with itself. In the real world, discord is the path of Marxism as diverging branches come into conflict with each other (see: Lenin, Stalin, and Trotsky).

It is impossible to approach the world without assumptions—ideas. All analyses and observations are theory-laden. Even the most rational of us brings preexisting ideas to analysis, whether it be to the power of the ideas themselves (including rationality and skepticism) or the human authority behind the ideas. Marxists are especially wed to their ideology.

There are rackets and there is orthodoxy, and we must learn to see through the trappings of both.  Which does the greater harm is an open question. My sense is that it is the ideologues of the latter category because they are defined and driven by dogma and righteousness, and there is nothing more dangerous than a true believer (and I know that Hemingway also writes about this somewhere).

Rackets are characterized by corruption and sustained by cynicism.  A cynic can be won over or bought off; a true believer cannot.  But true eschatological zeal is difficult to sustain, much less pass down intact.  Kids roll their eyes at their parent’s earnest faith. Last year’s radicalism is this year’s cornball.  In the real world, Marxism starts off as dogma and becomes a racket.  Over generations, Marxists become just like everybody else in every other gamed-out system. The Soviet Union died of an internal collapse, a crisis of faith that had begin decades earlier. By the late 1970s there were few true believes and virtually no young believers in Marxist ideology. Thus Soviet Marxist-Leninism followed a course similar to American Puritanism.

In some respects, Malthus is even more difficult to warm up to than Marx, but he cannot be dismissed.  We must take account of him with the realization that the desire for something to not be true has no bearing on its truth.  His projections of population growth have a character that strike us as inevitable and inexorable, unfeeling and unthinking—removed from both the passions and reason.  They undermine notions of social progress and the perfectibility of human nature as effectively as Darwin’s natural selection undermines the idea of eternal values and objective morality.7

History has no underlying narrative, but the playing out of natural trends represented by numbers does.  Like Marx, Malthus latches on to a human proclivity, but unlike Marx, it is not one of rational tendencies but rather of biological impulses—of reproduction—and the math of reproduction outpaces the math of production.  When An Essay on the Principle of Population came out in 1798, the human species looked like it was heading lemming-like toward a cliff.

And then we caught a break, or seemed to, from another historical current.  The Industrial Revolution and its applications in agriculture put off the doom implicit in the geometrical tables of human reproduction.  By the time Malthus’s third book, Principles of Political Economy was issued in 1820, the world population was already more than one billion, but perhaps it did not matter.  The history of technological innovation for the next 200 years would be characterized by a litany of breakthroughs allowing for more and more people live without hunger.  For the time being, production outpaced population. 

But it was a temporary reprieve, a fool’s paradise born out of a stay of execution and faith in the promise of technology.  Technological progress appeared to be the savior of Enlightenment and Victorian notions of social progress.  Still, dark realities loomed.  Modern capitalism aided by technology is based on the demonstrably false belief in endless growth on a small planet with limited resources.  Eventually population would outpace production or production would desolate the biosphere, or both. Malthus, broadly construed, was right.  He just got the timetable wrong.  Technology merely delays the inevitable.  

When I bring up Malthus in conversation, people sometimes push back with numbers showing that with current agricultural methods and resources we could feed the entire world and that getting food to those who need it would be a mere logistical detail, if the political will was there. 

This argument does nothing to undermine the Malthusian position.  It is also an academic point.  Surely the dead and malnourished don’t care about such distinctions.  It is irrelevant if they starve for logistical reasons or a lack of political will rather than from a lack of sufficient production (“political will”—what eternal hope must reside in this term, as if it were a thing to be switched on or off like an electric light).  We must also realize the broader implications of Malthus: that the production of food (to say nothing of extractive activities) at current levels by current techniques is destroying the planet.  As the British philosopher, John Gray observes, with 7.8 billion people in the world “[w]e cannot tread the world so lightly” as to not trample it.  Wherever you have human beings in numbers, you will have ecological degradation.8

The sociobiologist, Edward O. Wilson, writes that we surpassed the Earth’s sustainable capacity around 1978, and that as of 2002 about 1.4 four planet Earths would be required to sustain its human population.9  We are well beyond that now.

At present there are about 100 million refugees worldwide from all causes.  As thus number swell into the multiple hundreds of millions to a billion or more, successive waves (and then a continual tide) of desperate people will break over the rich West.  At that point, life for most people will resemble life in a Mad Max film, the way it already does in some of the poorest countries.

Some commentators younger than me have suggested that because the United States and other rich nations have disproportionately benefited from what we have extracted from the planet, and abused people in poorer regions in a variety of ways in the process, we should therefore let in people from these places to share in our fleeting wealth as the planet dies. 

If you believe that the world is too far gone to save, then this argument seems to make sense.  If you believe that the U.S. could still be a force for good in the world by standing apart from the population bomb, then it makes no sense at all.  To let in millions of people with no cultural connection to a country with high unemployment makes no economic sense and it will solve none of the big problems.  It also plays directly into the hands of neoliberal wage slavers who would love for the United States to become another low cost production zone.

Young people, too, are more open to ideas like socialism. I have heard some speak of a kind of spontaneous, democratic socialism as a structural panacea to the worlds problems: give people access to power, and they will vote for their clear-sighted interests.  In doing so, they will save the world.  And yet when you ask advocates of this position how to do it, they are short on answers beyond the necessity of doing so and a blind faith in something close to direct, borderless democracy.  There are no historical examples of open-border socialism, and examples of functioning democratic socialism involve small, homogeneous, highly-educated countries (e.g. the Scandinavian nations).  It is a harsh fact that past a certain healthy and desirable point, the more diverse and populous a nation becomes, the less governable, and certainly the less democratic it becomes.  An enlightened social democracy is a non-starter in a large, overpopulated, polyglot nation.10 

In the end, all of the existential problems that face our species other than the possibility of nuclear war are the result of human overpopulation or else are severely aggravated by it: carbon generation/climate change, the loss of habitat and biodiversity, depletion of vital resources, the plastics crisis, water crises, etc.  If the world populations was 500 million instead of our current 7.8 billion, most of these issues would be manageable.

This is where we stand.  Is there a global Marxist revolution in the offing for the near future?   In the West, there is a lot of dissent in the air, but much of it is on the populist far right.  Capitalism appears to be succumbing to its own excesses, as Marx predicted, but his solutions remain as impractical as ever, and even with modern communications, there is no way to coordinate a global revolution a la Trotsky.  Even if there was, what would its supporters hope to accomplish? The violent overthrow of the existing order?  Good luck with that (the dispossessed are more likely to overwhelm it via migration), and even if they succeed, they will inherit a dying planet and not a blank canvass for a workers’ Utopia.  No, in spite of his keen historical insights, it is not the vision of Marx that will come to pass, but the apocalypse of Malthus. 

The advantage of the study of history in our time is that of a superior vantage point: we see the bigger picture more clearly and fully than any previous, more optimistic time in a similar way that an older person has a fuller idea about the meaning of his or her own life than does a younger person.  We know more about the plot at the novel’s end. 

As regards global history, this broader perspective is not an attractive one. The meaning of the human story will likely not be that of the triumph of Enlightenment reason or a nineteenth-century belief in social progress.  It is not the unfolding of Hegel’s vitalism or the moral rationalism of Marx’s dialectical materialism, the pseudo-scientific monstrosity of National Socialism or the ultra-humanism of the New Left.  It is neither the self-created meaning of the existentialists or “the end of history” dreams of globalization beyond the nightmare of the uncontrolled spread of our species across the planet. 

The larger meaning of the human experience is one of biological imbalance, an extension of the viewpoint of Malthus and fundamentally linked to his idea of population increase and the subsequent depletion of resources.  Combined with the Gaia hypothesis, we have a powerful interpretive frame for history that is more accurate than all previous models.  Increasingly, human history appears to be a catastrophic prong of natural history, a runaway project of nature and our own nature.  As our population growth continues unabated toward eight billion and beyond, and with a biomass well over 100 times larger than any other large animal that walked the planet, the human project has taken on the appearance of a natural-historical plague species responsible for the Earth’s sixth great extinction.11  We are both the asteroid and its victims.

Please do not misunderstand me, I am no misanthrope—all of my favorite people are human beings.  I love innumerable of our kind and our best examples are my greatest sources of inspiration.  It’s the species en masse—including myself—and what we are doing to the planet that I loath, but cannot blame.  What is a thoughtful cancer cell or locust to do?  Sartre was wrong: Hell is not “other people,” it is the teeming swarm.

I have long hoped that humans could rise above biological determinism, rise above our own unthinking biology via cooperation, moderation, and reason.  As Hume reminds us, reason is a marginal junior partner to the passions. But there was always hope that we might curb the worst excesses of our nature and in doing so, save ourselves from becoming just another casualty of our own success.  Take a look at the world around us.  How are cooperation, moderation, and reason doing these days?

Note

  1. An even more accurate account of class struggle is that of Brooks Adams, brother of Henry Adams.  He held that the law served the powerful but that concessions were made by them to keep those without power more or less content.  Over time, however, he believed that democracies tended to tear themselves apart.  For a description of Brooks Adams’s pessimistic view on class struggle as manifested in the law, see James Herget, American Jurisprudence, 1870-1970 (Houston, TX: Rice University Press, 1990), 131-134.
  2. Bertrand Russell, Why I am not a Communist, 133-134
  3. See Hemingway’s December 1934 article “Old Newsman Writes: A Letter from Cuba,” reprinted in By-line Ernest Hemingway (New York: Charles Scribner’s Sons, 1967) 178-185. 
  4. Traditionally a “Marxian” is someone whose economic outlook is influenced by Marx, while a “Marxist” embraces his politics.  As originally explained to me, the former tended to be academics while the latter were actual revolutionaries.
  5. See generally, Karl Popper, The Poverty of Historicism, (London, 1957).
  6. Karl Popper relates a similar experience of Marxists finding confirmations of their interpretation of history on every page of a newspaper.  See Karl Popper, Conjectures and Refutations, 2nd ed. (New York: Basis Books, 1965), 35.
  7. T.R. Malthus, An Essay on the Principle of Population (Oxford World Classics, 1993 [1798]), 72. The ideas of Malthus gave rise to a stern and unsympathetic kind of economic and political conservatism embodied by Charles Dickens’s character of Ebeneezer Scrooge. Given that better living standards and social factors like women’s rights actually decrease population, I would argue that it is possible to be a Malthusian progressive and social democrat.
  8. John Gray, Straw Dogs (New York: Farrar, Strauss, and Giroux, 2002, 2007), 7.
  9. Edward O. Wilson, The Future of Life (New York: Alfred A. Knopf, 2020), 27.
  10. Tony Judt, Ill Fares the Land (New York: Penguin, 2010), 69-71.
  11. On the human biomass, see Wilson, 29.

After the Riot: Green Zones within Green Zones

By Michael F. Duggan

The first thing you noticed was the desolation of the place.  No more than 20 people got off of the train at the long platform at Union Station where a year ago hundreds would have detrained.  It was before 7:00 and mostly dark.

The station was well-lit and more alive than I expected, so strange, so unchanged since last March.  The virus had shut down the routine life, and for ten months I had driven to work.  But The Hill was now buttoned-down. Major arteries were blocked off making driving difficult, and I was back on the commuter train.  A half-dozen people crossed the great airless concourse of the station at 7:04.  Outside under the arches facing south to the Capitol were the perennial homeless, restless and unmasked in a pandemic.

There were clouds on the east horizon and the sunrise was more vivid than usual, and the light of dawn reflected in the windows of office buildings along North Capitol Street to the west of the station’s plaza.  There were no cars for blocks in either direction on Massachusetts Avenue at Columbus Circle.  The crosswalk too was deserted and terminated at at the base of nine-foot-high fence-barricades, and I entered the green zone at First Street, NE. The barricades were topped with razor wire.

3 Revisions

The checkpoint was manned by police flanked by solders of the Guard. I recognized the insignia of the Twenty-eighth “Keystone” Division and the yin-yang shoulder patches of Twenty-ninth “Blue and Gray” Division—heirs to the men who went in with the first wave on Omaha Beach.  There were other patches I did not recognize.  You just presented your ID and named your building, and the police officer let you in. Two MTVs were parked back-to-back but unaligned across First Street just below Massachusetts blocking all but a single lane of traffic between them.  But few cars came and in spite of the sunrise it did not feel like rush hour.

Walking up The Hill, there were many more Guardsmen and women than pedestrians: young people with M4s, some of whom nodded and called me “sir” as I passed.  Constitution Avenue was blocked off from traffic below the Hill and there were no other people as I crossed adjacent to the Capitol.  As I passed the Supreme Court I could see that both it and the Capitol grounds were enclosed behind the high fence-barricades and concertina wire—green zones within green zones.  I was alone outside of this internal line of barricades and kept walking.

All was quiet.  And I went into my building, turned on my computer and drank my coffee and went to work.

January 15, 2021

The Confederate Bikini

By Michael F. Duggan

I wrote the account below a few years ago during one of the occasional dustups involving Confederate flags and related symbolism. Given the social and political rifts that remain in this country, I think it is as topical as ever.  It has nothing to do with realism or policy, and I hope that the satirical tone will not distract from the usually serious timbre of this blog. All of the events depicted are true.  

Warning: Mildly indelicate/regressive (frankly childish) humor to follow.

The Confederates have finally taken Gettysburg. 

With the week off and far too many errands to reasonably accomplish, I took the day and went to the small Pennsylvania town. It was a glorious afternoon, and the battlefield was awash in the full splendor of August flora—blood red cardinal flowers between Plum Run and Houck’s Ridge, asters throughout.  Then I went into the town.

Without getting into constitutional or legal issues, let me just say that the tourist area and the road leading into it were a bit of a jolt.  It ranged from the disappointing (e.g. Pennsylvania farmhouses flying the Battle Flag of the Army of Northern Virginia) to the brazenly in-your-face (e.g. a sign on a shop door reading “Don’t criticize what you can’t understand.” “Really?” I thought, “unreconstructed, pro-Confederate types quoting Bob Dylan?  How about a few lines from Oxford Town”?).  In an attempt to please everybody, Gettysburg has long catered to both “sides.”  But now the northern town and site of the Union’s greatest victory seems to have taken on a distinctly Southern twang.

In one sutler store, I interloped (feinting interest in various historical reproduction items) on a fifteen-minute lecture as a man, who I assume was the storeowner, enlightened two earnest out-of-towners on the history of the Battle Flag of the Army of Northern Virginia (and then on every other banner he believed gave liberals moral discomfort) in excruciating detail.

He explained: “First they [meddling liberals] came for the Confederate flag.  Now they are coming for the MIA and Tea Party flags.” He then went into the eighteenth-century origins of the latter with equally stultifying minutia.    

“Why would they do that?” asked one of his audience members with innocent disappointment.     

“One word,” he replied, “Political. Correctness.”    

One word?” I thought, looking up with a grimace bordering on an audible scoff (causing me to fumble the Made-in-India knockoff of a Model 1853 Enfield socket bayonet I had been examining). This blew my cover as a disinterested bystander. The store owner was on to me like gray on Robert E. Lee, and, like Marse Robert heading for the Potomac after Pickett’s Charge, I beat a hasty retreat.

But the apex of this anachronistic adventure into the absurd was the “Confederate Bikini”(TM?) prominently displayed on a manikin in another storefront window.  Suffice it to say that this item—perhaps evoking the likes of Daisy Duke in a defiant “Hell no!” mood—was not in the drab gray or butternut of a Confederate uniform, but rather was comprised of Confederate Flags strategically positioned to hold the high ground and bottomland alike.  Somehow it just didn’t seem like a solemn celebration of “heritage” to say nothing of its limited usefulness for reenactors; it is well documented that comparatively few Confederate soldiers actually wore bikinis on campaign, much less in combat.

Driving home I thought to myself: what would be a good sales pitch or catchphrase to market such an unusual and patently inoffensive piece of apparel?  (Regressive humor about to start)  Here are a few I came up with (feel free to come up with your own):

1). Now you can cover your “Southern regions” with your “heritage”!

2). This thong has nothing to do with slavery (but it is suggestive of a cleft in the Union).

3). How do you like these “Little Round Tops”?

4). Hell no, I’ll never forget… the sunscreen.

5). Betcha can’t “look away, look away, look away…”

I’m sure that one could come up with others about “waxing” nostalgic for The Lost Cause, “Mason-Dixon” tan lines, and how “the South will rise again,” but I will leave these to the imaginations of others.  

Apologies for any bruised sensibilities, North or South, left or right (and bottom to top).

2020, Losses

By Michael F. Duggan

On December 10, 2020, the number of Americans who had died of COVID-19 surpassed the number of United States combat deaths in the Second World War (given as 291,557). Sometime in January we will surpass U.S. deaths from all causes in World War II (405,399). In spite of news stories about overrun ICU wards and people who have lost relatives, there is not an overwhelming sense loss among many of our people. Some defiant Americans still think that the pandemic is a hoax or else real but greatly exaggerated, a bad flu season. Even with social media memes taking shots at 2020—as if a year was a person to be insulted or shamed—it just doesn’t feel like we are living in a nation that has lost more than one-third of a million people in ten months. We are still met with happy, reassuring commercials when we turn on the television. At worst we see sympathetic pitchmen/women referencing “these difficult times” and the “new normal.”

And then there is the official response. Even with the impressive development of several effective vaccines, and now the massive logistical efforts to distribute them, there is still no universal national mobilization like that of the war years. Lacking sufficient commonsense and a national will to defeat the virus through minor sacrifices (we are being asked to wear a mask in public after all and not to die on the beaches of an island in the South Pacific) and by altering our behavior, we must rely now on medical technology to save us. New Zealand lost 25 people to the virus without a vaccine. Taiwan lost 7. With almost 20 million infections and more than 340,000 deaths in this country, the Great Abdication continues.

Many of us feel the pandemic as a menacing omnipresence lingering unseen in the air we breath. We know its scale and scope as abstractions and from news stories with nurses weeping for people who died in their arms that day, and the day before, and the day before that. But empathizing with someone who has been punched in the stomach is not the same as being punched yourself. Unless you are a front line medical professional or know someone who died of COVID-19, it is hard to personalize the all-pervasive sense of loss felt by an increasing number of Americans. The PBS News Hour has made an admirable effort to spotlight ordinary people who have died and their families.

When all COVID and non-COVID deaths are tallied, 2020 turns out to be a dour year for the Mass Culture. With the deaths of Olivia De Havilland and Kirk Douglas, a final door seems to have closed on the Golden Age of Hollywood. Sports and entertainment took heavy hits with the deaths of Wilford Brimley, Lou Brock, Kobe Bryant, Pierre Cardin, Sean Connery, Robert Conrad, Charley Daniels, Brian Dennehy, Whitey Ford, Buck Henry, Ian Holm, James Lipton, Rebecca Luker, Vera Lynn, Ellis Marsalis, Jr., Johnny Nash, Curly Neal, Geoffrey Palmer, Charley Pride, John Prine, Helen Reddy, Carl Reiner, Ann Reinking, Little Richard, Diana Rigg, Kenny Rogers, Tom Seaver, Jerry Stiller, Alex Trebek, Max von Sydow, Fred Willard, Bill Withers, and now Dawn Wells, to name a few. We also lost the man who broke the sound barrier.

The purview of this blog is policy and comment and there were notable losses in government too. It seems a little odd to single out the passing of a noble-minded few during a pandemic, but Ruth Bader Ginsburg and John Lewis, both of whom comfortably bear the appositive “the great,” are gone. Gone too is foreign policy advisor, Brent Scowcroft, a great realist, public servant, and gentleman.

It was an especially bad year for independent voices. Stephen F. Cohen, Robert Fisk, and Pete Hamill are gone. You will find tributes to all three of these writers (and Brent Scowcroft) on this blog. Take note of benevolent leaders and fearless independent voices while they live. It will make you appreciate what is good, what still works, and what courage still exists in the world, while it exists in the world.

As human beings continue to violently encroach into hitherto undisturbed wild areas and come into direct contact with animals that are hosts to wide ranges of viruses, one can only wonder if 2020 is a demarcation that marks an opening shot of the Apocalypse of the environment.

I apologize for the tone of this somber posting. Indeed there are things to be optimistic about, especially the vaccines that promise to ease and perhaps defeat the pandemic. There is also a majority of people who do take the crisis seriously. But with an even more contagious mutated variation of the disease abroad in the world (and now in the United States), one wonders if the future of our species will be a series of desperate efforts to stamp out pathogenic brush fires—outbreaks—as they crop up and before they become pandemics. It is possible that variants of COVID-19 will become seasonal, like the cold and flu. Vaccines are now in the offing with 95% effectiveness against the present virus in its current form. The time may come when we have to face pathogens that are more problematic than the current one. It is therefore in our interest to leave alone the remaining unspoiled habitats of the world and the creatures therein.

The War Correspondent (Robert Fisk)

By Michael F. Duggan

There [are] no good guys in war… War is primarily about the total failure of the human spirit. It is about death and the inflation of death. And if you don’t realize that, you will die in a war.
-Robert Fisk

You see this terrible suffering, these monumental crimes against humanity—let’s speak frankly, that’s what we are talking about—we have all committed them, not just al-Qaeda.  We [have] all committed crimes against humanity, and if you don’t report it, people won’t know.  I always say… we can tell you what’s happening, don’t ever say no one told you.  Don’t say you didn’t know.
-Robert Fisk

It seems to me that [the role of war correspondents] at the moment is to be out there on the street, in the battlefield with soldiers, with civilians in hospitals particularly and record the suffering of ordinary people and talk to them.
-Robert Fisk1

Another important independent voice is gone, another casualty of 2020.  As a foreign correspondent, Robert Fisk was the best at what he did: covering wars from the front lines and in front of the front lines when the lines existed at all.  Except perhaps for the job of combat photographer, it is, when done right, the most dangerous calling in journalism. In a career that spanned almost 50 years, he covered conflicts in the massive expanse between Afghanistan and the Balkans as well as in Northern Ireland and North Africa.  He appears to have died of natural causes.

There are other reporters who have matched Fisk’s doggedness and physical courage.2 But few if any equal his depth and breadth of understanding.  He had an intimate knowledge of the regions he covered—saw the big picture and saw through the stated reasons of those who waged wars and was able to adumbrate the likely outcomes of those wars.  With a doctorate from Dublin’s Trinity College, he also had academic credentials and was a journalists with a greater depth of the historical understanding of war than most scholars and area specialists. When it came to writing about the Middle East, Fisk wrote with passion and presented the “hot” analysis relative to Patrick Cockburn’s “cool.” At 1109 pages, his The Great War for Civilisation, The Conquest of the Middle East, is magisterial, readable, and endlessly rich in its insights.

Fisk presented the bottom-up view of conflicts, a perspective missing or glossed over in much of the reporting by the corporate media (we can hardly be surprised if most Americans are unaware that millions of people have died in the Middle East since the beginning of the U.S. war in Afghanistan in 2001).  He always named the names of perpetrators regardless of what side they were on. His frequent criticism of American and Israeli polices made him enemies both powerful and ordinary.  In 2002 he was famously threatened by the actor, John Malkovich.3

Fisk was one of those legendary on-the-ground correspondents—“a historian of the present”4—who immersed himself in the Middle East and seemed to be everywhere in that troubled region from the 1970s until 2020.  He was fluent in Arabic and lived in Beirut.  He was also fluent with the region and its cultures as well as its conflicts.  He wrote well and often with good humor (it was from him that I learned the slangy codswallop).  

He covered the Soviet invasion of Afghanistan in 1980, the Iranian Revolution, the Iran-Iraq War, the Balkan wars of the 1990s, the Iraqi invasion of Kuwait, and both U.S. campaigns in Iraq to name but a few.  He covered the civil wars in Lebanon, Algeria, Syria, and Libya.  He was at the massacre of Sabra and Chatila while the killing was still in progress.5 He knew that one must leave the pack in order to get to the real story, ignoring official dog-and-pony shows and prejudicial stunts like embedding.  He loathed and lamented “grad school journalism” and the “safe” “fifty-fifty journalism” of “obedient reporters” who he saw as willing and uncritical spokespeople for governmental agencies. He despised the “parasitic, osmotic relationship between journalists and power.”6 He covered five Israeli invasions and interviewed Bin Laden three times (the fame from which he called his “albatross”). It is striking that in spite of his experiences he never lost his humanity—his belief in human potential—and his capacity to be shocked by the degradation of war.

Given the dangerous places and situations Fisk was frequently in, it is noteworthy that he carried a weapon only once, when a Kalashnikov was thrust into his hands before an expected ambush in the early days of the Soviet war in Afghanistan. Of this experience he wrote: “I have never since held a weapon in wartime and I hope that I shall never again. I have always cursed the journalists who wear military costumes and don helmets and play soldiers with a gun on their hip, greying over the line between reporter and combatant, making our lives ever more dangerous as armies and militias come to regard us as an extension of their enemies, a potential combatant, a military target. But I had not volunteered to travel with the Soviet army. I was not—as that repulsive expression would have it in later wars—’embedded.’ I was as much their prisoner as guest.”7

The swaggering foreign correspondents of the Big Three and cable television can at times match Fisk’s physical courage, often while dramatically inserting themselves into stories.  But there has to be more to being what Hemingway dramatically calls a journalistic “carnivore” than just being on the ground and in the shit.  As with the historian, the job of the journalist is to get the story right, to tell the truth.  While we cannot doubt the conspicuous courage of big network reporters, their broader perspectives are conventional and homogenized—uninteresting—and more often than not, identical to the official line. The striking footage brought to you in living color by network valor and careerism cannot touch Fisk’s insight, depth of knowledge, and moral courage to tell the truth.

It is ironic that the correspondent who saw more of war than most soldiers would die of natural causes far from the battlefields he covered. It apparently was a stroke at St. Vincent’s Hospital in Dublin on October 30.8 He was 74.

This blurb in no way does justice to Robert Fisk and only hints at his accomplishments and integrity.  I did not know him. I only knew of him.  I hope that this minimalist treatment of a great journalist will inspire others to read the remembrances and tributes by those who did know him.  Above all, they should investigate his articles and books.9 We need his brand of driven honesty now more than ever.

Notes 

  1. All prefatory quotes are from Harry Kreisler’s interview with Fisk on University of California TV’s Conversations with History: Robert Fisk, February, 2007.
  2. For example other reporters interviewed Bin Laden on his own territory and the number of journalist and media support workers killed in Afghanistan and Iraq since 2001 is more than three times the number killed in the Second World War.  The tally of reporters killed in Afghanistan between 2001 and the middle of 2014 is given at 28, while the number of journalists killed in Iraq between March 2003 and June 2012 is 150 along with 54 media support workers.  By contrast 68 journalists were killed during World War Two.  Sixty-six were killed in Vietnam between 1955 and 1975.  One reason given for the large number of journalists lost in recent wars is that they were the victims of targeted killings rather than combat casualties.                                                                                 https://cpj.org/2013/03/iraq-war-and-news-media-a-look-inside-the-death-to/                                                     https://globaljournalist.org/2014/06/timeline-press-casualties-afghanistan/           
  3. https://www.independent.co.uk/voices/commentators/fisk/robert-fisk-why-does-john-malkovich-want-kill-me-9204117.html       
  4. https://www.independent.co.uk/voices/robert-fisk-iraq-2003-patrick-cockburn-the-troubles-b1539514.html
  5. https://www.independent.co.uk/news/world/middle-east/forgotten-massacre-8139930.html
  6. Address to the Georgetown University Center for International and Regional Studies, State of Denial: Western Journalism and the Middle East, April 10. 2010. https://www.youtube.com/watch?v=l6ASJA7fbcE&t=1646s
  7. The Great War for Civilisation, 68.
  8. Reading that Fisk died of a stroke made me think of the head injuries he sustained on December 12, 2001 in a small Afghan village when his jeep broke down. A crowd gather and quickly turned hostile; they were enraged over the Mazar-i-Sharif massacres and by B-52 strikes and severely beat and stoned Fish and fellow journalist, Justin Huggler. Fisk was lucky to to have escaped at all (with the help of an elder Muslim cleric, who took his arm and walked him away from the mob). See The Great War for Civilisation, 871-876. https://www.independent.co.uk/news/world/asia/my-beating-refugees-symbol-hatred-and-fury-filthy-war-9179496.html
  9. https://www.counterpunch.org/2020/11/17/robert-fisk-had-true-independence-of-mind-which-is-why-he-angered-governments-and-parts-of-the-media/ https://www.counterpunch.org/2020/11/09/the-life-of-robert-fisk/    

Stephen F. Cohen

By Michael F. Duggan

During an ominous election season, it is understandable that the nation would be distracted by the death of a Supreme Court justice, especially one with the mass culture stature of Ruth Bader Ginsburg. But it is worth noting in these troubled times that one of the leading American experts on Russia and Russian history, Professor Stephen F. Cohen of NYU, died on the same day. He was an important scholar and commentator who frequently and bravely went against the grain and appears to have been ostracized—perhaps even black-listed—for it in recent years.

I corresponded with him a few times via email. Seemed like a nice guy.

For a review of his last book, War with Russia?, see the posting of July 21, 2019 on this blog.

Pete Hamill, “Those Times”

By Michael F. Duggan

And you that shall cross from shore to shore years hence are more to me, and more in my meditations, than you may suppose. -Walt Whitman

East Side, West Side, all around the town
The tots sang “ring-around-rosie,” “London Bridge is falling down”
Boys and girls together, me and Mamie O’Rourke
Tripped the light fantastic on the sidewalks of New York

Things have changed since those times, some are up in “G”
Others they are wand’rers, but they all feel just like me
They would part with all they’ve got, if could they once more walk
With their best girl and have a twirl on the sidewalks of New York. -James W. Blake

Nostalgia isn’t what it used to be. -Sam Phillips

Pete Hamill is gone.  He was as New York as stickball, egg creams, Coney Island, and the Brooklyn Bridge. Like the Dodgers, only the memory of him remains.  How do you write about a writer like Hamill? With clipped, declarative sentences, of course.  Beyond that it is hard to know where to begin and what to include and what to leave out from such a rich life.

He was born in 1935, the eldest of seven children of Northern Irish Catholic immigrants from Belfast. His father lost a leg in 1927 after a severe soccer injury turned gangrenous.  His first home was Brooklyn, that innermost of outer boroughs relative to Downtown and the only one to have had an independent identity as Manhattan’s twin city prior to the consolidation of Greater New York in 1898.  As a young child, his mother walked him across the Brooklyn Bridge for the first time after seeing The Wizard of Oz, and for the rest of his larger-than-life life, he regarded the towers of Lower Manhattan to be the real Emerald City.  In a recent interview he characterized his upbringing as “poor” but not “impoverished” because he had a library card.

Hamill came of age during Brooklyn’s Golden Age of the 1940s and 1950s (see: Woody Allen’s Radio Days, Doris Kearns Goodwin’s Wait Till Next Year: A Memoir, the Great #42, most of the writing about “Dem Bums” during this period, and the movie Brooklyn).  As a kid I caught a fleeting glimpse of this world when we lived near my mother’s parents in Middle Village in Queens during the 1960s.  Elements of the American mid-century and before lingered there as late as the 1969 World Series or shortly thereafter.  It was part “the center of the world” and part small town manifested as neighborhoods (New York cannot be taken whole and so your neighborhood becomes your world).  My mom’s upbringing in Queens and my girlfriend’s family in Bedford-Stuyvesant and Maspeth were all a part of this mostly lost world.  But things really have changed since those times, and if there was ever a place and period in recent American history that legitimizes nostalgia as an ennobling emotion, it is this.1

Perhaps because he grew up in a plausible Halcyon Age that seems to have concentrated what was good about the United States, Hamill believed that much of what is justified as progress is actually the destruction of good things, things that worked, things that still resonate. To him New York exemplifies the tendency of change-for-the-sake-of-change more than any other American city, and in doing so, tramples on much of what is, or was, good about it. In this sense, he is a plain-talking conceptual cousin of Jane Jacobs and his views are sympathetic to the ideas she presents in The Death and Life of Great American Cities.

But Hamill’s life was not all sunshine and roses. A sensitive tough guy familiar with dysfunction, the low life, and the streets, he drank too much as a young man in a culture of drinking and fought in bars. He briefly saw the inside of a jail in Mexico City that included both solitary confinement and a large, crowded room where two men fought over a young woman with bricks. His first publication was a beat-inspired poem after meeting Jack Kerouac in 1957 (in a bar, of course). He quit drinking by sheer force of will and the easily-spoken rationale “that I only had to give up one drink: the next one. If I didn’t have that one drink, I’d never have another.” It worked. All of this is recounted in his unflinching 1994 memoir, A Drinking Life.

Coming out of this world of “sunshine and shadow” (and after a hitch in the navy and his stint in Mexico on the GI Bill), Hamill seems to have crossed paths with every writer and musician from the great American Mid-Century.  If A Drinking Life is a confession, then his later memoir, Downtown (2004), is a love letter to his city. His reminiscences of lower Manhattan in this book are fascinating in their insights and an education in themselves.  As a historian, Hamill is so compelling because he is non-theoretical and because he lived so much of what he describes.  He saw it with his own eyes. A streetwise realist, he “hated abstractions” and believed that ideology is “not thinking [but] a substitute for thinking” leading to snares. His knowledge is intimate and he knew most of the people about whom he writes. Like many New York writers, this local intimacy also creates cosmopolitanism out of the urban provincial.

He and his friend, competitor, and fellow “deadline artist,” Jimmy Breslin (1928-2017), were at the forefront of the New Journalism of the 1960s and ’70s and would become the newsprint voices for ordinary New Yorkers. By Breslin’s own reverse-snobbish account, they were not “journalists” (“That’s a college word”) but “reporters.” As public intellectuals for the regular Joe, they also became celebrities. With the good looks of a rugged leading man, Hamill, the poor kid from Brooklyn, dated Shirley MacLaine, Jacqueline Onassis, and Linda Ronstadt.  At one point he managed two newspapers.  He was a shoe-leather autodidact with enough grounding in the outer-borough ethos to know that fame was all bullshit, and a distraction.  “Fame was never the goal. [You] can’t write while trying to be famous,” and he never forgot where he came from. But fame followed him and even during this lifetime, stories—legends—abounded.

In 1968 Hamill wrote a letter to Robert Kennedy, spelling out the reasons why he had to run for the presidency.  RFK launched his campaign shortly thereafter and carried Hamill’s letter with him.  Hamill was with Kennedy at the Ambassador Hotel when Kennedy was shot. The letter would haunt him for the rest of his life.

Once, in a London bar called The Ad Lib Club, John Lennon, apparently not wanting an American to sit at his table, said to Hamill to “Why don’t you get the hell out of here.”  The tough kid from Prospect Park replied, “Why don’t you make me?”  Lennon said “What?” “I said, why don’t you try to make me leave?” Hamill answered. Lennon looked down at his drink and smiled.  Hamill sat down. Later he would call Lennon “one of the bravest human beings I know.”

He was a war correspondent who filed dispatches from Vietnam, Nicaragua, Lebanon, and Northern Ireland.  He covered civil rights in the South. He predicted what he called “the revolt of the white lower middle class” 47 years before the 2016 election.

When a powerful New York City real estate developer, who would go on to become president, took out a full-page ad calling for the death penalty for the suspects in the Central Park Jogger case, Hamill punched back, writing:

“Snarling and heartless and fraudulently tough, insisting on the virtues of stupidity, it is the epitome of blind negation. Hate was just another luxury. And Trump stood naked, revealed as the spokesman for that tiny minority of Americans who lead well-defended lives. Forget poverty and its causes, forget the collapse of the manufacturing economy, forget the degradation and squalor of millions; fry them into passivity.” The central Park Five were later exonerated.

But what about Hamill as a writer? 

I have long had an interest in novelists who were also reporters—Defoe, Twain, Crane, London, Hemingway, Camus, Mailer.  I also admire reporters with the courage to tell the truth as they see it, even if I don’t always agree with their politics or outlook.  Some of these are Andrew and Patrick Cockburn, Robert Fisk, Chris Hedges, Diana Johnstone, and historically, George Seldes, Lincoln Steffens, Jack Reed, and Martha Gelhorn. Hamill checks both boxes.

A couple of years ago, with Philip Roth gone, I asked some friends if there were any great American writers of the old school left.  Heller, Mailer, Updike, Vidal, and Breslin were all gone.  What about Pete Hamill?  What I heard from him and others was that Hamill was a great, two-fisted journalistic stylist with little of his own to say.  He was an impressive observer but, unlike Camus or Hemingway, there was no unique worldview or original take on things.  The consensus, more or less, was that he was an earthy writer in the tradition of other New Yorkers like Liebling, Mailer, and Miller, but not a standalone literary philosopher and commentator on life.  So I picked up a copy of his memoir about lower Manhattan, Downtown (essentially a long essay incorporating history with personal memory—an extended sonnet in prose to a city he spent his life trying to know)—and found it to be rich with plenty to say about life and loss.

Hamill reads like a cross between Hemingway and a harder-edged version of Whitman.  He is more poetic than Breslin. For many people, he was New York City personified.  He combines Hemingway’s impressionistic realism with what one critic called the magical realism of Gabriel Garcia Marquez (although otherworldly crossovers and the overlapping “in between times” are also distinctly, but not uniquely, Celtic).  He has Whitman’s love for the cacophony of the city—the urban hive—and is an observer of the first rate with a reporter’s critical eye. He has Twain’s ability to see through shams, usually.

Some critics accused him of being too sentimental.  Hamill counters this in Downtown by saying that the inevitable loss and change that NYC inflicts upon its people makes them embrace a deep and profound sense of “nostalgia”—the longing for important things lost or taken.  By contrast, “sentimentality” is a superficial, often dishonest, emotion.  As he puts it:

“The New York version of nostalgia is not simply about lost buildings or their presence in the youth of the individuals who lived with them.  It involves an almost fatalistic acceptance of the permanent presence of loss.  Nothing will ever stay the same.  Tuesday turns into Wednesday and something valuable is behind you forever.  An ‘is’ becomes a ‘was.’  Whatever you have lost, you will not get it back: not that much-loved brother, not that ball club, not that splendid bar, not that place where you once went dancing with the person you later married.  Irreversible change happens so often in New York that the experience affects the character itself.  New York toughens its people against sentimentality by allowing the truer emotion of nostalgia.  Sentimentality is always about a lie.  Nostalgia is about real things gone.  Nobody mourns a lie.”

Not bad (and those of us who call Bethesda our home, can certainly relate to the idea of a place being ruined by destruction in the name of “progress”).  And yet I am not sure what to make of it and of the misty-eyed Heraclitus of the Bowery who wrote it.  This gorgeous paragraph expresses the universal particularized in the cauldron and intensifier of change that is New York City. He raises love of place to a high faith of loss without the possibility of resurrection, outside of memory. But is his idea of nostalgia fresh and new in a profound way?  We all know loss.  We all mourn at the “shallow graves” of the recent past now and forever just beyond reach. Just as all matter is really stable energy, all people and things are verbs posing as nouns—physical processes destined to play themselves out, destined to succumb to the second law of thermodynamics. Heraclitus writes that reality is change, and we have all experienced the tyranny of the arrow of time and the capriciousness of life. But so what?   Why seems it so particular with thee? Perhaps it is as simple as the realization that nostalgia is a sensibility of someone who has lived a rich and memorable live, and that changes is always dicey, even though, as Parmenides holds, it is inevitable. A future that is different from the past already exists and we are merely walking a path to it and through it in Einstein’s “stubbornly persistent illusion” of the present moment.

Is there a bigger idea here, like a view that human history, change, and progress is nothing more than the progress—the metastasis—of a Malthusian plague species? I don’t think so. It is all personal and proximate, didactic, and elegiac. After all, Hamill believes in and defends something like a Golden Age—the great window of opportunity and the parabolic curve called the postwar United States (again, particularized to New York), which was perfectly coterminous with Hamill’s career. And so we are left taking or leaving him on his own terms.

Hamill also metes out glimpses of his worldview in Ric Burn’s 1997 documentary series, New York.  At one point he observes there is no definitive novel about the city in the same way that any number of Dickens’ novels capture London at a certain point of its history.  He attributes this to New York’s ever changing “daily-ness” and “a sense of surprise” and concludes that the closest thing to The Great New York City Novel is a local daily newspaper.  Perhaps this is why so much of his work is observational or descriptive rather than prescriptive, although there always seems to be a moral.  He sees the city as embodying an ineffable, kinetic chaos and the reality that “something is going to happen between here and 57th street and you’d better be ready for it.” He adds that “although [the city] insists on routine from a lot of its people, [it] knows that routine is a utopian goal.” Thus the experience of loss and change instills the expectation of new unexpected change. This is clearly realism, and he knows as well as anybody that in order to be vital, cities must also be dynamic.

Hamill’s worldview is too deeply rooted the 20th century, outer borough, son-of-an-immigrant ethos to be fully original.  He is an interpreter-as-exemplar of a code rather than its inventor, and yet his experience, instincts, and observations amount to flashes of insight and instances of originality (like the idea that New York is an “alloy” rather than the more traditional “melting pot” or “mixing bowl”). And for pure writing, nobody today, can touch his lean style.  It is understandable that Hamill would embrace this ethos, a working class chivalric code.  As the grandson of outer borough immigrants myself, I know that this code can be overpowering in its simple virtues of duty and decency.  What is the code?  As Hamill puts it:

“Where I came from, the rules were relatively simple.  Work. Put food on the table.  Always pay your debts.  Never cross a picket line.  Don’t look for trouble, because in New York you can always find it.  But don’t back off either.  Make certain that the old and weak are never in danger.  Vote the straight ticket.” 

Okay, so it isn’t as original as the Hemingway Code’s blend of stoicism and Epicureanism, but it’s pretty damned good and in some ways better (the drinking life he abandoned not withstanding). And nobody ever distilled the code better than Hamill.2

But as Hamill reminds us, nothing lasts for ever. The Outer Borough Code is quickly passing into history, and is probably as dead in the outer boroughs as Henry Miller’s Yorkville accent is in Yorkville. It was killed off by cynicism born of moral complexity and material success. One of the changes in Greater New York since “those times” was the white flight of the upwardly mobile from the city during the 1970s and ’80s.  People born in Brooklyn or Queens during the 1930s, ‘40s, and ‘50s are now senior citizens who likely spent recent decades in places like Westchester, Nassau County, and innumerable other places across the country.  My family did.  If you “made it,” you got out of the five boroughs. Many who left now “vote the straight ticket” for the other party, and embrace their own selective kind of nostalgia. The people who remained in the boroughs likely work gig jobs or else are the children of more recent arrivals with their own outlooks.  In Brooklyn the children of suburbanites and rich professionals have rediscovered the neighborhoods and have come back to play at being consciously (and fashionably) urban with no real connection to the area or its history or code.

I am not sure that Hamill realized that his staunch ethos may be gone for good.  For although he refers to the “sustained purgatory” of New York after the 1960s, and loathed the predatory plutocratic wealth embodied by the super tall skyscrapers of the 21st century, he also wrote that “Somehow our luck held.  We have lived long enough to see the city gather its will and energy and rise again, and its people playing by the old rules.”  Hmm.

Of course nobody is perfect and Hamill might have been too accepting of then Mayor Giuliani’s law and order crackdowns in the 1990s and the “Disneyfication” of Times Square that helped make the city safe for tourists and the highest bidders.  By contrast, as Jimmy Breslin observed in a 60 Minutes segment in 1997, “The city dies unless it’s got some dirt and a little raciness.”  Given a choice between the financial/corporate overlords and the prostitutes who used to occupy the area, he told Leslie Stahl, “I’ll take the hookers.” Hamil was well aware of his own contradictions and those of his fellow New Yorkers.

There is something fresh about the no-nonsense, see-through-the-bullshit reporting of Hamill and Breslin, something both painfully and numbingly missing in the homogenized reporting of today’s corporate media. This is to say nothing of the vile propaganda and political entertainment that so many Americans mistake for news. Division is good for business, and when the purpose of the news is profit and manipulation, then the free press is more or less dead.

I came to Hamill’s work relatively late in life.  I never met him. All history and biography is selective and my observations are selective, haphazard, secondhand, and incomplete. But to me Hamill was somehow both a univeralist and a tribalist: the city was his territory, but it included all races therein as a universal type, the single category of human beings with all of our differences. He was a progressive in the old sense of the term denoting the tough-guy looking out for the little guy and who fought for simple fairness. At times one senses in him an undercurrent of first-generation, working class conservatism bleeding through his observations in a similar way that old money matrons, new money couples, and prep school boys are seldom the good guys on Law and Order: SVU.

When I heard that Hamill was in failing health back in late January, I sent an email to the members of a book club I belong to, suggesting that we read Downtown sooner rather than later. Sometimes we are able to get authors to call in to our meetings, and the possibility of speaking with Pete Hamill would have been a coup. It is now seems like a that I had drafted and was about to send a follow up email when I heard the news that he had died. I was also rereading Downtown.

With Hamill’s death, “something valuable is behind [us] forever.”  But as with other things lost, the memory of him remains. I suspect that his journalism and reminiscences will be remembered more than his fiction; if there is no Great New York Novel—if a local newspaper is its closest approximation—then it follows that a reporter must be its greatest writer.  As with the Brooklyn Dodgers and all things loved and lost, we must let him go, although we may speak his name with nostalgia. Without devolving into bathos (the kind of word Breslin hated), perhaps the lesson is that brave new generations should not be too quick to dismiss the tears of “weepy old men” when they speak of a better past and provide plausible reasons for it.

His 2003 novel, Forever, is about an Irishman who comes to Manhattan in 1740 and is granted immortality by a dead African priestess on the condition that he never leaves the island.  Perhaps that Irishman was Hamill himself, because in 2016, he left his beloved Emerald City and moved back to Brooklyn where he now rests a few feet from Boss Tweed, another larger than life New Yorker.

Notes

  1. Of course there is a danger in romanticizing or living in the past. Poor Richard chides that “The Golden Age was never the present age,” and Jay Gatsby could not repeat even the superficial details of his own past much less improve upon them. All borough hagiographies should be weighed against darker and more gritty accounts. Hubert Selby’s 1964 Last Exit to Brooklyn and even the artful blend of nostalgia and realism in Chazz Palminteri’s Manichaeist A Bronx Tale are good tonics against uncritical outer borough myth-making (so is Hamill’s own A Drinking Life). After all, primary purpose of history is to embrace what was good about the past and what worked, while learning what did not work and why.
  2. There is overlap between the Hemingway Code and the Outer Borough Code. Hamill states “The only unforgivable sin was self-pity” in both A Drinking Life (p. 184) and Downtown (p. 8).