View from
The Center

How Things End

“As Joshua Foa Dienstag reminds us, the essential tragedy of being human (or being anything, for that matter) stems from the passage of time.”

The historian Appian of Alexandria, in his second century A.D. work Roman History, tells of the death of Cicero, which came about as a result of the statesman’s opposition to Mark Antony specifically and, more broadly, from his efforts to preserve the Roman Republic amid its slide toward dictatorship. Appian writes

“It is well established that [Cicero’s] slaves were ready to fight for him bravely and faithfully, but that he ordered them to put down the litter and endure with patience whatever an adverse fate should compel. As he leaned out of the litter and offered his neck unmoved, his head was cut off. Nor did this satisfy the senseless cruelty of the soldiers. They cut off his hands, also, for the offence of having written something against Antony. Thus the head was brought to Antony and placed by his order between the two hands on the rostra, where, often as consul, often as a consular, and, that very year against Antony, he had been heard with admiration of his eloquence, the like of which no other human voice ever uttered. The people, raising their eyes bedimmed with tears, could scarcely bear the sight of his dismembered parts.”

As if that were not enough, Appian also invokes the historian Dio Cassius’ further description of Fulvia, the wife of Marc Antony, taking the decapitated head of Cicero, pulling out its tongue, and ramming through pins “she used in dressing her hair, all the time heaping disgusting epithets upon it.” 

It was surely a most gruesome and unbecoming end for one of history’s great statesmen and philosophers

In a March, 2007 piece in Time originally entitled “The Fine Art of Dying Well” (later amended to “The Double Tragedy of a Stolen Death” for his 2013 anthology Things That Matter), Charles Krauthammer describes the “particularly unwelcome death that not just ends a life but also undoes it, indeed steals it.” After asserting that “We should all hope to die well. By that, I don’t mean in the classic Greek sense of dying heroically, as in battle. I’m suggesting a much lower standard: just not dying badly,” Krauthammer draws attention to a few cases where the circumstances of individuals’ deaths came to overshadow the content of their lives. He first mentions Kitty Genovese, suggesting that the details of this young woman’s life were eclipsed by her grisly murder and The New York Times story—now very much disputed—that 38 people were aware of the attack but declined to offer her aid. In essence, she was not to be remembered as a daughter, girlfriend, or aspiring restaurant owner; rather, her name was to become synonymous with an unfortunate aspect of human nature: that of the diffusion of accountability. 

Krauthammer then turns to Lou Gehrig, sensing that for the generations alive today—and for those duly to follow—his achievements in sport, as well as the fortitude with which he faced his diagnosis of Amyotrophic lateral sclerosis (ALS), have come to be obscured by focusing alone on the disease that now bears his name. And, finally, Krauthammer makes mention of a particularly tragic case of a life that ended far too soon: that of Megan Kanka’s. Murdered at only seven years old in New Jersey in 1994, the person Kanka was (and those hints of who she might further become) are now to be perpetually obscured by her being the namesake of “Megan’s Law,” which was passed in New Jersey and then subsequently in states across the nation in response to her death. Although the law, which was the result of her parents’ advocacy, has likely saved the lives of many other children, in the public imagination, the circumstances of Kanka’s death are what live on, rather than who this seven-year-old was. 

Cicero’s death, thankfully, has not come to define who he was as a man, to overshadow the elements of his life or veil his manifold accomplishments. However, one cannot perhaps shake the sense that a particularly violent or unfortunate end to a life lessens it in some way, especially when said life is viewed in its entirety.

Unfortunately, though, one does not get to choose the manner of his death, as Krauthammer laments. But should one be subjected to such an unenviable end, one hopes that he would—in that moment—face it with as much gracefulness as possible, such as when Cicero “offered his neck unmoved.” This sense of resolve, which characterized Cicero’s final moments, has a number of modern parallels. One involves the 2015 case of the extraordinary defiance of those 21 Christians murdered on that Libyan beach by ISIS. Of them, 20 were Egyptian Coptic Christians, and the 21st, Matthew Ayariga, was from West Africa. According to some reports, Ayariga was not himself a Christian but upon seeing the bravery of his fellow captors said, “Their God is my God,” knowing full well that an affirmation to this effect would result in his own beheading. The forbearance by which these Christians accepted their ends is admirable. But though we might herald their courage, we cannot help but reflect on how much better it would have been had they died peacefully decades later—or, in the case of Ayariga, that he had lived much longer, humbly practicing his newfound faith. Yet when it comes to both Cicero and Ayariga, it is not that the circumstances of their demise were haphazard or random; rather the arrival of their respective ends came about as the direct result of their living courageously and well. 

This idea of how things end, to be clear, is not limited just to the lives of individual human beings; it applies similarly to any number of things, from human creations to the extinction of, say, a certain animal species. It can apply to entire societies or nations, as President Richard Nixon famously sensed when reflecting on the still-standing columns of Greece and Rome, forlorn reminders of once-great civilizations brought low. And he wondered aloud in 1971—pessimism flowing freely—if the United States would be soon to follow. 

One iconic symbol of the United States and what it has represented for the past century or so of world history has already met its end. And that is the World Trade Center, which, at its inception, was conceived to be a tangible, living testament to the possibility of peace attained through trade and international cooperation. Although this idea of free trade as a guarantor of non-aggression also undergirded the creation of the European Coal and Steel Community (the precursor to the modern European Union), this optimism has been proven by time to be ultimately quixotic, by both the manner in which the towers would later fall and by the dark underbelly of unfettered globalization. But that does not even come close to subtracting completely from the noteworthiness of the towers, what they represented architecturally in their blend of modern functionality with an homage to gothic roots (even if some critics chose to dismiss their aesthetic as amounting to little more than two massive “glass and metal filing cabinets”), the commerce that would take place within them, or the additional purpose they served in making good on long-standing promises to revitalize Lower Manhattan, which had been losing out to Midtown. 

In the time since September 11, 2001, however, most media and literary attention has lent its focus to the acts of terrorism that destroyed the towers, interspersed though at times with welcome stories of heroism displayed both on that day and in the weeks and months that followed. But the story of the towers and of the entire World Trade Center complex is, of course, more than just that of a single Tuesday morning. The towers appeared in close to 1,000 films, from The French Connection, where, it being 1971, they are still shown under construction, to the more light-hearted Home Alone 2. In the cases of those television shows that were running when the towers fell and featured them in their opening credits such as The Sopranos, it became necessary to edit them out believing that to continue featuring the towers would be “inappropriate.” Beyond just what they represented in popular culture, embedded within their broader story were other, smaller component stories. Some of them were tragedies, catastrophes nested within the overall larger catastrophe (if one does, indeed, choose to see the whole World Trade Center chapter as ultimately a tragic one), such as the 60 construction workers who died during the erection of the complex as well as the 1993 bombing of the World Trade Center, which claimed six further lives. Then, there were the component stories of the 70,000 tourists and commuters who flowed in and out of the complex each day, the equivalent of the population of Wilmington, Delaware. And the towers did fall on a Tuesday—right in the middle of the week, right at the apotheosis of their usefulness. 

Indeed, as commentators, museum placards, and political officeholders alike have all sensed, they were targeted for what they represented, that they symbolized American financial influence and a global order—from trade on down—upheld by the stability provided by the United States of America. So, like some of the individuals already discussed, their end came about as a direct result of what they were. Given the inescapability of entropy, the unrelenting flow towards disorder, they would have fallen one way or another; over the next century or perhaps two, they would have gradually slipped into disrepair, to be replaced eventually by something else, albeit minus the all too real loss of life. But their uniquely violent ends, somehow, make their demise more tragic than if it had come about as the result of the unyielding forces of natural decay.

So the essence of the story of the World Trade Center, as we look upon it today, lies in the struggle to reconcile what it was and what it represented with the circumstances of its annihilation. This is perhaps told no better than through Mordicai Gerstein’s 2003 children’s book The Man Who Walked Between the Towers, which would deservedly win the 2004 Caldecott Medal. The book, which tells the story of Philippe Petit’s 1974 tightrope walk between the towers, culminates in that haunting penultimate page, which reminds us—as if we could have possibly forgotten—”Now the towers are gone.” And this is followed by an urging to remember them “as if imprinted on the sky.” 

I used to pose a question to friends of mine: Does the fact that David Foster Wallace committed suicide barely three years after his exemplary May, 2005 speech at Kenyon College take away from the content of his remarks? The speech, which contains many memorable gems, has been hailed as among the greatest commencement speeches of all time. (1) Much of Wallace’s remarks, however, focused on offering practical advice on how to live well; in the speech, he even suggests that following some of his favored tidbits of wisdom might prevent one from becoming one of “the adults who commit[s] suicide with firearms.” So does the fact that Wallace himself, a short while later, committed suicide lessen or cheapen the value of his advice or even subtract from his body of work? On the former point, I would argue that surely it does, but less so on the latter. But to reflect on the final years and ultimate death of Wallace is to disabuse oneself forever of any of the naive, adolescent-esque romanticizations of the artistic-type lost too soon, from the 27 Club to Leopardi.

The endings of things—and the declines that precede them—are not always pleasant. There is a certain type of decline, however, that is particularly unbecoming, wherein one becomes something less than he was at the height of his faculties or being, when the approach path is discordant with the more esteemed hours. This applies far more to Wallace than to Cicero—or to the latter’s inanimate counterpart, the World Trade Center. The final months of Wallace’s life, though one wonders where the man ends and his mental ailments begin, were certainly far from ideal. His trips for psychiatric treatment, though a mainstay of much of his life, became more frequent and more invasive, and his artistic output was much diminished. In the minds of some observers, just as the World Trade Center might be the inorganic analog to Cicero, the United States’ ongoing ingestion of society-wide solvent—punctuated by ultimately half-hearted attempts at remedy—might accord with Wallace’s closing days. 

One thinks also of Howard Hughes, the man once considered to be the embodiment of glamor and daring in pre-War America. Yet, his later years were as ignominious as his early days were triumphant. As an itinerant of sorts, the tourist cities he frequented—roaming from one to the next—might have boasted more luxurious digs than, say, the “one-night cheap hotels/And sawdust restaurants with oyster-shells,” of which Eliot spoke. But this does not suffice to distract from his lacking in that sense of rootedness that characterizes a healthy life or, even more so, a healthy retirement. And his appearance similarly suffered, grisly and unbecoming details very much included such as with his acquired disinterest in trimming his fingernails or a diet consisting only of two chief but seemingly opposite foods: chocolate bars and chicken. Although to recall and list these particulars might seem unnecessary and even bordering on the voyeuristic, they do capture how extensive a fall from grace it must have been, for Hughes himself; for those who knew him, including his aides as they watched him decay before their eyes; and for all of us who might mourn what can become of a once-incomparable man in such a brief period of time. Hughes’ descent, in fact, must have been so complete that even an oblivious Willy Loman would have struggled to misinterpret Hughes’ hotel-bound final years the way he did those of Dave Singleman, the man he so unwisely chose for a role model. 

So do Hughes’ final years invalidate in some way his flying records, films, and what he must have represented to so many? Or, for a more pedestrian example, does a particularly explosive and acrimonious end to a romantic relationship nullify its finer moments; does a caustic parting of ways render no longer valuable all the positives once together shared? 

As Joshua Foa Dienstag reminds us, the essential tragedy of being human (or being anything, for that matter) stems from the passage of time. Implicit in this is that time leads all things, whether individuals, states, stars and planets themselves, or any form of order, really, inexorably toward their destruction. (2) Jerry Seinfeld, one might remember, offered his own more light-hearted articulation of the same idea, consoling George Costanza about the latter’s dating ordeals, “Well, you’ve only got another 50 years or so to go, and it’ll all be over.” So how does one square himself to such a reality; does one take up the Wittgensteinian notion that “If we take eternity to mean not infinite temporal duration but timelessness, then eternal life belongs to those who live in the present”? Instead, perhaps, one might prefer to look to Bertrand Russell when he writes in The Conquest of Happiness, “There can be no value in the whole unless there is value in the parts” and interpret such to suggest that it is a matter of narrowing one’s gaze, focusing more on the component parts to avoid assuming a frame that captures the duration entire. For the rest of us, there is always God.

These days, decline is apparently on many minds. Perhaps President Nixon will be proven correct, and America’s apex is already behind us, as we face an overdetermined descent, with degrowth, foreign competition, and cultural degradation all contributing. Perhaps continued mass migration will make America unrecognizable, including by supplanting the country’s long-standing culture with a mosaic of imported and often contradictory commitments from each corner of the globe. Perhaps nature itself will win us back over. If any of these scenarios should come to pass, or if they are already very much afoot, would the manner of American decline render less impressive the nation’s one-time, unparalleled achievements? 

As one recalls that the once-favored steady-state model of the universe failed, in the end, accurately to describe it all because flux is the norm and achieving a lasting stasis remains elusive, he begins to let go of that urge—as comforting as it might temporarily be—to see things in just one particular frame. This is because overtaking him unfailingly is that sensation welling up that it is all but impossible not to view things as a whole.

Erich J. Prince is the editor-in-chief of Merion West.


  1. As I have discussed previously, I consider the commencement speech to be its own literary micro-genre and one well worthy of attention. 
  2. I think of NASA engineer and author Homer Hickam recounting a conversation he had had with his father: “‘What’s the hardest thing you ever learned, Dad?’ I asked abruptly. He leaned on the rail of the stoop. ‘Entropy,’ he said finally. I didn’t understand the word and he knew it. ‘Entropy is the tendency of everything to move toward confusion and disorder as time passes,’ he explained. ‘It’s part of the first law of thermodynamics.’ I must have looked blank. ‘No matter how perfect the thing,’ he continued patiently, ‘the moment it’s created it begins to be destroyed.’ ‘Why was that so hard to learn?’ He smiled. ‘Because even though I know it to be true, I don’t want it to be true. I hate that it’s true. I just can’t imagine,’ he concluded heading back inside his office, ‘what God was thinking.'”

Erich J. Prince is the editor-in-chief at Merion West. With a background in journalism and media criticism, he has contributed to newspapers such as The Philadelphia Inquirer and The News & Observer, as well as online outlets including Quillette and The Hill. Erich has also spoken at conferences and events on issues related to gangs, crime, and policing. He studied political science at Yale University.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.