I am sure most of you reading this post are already fully aware of the dire situation at the University of Florida this spring in terms of the coerced face-to-face teaching demanded by the university’s higher administration. This has become a national scandal and there have been numerous major media reports, of which the following list represents only a sample:
As these stories all make clear, the response of our university to the pandemic differs from that of not only universities in other states—in the last week, the University of Michigan has suspended their sports programs and then issued a stay-at-home order for their entire campus—but also from other campus in our state system. Indeed, as the Orlando Sentinel reports, “The start of the semester has gone smoother at Florida State University, where in-person classes resumed on Jan. 19. Just over half of the school’s course sections are taught face-to-face this spring, though many of them are small, said Matthew Lata, a professor in the College of Music who serves as the president of the faculty union there. ‘So far, I’ve had almost no complaints,’ Lata said, adding his administration’s approach had been more conciliatory than UF’s. ‘It’s a much happier shop here in Tallahassee than it is in Gainesville.’”
Despite their claims to the contrary, our university is pursuing these policies in disregard of the recommendations of national and international medical experts and in the face of skyrocketing numbers of infections in Florida and in Alachua County, the county where UF is located, currently, according to The New York Times, at an “extremely high risk”level. As a result of these policies, the university is endangering the health and even the lives of many, including those of the most vulnerable populations on campus of untenured faculty, graduate students, staff, and campus support workers, as well as elderly community residents and those who have few options but to serve in the entertainment and service sectors of the local economy (and this keep in mind, as the January 23 story in the Gainesville Sun bears out, is occurring in a state with Governor DeSantis mandated 100% bar capacities and local municipalities are barred from enforcing compliance with CDC guidelines). One person who has already paid the price is the university’s basketball star, Keyontae Johnson, whose NBA career may very well be in jeopardy because of his infection and subsequent collapse in the opening moments of a game at Florida State on December 12. (This recent story underscores the fact the university continues to stonewall any information on the cause of his collapse, hiding behind “student privacy” concerns).
However, one important point that all these excellent reports tend to pass over is the fact that these decisions, policies, and actions have their roots in larger contexts, of both recent developments and transformations in higher education that have been underway for decades. A half century ago, Fredric Jameson observed that ideology is never simply a matter of content—of the ideas, beliefs, and values we hold in our heads—but rather of the very forms of thought we use to understand the world: “The dominant ideology of the Western countries is clearly that Anglo-American empirical realism for which all dialectical thinking represents a threat, and whose mission is essentially to serve as a check on social consciousness . . . The method of such thinking, in its various forms and guises, consists in separating reality into airtight compartments, carefully distinguishing the political from the economic, the legal from the political, the sociological from the historical, so that the full implications of any given problem can never come into view; and in limiting all statements to the discrete and the immediately verifiable, in order to rule out any speculative and totalizing thought which might lead to a vision of social life as a whole.” Thus, in order to more fully understand the current crisis, and even more significantly not to repeat the errors of the past, we need to make these connections explicit.
UF’s Provost Joseph Glover has stated the reason for pursuing such a plan of action is the fact that “Our [his] first commitment is to our students: to provide instruction in the format they requested, whether in person or online.” Whether this McDonald’s customer-first approach to higher education is true or not—and if so, it says a great deal about the attitudes the university currently has toward its faculty, workers, and their families—the situation is far more complex. First of all, such declarations on the part of the university administration conveniently provide ample cover for Florida’s Trump-acolyte governor and Scott Atlas-endorsed “herd immunity” advocate, Ron DeSantis. (I touched onto DeSantis’s response to the Covid crisis and its consequences for the university in a blog post last October.) Moreover, as Theodor Adorno would put it, “it is hardly an accident” that the Covid pandemic has led to the suspension of hiring in many departments and programs, already cut to the bone by more than a decade of prior cuts and scanty hiring; the Board of Trustee’s approval last fall of expansive new furlough powers on the part of the administration; and likely significant budget cuts later this spring. These changes parallel those in many other Republican-dominated states, including the recent decision in Kansas (what’s the matter with Kansas?) to use the crisis as an excuse to suspend tenure protections.
However, the University of Florida has made amply clear that the one exception to any contemplated cutbacks is its vaunted AI initiative, launched in earnest this past fall. The “intent” of the initiative, the university’s website proclaims, is to “use this initiative to create a model for AI workforce development that can serve as a template for other colleges and universities in Florida and across the U.S.” No surprise then, in the midst of the Covid emergency, the university recently sent its faculty the cheery news that a new “Data Science and Information Technology” building was underway at the center of our “historic campus,” and this after other already approved projects, such as a long-planned and desperately needed new music building, have been suspended. The building will be named after the alumnus-donor who is also the owner of the company whose proprietary technology underwrites this entire initiative. Moreover, last summer, the university announced the opening of a new luxury “boutique hotel,” one that just happened to be funded by “UF reserve funds, or money left over from previous budgets:” that is, money not spent on hires and other needed infrastructural developments.
As my colleague and our former union president Kim Emery argues in a brilliant essay from a decade ago, these current policies and actions are very much in line with a longer and ongoing program by university administrators, in conjunction with state Republicans from Jeb Bush to the present, to strategically use what Naomi Klein famously terms the “shock doctrine” and Emery calls “management by crisis:” the deployment of “events”—manufactured ones, as was the case in 2006 in the College of Liberal Arts and Sciences, wherein as Emery notes, the university first “‘uncovered” a significant fiscal deficit” and then used it to announce a “Five-Year Plan,” which included “combining some departments and reducing others” as well removing the “Department Chairs of Math and English;” and actual, as in the economic collapse in 2008 and the Covid pandemic in the present—as the means of at once of eroding faculty governance, destroying unions and other collective organizations state-wide, and reorganizing higher education in Florida. This was in turn is part and parcel of a larger neoconservative push that extends back to the early 1980s, which, as the historian Alan Taylor argues in an outstanding essay published on the eve of our unrelenting Trump “error,” worked to undermine the post-World War II project of education for democratic citizenship, or what Taylor refers to as “political goods,” and replacing it with an education system focused exclusively on “economic goods,” or individual and institutional financial rewards. (I talk about Taylor’s essay and the consequences of these changes in more detail in the first chapter of my book Invoking Hope.)
The cynical use of the Covid crisis to justify such a neoliberal remaking of the American university is laid bare in an opinion piece published last summer in Inside Higher Ed where the authors argue for the need to use “this moment to move toward needed transformation” instead of “the industry” (!!) reverting “to an old playbook, where cuts are neither strategic nor grounded in important goals of creating a sustainable business model that is aligned with the institution’s strategic vision.” Not surprisingly, their recommendations include the shift from “Budget balancing (‘Do I have enough?’) to return on investment (‘What do I get from what I have?’)” and establishing “clear lines of authority. People need to know who will make final decisions about strategic priorities and how resources will be aligned with these priorities. Good leaders will establish who the deciders are and how stakeholders [sic] can provide input.”
In years past, the university’s success in foisting these changes was in a large part based on the tried-and-true strategy of divide and conquer. In 2006, for example, the university succeeded in pitting against each other departments in the college and individual faculty members within the same departments, by appealing to individual grievances, resentments, and narrow self-interest. The same strategy worked well in a number of other situations in the decade to follow. In the current emergency, the aim is clearly to reproduce this successful strategy by setting undergraduate students against the faculty, graduate students, and university support staff.
In the current moment, such an approach happily thus far seems to be failing. No one among the targeted communities sees any benefits from the situation; undergraduates have made their displeasure clear; and a number of departments have expressed an unwillingness to play along. Moreover, as had been the case in past challenges, UF’s faculty and graduate student unions have marshalled an effective challenge to these policies. A number of other new actions have also been started. Recently, a MoveOn boycott initiative has been launched, after a viral thread exploded. The university administration remains deeply invested in institutional rankings and its “Preeminence Initiative:” what such a preeminence means to Governor DeSantis, is made evident in this 2019 story about UF’s rise to #7 in The U.S. News and World Report’sranking of public universities, complete with a photo-op of him receiving a football jersey from thankful administrators. Making them aware that their current actions will do long-term damage to the university’s reputation and may even lower UF’s rankings will surely have an impact, and international letter writing campaigns are underway. (The email addresses of the president and provost are readily available on the university’s website.)
All of this makes evident a key lesson of this challenging moment. Whatever the outcome of the current crisis, it is vitally important that all of us in the university community—as well as in university communities across the nation and our planet—understand that the what has happened the last few months are not isolated occurrences, ill-conceived and hasty reactions to an unexpected series of events, but rather an opportunistic use of one more in what now seem to be an endless series of crises. As such, the next crisis will produce a similar response and the only possible hope we have lies in a shared, collective refusal of such a deeply anti-democratic vision of the university and higher education.
There is a scene early in James Joyce’s A Portrait of the Artist as a Young Man (1916) that resonates in perhaps unexpected ways with the contemporary nightmare from which we are trying to awake. The scene concerns the early schooling of Joyce’s semi-autobiographical would-be artist Stephen Dedalus, in the moment of his short-lived attendance at the elite Irish Catholic boarding school, Clongowes Woods College in County Kildare. (Joyce, born in 1882, attended the school from ages 6-9 but was forced to leave due to the rapidly declining fortunes of his once middle-class Irish Catholic family.) Joyce writes:
It was the hour for sums. Father Arnall wrote a hard sum on the board and then said: —Now then, who will win? Go ahead, York! Go ahead, Lancaster! Stephen tried his best but the sum was too hard and he felt confused. The little silk badge with the white rose on it that was pinned on the breast of his jacket began to flutter. He was no good at sums but he tried his best so that York might not lose. Father Arnall’s face looked very black but he was not in a wax: he was laughing. Then Jack Lawton cracked his fingers and Father Arnall looked at his copybook and said: —Right. Bravo Lancaster! The red rose wins. Come on now, York! Forge ahead!
This leads Stephen to meditate on the color of roses:
White roses and red roses: those were beautiful colours to think of. And the cards for first place and third place were beautiful colours too: pink and cream and lavender. Lavender and cream and pink roses were beautiful to think of. Perhaps a wild rose might be like those colours and he remembered the song about the wild rose blossoms on the little green place. But you could not have a green rose. But perhaps somewhere in the world you could.
The two parties Father Arnall refers to here, the Yorks and the Lancasters, were competing factions of the fifteen-century English Plantagenet royal house who engaged in a 32-year-long conflict that would come to be known as the Wars of the Roses, after the white and red rose heraldic emblems of the two sides. As in Stephen’s classroom “game,” the Lancasters proved victorious. Their leader, Henry Tudor, assumed the English throne as Henry VII and established a dynasty that would rule the country for more than a century. During the Wars, Ireland sided with the Yorks and as a result Henry VII and even more so his two successors, Henry VIII and Elizabeth I, would increasingly consolidate and formalize English imperial rule over Ireland, which would continue until 1922.
Although neither young Stephen nor the reader yet realize it, this short scene brilliantly encapsulates two dimensions of what it means to live in an occupied land. First, Joyce shows the ways that even small gestures such as the choice of names for a children’s game remind the Irish people of both their past defeats and the impossibility of anything like the autonomous Irish history figured by the green rose of which the young Stephen dreams. Secondly, Joyce subtly demonstrates the myriad ways that in any situation of occupation, all aspects of everyday life are politicized. This becomes more and more evident as the novel progresses: in the next section of the first chapter, the family’s Christmas dinner explodes into passionate argument over the sad fate of the recently deceased Irish leader, Charles Stewart Parnell; and the chapter comes to its climax in a Irish landscape whose smells, sights, and sounds—the fields surrounding the elite Anglo-Irish Barton family’s Straffan House and “the sound of the cricket bats: pick, pack, pock, puck: like drops of water in a fountain falling softly in the brimming bowl”—all “mythologize,” as Roland Barthes puts it, the British imperial presence. The “very principle of myth,” Barthes points out, is to “transform history into nature,” in this case making British rule appear to its subjects as an immutable fact of the world, a reign upon which the sun will never set.
Even in the novel’s final chapter, the few careers available to Stephen’s classmates at Dublin’s Jesuit-administered University College —“Did you hear the results of the exams? He asked. Griffin was plucked, Halpin and O’Flynn are through the home civil. Moonan got fifth place in the Indian. O’Shaughnessy got fourteenth”—involve participating in British imperial rule both at home and abroad. It is this situation that Stephen finds intolerable—“My ancestors threw off their language and took another, Stephen said. They allowed a handful of foreigners to subject them. Do you fancy I am going to pay in my own life and person debts they made?”—and in the end he decides to leave Ireland to develop his art (although as we learn in the novel’s “sequel” Ulysses (1922), Stephen’s initial self-imposed exile proves to be short-lived).
I have taught Joyce’s A Portrait numerous times over the course of the last 26 years and every time I do so I learn new things from this magnificent novel. What struck me especially forcefully this fall is how much the situation of late-nineteenth and early-twentieth century Ireland resembles that of the United States in 2020. Although we are not ruled by another nation—or at least, not explicitly— in both situations nearly every aspect of daily life has been politicized to an extraordinary degree.
Even in the most equitable of democratic societies, all our actions are ultimately political in that they involve the reproduction of or challenge to a certain way of life. However, in a situation of occupation and direct domination, all the mediations, nuances, and complexities dissolve away, so that every act, every gesture—the games we play, the entertainments we enjoy, the careers we choose—becomes understood as immediately involved in supporting or challenging the ruling powers. And there are only two choices available. These leaders endlessly remind the ruled: you are either with us or you are our enemy.
Those who deny this reality and pretend there still exists some distance between seemingly inconsequential everyday activities and political life often prove themselves to be among the most complicit. This is very much the case for Joyce’s earlier great character, from his story “The Dead,” Gabriel Conroy (who is also something of a glimpse of an alternate future for Joyce himself had he decided to forgo his vocation as a writer and remained in Ireland). After Gabriel is confronted for producing a literary column for a conservative British-supporting newspaper, he responds in this way: “He continued blinking his eyes and trying to smile and murmured lamely that he saw nothing political in writing reviews of books.”
Such a politicization of everyday life has become more and more the case in the four years of the undemocratic and increasingly explicit authoritarian regime of Donald Trump. Numerous commentators have pointed out the ways that Trump, along with his supporters on the national and state levels, castigates everyone from career government officials to research scientists—from those in the state department to agriculture, from climate scientists to epidemiologists—as always already politically partisan. The result is that any statement they make or position they take is understood as either supporting or, as is far more often the case, challenging Trump’s rule. Others note that even gestures such as holiday greetings or displays of the nation’s flag have been transformed into statements of support for the Trump regime. And of course, over the past eight months we have all witnessed how practices recommended by the overwhelming majority of the medical community, such as mask wearing and adequate social distancing, similarly have been transformed into political statements. In such an untenable situation, today as much as in Joyce’s Ireland, the politicization of everyday life both increases stress and anxiety and raises the likelihood of violent clashes. (An even more explicit, quantitatively but not qualitatively different, example of the politicization of everyday life is on display in Glenway Wescott’s superb novel set in Nazi-occupied Greece, Apartment in Athens .)
This fall, this has also become true in the case of one of my favorite pastimes, NCAA football (or “hand-egg” as my soccer playing son calls it). While I have been since my teenage years a dedicated follower of the game and was even a decent football player at my small California parochial high school, the University of Florida is the first institution I have been involved with that houses a major football program. Cal State Northridge was a Division II program while I was an undergraduate there, and while Duke was fun to watch during my first three years in graduate school—when the head coach was the Florida Gator’s former quarterback and first Heisman Trophy winner, Steve Spurrier—football always remained a very distant second to men’s basketball. (Duke played in the Final Four each of the five years I resided in Durham and won National Championships in my final two years there).
I arrived in Gainesville in the fall of 1994, four years after Spurrier left Duke to assume head coaching duties at his alma mater and launch what was until that time the most successful run in Florida football history, culminating in a victory over state rival Florida State in the 1996 National Championship game. This was the first of what would turn out to be three national championships in my years here (the Gators would win the National Championship again in 2006 and 2008 under Coach Urban Myer). I quickly became a passionate follower of the team, something that spread to my brothers residing in the Midwest and California: they would come down to Florida for big games and would even participate in UF alumni gatherings in their home towns.
I have always recognized the contradictions involved in being a fan of big-time college football. The transformation of intercollegiate athletics into an extraordinarily profitable big business enterprise has been central to the neoliberal restructuring of our nation’s once great public educational institutions—whose mission in the years following the Second World War, as the historian Alan Taylor observes, was to produce both economic goods and a capable democratic citizenry—into privatized entertainment and patent-generating research complexes, which are reliant on the exploitation of the cheap labor of student-athletes, graduate students, support staff, and a faculty increasingly made up of what Herb Childress terms “the adjunct underclass.”
This contradiction came home to me in the 2006 national championship season. That fall, I had the extraordinary opportunity to work as part of a sideline camera crew for home games. This gave me the chance to take on-the-field photos of the action (a handful of which I have posted here) and I even momentarily appear on camera during CBS’s post-game interview with defense end Jarvis Moss after he had blocked the game-winning field goal attempt by an upstart South Carolina team then being coached by Spurrier. Moss’s heroic efforts propelled the team to subsequent victories in the Southeastern Conference (SEC) Championship and ultimately the National Championship games.
However, my exuberance over the football team’s success that season was dampened by the fact that in the same fall semester our university’s upper administration used the pretext of an economic emergency to launch a devastating assault on the humanities at the University of Florida, beginning a series of cuts and restructurings that in the subsequent decade would half the number of faculty members in our department and greatly reduce graduate student admissions.
Nevertheless, in 2006 the links between the fortunes of the football program, both on the field and off, and the political interests behind the restructuring of higher education in our state and nation were indirect and not evident to most people. (For a powerful study of these changes, I recommend this essay by my colleague Kim Emery. In another early sign of the changing realities, UF’s President at that time, Bernie Machen, would in early 2008 take the “unusual” step of endorsing Senator John McCain in his unsuccessful bid for the presidency.) Moreover, there have been and continue to be real rewards for my fandom, as my support of the Gators has enabled me to establish deep and long-lasting friendships with some extraordinary people in our community that I otherwise may have never met.
During the 2008 presidential elections and shortly before UF played Alabama for the SEC Championship, my then five-year-old son asked me, “Why don’t we hate Alabama fans like we hate Republicans?” His question took me aback and I replied that we didn’t hate either group—indeed, even some members of his extended family supported Republican candidates. But I also pointed out that I thought sports and political partisanship differed substantially, in that the former was for fun and should be forgotten as soon as the game was over, whereas the latter has significant and long-lasting consequences for the lives of many people.
In 2020, thanks to the Trump administration and its disastrous response among so many other failures to the coronavirus pandemic, this has changed. Early in the summer, when the major college football conferences were weighing whether or not to hold the season, the Trump administration along with other political leaders and Trump-supporting governors, especially in southern states like Florida, put pressure on universities to continue on despite the recommendations of medical experts and the significant potential risks involved to players, fans, and the communities involved. These political leaders felt that big-time college sports were popular among their supporters and that by demanding that things go on as usual, they could distract from the realities of the pandemic, help turn around the economic collapse their failed efforts had produced, and thereby increase their chances of remaining in power.
A recent study by the Brookings Institute concludes that “Partisan affiliation is often the strongest single predictor of behavior and attitudes about COVID-19, even more powerful than local infection rates or demographic characteristics, such as age and health status.” The same is now the case for college football, as support for playing the game as usual has become a sign of support for Trump. Florida’s current governor Ron DeSantis is in his own words a “Pitbull Trump Defender,” and even ran a commercial during his 2018 campaign showing him prompting his young son to build Trump’s wall and glory in Trump’s reality TV slogan, “You’re fired!” (Ironically, last Friday, Trump told a crowd of his Florida supporters that if DeSantis fails to deliver the state to him in the upcoming election, “I’ll fire him somehow. I’m going to fire him. I will find a way.”) In the days leading up to the Gators’ season opener, DeSantis thus not unexpectedly began to rail against what he termed the state universities’ “draconian” public health policies and threatened to force through a student bill of rights “that would preclude state universities from taking actions against students who are enjoying themselves.” He followed up on this threat of a “Bill of Rights to Party” by issuing an executive order allowing bars and restaurants to open up at 100% capacity and limiting local municipalities ability to do anything to curb them. Furthermore, in an act of “executive grace,” he suspended all outstanding fines and penalties that had been applied against individuals who had violated pandemic-related mandates such as mask and social distancing requirements. DeSantis opined, “I think we need to get away from trying to penalize people for social distancing. All these fines we’re going to hold in abeyance and hope that we can move forward in a way that’s more collaborative.”
The consequences of these decisions were amply on display during UF’s October 3 home opener. Although there was enforcement of social distancing and mask-wearing rules on UF’s campus and in its stadium, in the neighborhoods surrounding the campus it was pretty much business as usual, with street-side tailgating, open-container drinking (something not even allowed in normal seasons), and inadequately socially distanced game-watching parties. There were reports of packed local bars, even though the kick-off was at noon. There was also a rash of thefts of yard signs supporting Joe Biden’s campaign. When I contacted local authorities to report violations in our own neighborhood—including an adult tailgater urinating on a neighbors’ fence—the official with whom I spoke said that while they were deeply sympathetic and supportive, the governor had effectively tied their hands when it came to efforts to protect our community.
The following week, DeSantis upped the ante by declaring that all sports stadiums statewide had the right to operate at full capacity. The consequences of this new directive came to a head in the aftermath of the Gators’ October 10 upset loss at Texas A&M. Following the game, head coach Dan Mullen declared that he hoped that for the next game, “the UF administration decides to let us pack the Swamp against LSU — 100%.” He went on, “The governor has passed a rule that we’re allowed to pack the Swamp and have 90,000 in the Swamp to give us the home-field advantage Texas A&M had today.” The Swamp is the nickname established by Spurrier in the early 1990s for Florida’s Ben Hill Griffin Stadium, the twelfth largest college football stadium by capacity in the nation and the eighteenth largest in the world. When it is full, as I can personally attest from my on-field game experiences, it is also one of the loudest, a definite advantage for the home team.
I am confident that like Joyce’s Gabriel Conroy, Mullen thought there was “nothing political” in his call for a full stadium. His job is after all to create the best opportunity for his team to win and I am sure he would never have suggested such a course of action if he thought it would endanger the health of his players or coaching staff. However, as in the situation of the occupied Ireland so effectively portrayed by Joyce, there is no way that such statements could not be understood as political given our current realities. This fact was quickly acknowledged by a number of commentators: one of the first stated that whatever his reasons for so doing—including the patently evident fact that Texas A&M failed to adequately enforce safety precautions, allowing unmasked fans to sit close together in the lower levels of the stadium—it was “totally inexplicable and inexcusable that Mullen chose . . . to go all-in on his insistence that an institution of higher learning should follow the lead of politicians instead of public health and UF’s own medical experts when making decisions about safety precautions during a deadly pandemic.”
Fortunately, university officials quickly made it clear that at least in terms of practices on campus during home games they would not be following the lead of Trump and DeSantis. (However, sadly, this is not the case in terms of the university’s academic practices, as UF’s administration continues to hold firm in its commitment to meet the governor’s blackmail demand for dramatically increased face-to-face classroom instruction in the spring semester or face severe budget cuts. This has placed university leaders on a collision course with its own faculty.) Thankfully too, we in the community were spared the conflicts that very well might have erupted this past weekend, as in the week following Mullen’s statement the team announced that 21 players had tested positive for Covid and the next two home games would be rescheduled. Then over the weekend, we learned that Mullen himself had tested positive for Covid. (I hope that every person affected has a speedy and easy recovery.) All of this was seized upon by The Lincoln Project this past weekend, which released a short video that concludes, “So many voted for Trump because he promised to drain the swamp. And because of his failure, the Swamp is drained of not just fans, but of football itself.” Football is now definitely politicized.
As Joyce insistently reminds his readers, the troubling politicization of everyday life will continue on as long as the nation is controlled by those who put narrow interests above the well-being of the communities they command. To paraphrase Stephen Dedalus in the opening pages of Ulysses, we too thus need to ask, how long are they going to stay in our towers, their positions of expansive executive authority? And how will we respond if they refuse to leave?
“Although there may be nothing useful for you in my words, perhaps this example of ready obedience will not be wholly unprofitable to you.”
Gregory of Nyssa, The Life of Moses
I began this post as note of thanks to a dear friend who had sent to me on Sunday a moving and very much needed letter of support. As I was writing I thought perhaps it might be of value to others out there and decided to take the risk of sharing. I first posted it on my personal Facebook page; however, given the increasing sense of hopeless and even despair I am encountering in myself and so many others, I decided to share it on this blog as well.
As too many conversations I have had in the last week or so bear out and as a glance at the posts on Facebook confirm, there is an immense amount of suffering among those for whom I care most deeply. And every day we are confronted with more stories—personal, familial, local, institutional, national, global, and environmental—that add to this burden of pain. All of it brings to my mind again on this final day of Rosh Hashanah the words of one of the greatest thinkers of the last century, the German Jewish intellectual, Walter Benjamin—written only months before the catastrophe of his moment became too much for him to bear (the 80th anniversary of his death at the age of 48 is next Saturday): “The only historian capable of fanning the spark of hope in the past is the one who is firmly convinced that even the dead will not be safe from the enemy if he is victorious. And this enemy has never ceased to be victorious.” His desperate words seem to be reconfirmed everyday now.
Looking for something to latch onto, I encountered two things Sunday morning. The first was an email message from a dear friend and mentor, Tom Moylan, who wrote:
“It all may sound crazy, but it’s not. It’s the conditions within which were living now. It’s the clash of structures of feeling, carried out in our very bodies and emotions. Doing critical work, doing deep reading, teaching and writing and organizing is what we must do to keep the space open to move to the horizon. For that yes we need hope, and faith and love.
And that brings us to fidelity and solidarity to and with each other. We can’t/won’t do this alone. You my friend are not alone, as you well know.
My love to you this sunny Sunday morning.”
These moving words reminded me how much now more than ever we need to reach out to those who love and support us and I encourage everyone to do so and do so often: for it is only by making us feel alone, as if we are the only ones engaged in the Sisyphean struggle to push against the nightmarish tide of hate and fear crashing down on us, that the enemy will be truly victorious.
Secondly, I flipped open, as I often do in such moments, a book of Thich Nhat Hanh’s meditations, and on the page I read the following:
“Life is filled with suffering, but it is also filled with many wonders, such as the blue sky, the sunshine, and the eyes of a baby. To suffer is not enough. We must also be in touch with the wonders of life. They are within us and all around us, everywhere, anytime.”
“Each day many thousands of children die of hunger. The superpowers have enough nuclear warheads to destroy our planet many times. Yet the sunrise is beautiful, and the rose that bloomed this morning along the wall is a miracle. Life is both dreadful and wonderful.”
“If a child smiles, if an adult smiles, that is very important. If in our daily life we can smile, if we can be peaceful and happy, not only we, but everyone will profit from it. This is the most basic kind of peace work.”
Reading these words reminded me to do what I have often done throughout the past year, to take a brief moment in my beloved’s partner Susan’s garden. The summer swelter has finally broken in Florida, and while the sun is not out yet, the light wind is beautiful and refreshing. And although the first signs of the ‘season of mists and mellow fruitfulness’ are evident, there is still much life and beauty in the garden. All of it makes me smile.
A number of people have told me they find a bit of happiness in my photos of the garden, so I decided to share with you a little of what I encountered today.
It is my deepest prayer this morning on the dawn of new year that all of you can find some peace and happiness and something to make you smile every day. I was reminded today that there are many, many people who love and care for me, and I know that I feel the same for each of you. It would be naïve to guarantee that we will get through the dark times still to come; however, if there is any hope of so doing, it will only be together, each of us supporting the others. So take care of yourself, smile, and please let others know what they can do to help.
“Hope” is the thing with feathers -
That perches in the soul -
And sings the tune without the words -
And never stops - at all -
And sweetest - in the Gale - is heard -
And sore must be the storm -
That could abash the little Bird
That kept so many warm -
I’ve heard it in the chillest land -
And on the strangest Sea -
Yet - never - in Extremity,
It asked a crumb - of me.
It is a very good time to be a writer (and a reader) of SF, fantasy, weird, and other fantastic fictions—indeed, we might even say it is a golden age, if that term had not already been taken. This has been reconfirmed for me just this past month with the publication of Jo Walton’s beautiful and affirmative novel, Or What You Will.
In Invoking Hope: Theory and Utopia in Dark Times, I touch on the fact that the older post-World War II MFA “Program Era” proscriptions against genre or “paraliterary” fiction have begun to loosen (although, of course, not without resistance from a few critics who might very well be operating under the slogan “Make Literature Great Again” [MaLGA?].) This welcome change is borne out, first, by the fact that many significant works of “literary” fiction published in the first two decades of our millennium are examples of or draw deeply on SF and fantasy—among others, Margaret Atwood’s Oryx and Crake (2003) and its two sequels (2009 and 2012), David Mitchell’s Cloud Atlas (2004) and The Bone Clocks (2014), Kazuo Ishiguro’s Never Let Me Go (2005), Cormac McCarthy’s The Road (2006), Junot Díaz’s The Brief Wondrous Life of Oscar Wao (2007), Colson Whitehead’s Zone One (2011), Charles Yu’s How to Live Safely in a Science Fictional Universe (2011), Mohsin Hamid’s Exit West (2017), Louise Erdrich’s Future Home of the Living God (2017), and Salman Rushdie’s Quichotte(2019).
Secondly, writers long identified as practitioners of genre fiction—Kim Stanley Robinson, William Gibson, China Miéville, Ted Chiang, Allan Moore, N.K. Jemisin, and Cixin Liu—are being treated with new seriousness and respect by an ever more expansive and sometimes unexpected group of readers and critics. For example, a 2013 article in The New Yorker advances the claim that Robinson is “one of the most important political writers working in America today.” I would agree heartily with this assessment and would suggest something similar for many (all?) of the writers and books listed above. (On a related note, for an insightful review of Gibson’s and Yu’s most recent novels, Agency and Interior Chinatown, check out UF’s own Mitch Murray’s “The Worst of All Possible Worlds?”)
Prominent in any such list should be the name of Jo Walton. Walton is among the most interesting writers working today. Such a statement on my part is, as Sianne Ngai teaches us, a performative utterance “disguised as a constative:” it is a “demand,” or commandment as to how we should spend our limited time, masquerading as a truth claim or even an observable (for those who have the right eyes to see) fact (i.e. her work is “objectively” good). If past experience has led you to consider me a trust-worthy enough reader (and if so, I thank you), then you don’t need to proceed any further and can go out and use your valuable time to encounter for yourself some of Walton’s extraordinary novels. If, however, this is not the case, or if you would like some inkling of where you might begin, then allow me to explain some of the reasons I find her work so significant and rewarding.
In his meditation on postmodernism and utopia, The Seeds of Time (1992), Fredric Jameson observes, “It seems to be easier for us today to imagine the thoroughgoing deterioration of the earth and of nature than the breakdown of late capitalism; perhaps that is due to some weakness in our imaginations.” A little over a decade later, Jameson again and even more concisely claims, “it is easier to imagine the end of the world than to imagine the end of capitalism.” This has become among Jameson’s most repeated observations, one that has not surprisingly come to renewed prominence in the last few years. One place where Jameson’s insight is most immediately borne out is in the realm of contemporary storytelling: for as even the most cursory survey of contemporary print, television, and film will confirm, it seems today that it is far easier to imagine dystopia, stories of those bad places we make by our own actions, than utopia, a more equitable and just world beyond our own (it being understood that any such world can only be one where the cruel logics of both labor exploitation—the only freedom most people possess being the freedom to work or starve—and oppression are exceptions rather than norms).
There are of course very good reasons today for the fear that the world we know may be coming to an end and an even worse one is on the horizon: the now almost daily assaults on democratic institutions and the rise of neo-populist authoritarianism (what Frederico Finchelstein describes in his recent insightful and chilling book, A Brief History of Fascist Lies, as the post-WW II reformulation of “fascism in a democratic key”); the growing economic inequalities in what William Davies names a post-2008 “punitive neoliberalism;” and the seeming inexorable slow-motion apocalypse of climate change and environmental destruction. Moreover, many of the most interesting recent works of fiction—including a number of those listed above—are dystopias, albeit in the form of what Tom Moylan identified at the very beginning of this millennium as critical dystopias, works that “negotiate the necessary pessimism of the generic dystopia with an open, militant, utopian stance.” One of the best examples of this practice can be found in the great novel with which I conclude Invoking Hope, Mitchell’s Cloud Atlas, a work whose Gramscian “pessimism of the intellect” and “optimism of the will” I argue harkens back to H.G. Wells’ The Time Machine (1895), the “critical dystopia” that helped found modern science fiction.
In such a situation, Walton’s novels stand out because of their unabashed commitment to imagining utopia. This is especially the case with the five novels that she has published since her award-winning and moving novels, Among Others (2011) and My Real Children (2014). However, none of Walton’s novels are utopian in the clichéd, if still commonplace, sense of offering a fully realized destination, an end of history, or so-called “perfect” and perfectly static world at which we must arrive (another performative utterance, a demand, masking itself as a constative). Rather, Walton’s genius in these books—and again this is something she shares with a number of the other writers listed above, albeit in her own inimitable fashion—lies in her portrayal of both the fidelity and hard work, but also the joy, in what Moylan names in his forthcoming collection of essays and interviews, becoming utopian; or what Walton, in her Plato-inflected language, refers to as increasing our excellence. Walton’s work is so important because it strives to educate our desire to take up the labor of making worlds other to a present that is far too easy to see only in terms of a catastrophe that unrelentingly piles wreckage upon wreckage at our feet.
Walton’s novels The Just City (2015), The Philosopher Kings (2015), and Necessity (2016) make up her “Thessaly trilogy.” These novels are more immediately recognizable as part of the “cognitive estranging” (Darko Suvin’s path-blazing and still invaluable term) genre of the utopia. However, even this trilogy begins in the very different non-cognitive but still estranging practice of fantasy. The Just City opens with the decision by the Greek gods Athena and Apollo to enact, rigorously and to the letter, the vision of the Just City modeled in Plato’s Republic. Their original city is populated by three groups: the Masters, “three hundred fanatical Platonists from times ranging from the fourth century B.C. to the late twenty-first century A.D.”, and including such real world historical figures as Crito, Plotinus, Cicero, Boethius, Marsilio Ficino, Giovanni Pico della Mirandola, Lucrezia Borgia, Benjamin Jowett, and Ellen Francis Mason (while Plato remains absent, Sokrates is soon brought to the community as well); ten thousand former slave children, to be educated by the Masters to become the founding citizens of the Just City; and programmable mechanical workers, whose labor in building and maintaining the city’s infrastructure frees the first two groups to enact Plato’s vision. All this along with great “lost artworks from all of time,” including “the head of the Winged Victory of Samothrace,” and “all the books from the Great Library of Alexandria.” The entire community is then isolated on the island of Atlantis at some indeterminate point before both the Trojan War and the island’s destruction. On this seemingly improbable foundation, the trilogy proceeds to develop an extraordinarily profound meditation on the relationship in any utopian imaginary between contingency and necessity. Along the way, the trilogy also posits its own fascinating theory of the activity of reading. (As this preview might suggest, I am working on a longer essay on the trilogy that I hope to finish up before too long.)
Following on the heels of the Thessaly trilogy, Walton turns in a very different direction in her next novel, Lent (2019). Lent begins as an exemplary historical novel (the genre Jameson suggests precedes and gave rise to modern SF), developing a carefully researched and provocative revisionist account of the city-state of Florence in the years between 1492 and 1498 and encompassing the period of the republic established by the controversial Dominican friar Girolamo Savonarola (1452-1498)—who also happened to have been a real world friend of the great Christian humanists and Platonists also featured in the Thessaly trilogy, Ficino and Mirandola. In this section of her novel, Walton brushes against the grain of conventional histories, and presents the Republic, its warts and limitations there for all to see, as a realized utopia.
Crucially, this is not some idealized past to which Walton nostalgically wants to return (Make Florence Great Again!): rather, the aim is to “repeat” Florence. To paraphrase Slavoj Žižek, “To repeat Florence is to accept that Florence is dead, that its particular solution failed, even failed monstrously, but that there was a utopian spark in it worth saving. To repeat Florence means that one has to distinguish between what Florence actually did and the field of possibilities that it opened up, the tension in Florence between what he effectively did and another dimension one might call what was ‘in Florence more than Florence itself.’” Early in Or What You Will, one of the character’s describes 15th century Florence’s governing structure in this way: “Eight men of the merchant class rule for two months at a time, the highest honor the city affords. It was a real, if time-bounded, power, and if it led to inconsistent policies, well, it’s better than tyranny, and how is your democracy doing at that this fine day? (Don’t answer that. Don’t even think about that.)” In this regard, Walton proved to be ahead of the curve, as a number of essays have appeared recently that look to 15th century Italy in general and Florence in particular for the lessons they might hold for efforts to build a new and better world after the conclusion of our plague and other not-unrelated disasters.
Then, in the middle part of the book, Lent unexpectedly takes a dramatic swerve into fantasy (how it does so I will not reveal here), and transforms itself into a profound exploration of, among other concerns, the nature of freedom—”Girolamo thinks about the mutability and weight of history”—the importance of hope and change in the good life—“He wants to shout and jump for joy. At the very least there will food and beauty and friendship and hope”—and most of all, the necessity of community to bring about meaningful change: “You can’t do this alone, Asbiel, and I can’t either. You have to trust me. . . . The house divided against itself cannot stand, that means Hell cannot stand.”
I read Lent shortly after its release—along with the 2003 English translation of the Italian Wu Ming collective’s equally extraordinary debut, Q (1999) and its sequel, Altai (2009) and Alix Christie’s enjoyable Gutenberg’s Apprentice (2014)—and this experience led me to change my planned graduate seminar in spring 2020 to the Contemporary Historical Novel. The course began in the past, with readings of works by Walter Scott, Georg Lukács, James Joyce, and Virginia Woolf, before turning to late 20th and early 21st century examples of the historical novel. We concluded the seminar (by then many weeks into an online-mediated mode of conversation) with two novels released in summer 2019, Walton’s Lent and Whitehead’s The Nickel Boys (the latter won the Pulitzer Prize for Fiction only weeks after the end of the term). I also hope to be able sometime very soon to write something about both of these equally extraordinary novels.
While Or What You Will is very different from these prior works, it continues to develop Walton’s deep meditation on the questions of becoming utopian and striving for excellence. The novel is especially hard to characterize as it does so many different things and it does them so well. There are three intertwined stories in play throughout. First, the novel tells the life story of a successful Montreal based writer, Sylvia Katherine Harrison—“Thirty books, she’s written, in forty years”—who a few years prior had lost her beloved second husband, Idris, and who has herself recently been diagnosed with terminal cancer. Secondly, we are told the story of the intimations of change occurring in the fantasy community of Illyria, a world modelled on both 15th century Florence and Shakespeare’s imaginings of an Italy he never visted, but where magic is real and people now live on as long as they desire (or are not killed by another). Illyria, it turns out, was created by Sylvia many years earlier in her first trilogy of novels (like Walton, who began her career with a fantasy trilogy, the Sulien series composed of The King’s Peace, 2000, The King’s Name, 2001, and The Prize in the Game, 2002). Linking these two stories together is a narration offered by Sylvia’s unnamed (at least until the novel’s final page) imaginary friend and the basis of many of her characters. Realizing Sylvia is dying, her creation and longtime comrade determines to find a way that they might continue on. This inaugurates a dialogue that spans the course of the novel.
In many regards, the novel returns to the blend of autobiography and fantasy that Walton so effectively deployed in Among Others and My Real Children. At the same time, Or What You Will continues the reflections begun in Lent on the special qualities of Renaissance Florence—and like its predecessors, in this novel Ficino and Mirandola once again play prominent roles. Or What You Will also resembles Joyce’s A Portrait of the Artist as a Young Man in that it is a Künstlerroman, or artist novel, with a fictional artist whose story parallels Walton’s own (even passages from Walton’s own earlier works appear in the final chapter.)
Along the way, the novel offers its readers poignant reflections on death and mourning; a sometimes grimly realistic portrayal of the experience of working-class women in the post-World War II period and the psychology of abuse; an investigation into the creative process; a philosophical meditation on the real of fiction and the fictional nature of the reality (even Jacques Lacan is alluded to in passing); and a meta-fictional commentary concerning both potential limits and possibilities of generic fantasy. At the conclusion of her acknowledgements, Walton writes, “And thanks to you, my readers, for bearing with me through so many odd edges of genres and different kinds of stories.” I can say that no such thanks are needed as the experience of this novel, as with its predecessors, moved me deeply and inspired many new thoughts (some of which I am sharing with you today).
In addition to being a gifted writer and storyteller, Walton is a voracious reader: her February 2020 reading list includes 25 books encompassing a wide range of fiction and non-fiction. One of the great pleasures in all of her books, for me as well as many of her other readers, lies in their generous invitations to dive into the rabbit holes of other reading. (In this invitation, her work is not unlike that of my teacher and the subject of one of my earlier books, Fredric Jameson.) A number of fantasy readers have written about the ways Among Others opened up for them new horizons in SF. In a 2013 interview concerning just this fact (the interviewer expresses their gratitude for Walton’s introducing them to the work of James Tiptree, Jr.), Walton notes that a question she really doesn’t like to answer is which book would you most recommend, as Among Others “is not about reading one book! It’s about indiscriminately reading a lot of stuff.”
My encounter with Lent resulted in a personal and very valuable plunge into not only books and essays about Florence, Renaissance art, and early modern European history, but also the essays and letters of Italian Renaissance humanists—Ficino and Mirandola, among others—and early Christian mysticism, especially the writings of Origen and Gregory of Nyssa. This then led me to Thomas Merton’s recently published lectures, A Course in Christian Mysticism (2017) and A Course in Desert Spirituality (2019) and more recently, a really fascinating trilogy of books by Thomas A. Carlson. A Course in Desert Spirituality contains a short Foreword by Paul Quenon that points out that a chief concern of these great thinkers, women and men, was “’the discernment of spirits’; how do you know what inspirations come from God and what comes from the devil?” Quenon then concludes by noting that people today “feel an urgent need for ‘discernment of spirits’ on many fronts, personal, ecclesiastical, and political. How can we detect what is motivating people—myself, others, and those big faces on the TV screen?”
A similar generous and enthusiastic invitation to read continues in her latest novel. Early on, we learn that the final novel read by Idris is Sofia Somatar’s A Stranger in Olandria (2013). We are told that it was sent to Sylvia “to blurb, and [she] loved it, and passed it on to him. She had been looking forward to his perspective on it, the wonderfully complex world, the characters poised between cultures.” Idris’s final text message to Sylvia reads, “I’ve finished Olandria. Can’t wait to talk to you about it!” This scene recalled to me one of my favorite lines from The Philosopher Kings and which I use as an epigraph for the second chapter of Invoking Hope: “I knew what death meant now. It was conversations cut off.”
Later on, the couple discusses Kim Stanley Robinson’s monumental alternate history, The Years of Rice and Salt (2002) (a few years after its publication, Walton would also produce her own alternate history, the Small Change Trilogy of Farthing, 2006, Ha’penny 2007, and Half a Crown, 2008; My Real Children also offers an alternate history, albeit this time on the scale of an individual life): “They both loved the book, which struck them as a truly innovative thing to attempt, though he was unsure about some of what Robinson chose to do with Islam. The hard-back book sits on Idris’s shelves now, back in Montreal, haunted by a new, deeper loss.” I have written on The Years of Rice and Salt, and re-read and talked about it with my seminar participants this past spring and I too would recommend it to anyone who has never read it (a book about an alternate history growing out of the Black Death is especially timely today); and thanks to Walton, I have just ordered A Stranger in Olandria and look forward to reading it as well.
In Or what You Will, Walton expands her generous and enthusiastic celebration of books to include places she has visited (although already in My Real Children the encounter with new places plays a significant role). As with Lent, Florence is the primary setting of her latest novel. However, differing from its predecessor, the Florence of Or what You Will is divided between fantasy world of Illyria and the real-world Florence, presented briefly in 1847 and the 1970s and then more extensively in 2018. The novel is filled with beautiful evocations not only of well-known sites in the city but some that are clearly Walton’s personal favorites. When Sylvia visits Florence’s Teatro del Sale, Walton writes of it, “Talking about Teatro del Sale it’s easy to use words like ‘incredible’ and ‘unbelievable,’ but when you’re there it feels instead like the ultimate reality, the way things ought to be. Famous chefs ought to want to give things like this back to the community. Food ought to taste this way. It’s so like a wish-fulfillment narrative that it’s hard to suspend disbelief, but yet, here she is again, eating mouthwatering food, familiar but never taken for granted.” Walton’s beautiful description of the experience of Teatro del Sale has encouraged me to visit Florence as soon as I am able (in the coming after-time, of course) and to make a visit or multiple visits to Teatro del Sale among my top priorities (and in the meantime, I will to try and recreate as best I can something of the dishes she mentions. The picture below is not one of those, but a recent dinner we enjoyed here in Florida: Sicilian grilled tuna steaks with roasted Brussels sprouts, hand-rolled couscous, and Caprese salad.)
Walton’s enthusiastic evocation of books read and places visited resonates so deeply with me because it suggests experiences parallel to my own. (Walton discusses her initial encounters with Florence in a recent interview.) I was born in the same year as Walton, but in West Germany, where my father was serving as a nuclear missile technician and my mother joined him from their home in Chicago. I then spent the first 10 months of my life in Europe. However, I would not again travel outside of the United States (except for a few brief academic meeting trips to Toronto in the early 1990s) until 1996 (and my first trips to Florence and Rome in 1998), three years after completing my Ph.D and two years after I had begun teaching at UF. Until then, my only encounter with these places was limited to books, movies, and my parent’s photographs. However, the advantage of this delay is that all my subsequent encounters with the new places I have been so fortunate to have had the opportunity to visit in the last 25 years (most recently, in late 2019, an extraordinary week in South Korea) remains very much like that of Idris when he first visits Teatro del Sale and proclaims, “How can this exist?”
I also, thankfully, continue to have the same response to books, movies, paintings, and other storytelling media: encountering something fantastic and new—like Walton’s novels—I really do often wonder how can this exist? Even more immediately, this same experience occurs at home, where I have had, through no merit of my own, the impossible experience of living with someone who transforms every day into a conversation, a duet, an opportunity to learn by glimpsing the world through the eyes of a talented, joyful, caring, and brilliant other. Even the university, at its very best moments—especially those when it succeeds in resisting the neoliberal urge to transform all our interactions and experiences into money-making opportunities (we are not first and foremost, or should we be at all, “a business”)—still surprises me with its ability to evoke this response: how can such a way of life amongst such a vibrant, passionate, open-minded, and creative people exist in this world of ours?
I dwell so much on these seemingly secondary aspects of Walton’s books because I want to suggest that her invitations to read, travel, converse, and in general expand one’s experiences are at the core of her understanding of what it means to strive after excellence or become utopian. First, as with all great utopian visionaries, Walton, in passages like the one concerning Teatro del Sale I cited above, underscores utopia’s inherent unrepresentability: as Alain Badiou suggests, these are truths that you can only ever know by encountering them for yourself. The best a storyteller can do—and let me underscore, even this accomplishment is a rare one that should be treasured—is to present such impossible places and experiences in a such a way that a community of readers will feel inspired to do the things necessary to call them into being in their own and others’ lives.
Secondly, Walton’s portraits of places like 15th century Florence or Teatro del Sale in 2018 give the lie to the assumption that utopia cannot exist: it has, fleetingly, in the past and continues to do so in scattered places in the present. This brings us to Idris’s second question on his inaugural experience of Teatro del Sale: “And given that it does [exist], how can there only be one like it?” What Idris is really querying here, is why don’t more people (everyone) have the necessary resources—money, but even more so the time, education, community, and emotional support—not only to give full expression according to their abilities to their most creative energies, but also regularly to encounter according to their needs “impossible” things. The point is never just about creating opportunities for me to read, travel, write, converse, and live and work in a university community, but rather to create similar opportunities for excellence in as many people as who desire them. This inevitably means a change of not only one’s individual life but of our shared world. (I also take up these issues in my chapter in Invoking Hope on Isak Dinesen’s magnificent utopian short story, “Babette’s Feast” ).
Finally, Walton’s Teatro del Sale undermines the notion that utopia involves anything like a final destination, a perfect world where change is no longer possible: this would be to fall into the form of evil Badiou names “disaster,” something very different mind you from the Trumpian “terror” of Make America Great Again (but for whom?) or the “betrayal” that says no meaningful change is possible. In the midst of her final experience of Teatro del Sale, Sylvia comes to an important realization concerning the world she has built: “We were wrong about Progress. . . .We wanted it to be the Renaissance forever in Illyria, and so we said it was so. . . . But if it is forever, then it stops being the Renaissance.” This continues on for a number of very interesting paragraphs, before Sylvia concludes, “But really, a lot of what was important about the Renaissance was that golden-age feeling, people making things and being excited. Stopping that dead and removing the possibility of progress by fiat kills half of what matters about it. And the other half is really just decoration. Very beautiful decoration, admittedly, but still just decoration.” A world where the best of the past is celebrated but also where people are encouraged to make new things and be excited, and a world of change and growth as unforeseen and unexpected developments occur—this is the vision of utopia that is developed throughout all of Walton’s great fictions.
I’ll conclude here. I hope that these glimpses will encourage a few others to encounter Walton’s work for themselves (and this is what I think any reviewer striving for excellence ought to do). Walton’s novels invoke hope and increase the desire to become utopian in ways that far too few others do today. And for these reasons, her works need to be shared, encountered, and talked about by as many people and as often as possible.
He was running. Absolutely running, with nowhere to go. And he was not yet four-and-twenty.
Joseph Conrad, Lord Jim (1900)
One of the real pleasures of the last few months has been having more opportunities (and motivation) to watch and re-watch a variety of movies. This has included viewings of a number of classics of Hollywood and global cinema alongside my teenage son Owen. Seeing them anew through his eyes has opened up so many unexpected insights and for this I thank him.
Most recently on the bill was Sergio Leone’s final film, Once Upon a Time in America (C’era una volta in America, 1984). Earlier in the spring, we screened Francis Ford Coppola’s magnificent Godfather trilogy (1972, 1974, 1990); and Leone’s brilliant “Dollars trilogy” (Trilogia del dollaro)(1964, 1965, 1966) has long been a family favorite. I am also beginning work on a book on cultural production in the year 1984, making Leone’s film a must see. So after a fortifying dinner of Reubens, knishes, and matzo ball soup from Katz’s Deli (alas, here in north Florida, such a tempting meal remains a social-isolation-fueled dream), we decided it was time to invest the nearly four hours needed to watch in its entirety the “European cut” of Once Upon a Time in America.At the end, Owen agreed it is an extraordinary experience and we strongly recommend everyone to see or re-see it in this version.
What especially struck me on this viewing is how much the film both responds to the mythos of organized crime and especially the “family business” of the Mafia—a mythos in a large part, of course, fueled by the success of The Godfather films—and develops a rich meditation on an individual’s responsibility for their choices. In these regards, Once Upon a Time in America speaks as much to us today as it did to its original audience in the mid-1980s.
The story goes that Leone’s interest in making the film began in the 1960s after he first read The Hoods (1952), a memoir by the Russian Jewish immigrant Harry “Noodles” Grey (the pseudonym of Herschel Goldberg [1901-1980]) of his days as a gangster and bootlegger in the 1920s and early 30s. (Leone’s interest may have been sparked by the fact that in the opening chapter, one of Grey’s friends is berated by his Lower East Side high school teacher for reading “Western thrillers,” or what the she refers to as “filthy literature,” romanticizing the exploits of Jesse James. A few pages later, a teen-aged Grey reflects, “I visualized all of us on horses, six-shooters in our hands, banging away at a pursuing posse. That would be fun, I thought.”) Leone’s plan for his treatment took further shape after an interview with Goldberg, who, now in his 60s, reported to be still in hiding from his former associates.
Given the tremendous popularity of Leone’s “spaghetti westerns,” the director was among the first to be approached to work on the film adaptation of Mario Puzo’s best-seller The Godfather (1969). However, as Leone disliked Puzo’s novel he politely declined and continued to work on his adaptation. It would be another twelve years before Leone’s film would come to realization and it cost him a good deal in terms of his own health. Leone and his crew reportedly generated between 8 and 10 hours of usable footage and Leone initially decided he would release his masterpiece—in a fashion not unlike a number of recent blockbusters—in two 3-hour long segments. These would stand as the climax of his revisionist “Once Upon a Time” trilogy—preceded by Once Upon a Time in the West (1968) and Duck, You Sucker! (also known as A Fistful of Dynamite or Once Upon a Time…The Revolution, 1971). The studio balked—in part because of the box-office failure of the two-part release of Bernardo Bertolucci’s 1900 (1976)— and so Leone further reworked the material until he had a 229-minute version, which premiered at the Cannes film festival in May 1984 and was subsequently released throughout Europe. (A “restored” 269-minute version has been produced but never released due to legal wrangling over copyright. In 2014, a 251-minute “extended directors cut” was made available on DVD, although which version Leone himself would have preferred is a matter of debate.)
However, for the U.S. release and without Leone’s permission the film was further cut to 139 minutes. Even more disastrously, Leone’s complex plot structure was completely revised, and the resulting film proved to be a critical and commercial flop. The popular television reviewer Gene Siskel was not alone in proclaiming the U.S. theatrical release to be the worst movie of the year—however, Siskel then selected Leone’s original cut as his best movie of the year! Meanwhile, the violence done to his film and its reception in the U.S. proved a further blow to Leone and he would not complete another film before his death in 1989 at the age of 60.
Once Upon a Time in America opens in December 1933 at the end of prohibition, with a scene of a young woman (Darlanne Fluegel) entering a darkened hotel room. There, she encounters a group of thugs who are searching for her lover, the bootlegger, David “Noodles” Aaronson (Robert DeNiro). When she says she does not know where he is, they murder her. We soon learn that Noodles—after coming upon the bodies of his three fellow gangsters, who apparently have been killed in a shoot-out with the police—has holed up in a Chinatown opium den (really more of a cultural center, as it also contains a restaurant and shadow puppet theater) and fallen into a deep hallucinogenic slumber. The film closes with a brief scene of Noodles as he first enters the den. He takes off his jacket, inhales his first drags of opium, and lies back, smiling into the camera as the credits begin to roll.
Within this frame, Once Upon a Time in America moves back and forth between episodes drawn from three moments in the life of Noodles and his compatriots. First, we are introduced to a teen-aged Noodles (Scott Tiler) and follow him as he establishes his gang, encounters the person who will turn out to be his partner and rival, Maximillian “Max” Bercovicz (Rusty Jacobs), and romances the love of his youth, Deborah Gelly (played in her film debut by Jennifer Connelly, most recently known for her key role in TNT’s entertaining Snowpiercer series). This set of vignettes culminates in Noodles’ imprisonment for the murder of a rival gangster.
The second sequence leaps forward to the late 1920s with Noodles’ release from prison. He rejoins the gang, who are now powerful and wealthy bootleggers, and attempts to persuade Deborah (Elizabeth McGovern) to marry him. These episodes conclude with the murder of Max (James Woods) along with the two other gang leaders and Noodles’ retreat to the opium den.
The third sequence concerns Noodles’ return to New York City 35 years later, after he receives a note informing him of the relocation of the bodies of his partners from their original Lower East Side grave site. Crucially, this third set (along with the tale of Noodles’ escape from New York) differs from the first two in that it is occurs after the events of the frame.
Leone’s framing has raised a long-running debate over how to read the events we witness on screen. (Spoiler alert: I need to give away some major, and surprising, plot twists. If like me you believe in the pleasures of storytelling and if, unlike me, you haven’t yet seen the film, then I suggest you go and watch it before reading the rest of this post). The frame presents two options to the viewer: either, 1) everything we have witnessed has occurred in “reality;” or, 2) everything we see on screen after Noodles first imbibes opium (or perhaps, after he is momentarily awoken by a ringing telephone, which Slavoj Žižek significantly reads as introducing into the narrative a split between “symbolized reality and the surplus of the Real”) are his memories of the past and a fantasy of an alternate future. In this future, not only does Max remain alive—it turns out, Max sacrificed their partners and faked his own death to steal the million dollars the gang had hidden away—he has, under the new name of Christopher Bailey, become a wealthy west coast businessman, an influential, if corrupt, politician, and the husband of the now celebrated actress Deborah. In short, do the “real” events of the film end in 1968 as does the linear “story” (or what the Russian Formalist critics call its фабула [fabula]); or do they end in 1933, as is literally the case in the film’s non-linear “plot” (сюжет [syuzhet])? Nothing in the film gives us a definitive answer; and neither would Leone. His intent was to leave the matter unsettled, and it is this indeterminacy that makes Once Upon a Time in America one of the great realist and filmic examples of the rare genre Tzetvan Todorov names the fantastic: “The fantastic occupies the duration of this uncertainty. Once we choose one answer or the other, we leave the fantastic for a neighboring genre.”
The question if we accept the latter possibility is thus what would lead Noodles to retreat into such a fantasy? The answer is revealed in his actions that lead up to the death of his partners. On the night they plan to commit a petty crime, Noodles calls the police and informs on them. He does so with the aim of getting all four of them incarcerated for a short time—“probably one year,” he speculates. He hopes the time behind bars will break Max’s obsession with a suicidal plan to rob the federal reserve bank. However, just before their departure, Max, in an apparent fit of rage, knocks Noodles unconscious. When he wakes up and discovers the disastrous turn of events, Noodles’ guilt over what he has done overwhelms him and he flees to the opium den. (As a friend reminds me too, the guilty flashback on the part of the criminal has been a key aspect of Leone’s films from For a Few Dollars More (1965).)
Crucially, in the 1960s segment Noodles is given a chance at redemption. Late in the film Noodles encounters Max at a party held at the latter’s mansion. Max tells Noodles that he is the one who has enticed Noodles to return to New York. Max then reveals that he has become ensnared in a federal corruption investigation and the information he possesses will be enough to destroy his well-known partners-in-crime. He is sure that before he can testify, he will be murdered—after all, it is that nightmarish year of assassinations, 1968. For these reasons, Max has decided to give Noodles the opportunity to do away with him, which Max feels will be a just payback for his betrayal of Noodles 35 years earlier.
Noodles not only refuses, he persists in referring to Max as “Mr. Bailey” and declares, in a Bartleby-like affirmation of a nonpredicate, that he prefers to not-believe Bailey’s story. Max asks, “Is this your idea of revenge?” and Noodles replies, “No. It’s just the way I see things.” When he arrives on the street in front of Bailey’s house, Noodles sees a garbage truck start its engine and then spies in the distance what appears to be Max approaching the vehicle. After the truck passes by, Max is nowhere to be seen, leaving open the possibility that either, 1) Max has just prevented his planned murder of Noodles and has returned to his house (the “Crime does not pay” lesson of the murder of both Max and Noodles by each other’s hands would have probably been the film’s ending had it been released before the repeal of the Hays Code); or, 2) Max has committed suicide by diving into the vehicle’s trash grinder. (Leone further heightened the ambiguity by using a stand-in for Woods and refusing to tell the actor what transpired in the scene. Reportedly, there is no ambiguity in the original U.S. theatrical version, as after his departure Noodles hears a gun shot from inside Max’s home.) Right before the film cuts to the 1933 closing scene, Noodles “sees” revelers driving past in pre-war vehicles and dressed in old-fashioned clothing: are they real or part of Noodles’ fantasy? Although all these things could have occurred in reality, the incongruities work to reinforce the open-endedness of the conclusion.
Moreover, on even further reflection, the viewer recalls that the film has introduced a whole series of possible endings, many of which are accompanied by figures of doors. For example, in the opening 1930s sequence, we see Noodles at the bus terminal hesitate as he is about to step through a doorway into the void of an indeterminate future. Perhaps this is where the film “ends,” and his return 35 years later—to which the film immediately segues as Noodles re-enters New York through the same doorway—are his fantasy, dreamt up along his road to “anywhere” (named by a ticket seller as “Buffalo”).
Or again, shortly after his return to the city, Noodles learns from Deborah’s brother, Fat Moe (Larry Rap)—the owner of a run-down Lower East Side bar that had once been the bootleggers’ jazz-age club and before that his father’s Jewish restaurant—that he too had received a similar relocation notice from the cemetery. Moreover, he informs Noodles that while his sister has become a successful actress, he has not spoken with her in many years. He then invites the frustrated Noodles to spend the night in his apartment. After dark, Noodles climbs on the toilet and pulls open a small hidden door. As he peers through, the film flashes back to an image of a teen-aged Deborah as she dances in the storage room before she spies a youthful Noodles peering through the same door. All of this again hints that perhaps the real narrative action ends here, Noodles belated and confused return to the city prompting both his memories of the past and his fantasy of an alternate and more satisfying conclusion to his dealings with Max.
Finally, upon Noodles’ release from prison, Max arrives to pick him up—Max driving a hearse no less—and Noodles re-enters the criminal life by passing through the vehicle’s door (and into the arms of the voluptuous prostitute waiting inside). Is Noodles’ imprisonment then, which also begins with him walking through another set of imposing doors, to be understood as the conclusion of his story, making his reunion and subsequent successes and failures no more than a “crime thriller” fantasy constructed by the still incarcerated young man (as was Goldberg when he began writing his memoir)? As Red (Morgan Freeman) notes in the wonderful utopian film, The Shawshank Redemption (1994), “Prison time is slow time. So you do what you can to keep going.”
Even more significant are the film’s possible endings that follow on the heels of acts of deadly violence performed by Noodles. (The graphic portrayals of torture, rape, and murder were another reason for the drastic editing of the film for its U.S. theatrical release.) First, Noodles is sent to prison because he has stabbed to death a competitor, albeit in retribution for the shooting of one of his gang members—but, he then also stabs a police officer who tries to intervene. Secondly, shortly after his release from prison, Noodles commits another murder when he tracks down and shoots a mobster that his gang has betrayed. Noodles is furious that he was not told of the planned double-cross and in response he drives their car off the end of a city pier. The possibility that Noodles drowns in the crash is reinforced in a scene deleted by Leone from his cut (it now can be viewed online) where the other three gang members fall into panic when Noodles fails to resurface (although it is equally possible that this is pay-back for a similar practical joke played on Noodles by Max years earlier).
Third, and most significant of all, later in the film Noodles takes Deborah on a lavish and costly dinner date, where he expresses his love for her and asks her to marry him. Although she tells him that she cares deeply for him, she refuses and informs him that she will depart the next day for Hollywood. During their ride home in the back of a chauffeured car, Deborah tenderly kisses Noodles. Perhaps this is meant as a farewell; or it may be a sign that she has reconsidered. We never learn the truth, because Noodles responds by brutally raping her. After an excruciating amount of time has passed, the driver halts the car. Noodles disembarks and the vehicle speeds away with the distraught young woman. We then see Deborah’s departure and learn that Noodles spends the next few weeks mourning in the opium den. This revelation raises the possibility that the concluding scene is of Noodles entering the den even earlier, on the heels of his rape of Deborah, and that it is her departure that instigates his flight from the real.
These multiple possible endings and our inability to decide between them offer important clues to the film’s central message. Whatever “in reality” may be the narrative’s conclusion, what becomes clear is that Noodles “real” flight is from what Žižek refers to as the “properly traumatic” experience of freedom. Žižek further points out, “it is easy to accept that we are just a speck of dust in the infinite universe; what is much more difficult to accept is that we really are immortal beings who, as such, cannot escape the terrible responsibility of their freedom.” Žižek’s insight very much applies to Noodles. All the terrible events in his life—his imprisonment, his break with Deborah, and the deaths of his friends and lover—are consequences of choices made by Noodles and it is this truth from which he tries to run. This makes Noodles a lot like Joseph Conrad’s Lord Jim. Both young men try to dodge responsibility for their actions: “You had to listen to him as you would to a small boy in trouble. He didn’t know. It had happened somehow. It would never happen again.” Also like Jim, Noodles expresses the desire to begin again with a “clean slate,” where his past acts and guilt have been wiped away. Finally, both men find redemption only in fantasy, whether such fantasy be reality or a dream.
The indeterminacy of what is reality and what is fantasy also transforms the film into a mirror held up to every viewer: it all depends on the way you “see things.” Those who conclude that the 1968 events are “in reality” are also often those who want to hold on to both dreams of heroic redemption—“He saw himself saving people from sinking ships, cutting away masts in a hurricane”—and the notion of a more authentic inner self distinct from our choices. However, Leone’s film repeatedly reminds us that even if Noodles makes a better decision in the future this does not erase what he has done in the past or lessen his responsibility: we can learn from our past, but we cannot change it. Once Upon a Time in America, Conrad’s Lord Jim, and other works like them teach us that we are what we choose to do, and it is our actions more than our internal fantasies that define our true selves.
This complex open-ended structure also serves an important role in Leone’s rejoinder to The Godfather. The link between the first two Godfather films and Once Upon a Time in America is further suggested by Leone’s choice of De Niro to portray Noodles. Al Pacino plays Michael in all three Godfather films, as he moves from a being a young decorated veteran in the days immediately following the Second World War to someone who is by the late 1970s a wealthy and immensely powerful head of a global corporation. An aged Vito is memorably brought to life in the first film by Marlon Brando. In the second film, however, Brando never appears on screen and instead a young Vito is equally memorably played by . . . De Niro. Moreover, The Godfather II also deploys an interweaving narrative structure, shifting back and forth between the story of Michael’s efforts in the months preceding the Cuban Revolution to transform his family’s business and flashbacks to Vito’s emigration in the early years of the twentieth century from Sicily to New York’s Little Italy up through his rise to power in the 1920s. (Revealingly, while alluded to, the violence of the gang wars of the 1930s and Vito’s actions in that period never appear on screen).
These similarities aside, the narrative structure and ultimate vision of Coppola’s and Leone’s films diverge dramatically. Despite its elements of fantasy, Once Upon a Time in America formally offers a realist portrayal of the lives of Noodles and his compatriots. Fredric Jameson once pointed out that in any authentic realism, the author “does not really know what he will find beforehand.” This means that every event in the story should feel as if it opens up onto a number of possible resolutions, each of which is contingent on the free choices made by the protagonists.
Coppola’s mode, on the hand, is the very different one of tragedy. And as tragic figures whose destiny is determined by fate, Vito and Michael cannot be judged according to the criteria of good and evil. In the spirit of classical tragedy, the pair are portrayed as noble men, who, while they may do bad things are not bad in and of themselves. They come to their tragic ends precisely because of a fatal conflict between their unbending wills (their harmartia, “a fatal flaw leading to the downfall of a tragic hero or heroine”) and forces external to them. Indeed, the films show that many of their worst deeds are “forced” upon them by the evil of other men.
Take the case of Michael. His intent at the beginning of the first film is to stay far away from his family’s criminal enterprise. He is drawn in only after an assassination attempt on his father, who has been targeted by other gangsters because of his principled refusal, whatever the lost profits, to support the trafficking in heroin. Michael realizes that the only way to truly protect his father is to murder those who want to do him harm. He is similarly forced to become the head of the family after his older impulsive and violent brother Sonny (James Caan) is brutally murdered. And finally, it is the plots against him that forces Michael to eliminate both the leaders of the other factions and one of his father’s closest associates—as well as his brother-in-law who has confessed to setting up Sonny.
In the second film, Michael struggles to make legitimate his family’s various business interests. However, again he is forced into violence by the combination of the racism of a corrupt Nevada senator, the desire for vengeance on the part of his business partner, and, worst of all, betrayal by his remaining brother Fredo. (Fredo is played by the extraordinarily talented John Cazale. Cazale appeared in only five films between 1972 and his untimely death in 1978, but all five—the first two Godfathers, The Conversation (1974), Dog Day Afternoon (1975), and The Deer Hunter (1978)—were nominated for the best picture Oscar, two of them in the same year and three of them winning.) Even his last efforts at redemption in the third film—protecting his son and daughter and supporting the good man John Paul I’s efforts to reform the Catholic Church—are thwarted. In the last film, Michael famously cries out, “Just when I thought I was out, they pull me back in.” We the viewers realize this has always been the case, and it is the casting of Michael as a victim of fate (them) that makes him such a prototypically tragic figure.
It is this reimaging of the criminal as tragic—again, someone who is not really free and hence beyond judgment as good or evil—that Leone thoroughly demolishes in Once Upon a Time in America. Imagine the response in the U.S., his story seems to suggest, if he made a film that cast the leaders of Italian fascism as similarly tragic heroes, noble men forced to act in the ways they did by the nations around them? The only ones who bear the responsibility for their actions, Leone’s film maintains, are these men, be they Noodles, Max, Michael, Vito, or Mussolini and his partners-in-crime.
And here we arrive at the continued timeliness of Once Upon a Time in America. Such a tragic mythos extends beyond the figure of the immigrant gangster and is applied to many aspects of twentieth- and twenty-first century American life. First, it is no coincidence that Leone sets the action of the final sequence in 1968, for among other reasons, this is the year of the election of Richard Nixon as the President of the United States and the beginning of the long historical wave that has led to the present. Revealingly, the title of a 2015 study of Nixon’s presidency is One Man Against the World: The Tragedy of Richard Nixon. Furthermore, recall that when Leone’s film was first released, there was a concerted effort underway to re-narrate the actions of the U.S. in Vietnam and other hot spots of the Cold War not as ethical and political failures, freely initiated crimes for which the nation collectively bore responsibility, but as tragedies.
Even in Coppola’s own monumental treatment of the Vietnam War, Apocalypse Now (1979), the question arises as to whether the character of Kurtz (Brando)—adapted as is much of the film’s narrative structure from Conrad’s Heart of Darkness (1899)—is evil or tragic. We are left to wonder whether Kurtz is brought down by his “unsound methods” or by the conflict between his core values—the film’s version is no hollow man—and the cowardly pragmatism of his superiors: “these were not monsters. These were men… trained cadres. These men who fought with their hearts, who had families, who had children, who were filled with love… but they had the strength… the strength… to do that. If I had ten divisions of those men our troubles here would be over very quickly. You have to have men who are moral… and at the same time who are able to utilize their primordial instincts to kill without feeling… without passion… without judgment… without judgment. Because it’s judgment that defeats us.” Could it also have been some obscure recognition of how out of step Leone’s vision was with Reagan’s “Morning in America” (a moment too when the older heroic figure of the cowboy so effectively undercut by Leone’s 1960s westerns was also being revived) that contributed to the decision to undertake the extreme and disastrous editorial efforts to make the film more “palatable” to a U.S. audience?
Such an ethos continues unabated into our millennium. For example, as the sad case of Ward Churchill reminds us, any attempt to locate some responsibility for the 9/11 attacks as “blowback” for U.S. policy decisions and the actions of the military around the world were met with swift and brutal reprisals. Even more recently, while the book itself is quite critical, the title of Glenn Greenwald’s 2007 best-seller, A Tragic Legacy: How a Good vs. Evil Mentality Destroyed the Bush Presidency suggests that the catastrophe of the second Bush administration’s actions in the Middle East are less free choices than tragic failures, instigated by the harmartia of too simplistic a moral world view. Even in terms of Trump’s presidency some already seek to explain the crimes that he and those around him have committed through recourse to the formal logic of tragedy: “Yet there is an undeniably tragic quality to the Trump presidency . . . . Why? Because Trump did have some valid and important insights into America’s current problems and he had a chance to do something about them when he got elected. That opportunity has been wasted, however, and Trump’s flaws as a politician, strategist, and human being are the main reason why.”
This encapsulates in its most fundamental form the work of ideology or what Roland Barthes calls mythology: “myth hides nothing: its function is to distort, not to make disappear. . . . it transforms history into nature.” Leone’s great film cracks open such myth making and teaches us that until we refuse the comforts of tragic modes of national storytelling, “Once upon a time,” and face up fully to our responsibility for our actions, we will remain trapped in a repetitious structure where the encounter with the truth about ourselves never comes and we retreat further and further into fantasy—into our dreams. But the dream of a better world is not a better world. The latter will only come into being when we wake up, attend to the truth of what has come before, and begin to act in new ways.
For anyone who is interested, I did a more than one-hour long interview on the book with University of Florida alum Camelia Raghinaru for her extraordinary and timely podcast, ‘Theory to No End.‘
The book’s main title comes from an etching by Sebastiano Ricci, Allegory with Figures of Hope, Time, and Death (1659-1734) that the press generously reproduced as a frontispiece and which I post here in color.
The protests following the May 25, 2020 murder of George Floyd in Minneapolis has brought back into prominence the phrase, “Black Lives Matter.” This has resulted in such admittedly symbolic but still necessary gestures as the Mayor of Washington renaming a portion of 16th Street leading up to the White House Black Lives Matter Plaza.
The phrase Black Lives Matter was first used in summer 2013 by the activists Alicia Garza, Patrisse Cullors, and Opal Tometi in response to the acquittal of George Zimmerman for the murder on February 26, 2012 of 17-year old Trayvon Martin. The following summer it came to even greater national and international prominence in the protests that erupted after the fatal shooting by a police officer in Ferguson, Missouri of 18-year old Michael Brown, Jr.
Not unsurprisingly, shortly on the heels of the initial success of Black Lives Matter in mobilizing a diverse coalition there emerged the competing slogan “all lives matter,” which was quickly and cynically seized upon by a range of neo-conservatives, including Ben Carson and Donald Trump, who claimed that the Black Lives Matter privileged one group of lives over others.
In recent weeks, I have seen a number of well-meaning people seize upon a counter-defense nicely summarized in this cartoon from a few years ago I found online:
The problem with this gesture is that while it is true that a house that is burning has priority over those that are not, how do we prioritize one house when multiple houses are all aflame at once? In my last blog post, I cited Greta Thunberg deploying, rightly, the same phrase in reference to climate change: “I want you to act as if our house in on fire. Because it is.” Moreover, is there a more burning issue for First Nations peoples than the accelerated dispossession of their lands and the destruction of their communities by the policies of the Trump administration? Or for those from the global south seeking asylum and economic opportunity for themselves and their loved ones than radical changes in immigration policies? Or for children working in sweatshops or undocumented farm workers or even adjunct faculty members than a fundamental restructuring of the global economy? As the Floyd case and so many others that have been (thankfully) recently caught on video bear out, just because we don’t, or can’t, see a particular fire doesn’t mean it is not ravaging many lives.
Then why the privileging of Black Lives Matter as the rallying cry today? What both responses, the conservative (all lives matter) and liberal (these lives matter more now), miss is the universal dimensions of the clarion call that is Black Lives Matter. The is because, as Slavoj Žižek among others has pointed out, the phrase Black Lives Matter, far more than the abstract “all lives matter,” expresses a drive toward what Žižek terms concrete universality.
How does this work? We need to begin by making explicit the unstated qualifying presupposition of each phrase as it is deployed today by opposed communities:
All lives matter . . . except those lives— black lives, poor and working-class lives, immigrant lives, differently abled lives, queer lives, etc.—that are the outside the charmed circle of lives that matter.
Black Lives Matter . . . as much as the more privileged lives inside that circle.
The former exemplifies what Žižek terms the universal exception: “The ‘universal exception’, according to Lacan, is the fundamental feature of the symbolic order (the ‘big Other’) as the order of universality: each universality is grounded in its constitutive exception.”
In its original formulation, the preamble to the constitution functioned in the same way: “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.” As anyone who knows anything about American history recognizes, these words were penned by white property-owning men whose intent (and this is also the problem of so-called constitutional originalists) was that women, people of color, and even white men without property be excluded from the group named “all men.” It was only later coordinated political action on the part of these excluded people—taking up as their mantras, all property-less men are created equal to propertied men, all men of color are created equal to white men, all women are created equal to men, all differently abled people are created equal to normatively abled people, all queer people are created equal to heteronormative people (all of them in common having nothing to lose but their chains)—that the notion was gradually expanded into the more universal understanding we hold today. Moreover, this is a struggle that very much continues into the present, as this week’s Supreme Court rulings extending civil rights protections to LGBTQ+ people and protecting (at least for now) DACA recipients bear out.
A similar logic is at work in what many conservatives ingeniously refer to as the apparently universal position of “pro-life.” The phrase pro-life far too often serves as mask for what is simply anti-abortion, which is grounded on the deeply patriarchal false universal that all lives matter . . . until the moment they leave the womb. Then some lives—those whose parents cannot afford health care or adequate nutrition, who happen to be born to parents outside the United States, who are of a different faith, who are part of a population deemed by political leaders to be our enemies, and so forth—become of very little concern to many self-proclaimed pro-lifers. The counter that every pre-natal life is a universal exception and hence less valued in our society has a kinship with the notion of “blues lives matter:” both arise from the presupposition that these lives matter less in the eyes of our permissive liberal institutions than the lives of women, or black, poor, immigrant, or queer people. Both claims are simply ideology and readily can be demonstrated as false. For example, a crucial proof that black lives are undervalued is the treatment of black people who otherwise reside within the charmed circle. In recent years, men in positions of power and prestige—members of congress, medical doctors, Harvard professors—have all been documented victims of police harassment simply because they happened to be black men at the wrong place and at the wrong time.
Conversely, in the Christian New Testament, the second of two commandments issued by Jesus (“There is no commandment greater than these”) is not “love all men as yourself” (i.e. all men matter) but rather “Love your neighbor as yourself” (NIV: Mark 12: 31). This is because it is only by way of the universal exception—the neighbor, the one excluded by Margaret Thatcher’s infamous 1987 assertion “there’s no such thing as society. There are individual men and women and there are families”—can the concrete universality of the first commandment be realized: “Love the Lord your God with all your heart and with all your soul and with all your mind and with all your strength” (NIV: Mark 12: 30).
The only path to get to a concrete universal—a true All Lives Matter, all are created equal, pro-life position—is by first embracing an actual universal exception as the basis of political engagement. Such a starting point by no means guarantees the destination— as Nancy Fraser and others point out, liberal feminism too often pulls up short of becoming a concrete universal; and Jodi Dean in her timely recent book, Comrade: An Essay on Political Belonging (2019) argues that the notion of allyship, which has also in the current moment risen to new prominence, at times “reflects the shrinking or decline of the political. . . the term ally appears more to designate a limit, suggesting you will never be one of us, than it does to enable solidarity.”
Nevertheless, only a concrete universal exception can inaugurate the chain that will continuously expand to include all others. In this way, Black Lives Matter comes to include immigrant lives, differently abled lives, queer lives, poor and underemployed lives, and every other under-valued life. This is what Angela Davis so powerfully expresses when she notes in response to Hillary Clinton’s use at a talk in Florissant, Missouri of the phrase all live matter only days after the June 17, 2015 murder of nine black lives (Clementa C. Pinckney, Cynthia Marie Graham Hurd, Susie Jackson, Ethel Lee Lance, Depayne Middleton-Doctor, Tywanza Sanders, Daniel L. Simmons, Sharonda Coleman-Singleton, and Myra Thompson) at the Emanuel African Methodist Episcopal Church in Charleston, South Carolina: “If indeed all lives mattered, we would not need to emphatically proclaim that ‘Black Lives Matter.’ Or, as we discover on the BLM website: Black Women Matter, Black Girls Matter, Black Gay Lives Matter, Black Bi Lives Matter, Black Boys Matter, Black Queer Lives Matter, Black Men Matter, Black Lesbians Matter, Black Trans Lives Matter, Black Immigrants Matter, Black Incarcerated Lives Matter, Black Differently Abled Lives Matter. Yes, Black Lives Matter, Latino/Asian American/Native American/Muslim/Poor and Working-Class White Peoples Lives matter. There are many more specific instances we would have to name before we can ethically and comfortable claim that All Lives Matter.” For similar reasons, Žižek finds the “+”in the slogan LGBTQ+ the sign of its potential insurgent universality; and a pro-life position that was a true concrete universal would not only be against abortion it would be actively for universal health care, guaranteed universal income and housing, equal schooling, and open borders; and against police violence, war, economic exploitation, and so much more.
All of this bears out the fact that in our present context not only the universal goal—Black Lives Matter as much as all others—but the more particular one of systemic police reform, will not occur without a fundamental remaking of society. Our call should not be simply to defund the police, but to defund an inequitable, unjust, oppressive, and exploitative reality itself. To those who say such a concrete universality and the admittedly arduous labor of building solidarity across different communities and interests is unrealistic, impossible, and utopian, I would suggest we respond with one of the great slogans of 1968, a year a number of commentators have pointed out resonates with our own (and if this is the case, may the second half of 2020 come to very different conclusions): Soyez réalistes, demandez L’impossible (Be realistic, demand the impossible).
Today, Time magazine released a prose poem by Titus Kaphar entitled “I Cannot Sell You This Painting,” intended to accompany his painting that appears on the cover of the June 15 issue of the magazine. It is deeply moving and powerful short work, and I would encourage everyone to read and meditate on it. (I have posted links at the end of this essay to the poem and a few other pieces I touch on.) Well into the poem, Kaphar cries out, “Do/ not/ ask/ me/ to/ be/ hopeful;” and then again a little further on, he observes, “And so those without hope…/ Burn./ This Black mother understands the fire./ Black mothers/ understand despair./ I can change NOTHING in this world,/ but in paint,/ I can realize her…./ This brings me solace…/ not hope,/ but solace.”
In January 2019, the extraordinary young climate change activist Greta Thunberg concluded her stirring address to Davos World Economic forum in a similar fashion:
Adults keep saying: “We owe it to the young people to give them hope.” But I don’t want your hope. I don’t want you to be hopeful. I want you to panic. I want you to feel the fear I feel every day. And then I want you to act.
I want you to act as you would in a crisis. I want you to act as if our house in on fire. Because it is.
As someone who has just completed a book with the title, Invoking Hope: Theory and Utopia in Dark Times, both Kaphar and Thunberg raise questions that have been on my mind for a good while now. There is a long tradition of deeply committed artists and intellectuals decrying the false comforts of shallow hope or easy optimism. Such a critique is at the heart of Joseph Conrad’s summary statement on European civilization issued at the dawn of the twentieth century and in response to atrocities he witnessed first-hand: “The horror! The horror!” It is also at the center of Theodor Adorno’s formulation of negative dialectics; and in his book of that title, Adorno notes, “People to whom despair is not a technical term may ask whether it would be better for nothing at all to be than something.”
In the preface to his first collection, Na Han (Call to Arms)(1922), Lu Xun—in a passage I often refer to in my teaching—recalls a parable he had presented a few years earlier to a close friend:
Imagine an iron house without windows, absolutely indestructible, with many people fast asleep inside who will soon die of suffocation. But you know since they will die in their sleep, they will not feel the pain of death. Now if you cry aloud to wake a few of the lighter sleepers, making those unfortunate few suffer the agony of irrevocable death, do you think you are doing them a good turn?
Lu Xun tells us that his friend replied, “But if a few awake, you can’t say there is no hope of destroying the iron house.” Lu Xun then concludes his anecdote by noting, “True, in spite of my own conviction, I could not blot out hope, for hope lies in the future. I could not use my own evidence to refute his assertion that it might exist. So I agreed to write, and the result was my first story, ‘A Madman’s Diary.’ From that time onwards, I could not stop writing, and would write some sort of short story from time to time at the request of friends, until I had more than a dozen of them.”
I am convinced that Lu Xun’s words offer an effective means to understand the import of what Kaphar, Thunberg, Adorno, and Conrad touch on each in their own unique way. I would suggest that all of these great figures are in fact not referring to authentic hope, but rather to its shallow cousin, optimism—what Antonio Gramsci refers to as an “optimism of the intellect,” and especially, as Thunberg also so effectively underscores, when such an optimism is accompanied by a “pessimism of the will.” And in this I fully concur with Gramsci in relationship to his terrible moment and Kaphar and Thunberg to our own: there is nothing more dangerous than the false promise of this shallow form of hope.
However, the true negation of hope is despair, and despair—a pessimism of the intellect and will—is expressed in passivity and silence; and it is precisely such passivity and silence that both Kaphar and Thunberg refuse in the very act of writing. A figure of such a despair is to be found in Conrad’s work in the novel that is perhaps his masterpiece, Lord Jim (1900): in Captain Brierly, who after coming to a fundamental realization about himself (“A man may go pretty near through his whole sea-life without any call to show a stiff upper lip. But when the call comes . . . Aha! . . . If I . . .”), we learn “committed suicide very soon after.” The opposite of Brierly, and a figure for Conrad of authentic hope, is to be found in the narrator of both “Heart of Darkness” and Lord Jim, Charles Marlow, who tells the truth of what he has experienced to his sole remaining audience member on board the Nellie and to the “privileged man” who receives his final letter. For Marlow, as for Conrad, it is lies—and that most insidious form of the lie which is silence—that are “too dark – too dark altogether . . .” Similarly, in the case of Lu Xun, Adorno, Thunberg, and Kaphar, the depth of their hope becomes manifest in the fact that even though each of them “knows” things are hopeless (pessimism of the intellect) they nevertheless continue to act (optimism of the will), painting their pictures, telling their stories, teaching their students, and working to change the world. (Need I add that such a stance is very different from either romanticism—an enthusiastic but quickly spent optimism of the intellect and will—and cynicism—the combination of a pessimism of the intellect with an optimism of the will, the willingness to act as if everything is fine even though we know the house is on fire or the air is running out in the iron house). They all offer their message, as Adorno and his co-author Max Horkheimer write near the climax of Dialectic of Enlightenment, “not to the ‘masses’ and not to the individual (who is powerless), but to an imaginary witness—lest it perish with us.”
We need this form of hope now more than ever in these increasingly dark times and I truly thank people like Thunberg and Kaphar who give such a gift to us. It is such a hope that Pope Francis also had in mind when he recently observed, “Optimism disappoints, but hope does not.”
* * *
Here are the links to some of the essays I discuss. I take up Gramsci’s message, and present the Adorno and Pope Francis quotes, in the Conclusion to Invoking Hope. The Conrad citations are from Heart of Darkness and Lord Jim.