Thursday, September 29, 2016
Saturday, September 17, 2016
Give credit where it’s due.
For better or worse, Kim Kardashian has a major platform for airing her views. In publishing a full page advertisement in today’s New York Times directing attention to ongoing Turkish efforts to deny the Armenian genocide (most recently with the publication of the advertisement in the Wall Street Journal by a group called “Turkic Platform”), she is using her pop culture pulpit for good. She might have been somewhat more eloquent in describing her objections (she uses the generic terms “crap” and “crappy” a bit too much for my taste), but the thrust of her argument is clear: it is important to “honor the TRUTH IN OUR HISTORY.” And yes, “Education Matters.”
That said, I was surprised to see KK employ a questionable counterfactual claim to rhetorically enhance the urgency of her appeal. In the last paragraph, she writes: “Many historians believe that if Turkey had been held responsible for the Armenian genocide, and reprimanded for what they did, the Holocaust may not have happened. In 1939, a week before the Nazi invasion of Poland, Hitler said, ‘Who, after all, speaks today of the annihilation of the Armenians?”
Having written extensively about counterfactual claims on the Holocaust (see my book, Hi Hitler!), I would be interested to learn which scholars KK is thinking of. I don’t know of any historians who have employed this particular “what if” claim. To be sure, many historians have used the famous (albeit highly disputed) Hitler quotation to suggest that the world’s ostensible indifference to the Armenian genocide emboldened Hitler to pursue his Final Solution of the Jewish question in radical form. In other words, this claim by historians includes an IMPLIED counterfactual. But to my knowledge, it’s rarely, if ever, been explicitly expressed (certainly not among scholars of German or Jewish history). KK is thus overreaching.
I did a little digging, however, and found some claims in some texts produced by writers of Armenian descent. To cite one example, Marian MacCurdy writes that “If the Armenian genocide had been recognized, it is possible that the Holocaust would not have occurred.” (The Mind’s Eye: Image and Memory in Writing About Trauma, p. 164). Amos Elon also quotes an Armenian official in Jerusalem observing, “The Armenian holocaust was forgotten or ignored. If it had not been ignored, perhaps Auschwitz would not have happened.” (Elon, Jerusalem: Battlegrounds of Memory, p. 226). Taking a more skeptical stance, Stefan Ihrig’s study, Justifying Genocide: Germany and the Armenians from Bismarck to Hitler declares, “the argument that without the Armenian Genocide there would not have been a Nazi Holocaust is unnecessary and to some extent folly” (Ihrig, p. 357).
Indeed, KK hardly needs to employ her counterfactual to defend her appeal, which remains fundamentally legitimate. Too many Turkish officials, academics, and others have denied the truth of what happened to the Armenians in the First World War. (By the same token, plenty of Turkish academics, journalists, and writers have taken a more critical and honest approach to the highly politicized subject).
The Armenian question aside, what does KK’s claim say about the current state of counterfactuals?
On the one hand, her advertisement reveals and reinforces the appeal of counterfactuals, which retain considerable rhetorical power. Her ad further enhances the value of speculative thinking, as she is employs it in defense of historical truth. This is important for a variety of reasons. In our increasingly “post-fact” and “post-truth” world, it is critical that we do, in fact, “honor the TRUTH IN OUR HISTORY.” For the record, this is the message that stands at the core of the forthcoming film, Denial, starring Rachel Weisz as historian Deborah Lipstadt. (See my forthcoming review in The Jewish Review of Books).
Yet, despite KK’s embrace of counterfactual reasoning to promote the cause of truth, many skeptical observers continue to see the former as antithetical to the latter. Counterfactual history is often accused of contributing to the increasingly blurred boundaries between fact and fiction, between historical truth and outright denial. All of these trends are often blamed on rise of postmodern culture, which has allegedly nurtured them with its relativistic spirit. There is a good deal of validity to this claim. But it would be entirely misguided to throw the counterfactual baby out with the dirty bathwater of historical denialism. Like any historical methodology, counterfactual speculation can be used for a wide range of trivial, mischievous, and also nefarious ends. It can also be used – as this blog has long maintained – to pursue the goal of enlightenment.
I wonder whether we are at a crossroads with respect to counterfactual history. On the one hand, we are clearly in the midst of a new “golden age” for the discipline, as seen in the proliferation of alternate history novels, web series, and television shows (coming soon: a blog post of mine on this theme for the Organization of American Historians). But the growing disaffection with the relativistic reality of western intellectual and cultural life may lead to a backlash.
I have been wondering if we are in store for a paradigm shift within western historical consciousness. Peter Novick’s celebrated study, That Noble Dream, convincingly shows how the American historical profession has vacillated between waves of support for the belief in objective truth and the belief in relativism. If, as I suspect, the free-wheeling relativism of our present-day world (on the World Wide Web, in our political discourse, and beyond) is going to stoke popular demand for a return to standards of objectivity, what will be the consequences for counterfactual thinking? Will it become re-stigmatized all over again as culpable for our excesses of relativism? Will it become the Socrates of historical methodologies, blamed for corrupting the minds of the young and impressionable?
All of these questions deserve more thought than I am able to provide at this juncture. But I hope to revisit them going forward as I continue researching the history of counterfactual history.
Saturday, August 27, 2016
David Means’s new novel, Hystopia, has been on my must-read list for some time now, so I was eager to plunge into it while out of town last week. I’m happy to report that it met many – thought not all -- of my expectations. I particularly liked its metafictional elements (it’s a novel within a novel, replete with fake introductory author’s and editor’s notes, as well as other lit crit marginalia). The novel also features powerfully drawn characters (mostly traumatized Vietnam Veterans who struggle with various forms of PTSD). That said, the novel was less interesting from a counterfactual perspective.
In a sense, Hystopia can be seen as a quasi-alternate history. To be sure, its premise is firmly rooted in a key point of divergence: President John F. Kennedy escaping assassination by Lee Harvey Oswald in November of 1963. Means furthermore adds other counterfactual events pertaining to the President’s life, imagining him avoiding death five more times until finally succumbing to a seventh successful assassination attempt during his third term in the fall of 1970.
Means does not exploit the full potential of this point of divergence, however. JFK’s survival hovers mostly in the background of the novel, whose plot is mostly taken up with the psychological struggles of its central characters. Ostensibly, Kennedy’s policy decisions worsen the characters’ difficulties coping with their war experiences. The novel imagines JFK continuing the U. S.’s involvement in Vietnam, despite its many obvious costs to the nation; in the process, his policies worsen domestic social tensions, especially among ex-soldiers, white blue-collar workers, and urban African Americans. Worse still, JFK’s poor leadership contributes to the eruption of violent race riots in the late 1960s and the early 1970s -- in metropolises like New York, L. A., and Detroit, as well as in smaller cities like Flint, Michigan. Indeed, entire parts of the state of Michigan (from which Means hails) seem to be cordoned off into Mad-Max-like wastelands.
Means never really explains the causal links between Kennedy’s continuation of the war and America’s dystopian turn, however. He does not show, for instance, why the continuation of the war in alternate history ends up having worse consequences than President Lyndon Johnson’s continuation of the war in real history. We also don’t really learn much concrete about why Oswald’s assassination failed and why so many subsequent murder attempts failed as well. We also don’t really find out why the last one succeeds.
Literary works of alternate history, to be sure, can present their points of divergence in subtle, allusive form and do not have to spell every detail out for readers. In fact, the primary flaw of most alternate history novels is clunky and excessive exposition. On this count, Means avoids a major mistake made by other novelists. At the same time, however, he fails to connect his novel’s allohistorical context to its broader plot. He thus misses an opportunity to create a more integrated work of counterfactual fiction.
On this count, I thought of a few literary “what ifs.” Couldn’t Means have arguably set his story in the real historical context of LBJ’s America and explored the psychological turmoil of his novel’s characters in the same way as he did? Alternatively, couldn’t he have set the tale in a dystopian future (switching the Vietnam war for some other conflict) and pretty much kept his plot as is (at least in terms of his character’s dysfunctional relationships)? I would hazard to say the answer to both questions is yes. The fact that Means sets his story in a counterfactual historical context but fails to extract more from its dramatic possibilities ends up being something of a let-down – at least to this reader.
These quibbles notwithstanding, Means’s embrace of alternate history – however faint -- can nevertheless be welcomed as another sign of the genre’s increasing popularity and legitimacy.
Thursday, August 11, 2016
As I continue to work on my new book on the history of the Fourth Reich (a project that, incidentally, employs abundant counterfactual reasoning), I’ve tried to stay up to date about the latest goings-on in the world of “what if?”
Two recent essays struck me as simultaneously discouraging and heartening with regard to the future of counterfactual history.
On the one hand, a recent short essay in The Guardian by Nicholas Lezard entitled, “Altered Pasts Review: Counterfactual Histories Should Be Fun,” gets no objection from me in declaring that “We love a good counterfactual, don’t we? They are a bit of fun, in which we tweak history’s nose by imagining what might have been.” However, Lezard then proceeds to lose me entirely by endorsing many of Richard Evans’s ill-grounded objections to “what if” thinking and by ultimately concluding – in overly sweeping fashion – that “counterfactuality is not a respectable historical tool, so don’t treat it like one.”
I don’t know what Lezard’s definition of “respectable” is, but I would think that we were past the point where such baseless accusations continue to be recycled. I would like to think that we’ve arrived at a point where we don’t need to rehash all the reasons why counterfactual reasoning is not only essential to historical analysis, but has always been a tool (however unacknowledged) used by the leading figures in the western historical profession. (For what it’s worth, I plan on meticulously documenting this fact in a future study of the field).
On the more positive side of the ledger: Niall Ferguson and Graham Allison have recently received a decent bit of attention for their Applied History manifesto, “Establish a White House Council of Historical Advisers Now,” which was recently published in abridged form in The Atlantic.
Anyone interested in counterfactual history will be thrilled to see it endorsed by Ferguson and Allison as one of the key ways in which historians can contribute to policy making decisions.
They write as follows:
“A fifth type of assignment where applied historians could be helpful in the current policymaking process: by posing and answering “What if?” questions designed to analyze past decision-making. Addressing such questions requires disciplined counterfactual reasoning. While many mainstream historians have voiced reservations about counterfactual analysis, this method lies at the heart of every historical account. As one of us argued in Virtual History, “it is a logical necessity when asking questions about causation to pose ‘but for’ questions, and to try to imagine what would have happened if our supposed cause had been absent.”
“When assessing the relative importance of various possible causes of WWI, historians make judgments about what would have happened in the absence of these factors. Methods developed for doing this systematically can be employed by applied historians in considering current policy choices. Thus, President Obama’s successor could ask his Council of Historical Advisers to replay 2013. What if Obama had opted to enforce the “red line” in Syria against the Assad regime, rather than delegating the removal of chemical weapons from Syria to the Russian government? And what if, in January 2014, the EU had not offered Ukraine an economic association agreement that was clearly designed to pull Kiev westwards? Would President Putin have intervened militarily in Ukraine?”
People may differ on the value of historians diving into political work, as Jeremy Adelman has recently written in a new piece in The Chronicle of Higher Education. But whatever one’s views on the subject, it is significant that the manifesto elevates historical “what ifs?” to such prominence.
At the very least, it refutes Lezard’s erroneous claim that they are anything but “respectable.”
Wednesday, August 3, 2016
It's the summer doldrums and I'm taking a bit of a break from blogging about historical counterfactuals.
But I thought I'd post a link to a new review that I just published in The Forward on Simone Zelitch's novel, Judenstaat, about a Jewish state being established in Saxony after World War II.
The novel was briefly profiled in this past weekend's New York Times, along with Underground Airlines. I'd like to think I engage with the book's absorbing narrative in greater depth.
Tuesday, July 5, 2016
My summer alternate history readings list just got longer.
While I still have to crack open David Means’s Hystopia, I just saw today’s New York Times article about the publication of yet another prominent alternate history novel, Ben Winters’ Underground Airlines.
Not surprisingly, the two novels deal with “what if?” scenarios involving two of America’s enduring traumas: the Vietnam War and slavery. I look forward to posting my thoughts on them later this summer.
But several things struck me in reading the Times piece on Underground Airlines.
First, I was disappointed by the failure to mention the fact that the novel is a work of alternate history. Regardless of whether Winters conceived the novel self-consciously as belonging to the genre (and he is quoted as having read Philip K. Dick and Philip Roth’s classic works, so he probably did), the failure of the reviewer, Alexandra Alter, to even mention the genre’s existence, to my mind, speaks to alternate history’s ongoing struggle for acceptance and legitimacy in the mainstream press and reading public.
One would think that after all the prominent contributions that have appeared in recent years, it would be de rigueur to note that any novel based on a historical counterfactual belongs to an established literary tradition. Apparently, we’re not yet there…
Second, and this is a point having less to do with “what ifs?” than identity politics, I was struck by the multiple comments involving the “controversial” dimensions of Winters (a white male Jewish author) writing about a black protagonist.
Winters is quoted in the article saying: “No one tried to talk me out of it, but my wife at one point said, ‘Boy, it would be better if you were black.”…. “My agent might have said something similar, that the reception of the book would go down easier if it was an African-American author.”
The article goes on to add:
“At least one publisher passed on the book, arguing that it might be too controversial in the context of the Black Lives Matter movement, Mr. Winters said.”
The article further notes:
“The African American writer, Attica Locke, a mystery novelist and a writer for the television show “Empire,” said she was taken aback at first when she picked up the book and saw the author photo. “The premise was just like, ‘Wait, what now?’” Ms. Locke said. “For me, as a black writer, I have to be like, ‘What’s Ben trying to do here?’” Then she got sucked into the story and was “blown away,” she said. “There’s always this chatter about who gets to tell which stories, and I’m so grateful that he did not let his choice to have a black protagonist scare him away from the project, because this is everybody’s history,” she said."
All the above quotations obviously speak to the ongoing difficulty speaking about race in the U. S. And there are plenty of reasons why this should be so.
The very idea, however, that – especially in fiction, the art of the imagination – some people may be un- or under- or dis-qualified from speaking about subjects because of their identity is woefully ill-considered.
Was (white writer) Terry Bisson wrong for featuring black protagonists in his alternate history novel, Fire on the Mountain? Was the non-Jewish writer Martin Amis ill-equipped for writing about the Holocaust in Time’s Arrow? The list goes on and on and on….
Works of fiction (like film, music, theater, etc.) can be judged on all kinds of aesthetic and ethical criteria, but to prejudge works of fiction based on superficial assumptions about the identity of the author strikes me as, well, superficial.
Human beings have the capacity to show empathy for other human beings. This is the job of literature at its core. And it’s our obligation to one another as members of the human species. As Winters himself notes:
“The whole art form is about empathy,” Mr. Winters said. “No, I will never know what it’s really like to be black, but I can, through as much imagination as I can bring to it, create this individual. That’s my job.”
Thankfully, everyone who touched on the issue in today’s Times article ultimately came off as reasonable, but I’m afraid this trend will lead to some less praiseworthy episodes in the future if it continues.
Monday, July 4, 2016
A July 4th Counterfactual: Jefferson's Deleted Condemnation of Slavery from the Declaration of Independence
Today’s New York Times contains a sobering op-ed that counterfactually reminds us of the missed opportunities associated with our otherwise celebratory July 4th holiday.
According to historian Robert Parkinson,
“The Declaration’s beautiful preamble distracts us from the heart of the document, the 27 accusations against King George III over which its authors wrangled and debated, trying to get the wording just right. The very last one — the ultimate deal-breaker — was the most important for them, and it is for us: “He has excited domestic insurrections amongst us, and has endeavored to bring on the inhabitants of our frontiers, the merciless Indian savages, whose known rule of warfare is an undistinguished destruction of all ages, sexes and conditions.” In the context of the 18th century, “domestic insurrections” refers to rebellious slaves. “Merciless Indian savages” doesn’t need much explanation.”
“In fact, Jefferson had originally included an extended attack on the king for forcing slavery upon unwitting colonists. Had it stood, it would have been the patriots’ most powerful critique of slavery. The Continental Congress cut out all references to slavery as “piratical warfare” and an “assemblage of horrors,” and left only the sentiment that King George was “now exciting those very people to rise in arms among us.” The Declaration could have been what we yearn for it to be, a statement of universal rights, but it wasn’t. What became the official version was one marked by division.”
To understand the profound regret that characteristically informs this “missed opportunity counterfactual,” it helps to re-read the original draft of the Declaration penned by Jefferson.
As is made clear on the Library of Congress website, Jefferson originally included among King George III’s “long train of abuses & usurpations” the following complaint:
“he has waged cruel war against human nature itself, violating it's most sacred rights of life & liberty in the persons of a distant people who never offended him, captivating & carrying them into slavery in another hemisphere, or to incur miserable death in their transportation thither. this piratical warfare, the opprobrium of infidel powers, is the warfare of the CHRISTIAN king of Great Britain. determined to keep open a market where MEN should be bought & sold, he has prostituted his negative for suppressing every legislative attempt to prohibit or to restrain this execrable commerce: and that this assemblage of horrors might want no fact of distinguished die, he is now exciting those very people to rise in arms among us, and to purchase that liberty of which he has deprived them, & murdering the people upon whom he also obtruded them; thus paying off former crimes committed against the liberties of one people, with crimes which he urges them to commit against the lives of another.”
The reason why this anti-slavery passage was deleted was later explained by Jefferson in his Autobiography, where he wrote:
"The pusillanimous idea that we had friends in England worth keeping terms with still haunted the minds of many. For this reason, those passages which conveyed censures on the people of England were struck out, lest they should give them offense. The clause, too, reprobating the enslaving the inhabitants of Africa was struck out in complaisance to South Carolina and Georgia, who had never attempted to restrain the importation of slaves, and who, on the contrary, still wished to continue it. Our Northern brethren also, I believe, felt a little tender under these censures, for though their people had very few slaves themselves, yet they had been pretty considerable carriers of them to others."
Predictably, many observers have wondered how American history would have unfolded if the paragraph had been included.
The British Library has opined: “Now, we're not going to enter here into the debate about Thomas Jefferson's attitude to slavery. He expressed opposition to the slave trade throughout his career and in 1807 he signed a bill that prohibited slave importation into the United States; that said, Jefferson was also the owner of hundreds of slaves. However, it does strike us that this passage, with its forthright language ('this piratical warfare', 'this execrable commerce'), could easily have changed the course of history if adopted in America as early as 1776.”
Henry Jaffa has argued in A New Birth of Freedom (p. 478): “It remains a matter of profound regret that [Jefferson’s original words…did not remain in the text. They would have made impossible the perversity of [Supreme Court justice, Roger B. Taney, who handed down the Dred Scott decision in 1857] and [Stephen] Douglas’s misrepresentation of the Declaration of Independence. (Douglas had agreed with Taney that the signers of the Declaration had not meant to include Negroes in their equalitarian pronouncement).
It would be interesting to see how other scholars have wrestled with the counterfactual implications of Jefferson’s deleted words. Reflecting on the deeper questions involving the origins of American independence lends deeper meaning to a holiday otherwise devoted to consuming mass quantities of charred meat.