History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Fri, 03 Apr 2020 17:42:38 +0000 Fri, 03 Apr 2020 17:42:38 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://theguardian.hnn.org/site/feed Filmmaking, Reality and Fact: How Documentaries Shape Americans' Ideas of Truth

 

Documentaries—screened, broadcast, and streamed—are more appreciated and influential today than ever before. It’s been said we live in a Golden Age of nonfiction filmmaking. At the same time, media phenomena like “alternative facts” and “fake news” lead some to suggest we live in a “post-truth” era. It’s a major challenge for viewers trying to gain an understanding of contemporary politics, and for those who study America’s past. 

How did we get here? Given that a sizable percentage of Americans get their sense of history through television documentaries, the history of nonfiction filmmaking offers insights.  How documentary filmmakers struggled to discover and convey a sense of the real world—succeeding, failing, and compromising along the way—provides a basis for understanding and appreciating the difficulties and importance of representing and determining evidential truth in times where facts are discounted, distorted, or ignored. 

Since the first movies in the 1890s, nonfiction filmmakers were driven by curiosity about the real world and changing times—and shaped by interactions with their audiences. Three documentaries focused on families mark changes in both filmmaking techniques and philosophical arguments about how film and television have represented reality during the past one hundred years.

By 1922, the first Dream Factories of Hollywood dominated international movie production, but a less-than-glamorous story about an Inuit family of seal hunters in the Canadian arctic was a surprise box office hit. Shot on location without trained actors, Robert Flaherty’s Nanook of the North is considered the first documentary film narrative. With moments of drama, suspense, humor, and poignancy—unlike previous travelogues—Flaherty mostly portrays Inuit people as complete human beings, not primitive sources of detached curiosity for his largely white audience, many with attitudes of racial superiority.  

Yet, as “realistic” as it appeared (and sought to appear), the truth of Nanook was affected, at least partly, by the filming process. Since the heavy hand-cranked silent cameras of the day limited Flaherty’s ability to respond spontaneously to changing action, he staged much of what he shot. Without the capability to record sound, narrative information and context are provided by title cards. 

At a time when distinctions between fiction and nonfiction films were yet to be defined, Flaherty still “cast” his characters: his “star” is a respected seal hunter. Not known as Nanook (Bear), his real name (Allakariallak) would be unpronounceable to English-speakers. The woman who plays Nanook’s wife, “Nyla” (Smiling One), isn’t his actual spouse, but chosen for her beauty. 

Notwithstanding these creative licenses, Flaherty intended to act as a cultural preservationist. He aimed to convey a sense of the Inuit as they lived before the influence of the white man.  Even though modern academic anthropologists disdain this as “salvage anthropology”—an approach that ignores native people as they are to convey an idealist vision of the past—Flaherty wasn’t making a deliberate attempt to deceive. Despite creative liberties, Inuit life as portrayed in Nanook represents the reality of a forbidding arctic environment where starvation was a true threat to survival. Facing criticism, Flaherty openly admitted, “sometimes you have to lie to tell the truth.” 

The idea that small facts can be sacrificed in service to larger truth remained a cornerstone of the craft of documentary film even as modern technology made it possible to record events without more obtrusive staging.

During the 1960s, the invention of light, portable cameras, more sensitive film stock, and easily synchronized sound lifted many of the restraints that had hindered Flaherty. The result was cinéma vérité, a new style and philosophy of nonfiction moviemaking that transformed portrayals of reality on film.  

Vérité pioneers like Bob Drew, Ricky Leacock, Al and David Maysles, D.A Pennebaker, and Fred Wiseman sought to capture experience and environment, to give viewers a sense of “being there,” not necessarily to communicate factual information. They vehemently objected to authoritative narration, interviews, and the emotional guidance provided by a separate musical soundtrack. They wanted the audience to be actively involved in evaluating and determining the truth of what was shown, not to be told how to interpret what they were seeing. 

Building on the cinéma vérité tradition, the 1973 PBS/WNET series An American Family closely observed the lives of the upper middle-class Loud family of Santa Barbara, California.  While previous vérité films, like Drew and Leacock’s Primary, focused on the 1960 Democratic presidential race, suggested a new kind of journalism, An American Family producer Craig Gilbert saw his project in terms of a sociological study. The series was recorded by a husband-and-wife camera and sound team, Alan and Susan Raymond, who virtually lived with the Louds during filming. Despite early sociological intentions, during the editing process, the twelve-part series morphed into a kind of real-life soap opera, with portents of reality TV to come.  

As it turned out, along with familiar upper-middle-class activities, life with the Louds unexpectedly included adultery, frequent shots of vodka, and the uninhibited lifestyle of a son, Lance, who was proudly gay at a time when openly gay characters were missing from television and movies. He attracted critical mockery, but later proved to be an example of changing attitudes in the decades to come. As film scholar Jeffrey Ruoff wrote: “Lance Loud didn’t come out on American TV, American television came out of the closet through An American Family.”

As An American Family unfolded, the normally sedate public television audience was divided between fascination and disdain. For many viewers, although the Louds weren’t scripted actors, it was easy to see them less as real people and more like performers playing themselves. The result was a new kind of celebrity and questions that added to doubts about the veracity of screened reality. Did the filmmaking process affect the truth it purported to capture? 

Even with these questions, An American Family offers valuable, if filtered, historical evidence, not only about family life in the 1970s, but also the evolution of portrayals of the real world on film and television. Sadly, the outtakes of a popular TV show aren’t often considered serious source material for future study. Most of the three hundred hours of original footage shot by Alan and Susan Raymond was destroyed during a WNET housekeeping session.

If cinéma vérité purists scorn traditional documentary approaches like narration and interviews, found on television since the 1950s—Ken Burns, perhaps the best known and successful contemporary American historical documentarians, is proudly old-fashioned. Burns’s 2014 seven-part, fourteen-hour PBS family portrait The Roosevelts: An Intimate History engagingly traces the intertwined lives of Theodore, Franklin, and Eleanor, based on deep and careful archival research, informative narration by longtime Burns collaborator Geoffrey Ward, and the on-camera guidance of interviews with authority figures and eyewitnesses. 

As an evidential historian, Burns benefits from the evolution of nonfiction filmmaking genres between 1920 and 1950, especially newsreels. Teddy Roosevelt wasn’t the first president to appear on film. That was William McKinley in 1896. But TR’s physically flamboyant speaking style made him a compelling “picture man” on silent movie screens. 

Sound movies and radio were boons for FDR and Eleanor, providing a vivid sense of who they were in “real life” as well as the world they inhabited—impressions that are harder for print chroniclers to convey. Yet even though contemporary documentarians may access a wealth of facts and engaging archival imagery, questions about how documentaries are made, dating from the days of Nanook, continue, perhaps inevitably, to generate questions about truth on film.

Like his fiction filmmaking counterparts, Ken Burns considers himself a storyteller, “an historian of emotions,” emotions being “the glue of history.” Most viewers found The Roosevelts moving as well as informative. However, fairly or not, a mutual sense of filmmakers and audiences that movies are an emotional medium has long been a burden for nonfiction filmmakers with informative intentions. Despite, or perhaps because of, Burns’ popular success, many academic historians are suspicious of film, considering it a source of impressions and feelings rather than the purported intellectual depth and rigor of the printed word.

Notably, Burns has been criticized for a tendency to view historical stories like the lives of the Roosevelts, and his most lauded effort, The Civil War, in terms of a series of trials that ultimately lead to an uplifting triumph over adversity—a narrative arc found in fiction. Although Burns doesn’t ignore social and political injustices, especially concerning race, as a skilled popular historian he doesn’t dwell on unfinished business that could disturb or challenge viewers who look to historical documentaries to provide a sense of closure.

Compensating for this, “open wounds” from the more recent past are often the subject of less-reassuring documentaries like The Central Park Five. An uncharacteristic Burns film, made in collaboration with his daughter Sarah and her husband David McMahon, The Central Park Five investigated the unjust conviction of a group of young African American and Latino teenagers accused of a brutal (and sensationally-reported) rape. 

The nonfiction film history I’ve sketched reveals the persistent limitations and compromises of documentary film as well as the medium’s expressive strengths. Just as academic historians bicker over differences of interpretation and the importance of archival minutia, without dismissing the idea of truth itself, serious documentarians are just as committed to reflecting an honest sense of the real world.  

This is especially vital during a moment in society when people may toggle between believing what is staged and edited, cynically rejecting everything as fake, or selectively embracing evidence that affirms their beliefs. Polarized audiences can be tempted to dismiss documentaries as just another a kind of entertainment—no more true than fiction films. 

Such attitudes are encouraged by the reality TV presidency of Donald Trump, whose administration is known for a dramatized attitude toward the real world, an entertainer’s ability to appeal to emotions, tweets that offer diverting plot twists, and attention-grabbing behavior that delights fans and outrages the less enamored. 

If this weren’t enough, there could be more uncertainty ahead. Technology has transformed the possibilities and supported new forms of nonfiction storytelling and will affect traditional historians as well. The emerging realm of “deepfake” imagery threatens to manipulate real images beyond detection, and artificial intelligence can target viewers with the stories they prefer to see, true or not.

Like all nonfiction, documentary films have always depended on credibility, even when the building blocks of moving images have been staged, selectively edited, or chosen above others for reasons of narrative. Of course, print historians face similar challenges in constructing narratives that marry facts with interpretations, rather than simply assembling encyclopedic lists of dates and names. Unlike a documentarian, a fiction filmmaker can have a flop and move on. For a creator of nonfiction films, losing audience trust is catastrophic.

Viewers can lose out, too. If they make choices about what they believe based on entertainment appeal or personal preference rather than thoughtful evaluation, even a Golden Age of documentaries can become counterfeit. American democracy—and an accurate appreciation of our past—will be the worse for it.

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174756 https://historynewsnetwork.org/article/174756 0
"Rogue" Manufacturing in China: Past and Present

 

China’s rapid ascent as an economic superpower at the turn of the twenty-first century has fueled considerable global anxiety. The Trump administration’s recent decision to adopt an aggressive, no-holds-barred tariff war with China is one such expression of concern. At the core of the unease is the sense that China does not play by the rules and engages in rogue manufacturing and industrial espionage. While copying and trade theft occur worldwide, China has become identified as the most egregious offender in the production of fakes and knockoffs, portrayed as unable to innovate and add productively and originally to the global economy. 

 

In China itself, however, copying is not necessarily unethical, nor mutually exclusive from innovation. Manufacturing products by copying preexisting brand name items is often associated with the term shanzhai. Originally literary in nature, referring to the mountain strongholds in which rogue heroes pursued extralegal justice, shanzhai is now translated as “knockoff” or “local imitation,” and is linked to the culture of underground factories located in manufacturing hubs such as Shenzhen that deliver local adaptations of brand products. Despite critics in the West raising alarm bells regarding Chinese counterfeits, Chinese leaders have come to celebrate shanzhai practices and promote the adaptation of imported technologies as crucial to “indigenous innovation.” Some Western journalists have even started to praise the culture of shanzhai in Shenzhen by likening the city to Silicon Valley and celebrating the area as a tech “nirvana.”

 

If shanzhai practices characterize China’s economic growth today, manufacturing that resembles shanzhai has a longer history. In the early twentieth century,unconventional and homegrown ways to build industry, which included drawing from global knowledge and engaging in strategic emulation, were avidly pursued. One particularly colorful individual engaged in such practices was Chen Diexian (1879-1940), a novelist and pen-for-hire, a professional editor/translator and dabbler in chemistry, and, eventually, a patriotic captain of industry. A new-style entrepreneur, Chen Diexian deftly navigated China’s early twentieth-century transition to industrial modernity. The fledgling republican state was weak and distracted, busy staving off internecine warfare and threats from abroad. Economic imperialism loomed large and Chinese entrepreneurs faced daunting challenges in developing native industry. Yet, while political chaos reigned and economic imperialism seemed insurmountable, the rise of vibrant treaty-port economies and modern print and light manufacturing industries also allowed unexpected opportunities to emerge. The decline of orthodoxy and tradition that had followed the fall of the empire meant that enterprising individuals could engage in new regimes of knowledge and pursue new endeavors. 

 

Taking advantage of the unprecedented opportunities, Chen Diexian was one such enterprising individual. He dabbled in scientific activity and developed commercial enterprises, both lettered and material. He translated texts on and explored regimes of chemical and legal knowledge, adapted foreign technologies, and openly pursued profit—activities once deemed unthinkable for respectable men in late-imperial China. Productive in the making and selling of words and things, Chen turned his writer’s studio into a chemistry lab, shared brand name manufacturing formulas as “common knowledge” in newspapers, and utilized proceeds from his romance novels to manufacture the incredibly popular “Butterfly Brand Toothpowder,” unique in its ability to double as face powder. By doing so in a moment when China was experiencing penetrative economic imperialism, Chen’s industrious activities constituted a form of “vernacular industrialism” that was local and “homegrown” (as opposed to foreign), informal and part of China’s consumer culture (rather than state-sponsored or academic-oriented), and artisanal and family-run, if eventually located in factories. 

 

A key element of Chen's unconventional route to entrepreneurship and vernacular industry building was a "Do-It-Yourself" maverick approach to manufacturing. As a self-proclaimed nativist not able to speak a single foreign language, Chen tapped into global networks of knowledge by employing practices of collaborative translation where fidelity was second to adapting texts to local concerns. He would then tinker with the translated technologies, improve foreign recipes, and present such adaptations as virtuous “emulation” crucial to the building of native Chinese industry. Chen’s iconic product, the Butterfly Brand Toothpowder, was ingeniously manufactured when Chen improvised a foreign manufacturing recipe by experimenting with local cuttlefish bones to source local calcium carbonate, a crucial ingredient. While an advocate of emulating foreign manufacturing formulas and technologies, he sought patents for his own recipes and gadgets, basing his claims of ownership not on original invention, but on improvement (gailiang), an approach that came to inform the National Products Movement, a “buy and manufacture Chinese goods” campaign. These practices were not examples of ignorance or deviousness, but instead demonstrate how copying, improvement and innovation were not always at odds. They also reveal the strategic agency of Chen, who despite being highly “local,” drew from far reaching circuits of law and science. 

 

Chen’s early twentieth-century vernacular industrialism can serve to remind contemporary observers that shanzhai practices of strategic emulation, hands-on tinkering, “open-source” know-how, assembling and incremental improvement to remake technology, all have a long history in China. By considering the two periods side by side, we can see how and why practices of DIY tinkering, copying, improving, and reassembling have come to be so closely associated with modern China and why they have come to be seen as “rogue” in global discourses. Republican-era vernacular industrialism and contemporary shanzhai manufacturing overlap insofar as they emerged during moments of China’s entry or reentry into global capitalism. 

 

Yet real differences between the two moments exist and reflect some of the variant political implications behind not just the manufacturing cultures but also the divergent place of China vis-à-vis global capitalism and the differences between capitalism in the two eras. In the earlier moment, the nascent Republic of China was struggling with what were unremitting economic and political imperialist pressures while being ripped apart by internal warfare and political fragmentation. Despite this inhospitable context, vernacular industrialists such as Chen were able to adapt new forms of industrial manufacturing that emerged with global developments in chemistry and physics to generate a patriotic and anti-imperialist National Products Movement. In this context, “rogue” practices for the purposes of import substitution were deemed necessary. The contemporary moment offers us a different iteration of the way China engages with global capitalism. Today’s shanzhai and counterfeit manufacturing emerges during a period when a strong post-socialist authoritarian state has been eager to reenter global capitalism and has adopted muscular policies that have enabled China to do this extremely successfully. Practices of shanzhai copying and adapting electronic and digital technologies of the current postindustrial global economy have been part of this success, helping fuel China’s ascent, even while generating anxiety among global competitors.

 

Both Chen’s vernacular industrialism in the early twentieth century and today’s shanzhai force us to rethink conventional narratives and normative ways of understanding ownership, innovation, and what constitutes industrial work and industrial development. For the Trump administration so eager to demonize Chinese industry to pursue its trade war, the legal conceit that emulation precludes innovation is highly convenient for it to castigate China for its purportedly flagrant disregard of intellectual property. Yet, history of shanzhai-style manufacturing reveals a more complex picture, where innovation and emulation have often been inextricably bound. This history, moreover, converges with contemporary discourses in maker’s movements and start-up cultures worldwide that embrace open-sourcing and shared access to knowledge, similarly threatening the easy distinction between imitation and innovation. This all suggests that imposing any one set of norms on intellectual property in industry is likely to be a futile effort. Rather, what we might see is that as China’s economic power grows, intellectual property might end up looking more and more like shanzhai in the future.

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174767 https://historynewsnetwork.org/article/174767 0
DC Comics and the American Dilemma of Race

 

The sudden firing this past February of Dan Didio as co-publisher of DC Comics continued a tumultuous era for the company. During Didio’s decade-long tenure, DC’s comic universe frantically churned through four separate reboots/rebrandings–“The New 52,” “DC You,” “DC Rebirth,” and “DC Universe”–with a fifth allegedly in the offing for later this year. A consistent feature of this hyperactive course correcting has been an explicitly stated intent of building stories and heroes that more fully reflect the diversity of their real-world readers. Such an emphasis appears likely to continue. Speaking in the aftermath of the corporate turnover, now sole Publisher Jim Lee promised an even brighter future grounded in superheroes who are “inclusive and diverse.” 

It might be easy to think of this impetus toward inclusion within comics as a fairly recent phenomenon. But understanding diversity as only a 21st century preoccupation shortchanges what comic books have (and have not) been doing since Superman first flew onto the scene in 1938. Superhero popular culture, in fact, has always been embedded within American racial attitudes, reflecting and even contributing to them in ways that have set the stage for how we continue to grapple with these matters in 2020, especially in revealing that goodwill is not sufficient, in and of itself, to fix our problems.

Self-conscious explorations of racial issues in American comics date most fully from the late 60s and early 70s, when creators turned in earnest to the idea that they might use their medium to help build a more egalitarian world. No creators better embody this turn towards what would be termed “relevancy” in comics than Dennis O’Neil and Neal Adams, the creative team that produced the much-lauded “Green Lantern/Green Arrow” series. Wanting, as he later wrote, to do his part in the movement for civil rights, O’Neil used his four-color pulpit starting in 1970 to explore a range of issues, including urban and rural poverty, industrial exploitation, environmental degradation, overpopulation, and teen drug addiction. Alongside these, O’Neil and Adams also addressed race as the heroes encountered not only African Americans and Native Americans, but also the discrimination and marginalization such persons of color confronted on a daily basis. 

However, O’Neil and Adams’s work – despite its undoubtedly good intentions – reflected the limits of the liberal imagination of its time. Postwar liberalism offered grand visions of racial equality and harmony, but too often imagined the obstacle to these goals as discrete, misguided individuals as opposed to systemic inequities within U.S. society. Such an understanding readily translated to the good guy/bad guy duality in comics, and so Green Lantern and Green Arrow regularly dealt with corrupt slumlords and businessmen while leaving intact – if not completely unacknowledged – the structural problems fomenting race-based discrimination and impoverishment. Too, the series often put the onus for change on nonwhites themselves, chiding them to, in essence, get their act together and/or accept benevolent white assistance, implicitly casting them as part of the problem rather than its victim.  

If 60s and 70s comics were hemmed in by the liberal ideology prevailing within U.S. society, 80s and 90s comics found themselves trapped by the problematic understandings inherent within what would become known as “multiculturalism.” Nowhere is this seen more fully than in the super teams that developed during this era and would seem ready-made to promote inclusion. It turns out, however, that they fell prey to the limits of multiculturalism itself, which too often traded in superficial forms of inclusion as well as a flattening of nonwhite persons into racialized caricatures.

On the comic book page, inclusion often meant adding one nonwhite – and most often, black – member to an otherwise all-white lineup. The lauded relaunch of DC’s “New Teen Titans” by writer Marv Wolfman and artist George Pérez, for example, included the African American Cyborg as its only nonwhite member (setting aside, of course, the orange-skinned alien Starfire and green-skinned Changeling, neither of whom represent any real-world forms of racial difference). Other teams with wider-ranging diversity in their memberships traded in reductive stereotypes. The “Detroit-era” Justice League, for instance, added the Latino hero Vibe who spoke in a stilted dialect and came from “the street” as well as the African Vixen who was as much defined by her sexuality as her extranormal abilities. Even worse was DC’s “New Guardians,” an even greater conglomeration of stereotypical associations: an emotionless Japanese hero who was half human and half computer, a Chinese heroine with mystical abilities and an unrelenting sex drive, and a Latin American magician that likely could not have embodied more degrading stereotypes of homosexual men if his creators tried.

If looking back reveals signposts marking the ways in which comics have long evidenced Americans’ limited success in addressing race, we might then wonder what DC’s hyperactive – if not hyperreactive – rebooting of its heroes suggests about U.S. society, particularly as these reboots have been inextricably linked to inclusion. Certainly, it reveals some good-intentioned will to build something better. But such will, as in the past, does not guarantee a better world, and Americans, not unlike DC, still struggle to enact real reform. As DC struggles to find solutions that are anything more than feel-good bromides, Americans remain contradictorily caught between the pleasant fiction of what we claim – and have always claimed – that this country represents regarding diversity, and the unpleasant realities of a president who brags about building a wall along our southern border, a government agency that separates immigrant families, and police officers who brutally slay young men simply because they are black. The result is a kind of paralysis: We remain hemmed in by the disjunction between our lofty ideals and the disturbing realities that goodwill and talk are insufficient to resolve. Until such realities are acknowledged, real change is no more likely than the fairy-tale happy endings that comics so often promise.

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174762 https://historynewsnetwork.org/article/174762 0
An Interview with "Most Wanted" Author Sarah Jane Marsh

 

 

Chelsea Connolly: While reading Most Wanted, it occurred to me that I couldn’t remember the last time I had read a children’s book, and your book helped remind me of how unique of a medium they are. Could you talk about what you enjoy most about being a children’s author? Sarah Jane Marsh: I love the design challenge of crafting compelling nonfiction for children. I start with a vision of a story I’m fired up to share with kids. In Most Wanted, I wanted readers to understand why John Hancock and Samuel Adams were called out by name as troublemakers by the British government (reminiscent of our current U.S. President name-calling on Twitter). In the process, I’m whisking readers through ten years of American Revolution history...all in less than 2500 words. It’s challenging to channel all this inspiration and research into a compelling story within the format of a picture book biography. I’m writing to engage my young reader, but I must also consider the needs of the illustrator, publisher, teacher, and librarian. And the magic of narrative nonfiction is that it reads like fiction yet is historically accurate. So one immediate challenge is employing fiction techniques, such as narrative arc, character, pacing, and dialogue, without straying from the facts. And word count is my toughest constraint. An ideal nonfiction picture book is less than 2000 words. Each word has to hold its weight. Children's books are just as much a visual medium as they are textual. This puts a great deal of responsibility on the illustrator as well as the author. What is the relationship between an author and an illustrator when crafting a children's book? How do you help and/or influence each other?  Interestingly, the author and illustrator don’t usually have direct contact. Our work passes through our editor and art director who help orchestrate the story. But it is important for the illustrator to first create their vision based upon the completed manuscript. This begins with sketches that my editor sends to me for comment. It’s my job to weigh in if something doesn’t work historically. I do compile resources for our illustrator from my research: images and primary source descriptions of buildings, objects, clothing, etc. My publisher also employs a historian to fact check both the manuscript and illustrations.

I also share my thoughts about overall theme and context. In both Most Wanted and Thomas Paine and the Dangerous Word, it was important for me to have a visual sense of “the people.” Although I use individuals such as Adams, Hancock, and Paine as a vehicle for the story, I want readers to understand the American Revolution was ultimately a mass movement of the people. My editor shared this insight with our illustrator of both books, Edwin Fotheringham, who crafted some amazing scenes such as the 5,000 Bostonians who gathered at Old South Meeting House on the evening before the protest now known as the Boston Tea Party. We also had fun with the final illustration by adding some famous folks of the American Revolution. 

Visual literacy is also an important skill for young readers. With picture books, the illustrations enhance the story in many ways, such as by adding context, explaining vocabulary, or expressing emotion. Ed brilliantly depicted the oppressive nature of the Stamp Act through a visual metaphor of a super-sized stamp falling from the sky as John Hancock obliviously sips his beloved Madeira wine. Ed added an incredible amount of historical detail in Most Wanted, especially impressive considering he’s from Australia! 

The causes of the American Revolution is a very dense subject. How did you choose to tell this specific story? My picture book biography on Thomas Paine focused on the personal story behind the famous pamphlet Common Sense. As a result, readers gain an understanding of how and why we declared independence. (Insider secret: underneath the jacket cover is a replica cover of Common Sense!) For Most Wanted, I was curious about General Thomas Gage’s 1775 proclamation pardoning all militia gathered outside Boston -- except for Samuel Adams and John Hancock. What trouble had these two men caused to be called out by name? In the process of writing this story, I realized I was creating a prequel, history-wise, to Thomas Paine.  At first, I approached this story by focusing on the legendary hectic night inside the Hancock-Clarke house before the battles in Lexington and Concord. I spent a year writing this story. And my literary agent wasn’t wild about the resulting manuscript. But she was intrigued by the relationship between Hancock and Adams. So using that as my new lens, I expanded the story across ten years of revered (pun-intended) history, taking readers from the Stamp Act through the protests in Boston, the fighting in Lexington and Concord, to Hancock and Adam’s triumphant entrance into Philadelphia for the second Continental Congress, and ending with General Gage’s proclamation. This gave me the opportunity to compare and contrast these two leaders, and show the cause and effect of events that led up to the fighting Lexington and Concord.  When the final book arrived on my doorstep, I was thrilled to discover that our editorial team had surprised me by recreating Gage’s proclamation underneath the jacket cover -- similar to our Paine book. A total delight!   The author's note at the end of your book addresses that traditional narratives of the American Revolution often ignore and silence the populations who suffered greatly at the hands of the colonists. How did you incorporate this complexity in the book itself? How do you strike a balance between addressing nuance while also keeping the story accessible to children? My author’s note, written closer to publication, reflects an awareness I didn’t have when I began writing Most Wanted in 2015. My focus was navigating the complexities of the confusion prior to the fighting in Lexington and Concord. As engaging as the story is, my ten-year history sticks to the traditional narrative that we are beginning to see as limited and one that omits the experiences of those who did not hold power at the time, such as women, African Americans, and Native Americans. In Thomas Paine we address the issue of slavery directly, as Thomas Paine did in his writings. In Most Wanted, the topic is alluded to in the illustrations, but not addressed in the text other than a hypocritical quote by Hancock declaring he “will not be a slave” in the presence of his enslaved servant.  By listening to the deeply knowledgeable voices challenging our traditional narratives, I began to see how we have distorted our understanding of our history and of ourselves. My author’s note is my attempt to correct course, prompt discussion of the limitations of the text, and encourage critical thinking about these narratives, our history, and of ourselves.   There is an awakening happening in American history. Books like Hidden Figures, Never Caught, Stamped From the Beginning, and the New York Times “1619 Project” are examples of how a new age of historians are delving into our past to share silenced voices, experiences, and accomplishments. Howard Zinn was an early voice in this realm. And traditionally, our history has been told primarily by white males through their perspective. That is changing. And we are seeing the effects and gaining a better understanding of our shared history.  Children’s publishing is also experiencing an awakening, thanks to the efforts of those pushing to address the inequality of representation in children’s books. Only 23% of children’s books depict characters of diverse backgrounds. And there is good discussion and groundbreaking books that address tough topics with kids. In nonfiction, nuance can be discussed more fully in the backmatter. Every child, especially those dealing with tough situations, deserves to be seen in a book. In many ways, we explain the world to ourselves and our children through our books. Storytelling is our most powerful medium for transmitting values. 

Much of the work you do as an author can be translated to the work a teacher does in the classroom. As a former elementary and middle school teacher and having taught in the classroom yourself, what advice would you give to educators who want to talk about these complex and often upsetting issues with their young students? I think it’s important for educators to first develop their own cultural competency. Like learning itself, it’s a lifelong journey and one that I’ve recently begun through books, discussion, online resources, and trainings available to guide this growth.    Cultural competency starts with a better understanding of self: an awareness of our own identity and how we’ve been socialized within our own culture to hold certain norms.  For example, it was eye-opening to realize that the books I read as a child universally featured white main characters. This inherently reinforces a bias that the white experience is the norm. When you become aware of your biases, you can work against them -- for example, by reading and sharing authors outside your race, gender, culture, sexual identity, etc. As Stephen Pinker wrote, “reading is a technology for perspective-taking.” It’s important to recognize these biases in our classroom and curriculum, understand how they negatively affect our students, and actively seek and share broader viewpoints. Doing this with your students can help develop their cultural competency as well. And culturally responsive teaching builds on that competency by seeking to understand the diverse cultures and identities of our students and incorporating them into our classroom. All students should see themselves represented and reflected in their learning at school.   Most importantly, use available tools. Teaching Tolerance is a project of the Southern Poverty Law Center and has a wealth of resources for teaching hard history and how to sit with that discomfort. Welcoming Schools has tools to create LGBTQ and gender inclusive schools and to prevent bias-based bullying. Books like An Indigenous People’s History of the United States and Stamped provide an eye-opening understanding of our history and have a version for teens, a powerful tool for the classroom.   These resources also provide important frameworks. For example, how and why we teach about complex and upsetting issues are important. Students need to know about the violence and oppression in our U.S. history of enslaving other humans, but also the many acts of resistance by enslaved people and the beautiful cultures that survived this brutality.  (Kwame Alexander’s picture book The Undefeated does this well with African American history and won the Caldecott Medal, a Newbery Honor, and the Coretta Scott King Award.) Similarly, Mexican and Native Americans experienced many of the same injustices and violence as African Americans, but this is rarely taught in the classroom. Celebrating these resilient, thriving cultures is an important part of the education we need to impart.  What do you hope that children take away after reading Most Wanted? I hope that Most Wanted inspires readers to want to read and learn more about our nation’s history. The American Revolution is not always taught in elementary school, and I hope this book sparks a curiosity to learn more. My own fascination with the American Revolution was inspired by reading Laurie Halse Anderson’s picture book Independent Dames about the courageous women of that era. One book can be a doorway to further exploration.  Also, I hope that my author’s note prompts my young readers and the adults in their lives to engage in critical discussion about the book, this era, and our history as a nation. And to seek out expanded viewpoints and fill the gaps in their own knowledge with more learning. Our history is fascinatingly complex and surrounds us still today.     Do you have any ideas in the works for future projects? I am re-evaluating my role as a historian and storyteller for children. I want to use the agency I have in the publishing world to broaden our children’s understanding of our nation’s history and to think critically, engage in democratic discourse and ultimately, to paraphrase Martin Luther King Jr., help bend the long moral arc of the universe toward justice. I’m working on another picture book that tackles the issues that I’ve been wrestling with as an author. It looks at America through a social justice lens to prompt conversation. I can’t tell if it’s brilliant or terrible, but I’m enjoying the creative challenge!

 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174765 https://historynewsnetwork.org/article/174765 0
A Word to My Fellow Progressives: Lessons of the 2020 Democratic Primary

 

 

This year feels to many progressives like a lost opportunity. With Donald Trump in the White House and anger among disaffected groups running high, it should have been possible to nominate a very progressive candidate to be the Democratic standard bearer this November. In the face of such Republican overreach, latitude existed to move the center of American politics leftward, compelling the electorate to accept a slightly more liberal candidate than would usually be tolerated, in deference to the catastrophic dearth of competence and probity in the Oval Office.  

 

As the nominating contest heated up, moderates invoked the precedent of the McGovern campaign of 1972 in warning against the dangers of a progressive candidacy. Drifting too far to the left would alienate the middle class voters that are foundational to any winning coalition. But the similarities between 2020 and 1972 were always tenuous. Few people had heard of the Watergate office complex as voters went to the polls in the fall of 1972. The nation was polarized by the Vietnam War, and for all that McGovern floated many progressive proposals, his was effectively a single-issue campaign seeking to channel anger over the war. Since the war had been started by successive Democratic administrations, and Nixon himself was promising a policy of “Vietnamization,” the election was less a referendum on progressive politics in general than it was over “gradualist” versus “subitist” models of military disengagement.

 

The closer parallel to 2020 was arguably the election of 1932. In that year, four years of perceived mismanagement of Depression fiscal policy by Herbert Hoover created widespread disaffection, not merely with the Republican party, but with the policy “status quo” more generally, paving the way for the nomination of Franklin Roosevelt to head the Democratic ticket over the more moderate Al Smith.  Though (before the onset of the Covid-19 pandemic, which hit after the Democratic nomination had largely been decided) we today have not suffered an acute crisis comparable to the Great Depression since 2008, Donald Trump and his administration had intensively alienated large swaths of the electorate as the Democratic primary got underway. In numerous polls, voters registered their fears over growing wealth inequality, climate change, the costs and availability of education and health care, and a growing sense of insecurity in a rapidly changing world riddled with unpredictable bad actors. Many African-American, Latinx, female, and Muslim voters felt that the US government was manifesting hostility not seen from federal authorities since the 1950s or 1960s. The spreading disenchantment, the sense that government is failing to provide needed leadership in urgent times, was reminiscent of the national mood at the end of Hoover’s first term. It was a moment in which progressives might have “turned the ship of state” and opened a course toward robust reforms like those of the New Deal era.

 

Now that Joseph Biden, the embodiment of the Democratic Party’s moderate wing, is all but sure to win the nomination, that chance has passed. What, then, can progressives learn and apply to the future from the experience of 2020? I would propose the following items:

 

1) The Primary’s the Thing. The American electoral system works along very particular lines. One of its idiosyncrasies is that, given the reality of the two-party system, each election has two phases which are strategically alien to one another. Winning the Democratic primary requires fundamentally different tactics, and the building of a fundamentally different coalition than does winning the general election. If progressives want to get onto the ballot, they must strategize intensively to compete in the primary contest. In other words, they must mobilize to defeat Democratic moderates in a contest among Democrats, state by state. 

 

This may seem like a truism, but it is a principle that is roundly ignored by members of all parties in all election cycles-it is one of the blind spots that helped propel Donald Trump to the Republican nomination in 2016, and that has helped him maintain control over the GOP ever since. As an example of things progressives might do differently if they were following such advice: campaign as Democrats. While it is true that Trump came in as an outsider and effected a kind of “hostile takeover” of the GOP, he did not do so while disavowing membership in the Republican party altogether. If you want to lead the party, you have to be willing to join the party.

 

2) Consolidate Early. The failure of a crowded Republican field to back a single insider helped Trump win the nomination in 2016. Democratic moderates eventually took this hint and consolidated behind Joe Biden in March 2020. If Pete Buttigieg, Amy Klobuchar, and Mike Bloomberg had not made timely withdrawals from the race, progressives might have captured the nomination. Conversely, if progressives had rallied behind either Elizabeth Warren or Bernie Sanders early on, outcomes might likewise have been different. Books like The Party Decides have raised awareness of the ways in which party power is institutionalized, such that systemic processes will generally favor moderates, whose donors consistently fund robust infrastructure. But Trump’s campaign demonstrated that the party “machine” cannot fully contain the energies of a consolidated movement. Progressives on the Democratic side only need to be able to achieve purposefully what Donald Trump did by accident in 2016. They do not need an operation as elaborate as the DNC, only a forum analogous to C-PAC in which progressive opinion and strategy can be deliberated.  The Center for American Progress tried to launch such a forum in this cycle with their “Ideas Conference.” Progressives should treat that event in future as an opportunity to achieve “movement discipline.”

 

3) Fight Astroturf Aggressively. Corporate interests will inevitably use their disproportionate power in the media to demonize progressive candidates and policy initiatives. We saw this early on with regard to Elizabeth Warren. A well-funded media blitz fabricated the self-fulfilling narrative that her “Medicare for All” plan was unpopular with voters and made her “unelectable.” Progressives must be ready for that kind of assault, and move to aggressively counter-message in defense. Obviously there are limits to how effectively such resistance may be mounted, but the leaders of progressive campaigns should not be above such tactics as circulating “talking points” to surrogates and allies, by way of cultivating “message discipline.” Some part of campaign war chests likewise should be earmarked for “anti-astroturf” use. Here Democrats might take a page from Donald Trump. Voters respond to a candidate who sticks to her guns, and Warren’s move to “moderate” her position may have contributed to the effectiveness of the campaign to discredit her, and costing her support among those on the fence between her and Bernie Sanders.

 

4) Engage Voters of Color. Failure in this regard is perhaps the key to the defeat of the progressive wing of the party in 2020. Bernie Sanders won the trust of many Latinx voters, but neither he nor Elizabeth Warren was able to garner robust support in the African-American community. The fundamental lesson here is that the campaign season is too late to forge relationships in communities of color. Any progressive Democrat who is planning to seek higher office must begin now, in whatever office they occupy or position they hold, to communicate with Latinx and African-American leaders (this includes candidates who are themselves African-American or Latinx, who may not presume upon the support of voters of color) and to partner with them on issues of urgent concern. 

 

This is a dynamic that works at the levels of both policy and politics. Latinx and African-American voters reserve their support for candidates who both address key problems (i.e. fighting against discrimination in education or credit markets, defending communities against racist violence) and demonstrate, in their public activities and communications, that they are comfortable with and respect communities of color. The success of Doug Jones in Alabama is perhaps the best object lesson in what this looks like. The fact that he took some risks to bring white-supremacist terrorists to justice as D.A. convinced millions of African-American voters that he might uphold his promises to them as a U.S. Senator. 

 

5) Build A Winning Coalition- Among Democrats. This is a corollary of principle #1, “the primary’s the thing.” Progressives have a distinct advantage in the Democratic primaries, in that turnout in those contests is higher among progressives than among moderates. But 2020 shows that progressives still cannot win on their own--a higher percentage of self-identified “progressives” may turn out to vote, but progressives as a whole are still outnumbered in quantities sufficient to overwhelm their advantage in turnout. A progressive candidate must be able to win over some moderates in order to claim the nomination. Not all, some.Enough to make up the difference between the progressive plurality and a winning majority. Moreover, this imperative works hand-in-hand with #4, above: moderates are disproportionately represented among voters of color.

 

This does not mean that the left must settle for “progressives-in-name-only,” only that a degree of compromise will need to be tolerated. A Sanders candidacy seems to have been an exercise in progressive “overreach.” Powerful systemic forces draw the electorate rightward in the American political process, progressives will always need to thread the needle between fighting that tide and riding that wave.

 

As a progressive, I would like to see “our wing” of the party succeed in capturing the White House. A great deal might thus be achieved. Some of the disappointments of the “neoliberal” moderation of the Clinton and Obama years might be redressed: bringing a more robust approach to global warming; meaningful reform in health care and education; redress to wealth inequality, and much more. But for any of this progress to be possible, the riddle of the American ballot box must be confronted. If we can assimilate the lessons of 2020, perhaps in the future lost opportunities can be redeemed. 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174768 https://historynewsnetwork.org/article/174768 0
The Ruthless Litigant in Chief: James Zirin Paints a Portrait of Trump Through 3,500 Lawsuits

 

American presidents before Donald Trump had some record of public achievement in politics, government or the military before they were elected. Donald Trump lacked any of those credentials, but brought his astounding history of involvement in thousands of lawsuits to the nation’s highest office. This trove of cases from more than 45 years reflects Trump’s contempt for ethical standards and for the US Constitution and the rule of law, the foundation of American democracy.

As a perennial litigant, Trump weaponized the law to devastate perceived enemies, to consolidate power, to frustrate opposing parties, as former federal prosecutor and acclaimed author James D. Zirin illuminates in his compelling and disturbing history of Trump’s use and abuse of the law, Plaintiff in Chief: A Portrait of Trump in 3,500 Lawsuits (All Points Books).

Mr. Zirin is a distinguished veteran attorney who spent decades handling complex litigation. He is also a self-described “middle of the road Republican.” Plaintiff in Chief stands as his response to Trump’s disrespect for law and our legal system. He stresses that the book is a legal study, not a partisan takedown. 

In his book, Mr. Zirin scrupulously documents Trump’s life in courts of law. Based on more than three years of extensive research, the book examines illustrative cases and how they reflect on the character and moral perspective of the current president. The details are grounded in more than 3,500 lawsuits filed by Trump and against Trump. Litigation usually involves sworn affidavits attesting to accuracy and testimony given under oath if a trial occurs, so Mr. Zirin is able to reference page after page of irrefutable evidence of Trump's legal maneuvering, misstatements, hyperbole, and outright lies. 

As Mr. Zirin points out, Trump learned how to use the law from his mentor, the notoriously unprincipled lawyer and fixer Roy Cohn whose motto was “Fuck the law.” Trump took Cohn’s scorched earth strategy to heart and used the law to attack others, to never accept blame or responsibility, and to always claim victory no matter how badly he lost.

”Trump saw litigation as being only about winning,” Mr. Zirin writes. “He sued at the drop of a hat. He sued for sport; he sued to achieve control; and he sued to make a point. He sued as a means of destroying or silencing those who crossed him. He became a plaintiff in chief.”

And Trump also has been a defendant in hundreds of legal actions, as Mr. Zirin details. In 2016, there were 160 federal lawsuits pending in which he was a named defendant, as well as numerous other investigations and proceeding. Mr. Zirin observes that Trump “has been sued for race and sex discrimination, sexual harassment, fraud, breach of trust, money laundering, defamation, stiffing his creditors, defaulting on loans, and . . . he [has] been investigated for deep ties to the Mob, which he enjoyed over the years.” 

And Trump’s pattern of disrespect and contempt for the law persists. As Mr. Zirin writes, "All this aberrant behavior would be problematic in a businessman. . . But the implications of such conduct in a man who is the president of the United States are nothing less than terrifying."

Mr. Zirin is a leading litigator who has appeared in federal and state courts around the nation. He is a former Assistant US Attorney for the Southern District of New York under the legendary Robert M. Morgenthau. His other books include Supremely Partisan-How Raw Politics Tips the Scales in the United States Supreme Court and The Mother Court, on great trials from the Southern District of New York in the mid-twentieth century. His articles have appeared in array of publications including Time, Forbes, Barron’s, The Los Angeles Times, The London Times, and others. 

 

Mr. Zirin also hosts the critically acclaimed television talk show Conversations with Jim Zirin Digital Age, which airs weekly throughout the New York metropolitan area. In August 2003, Mayor Michael R. Bloomberg appointed him to the New York City Commission to Combat Police Corruption. He is a Fellow of the American College of Trial Lawyers, and is a member of the Council on Foreign Relations. A graduate of Princeton University with honors, he received his law degree from the University of Michigan Law School where he was an editor of the Michigan Law Review and a member of the Order of the Coif.

 

Mr. Zirin graciously responded to questions on his study of Donald Trump by telephone from his office in New York.

 

Robin Lindley: Congratulations on your new book Mr. Zirin, and on your distinguished legal career. In your book, you chronicle Donald Trump’s life as a ruthless litigator for almost a half century. How did you come to write Plaintiff in Chief on Trump’s life through more than 3,500 lawsuits? 

James D. Zirin: About three years ago, a friend suggested that I write a biography of Roy Cohn. I knew Roy Cohn. He was an unscrupulous lawyer. He was disbarred 1986, about three years before he died. And he was Trump's lawyer and confidant, and their relationship was very close, very intimate. He boasted to a journalist that he and Trump spoke about five or six times a day. This was before Trump had any notion of seeking political office. 

Cohn really taught Trump everything he knows about waging what I call asymmetrical warfare, weaponizing the law and using litigation as a means to attain the various objectives that he had. They met in a bar in 1973 just after Trump had been named as a defendant along with his father in a race discrimination in housing suit brought by the Justice Department. Trump had a number of lawyers and normally a suit like that ends quickly with a consent decree with the defendant agreeing that he or she won't discriminate anymore without accepting or admitting or denying the allegations in the complaint. 

Cohn had a different recipe for going forward. He liked to beat the system. He'd been indicted three times by the legendary prosecutor Robert M. Morgenthau, and he'd been acquitted three times. Cohn’s recipe was fight, and he taught Trump the tools he used. Number one is, if you're charged with anything, counterattack. Rule number two is, if you're charged with anything, try to undermine your adversary. Rule number three is work the press. Rule number four is lie. It doesn't matter how tall a tale it is, but repeat it again and again. Rule number five is settle the case, claim victory, and go home. And that's exactly what happened in the race discrimination case. 

So anyway, I created a book proposal, which I sent to my agent and my publisher, St. Martin's Press. In its wisdom, the Press said I should try to write a larger book about the influence of litigation on Donald Trump because that's the way he had conducted himself in the 40 years before he achieved office, and I should use Cohn perhaps as a springboard but the book should center on Trump and what experience he had had in litigation. So, I did that and that's how I came to create the book. 

Robin Lindley: It's a remarkable and chilling account of Trump's life through the prism of his legal affairs. You stress in the book that the two most powerful influences in his life were Roy Cohn and his father, Fred Trump. 

James D. Zirin: Yes. His father, of course, was a defendant as well in the race discrimination cases. His father was also a real estate operator and he came up against the government in the arena of FHA loans. He was accused of profiteering. He testified before a Senate committee and was interrogated by Senator Lehman. He made a lot of money by mortgaging out with FHA loans in ways that they were never intended. Then when he was asked about the profits, he said he had never withdrawn the money from the bank, so therefore there were no profits. That was ridiculous. But here is an example of saying something that's totally ridiculous for public consumption that somehow or other some people will believe. And that's the approach Trump has used professionally and that's the approach he continues to use in office. 

Robin Lindley: Were you ever involved in litigation with Donald Trump? 

James D. Zirin: No. I never was. I met him several times, and I met him with Roy Cohn several times. 

When I first met Cohn, I was an Assistant US Attorney and Cohn was being investigated by a federal grand jury. I worked for Robert M. Morgenthau then and that investigation resulted in indictments. Cohn was in the anteroom of the grand jury chamber and witnesses were waiting to testify and he raised his open hand in what might be interpreted as a high five. I naively thought it was a high five to encourage the witnesses since they were facing the daunting experience of testifying before a grand jury. And it wasn't the high five at all. He was telling them to take the Fifth. 

That’s how Cohn operated and that's the way Trump operates. It's saying something that's highly incriminating and doing something that's highly incriminating, but doing it in a way so that you have total deniability if anyone calls you out for it. 

Robin Lindley: What was your impression of Trump when you met him decades ago?

James D. Zirin: I really met him only to shake hands. I never met him to talk with him, but I knew of his reputation. I knew he didn't pay his bills. I knew he didn't pay his lawyers. I knew he'd been in bankruptcy five times, and I knew about his Atlantic City casinos. I knew he'd been sued a number of times. And I knew that he had been a plaintiff an extraordinary number of times. He sued journalists. He sued small business people for using the Trump name. He sued women who he was involved with. He sued his wives even after a divorce, both Ivana and Marla Maples. And I knew that a lot of settlements he entered during litigation were kept under seal in the files of the court so the public would never know the terms of the settlements. 

In one major litigation effort, you had the so-called Polish brigade case which involved the construction of Trump Tower that opened in 1983.Trump had undocumented Polish workers and he did not contribute to the union pension fund as he was required to do. There was litigation and it was eventually settled for 100 cents on the dollar after lengthy litigation including a trial in which the trial judge said Trump's testimony was completely lacking in credibility. But the case was settled. We never knew what the terms of settlement were except, about 20 years later, a judge unsealed the settlement papers and it turned out that Trump had settled for 100 cents on the dollar. 

Robin Lindley: Full disclosure: I'm a lifelong Democrat and I think most in my party would agree with your history and characterization of Trump.

James D. Zirin: I'm actually a lifelong Republican and I'm decidedly anti-Trump because I don't think he represents the values of the country or the Constitution of America. I think he's been a rogue president.  

Robin Lindley; I agree. A lot has happened since your book came out, with Trump’s reaction to the Mueller report, his impeachment, his weaponizing of the Department of Justice, and more suits against the media and others. And Trump continues to follow the Cohn rules. Trump famously said he needed a Roy Cohn. Does he have his Roy Cohn now in William Barr, the Attorney General? 

James D. Zirin: Many people have suggested that. I think Barr is more of an ideologue. He's not an unscrupulous lawyer as Roy Cohn was. 

Roy Cohn represented mobsters and he was a crook. He was eventually disbarred because he stole $100,000 from a client. He was disbarred because his yacht went up in flames and a crew member was killed. He collected the insurance. It was supposed to go to his creditors, but instead he pocketed the money. He was disbarred because he made a false and misleading statements on an application to become a member of the DC bar, and there was a disbarment hearing. Trump was one of a number of his character witnesses and he testified to Cohn’s good reputation for honesty, integrity, truth and veracity. And of course, Cohn’s reputation for honesty, integrity, truth and veracity was very bad.

And after Cohn was disbarred in 1986, Trump distanced and himself from Cohn, but that was not for long because Cohn died three weeks thereafter. There was a funeral and Trump stood in the back of the room and delivered no eulogy, and never said much more about him. 

What we do know about how close the relationship is that 30 years later, in 2016, when Trump was elected president, he turned to gossip columnist Cindy Adams, a friend of his, and he said, “Cindy, if Roy were here, he never would've believed it.” So we know that’s how close the relationship was. And, in the White House in 2017 when counsel Donald McGahn was dragging his feet about firing Sessions, Trump made the famous statement, “Where’s my Roy Cohn?” 

Robin Lindley: It seems to many observers that William Barr is acting as the president's personal attorney and has an authoritarian attitude about the Constitution and the role of the president while scoffing at the separation of powers. 

James D. Zirin: That is easily said, but I don't think it's easily demonstrated because the Constitution does not say that the attorney general must be independent of the president. There is a tradition of independence in the justice department, particularly since Watergate where the Attorney General must serve the Constitution, and not the president, and if there's a conflict the Attorney General should do something to resolve that conflict. 

 I think Barr has been quite cavalier about observing that tradition. He doesn't believe in it. He is contrarian and a libertarian. He believes in the unitary executive so that Trump is free to do basically anything he wants to do because he's the President of the United States. Barr has not been a check on Trump's unbridled abuse of power, but it's not really for the attorney general to do that. It's for the Congress to do that through the impeachment power, so you can't say that Barr has failed to ride herd on the president because he would take the view that that's not his obligation.  

Robin Lindley: Thanks for explaining that view of the Attorney General’s role. Since your book came out the impeachment occurred. Senator Susan Collins thought the president would be chastened by that process. Of course, that hasn’t happened. How do you see Trump’s response to the impeachment and the unanimous Republican Congressional support of him, with the exception of Mitt Romney? 

James D. Zirin:I think Trump believes he's above the law, and when I say the law, I mean the law including the Constitution. 

The Republicans in the Senate were willing to give him a pass for various reasons. I suppose they could rationalize it. They could say, number one, it was for the American people ultimately to decide on whether he should remain in office and we have an election coming up in a few months. And number two, what Trump did was bad perhaps, but it wasn't so bad as to amount to an impeachable offense. Impeachable offenses are what two thirds of the Senate say they are was going to be. 

I don't think anyone ever thought that two thirds of the United States Senate would vote to remove him from office. But the Constitution provides for a trial and it's supposed to be presided over by the Chief Justice. And this was not a trial. It was a travesty because who has ever had a trial where the prosecution can't call witnesses to present the evidence. And that's what Romney was extremely upset about. 

I think it was a Senator Lamar Alexander's who said we don't need witnesses because, if five people say you left the scene of the accident, why call a sixth? And so, it was pretty much uncontested what the facts were, and what is to be made of those facts is up to the United States Senate under our Constitution. It shows that the hoary document we call the Constitution of the United States, which we put on a pedestal and supposedly has iconic significance, is an 18th century document that in the real world is pretty inefficient in curbing the powers of a tyrannical president. And I think that history will record that. 

Robin Lindley: And Trump responded that the impeachment was “a hoax” and said his letter to Ukranian President Zelensky was “perfect.” He actually asked a foreign government to interfere in an American election. It seems a high crime and misdemeanor under the Constitution. Elections are sacrosanct in a democracy. 

James D. Zirin:Well, that's true. And a high crime and misdemeanor does not have to be a crime that's in the United States Code, although this amounted to an invitation for a bribe, but also amounted to extortion, both of which are in the United States Code. 

But at the time of the enactment of the Constitution, there was no United States Code. The Constitution mentions bribery or other high crimes and misdemeanors. It was quite clear from the Federalist papers and the ruminations of Hamilton and Madison and others that abusive presidential power was an impeachable offense. And here you certainly have an abuse where Trump was using the foreign policy of the United States and the leverage of withholding funds for military aid that were authorized by Congress in order to achieve a domestic political advantage and benefit himself. 

Robin Lindley: And Trump continues to bring lawsuits from the White House. In the last couple of weeks, he's sued the New York Times and CNN for defamation. Of course, he'll never appear to be deposed, so those lawsuits will probably go nowhere. He continues to use the law as a weapon. You chronicle that sort of abuse of the legal system for the last 45 years or so. 

James D. Zirin: That's right. He has sued a lot of writers. Before he took public office, he sued the journalist Tim O'Brien for daring to write that his net worth was overstated. They took his deposition, and he demonstrably lied at least 32 times under oath, and the case was eventually dismissed. The defense was able to show the truth that, in fact, he had overstated his net worth. That was one of Trump’s sore points and he sued whenever someone said he was worth less than he believed should be stated. But O'Brien won his case. 

And he brought other cases against journalists. He sued an architectural critic for the Chicago Tribune for suggesting that one of his buildings, which wasn't even up but was planned, would be an eyesore on the horizon. The judge threw the case out because of the rule of opinions. To succeed in a libel action, you have to show that a statement of fact which is defamatory and false was made of and concerning the plaintiff. The architect stated an opinion that the building would be an eyesore on the horizon, and that's not something that could ever be libelous. 

Robin Lindley: You use the term “truth decay” and Trump is probably responsible for either misstatements or outright lies on an average of at least 10 times a day. How does this pattern of lying fit into his attitude toward the law? 

 James D. Zirin: I think he enjoys lying. I think it's part of his DNA. I don't think he has any grasp of the facts at all, so he says whatever he thinks will help him and whatever comes into his head. 

It is expedient, I suppose, to lie in litigation if you crossed an intersection through a red light. You can lie and say it was a green light and that changes the legal outcome of your case. And that's the way Trump operates. But he would go beyond that because he would say the heck with you and the horse you rode in on as he did in the House impeachment inquiry. Then, he denounced Adam Schiff and denounced Jerry Nadler and denounced the witnesses. He tried to subvert the whole proceeding by denouncing the whistleblower and by showing that those people who lined up against him were of low character and were themselves liars.

All of this goes back to Joe McCarthy because this is the way McCarthy operated. Then adversaries accused McCarthy of engaging in a witch hunt and accused McCarthy of generating these hoaxes. And of course, Roy Cohn was McCarthy’s chief counsel. And so he learned how useful those charges can be and how devastating they can be in any kind of controversy. And he taught all of that to Trump and Trump uses that to his great advantage.

Robin Lindley: Yes. And Trump certainly follows the Roy Cohn rule about declaring victory no matter how badly you lose. 

James D. Zirin: Yes. Even that conversation with Zelensky he called “perfect,” and it was something other than perfect. I don't know whether he's going to say the Corona virus is a perfect hoax, but perfect is a word that recurs again and again in his lexicon. 

Robin Lindley: Pardon me for this psychological observation, but you write that power and dominance are even more important to Trump than money. That seems pathological. And he seems to take a sadistic joy in attacking and ruining anyone he perceives as a foe of some sort. 

James D. Zirin: Well, that's true. 

In one of the cases that I describe in the book, he got wind of the fact that a small business, a travel agency conducted by father and daughter in Baldwin, Long Island, was using the name Trump Travel. Not Donald Trump Travel, but Trump Travel. They used Trump Travel because they were selling a bridge tours for people who play bridge, and “trump” is a bridge term. Also, like “Ace Hardware,” they thought “Trump” keynoted excellence. This was a little storefront travel agency in a small Long Island community. Trump had never been in the travel business and he never had any business involvement in Long Island, but he sued them. And at the end of the day, they were allowed to continue to use the name Trump Travel, but it had to make the lettering a little smaller. And they'd exhausted their life savings in defending the case. So he was quite sadistic about the way he went about it. 

There was another similar case where an unrelated family named Trump from South Africa had a multibillion-dollar pharmaceutical business and Trump sued them. He'd never been in the pharmaceutical business and had never been in business at that point anywhere outside the United States. But this family had the wherewithal to defend the case and eventually the case was thrown out completely.

 Robin Lindley: What are a few things you learned about Trump's ties to the Mob or organized crime? 

James D. Zirin: In the first place, his father had ties to the mob. His partner was a man named Willie Tomasello who was a made man and they were partners in various real estate ventures. 

Through Roy Cohn, Trump met leaders of the five families in New York, principally Fat Tony Salerno, Paul Castellano, and others who controlled the poured concrete business in New York. He also met John Cody who was a labor racketeer and president of the Teamsters.

 At that time, particularly because of the mob involvement, poured concrete was a much more expensive way of constructing a building than structural steel. The poured concrete business was dominated by Castellano who was murdered, and Salerno who was eventually sent to jail for a term of about 99 years. 

Trump retained these mafia companies to construct buildings out of poured concrete even though it was more expensive. We don't really know why he did that, but his mob ties ran quite deep. They existed in Atlantic City where he had a number of casinos, principally the Taj Mahal, which went bankrupt six months after its opening. At times, on a Tim Russert program and under oath, he denied that he had any contact with the mob, but he was warned by the FBI when he went into Atlantic City that he shouldn't deal with mobsters because it would ruin his reputation. 

But Trump continued to have contact with mobsters. On at least two occasions, which I relate in the book, he admitted that he had ties with the mob. In the construction of Trump Tower, the Teamsters called a citywide strike. Construction trucks and concrete trucks didn't have access to construction sites, but mysteriously at Trump Tower poured-concrete trucks passed the picket lines and continued their work. Cody, the president of the Teamsters got not one, not two, but several condominium units at Trump Tower for a female friend of his who had no visible means of support. 

Robin Lindley: Did you learn anything about Trump’s ties to the Russian mob and Russian oligarchs?

James D. Zirin: Yes. A lot of it is revealed in the book [by Craig Unger] House of Trump, House of Putin. But in 1986, a Russian oligarch walked into Trump Tower and he bought five condominium units with monies that had been wired from London and laundered from Russia. He was a member of the so-called Russian mafia. Eventually the Attorney General cracked down on it and made a finding that these apartments were purchased with laundered funds. So Trump’s ties to Russia go back at least that far, maybe further. And he has continued to deal with Russian oligarchs throughout his business career. 

Robin Lindley: Money laundering is complicated to me. Can you say more about Trump and laundered funds?  

James D. Zirin: Money laundering is where money comes from some illegal source and the money can't be reported, so the origin of the money must be concealed, and that's why it's called laundered money. A good way to conceal the source of money, particularly with money from Russia that is obtained by fraud or theft or in violation of Russian laws, is to buy a condominium unit in New York and the condominium unit is there and there's no tracing of the funds. The funds went to Trump and he deposited them in his bank accounts and he used it to pay his loans, and the origin and tracing of the funds just disappeared. 

Robin Lindley: And you indicate that Trump has repeatedly used laundered money to hide illegal funds.

James D. Zirin: The interesting thing with Trump and the Russians is that Deutsche Bank was his principal lender. No other bank would touch him. He now owes about $365 million to Deutsche bank. At one point in time, he was in default on a debt service payment to Deutsche Bank and they were about to sue him. Trump tried to stop them with the same technique, a suit against the bank: a counter attack, suing for fraud in lending. Somehow or other that got resolved, and the debt service obligation was extended and another department of Deutsche Bank continued to lend him large sums of money. Now that’s very suspicious because what bank lends money to a customer who's been in default, number one, and number two, what bank lends the money to a customer who has sued them and charged them with fraud?

Deutsche Bank pleaded guilty to money laundering for Russian interests and there were definite ties between Deutsche Bank and the Russians, which have never been fully explored. Russian money in effect may have been used to guarantee Trump’s indebtedness.  I think Trump's son Eric said on a number of occasions, and Donald Jr. said at a certain point in time that they couldn't get financing until they got it from Russia. 

Robin Lindley: Yes, I recall that comment. Trump also has been able to keep his tax returns secret. How do you see his refusal to reveal his returns and its significance?

James D. Zirin: Trump’s five predecessors in office all had no trouble releasing their tax returns. Both Republicans and Democrats seem to regard this release as a tradition although there is no legal obligation imposed on a president to release his or her tax returns. 

Trump's tax returns remain shrouded in mystery. Now, the District Attorney of New York County obtained a grand jury subpoena covering eight taxable years, and five of the eight were before Trump became president. He didn't subpoena Trump for them, but subpoenaed Trump's outside accountants and the Trump organization. Right up the line the courts sustained the subpoenas and said that the accountants had to comply, which they were willing to do except Trump had instructed them not to. That matter is now before the Supreme Court and will be argued in March. Presumably it'll be decided in June before this term of the Court ends. 

In addition, committees of the House of Representatives have subpoenaed some tax returns and that matter is also before the Supreme Court. 

Now, this is absolutely appalling. In the Second Circuit subpoena case brought by the Manhattan District Attorney, Cy Vance, Judge Chin questioned the lawyer for Trump in the Second Circuit. “Now your client said he could shoot someone on Fifth Avenue and no one would mind. None of his followers would mind and would still vote for him. I suppose if he shot someone on Fifth Avenue a district attorney would investigate the case. Couldn't the police investigate the case? Couldn't they seize the gun? Couldn't they talk to witnesses?” Trump’s lawyer answered “No, your honor. He’s protected by the fact that he's president.” And if anyone buys that one, I think the rule of law is seriously compromised. 

Robin Lindley: Yes. That goes back the wealth of evidence you present that Donald Trump has no regard for the rule of law or the Constitution. 

James D. Zirin: Yes, if it gets in his way. He said that impeachment was unconstitutional even though impeachment is provided for in the Constitution. So he doesn't know what he's talking about as a legal matter. 

There were also the instances of his undermining the judiciary, accusing judges who decide against him of being Obama judges or Mexican judges and taking them on individually as so-called judges. And he's undermined the judiciary in a way that's totally obnoxious to any lawyer who's dedicated to the rule of law.

Robin Lindley: He swore an oath to the Constitution as president and yet continues to attack the legal system and the rule of law. 

James D. Zirin: Justice John Marshall said we're a government of laws, not men. He would've said today men and women.  But Trump has attacked not only the legal system, but the judges who administer the legal system like Justices Sonia Sotomayor and Ruth Bader Ginsberg who he thinks should recuse themselves from all Trump-related cases. And again, he's gone after the individual judges. 

Robin Lindley: You also point out his abusive treatment of women and his payments of hush money to his paramours.

James D. Zirin: He did that before he took office and that's why Vance wants to see Trump’s tax returns, to see how these payments were treated on his tax returns and perhaps to find payments to other women. 

Robin Lindley: Since the book came out, have you received any backlash or criticism from Trump supporters or Republicans.

 James D. Zirin: No. I'm often asked if I expect Trump to sue me, and I say I wish he would because it would help the sales of the book. The book Fire and Fury would have sold about 5,000 copies, but then when Trump tried to enjoin it, it was like “Banned in Boston” and it became a runaway bestseller and sold three million books. 

I haven't received any backlash because the book is very well documented and everything in it is true, but the [Trump supporters’] response is basically, so what.  The stock market is up. We have less regulation. The government is less intrusive in our lives, except maybe when it comes to abortion. [To Trump supporters] this is all a good thing. And they spit up the names Pelosi and Biden and Bernie Sanders, and they ask would you rather have someone who's going to tax and spend like Bernie Sanders who promises a chicken in every pot? Or would you rather have a Donald Trump whose policies are a good though he's a little ridiculous. That’s the comment.

Robin Lindley: In the context of your study of Trump’s attitudes and perspective on the law, how do you see Trump’s response to the public health crisis now of coronavirus?

James D. Zirin: The coronavirus is quite likely to be the undoing of Donald Trump, when his mendacity, ignorance and shallowness came into full view, an empirical reality, as indisputable as the laws of science or a Euclidian equation.

I saw it all coming and I cried aloud in my book “Plaintiff in Chief. “

Here’s a partial list of Trump’s lies about the coronavirus: 

In President Trump’s first public comments about the coronavirus, on Jan. 22, he assured people that it would not become a pandemic: “No. Not at all,” he told viewers of CNBC. “It’s going to be just fine.”

In the weeks that followed, he offered a series of similar reassurances:

“We have it very well under control.”

“We pretty much shut it down coming in from China.”

“I think the numbers are going to get progressively better as we go along.”

“We’re going very substantially down, not up.”

“It’s going to disappear. One day — it’s like a miracle — it will disappear.”

None of it was true.

Robin Lindley: Do you have any words of wisdom now on the future of democracy and the rule of law? Trump has persisted in twisting the law to his interests or ignoring it in the lifelong pattern you portray vividly.

James D. Zirin: He has continued and I think he will continue. And I think the rule of law has been seriously undermined. 

Our democracy has been seriously compromised because the framers of the Constitution never thought the system would work this way. Republican senators deserve part of the blame because of their need to retain power or whatever, they did not respect the oath they took to be fair and impartial judges of the facts and the law, but instead voted along party lines to acquit him. 

Robin Lindley: Trump said sometime in the 90s, as you note in your book, that “I love to have enemies.” I think most of us wouldn't feel that way and it seems pathological to me. What did you think when you found that quote from him? 

James D. Zirin: Most politicians are controversial and have political enemies. I think Trump relishes that perhaps more than others and he loves to attack them personally. We know Biden is “Sleepy Joe” and Elizabeth Warren is “Pocahontas” and Bernie Sanders is “Crazy Bernie.” He has nicknames for everyone and he revels in trashing them rather than addressing the merits of anything they propose. 

Robin Lindley: The mentality of an eight-year-old bully, it seems. You write that Trump’s experience in lawsuits reflects “his inmost ulterior motivation.” This comes out in your responses, but could you sum up your sense of his character, motivation, and morality based on your extensive research?

James D. Zirin: His virulent combination of anti-science, anti-law, ignorance, irrational conspiracy theorizing, instability, narcissism and vindictiveness has led us to national catastrophe. If he is re-elected, I fear for the republic and the American people.

 

Robin Lindley: As you demonstrate, Trump is an anomaly in terms of the adversarial system. Do you have anything to add on his abuse of legal process? 

James D. Zirin: Look, the adversary system is the crown jewel of our legal system. We got it from the British. The idea is that you had lawyers on both sides who were partisan, who were like the Knights Templar who rode into battle on behalf of somebody or other in the Middle Ages. That has been the best way of getting at the truth. In contrast, in civil law countries like France or Germany, the judge conducts the inquiry. The judge might ask questions of the lawyers, but the lawyers don't develop the evidence on both sides. 

But adversary doesn't mean enemy. What Trump has demonstrated is that we have a great legal system and we all have the benefit of it. But there are also limitations for the law and those limitations can be exploited by someone determined to beat the system, and that's what Trump has done.

Robin Lindley. Thank you, Mr. Zirin and congratulations on your groundbreaking and compelling book on the life of Donald Trump through the perspective of his thousands of lawsuits. It's been an honor to talk with you

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/blog/154329 https://historynewsnetwork.org/blog/154329 0
Roundup Top Ten for March 27, 2020

We’ve Never Been Here Before

by Adam Tooze

There are historical analogies to this kind of collective shutdown, but they are not attractive.

 

Women Also Know Washington

by Lindsay Chervinsky

The last decade has witnessed a noticeable uptick of works on Washington authored by women, with more to come in the pipeline.

 

 

The History of Asian American Discrimination in Public Health

by Stanley Thangaraj

The popularity of various pseudo-scientific, ad hoc religious, and other problematic discourses about the coronavirus are jeopardizing national and global health. 

 

 

What Our Contagion Fables Are Really About

by Jill Lepore

In the literature of pestilence, the greatest threat isn’t the loss of human life but the loss of what makes us human.

 

 

Dismantling the Federal Bureaucracy Has Left Us All Vulnerable to COVID-19

by Teal Arcadi and Casey Eilbert

Decades of deriding bureaucrats and cuts have undermined government capacity to serve us when we need it most.

 

 

Babe Ruth's New York @ 100

by Jonathan Goldman

When Babe Ruth started hitting home runs, the US started to change.

 

 

Hospice of the Creative Class

by Alex Sayf Cummings

No event has so starkly revealed the brutal inequalities of contemporary capitalism as the coronavirus pandemic.

 

 

It Doesn’t Have to Be a War

by Tim Barker

The Trump administration appears ready to invoke the Defense Production Act to speed manufacture of essential goods like face masks. What if we didn’t have to resort to the analog of war?

 

 

Joking in the Time of Pandemic: The 1889–92 Flu and 2020 COVID-19

by Kristin Brig

As we see with COVID-19, the darkest periods in history expose the best—and worst—of humanity.

 

 

Assimilationists of a Feather

by Elliot Friar and Travis LaCouter

If the history of gay liberation has taught us anything, it’s that assimilationism is one hell of a drug.

 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174755 https://historynewsnetwork.org/article/174755 0
A New Lease on Life for the "Jewish Nose" Lie

 

Few people are shocked when a drunken buffoon or a white supremacist rants about Jews having distinctive noses. But when a historian at a respected institution perpetuates that dangerous myth, it’s a cause for concern.

In a tweet on December 17, 2019, Dr. Rebecca Erbelding, a staff historian at the United States Holocaust Memorial Museum, who frequently represents it at events around the country, wrote: “At a talk today, asked about my personal background. I confessed that I’m not Jewish, but with a Hebrew first name, German last name, and my nose and hair, I ‘pass’.”

The same week that Erbelding made her remark, UNESCO announced that it was canceling its association with the annual Carnival of Aalst, in Belgium, because the carnival has included a float that mocked Jews by depicting them with huge “Jewish noses.” 

Perhaps Erbelding or the Belgian float designers believed they were acting in a humorous spirit. But that misses the point. Wrapping a racist stereotype in a joke does not make it any less racist. It does the same damage—it perpetuates the degradation of the Other. The idea that there is a distinctive “Jewish nose” is one of the oldest anti-Jewish myths. Medieval anti-Semites introduced it in the 12th century CE as a way of singling out Jews for contempt. The nose became “a physical symbol of otherness for Jews,” as Prof. Roy Goldblatt has put it.

Nazi Germany’s propaganda machine made ample use of the “Jewish nose” stereotype. A notorious Nazi film produced in 1940, called “The Eternal Jew,” purporting to expose the “real” Jew, focused on “Jewish faces,” zooming in on Jews’ noses to make them seem unattractive. 

Similar images appeared throughout the Nazis’ news media, cultural publications, and children’s books. Der Giftpilz, an anti-Jewish children’s book published by Julius Streicher (publisher of the Nazi newspaper Der Sturmer), included a section called “How To Tell A Jew.” It described a 7th grade boys’ class, in which “Karl Schulz, a small lad in the front row,” steps to the chalkboard and declares: “One can most easily tell a Jew by his nose. The Jewish nose is bent at its point. It looks like the number six. We call it the ‘Jewish six.’ ”

In recent years, the “Jewish nose” lie has been heard from time to time. In 1999, for example, Arizona state legislator Barbara Blewster told a colleague, “You can’t be Jewish. You don’t have a big hooked nose.” Blewster was compelled to publicly apologize, but she continued to insist, “I have no prejudice at all. I admire the Jews.” 

The recently resigned prime minister of Malaysia, Mahathir Mohamed, has repeatedly referred to “hook-nosed Jews.” The website of Belgium’s Ghent University until recently included a sign-language video showing a hooked nose as the translation for the word “Jewish.” 

A related episode occurred in Sweden last year. A Jewish woman reported that when she went to a Stockholm police station to have her photo taken for an ID card, an antisemitic officer digitally altered her image to drastically enlarge her nose.

Women in particular have suffered from the “Jewish nose” stereotype. Rachel Jacoby Rosenfield and Maital Friedman, of the Shalom Hartman Institute, have written about how the “Jewish nose” and similar stereotypes have been used to intimidate Jewish women into altering their appearances. “These negative stereotypes have impacted our Jewish psyche and spawned a self-consciousness and communal shame about ‘Jewish looks’,” they point out.   

Some scholars see a connection between the Jewish nose stereotype and violence against Jews. In a recent essay, Jonathan Kaplan of the University of Technology-Sydney, noted that Pittsburgh synagogue gunman Robert Bowers invoked classic anti-Jewish stereotypes in his online ravings. “How we speak about and depict others in the media and social discourse perpetuates long-held stereotypes and ultimately emboldens hate-filled individuals,” Kaplan warned.

That, in my view, is what makes the Jewish nose lie so dangerous, and why it is so important for mainstream society to reject it. Stereotypes fuel hatred. Hatred fuels violence. So even when a stereotype is invoked in a supposedly humorous way, it has the same impact.

In fact, one could argue that a racist “joke” is even more insidious than the kind that is yelled at a passerby or scrawled on the side of a synagogue. Anti-Semitic humor helps bigotry infiltrate mainstream culture and discourse. Humor becomes the channel through which otherwise unacceptable sentiments are deemed acceptable.

Similarly, when a representative of a major Holocaust museum perpetuates the myth of the “Jewish nose,” it’s much more damaging than when some drunk at a local bar spouts off about Jewish noses. A scholar gives legitimacy to the stereotype in a way that the average gutter bigot never could.

Scholarly and cultural organizations, institutions, and museums need to respond swiftly when such stereotypes rear their ugly heads. UNESCO did the right thing in dissociating itself from the “Jewish nose” stereotypes of the Belgian carnival. Other institutions should do likewise.

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174766 https://historynewsnetwork.org/article/174766 0
Can We Build On Yesterday, Salvage Today—And Save Tomorrow? There was life as it was lived in the United States before the Coronavirus; and as all know there is life after its arrival (from China—or wherever, it matters little).  Our nation has been torn apart and not put together again.  Lowly people, and the elevated ones as well, have been torn from “normalcy” and face uncertain futures.

       There is a temptation to mentally visit a large city and imagine what's been affected. Go down the list: schools, bars and restaurants, entertainment of all kinds, transportation, retirement homes—everything is disrupted,  Jobs have been lost, with everybody hoping it’s temporary.

       In a great many cases an interviewer questioning a public authority is likely to find the answerer “uncertain” as to exactly what has happened. The superior who tried to tell him hardly knew the facts themself!  Predicting an individual’s future—short or long range—is mighty hard.  Let’s try with an old person, male or female.

         So many Americans these days live on pensions.  We anticipate their arrival on perhaps the first of the month, or thereabouts.  Are they going to come now?  Or, are they held up in court?  Will a judge suddenly proclaim:  “Let’s pack it in, boys; come back in a month.”

        The place where we worked is still there, thank goodness.  But, is that door locked in some special way? Is my key useless?  Nobody’s coming in; we all got a card saying “Don’t be reporting for work until we notify you that we’re active again!

       I need to go to the bank now and then.  The time is now, but when I went to the familiar front door it had a notice pasted on it.  The words said, essentially, “We are meeting to see where we stand, and will be letting you know pretty soon just what is likely to happen.”

       My drugstore’s pharmacy was open, thank goodness, and I got the product I wanted, but the conversation was not of a kind I would like to get regularly.  Predictions about a “vaccine” that will combat this world virus situation and bring real relief are not being made—even on the internet, people are hesitant to predict improvement.

     I watch television a lot these days.  My favorite commentators are announcing and explaining.  Good.  But they just don’t seem to know as much as they (and I) would like!  I listened to the Governor of New York State and he was really lucid, detailed, and calm.  But the things he was saying were, well, horrible.  Schools closed; maybe curfews ahead; teaching school on television channels. Can we really do all that?

       There’s wild talk out there.  “Let’s just abolish the coming 2020 Elections.”  Let’s not!  We did abandon basketball and other events that draw crowds. Even the Masters golf!  Cultural events are suffering.  People are staying home.  In a way, that sounds terrible. I guess looking another way, there are worse places to be than one’s home—especially when the whole World’s exploding around you….

     I’m sitting quietly in my living room  as I write. I’m trying to imagine, to visualize, to predict:  yes, I’m trying to envision a life like the one I had from birth ‘til now being lived so very differently….  It is really hard.  I am trying to imagine life being lived in a place so very different from the United States I (we) had. Everything was working, it seemed.  Things were “on time.” Products were available.  Services could be gotten for the asking.

       Now: well, let’s just skip “now” for now. It won’t be like today in a couple of days, will it?  Change is the way things are going to be.  On the other hand:  we haven’t had a war.  Earthquake’s upheavals. Tornado or hurricane’s eradication of the whole landscape. Our cruise ship didn’t sink. (And we didn’t get locked up outside one of our finest ports, unable to swim the miles to an invisible shore.)

       Some in authority speak of living  two weeks down the line.  (Sadly, others talk of “next summer.”)  I have heard the phrase:  “Things will never be the same, will they?”  Then I glance at my investments and groan.  There seems to be nothing else I can do (except maybe pray). 

       Still and all:  I and mine are alive and well.  There’s been no fatal automobile accident.  All kinds of people in authority seem to have plans that stress “what we did before, remember? It worked pretty well, didn’t it?  We have our Army, and those military branches somebody seems to know about.  We can Build things, can’t we?”   So we have to slow down.  Well, I can do that if I must.  

       But Lord:  we will have to focus on those poor and unemployed persons we know and could know. It’s time to give, to share, to “think outside the box.”  Members of our own family have experience in doing that.  Let’s Unite.  Let’s lick this.  We’ll be patient, and innovative, and use our imaginations.  And, relax a little.  I just know that we can come out of this united; maybe not better; but not licked!

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174738 https://historynewsnetwork.org/article/174738 0
HNN Introduces Classroom Activity Kits As part of a new initiative to merge the worlds of education and journalism, HNN is introducing a series of Classroom Activity Kits.

 

These kits, all crafted by undergraduate students at George Washington University, are designed to illustrate the relationship between current events and history. Each kit consists of a complete 45-60 minute lesson plan which requires little-to-no preparation on part of the instructor. These lessons are designed for high school and college students, but they can be altered to suit other education levels.

 

During these lessons, students will be use news articles as a tool for understanding a specific history. In doing so, students will engage in critical thinking, group-work, text analysis, research and more. This first batch of activity kits covers the history of U.S. immigration, climate change, sports activism, and the U.S. prison system.

 

Aside from the educational benefits to students, these kits are also a great resource for educators. All you need to do is click one of the download links below and boom! – you have a fully formed unique activity.

 

HNN’s Classroom Activity Kits will be a continued effort to bring news into the classroom.

 

Click on each link below to learn more and download the activity kit. 

 

 

Classroom Activity Kit: The History of Climate Change

What do farmers from the 1950s, anti-smoking campaigns and climate change have in common? Download this Classroom Activity Kit to find out.

 

Classroom Activity Kit: The History of U.S. Immigration

This Classroom Activity Kit teaches students about U.S. immigration history while also highlighting their personal histories.

 

 

Classroom Activity Kit: The History of Sports Activism

Discussing athletes from Jackie Robinson to Colin Kaepernick, this Activity Kit teaches students about the history of political activism in sports.

 

 

Classroom Activity Kit: The History of Private Prisons in the U.S.

Download this Classroom Activity Kit to teach students about the history of the American prison system.

 

 

Classroom Activity Kit: The History of Climate Change and Activism

Download this Classroom Activity Kit to help students understand climate change activism in its historical context.

 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174366 https://historynewsnetwork.org/article/174366 0
Our Stories Will Carry Us to the Future. Can They Save Us Now?

 

In his essay ‘The Storyteller,’ 20th century German-Jewish philosopher of history Walter Benjamin recounts an episode in the Histories of Herodotus, on the humbling of the Egyptian king Psammenitus. Captured by the Persian ruler Cambyses at Pelusium in 525 BC, Psammenitus was forced to witness his son and daughter join a parade of Egyptian prisoners. But he remained unmoved until he saw an old man, a former servant, stumbling along at the tail of the procession. Only at this final sight did Psammenitus beat his fists against his head in anguish.

According to Benjamin, the episode shows us the potential stories have to reach across time—not with declarative “truths” but with questions or mysteries. Successive generations have speculated on the reasons for Psammenitus’s reticence and sudden show of grief, but, Benjamin tells us, Herodotus “offers no explanation. His report is the driest.” This is why the tale can still provoke many thousands of years later: “It resembles the seeds of grain which have lain for centuries in the chambers of the pyramids, shut up air-tight, and have retained their germinative power to this day.” Psammenitus’s story should provoke us to consider how the distant future will find the seeds of our society and our stories about ourselves.

The changes we are making to the Earth run wide and deep. The environmental crisis is accelerating, absorbing lives on a daily basis; but to appreciate the scale of this, we need to also learn to see the deep time impacts of our actions. And yet, we struggle to really see this, in the parade of traces human activity daily leaves on ocean, air, and earth. 

Sea level rise is one of the most urgent crises facing humanity; how far the waters climb will decide the fate of hundreds of millions. 150 million people could have to leave their homes as early as 2050.  Cities like Bangkok and Mumbai could become lagoon cities, or even be lost altogether to the rising waters. War and want would surely follow. Whole regions could go up in flames, ignited by water.

If the worst predictions come to pass, by 2100 630 million peoplecould be homeless. The economies of entire nations will collapse, and already-unstable regions will face unprecedented pressures. Mumbai, India’s economic hub and home to one of its main nuclear weapons research facilities, will become again the archipelago of small islands that it was in the 1850s. Basra, the second largest city in Iraq, will be submerged beneath a vast inland lake.

Although so much will be lost, the inundation will also be an act of preservation, on a scale that has never been seen before. 24 million people live in Shanghai—by the end of the next decade, it will be closer to 30 million—but the city has already sunk by 2.6 metres in a hundred years. The financial district of Pudong boasts some of the tallest buildings in the world, erected impossibly on a soft bed of mud and sand that is also threaded by over 600 kilometres of metro lines. Seawater that floods its streets will claim even the tallest towers over the course of several hundred years. But the subterranean city—the underground shopping malls, the metro stations, and the steel and concrete roots of the skyscrapers—will be sealed against decay by a layer of thick marine sediment. In time, as it is pressed deeper into the rock, the city will become a vast trace fossil of twenty-first century life. 

Some miraculous transformations will occur underground: glass will devitrify, acquiring a milky sheen; iron will react with sulfides in the sediment to form pyrite, or fool’s gold. Everyday objects discarded in the rush to flee the drowning city will be leave their impressions. One hundred million years from now, writes Jan Zalasiewicz  in The Earth After Us, Shanghai will be a metre-thick layer in the strata, filled with the outlines of sim cards and bicycle wheels, as precise as fingerprints.

“We live in things,” wrote Virginia Woolf. Each molecule of carbon and sliver of plastic is a message to the future, locked up against decay and with the potential to flower into meaning millennia from now. 

Our future fossils will be our stories, but it is hard to predict how, exactly, they will be interpreted. What will the C-14 bomb spike, trace evidence of thousands of nuclear weapons tests since the middle of the twentieth century that will be legible for tens of thousands of years, say about us? Or the ‘reef gap,’ a crimson stain in the rock record that, millennia from now, might tell that a continuous coral ecosystem, 2,300 kilometres long, once thrived off the east coast of Australia? Just as we wonder how Psammenitus could stand impassively, perhaps anyone confronted with our stories in the deep future will wonder how we could simply stand by and watch the procession of disaster. 

But our future fossils are both our legacy and our opportunity. We can choose how we will be remembered. In learning to see the deep future flash upon the present, we will be better able to shape how our story will be told.

We’re confronted every day with stories of urgent and strange change. The evidence is quite literally all around us, but we are conditioned to look away, as if the climate crisis were also a crisis of the imagination. To remedy this, science is essential, but we also need stories. We know ourselves first and foremost in the tales we tell. This has been so since the beginnings of humanity; and since then, it is art and narrative that have borne our essence into the future. The question we must ask is, what stories will our future fossils tell?

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174670 https://historynewsnetwork.org/article/174670 0
Why Holocaust Fiction?

 

As a biographer I envy novelists, who can craft a captivating tale without needing to carefully document sources of information. Though they face definite challenges, fiction writers can, with whatever degree of knowledge and understanding they possess, invent composite characters and telling dialogue.

 

As a reader of Holocaust literature, I prefer non-fiction. Give me the facts straight up, please. I do not want to be left wondering whether some person existed or some action occurred. And though I love good stories, I see little need to manufacture them when the truth is powerful and strange and terrible enough. 

 

As a daughter of survivors, I think my grandparents—two of whom were in their early forties and two in their early sixties when they were killed in Auschwitz-Birkenau—would have wanted the full truth of what happened to them and their children to be wailed unto the heavens—lest they all vanish without a trace of having existed, as evidence of the human capacity for evil. 

I understand, however, that a case can be made for Holocaust fiction. In portraying virtuous, ignoble, or complex characters in extreme situations, novelists shed light on human behavior in normality. Vivid scenes facilitate our entry into foreign worlds. And in distilling events, the fiction writer can make the complicated comprehensible. Finally, readers who might not otherwise have known that Mengele experimented on twins, or that diplomat-rescuers saved some prominent Jews, or that tattooists engraved numbers on the arms of select Auschwitz inmates—or about any other of the innumerable dimensions of the maelstrom—might learn something. They may be spurred to further exploration. 

 

When fiction is “based on true events” we receive more than the author’s imagination. Of course, the degree to which such stories can be relied upon for historical accuracy varies. How deeply did the author research the subject? How can those of us who are not scholars evaluate whether we are getting a true picture? 

 

Some fiction writers use the Holocaust in the service of a good yarn—as if throwing perpetrators or victims or survivors into their narrative adds pathos or heft. Sometimes the most fantastical accounts (for example, the movies Life is Beautiful and Jo Jo Rabbit) dish up the absurd enmeshed in the plausible, like an SS officer barking orders at inmates in a language they did not understand. This happened. Nazi leaders training Hitlerjugend to defend the fatherland—this happened. While such works may be accused of trivializing the most serious of subjects, they make no claim to being other than farcical. 

 

But pretenders to truth (such as Binjamin Wilkomirski’s Fragments: Memories of a Wartime Childhood) indisputably cross a line. Passing off a false account as true provides fodder for Holocaust deniers and affronts us all. 

 

Sometimes true accounts are mistaken for fiction. Each semester that I taught a college course on the Holocaust, students would hand in papers that read, “In Elie Wiesel’s novel Night...” The Nobel laureate wrote several novels, but Night is a true account of teenaged Wiesel’s experience of the war. I wanted my students to know that. 

 

Survivor accounts are among our most trustworthy sources. Though it was near impossible for people in extreme situations to remember precise dates and times (not even decently-fed soldiers could recall such details), those who were there were (and are) experts on what they saw and felt on their own skins. 

 

Perhaps, then, the greatest good that ever came out of a work of Holocaust fiction was the Institute for Visual History and Education of the University of Southern California Shoah Foundation. In March of 1994, after accepting an Academy Award for Best Picture for Schindler’s List (based on Thomas Keneally’s historical novel), Steven Spielberg launched an ambitious and impractical project: knowing that survivors’ stories would soon be lost to history, he would capture on video as many of their testimonies as possible. Among the roughly 350,000 survivors then alive, most had been young adults during the war; they were now seniors; it would be a race-against-time. 

 

Moving quickly and efficiently—with the aid of historians and scholars, and production logistics experts; with project directors, coordinators, and, eventually, 2,500 interviewers in 33 cities in 24 countries, Spielberg set about achieving his goal. Wanting viewers to “See the faces, to hear the voices,” he insisted that survivors be interviewed in their own homes wherever possible. They were to tell their complete life stories, but spend most of the interview recounting their Holocaust experiences. Spielberg’s team ultimately amassed 52,000 videotaped testimonies. 

 

What possessed these (mostly) ordinary citizens, including those who were shy or humble or who had never before spoken about their experiences, to dress neatly, invite interviewers and videographers into their living rooms, and open up about the darkest period of their lives? For one, most knew about Steven Spielberg’s Schindler’s List. Secondly, they learned about the project through multiple media sources; flyers and ads with the headline “So Generations Never Forget What So Few Lived to Tell” awakened their sense of moral responsibility. Ms. Miller, a child who had hid with her family in a crowded farmhouse in the hills of Italy, said, “We come forward because we are aware of our own mortality and how important it is to share what happened.” 

 

Fortuitously, during the six-year period (1994-2000) in which the interviewing took place, there was an explosion in the field of information technology, enabling Spielberg’s team to create a vast, searchable cyber-archive. The carefully catalogued and scientifically preserved videos were distributed to various organizations (including the U.S. Holocaust Memorial Museum and Yad Vashem). 

 

It would take twelve years (or more than 105,000 hours) to watch all of the interviews. I have only had time to view some, obtained online through the USC Shoah Foundation. Once I began listening to certain survivors, I could not tear myself away. Their stories are inherently harrowing, gripping, and educational. And authentic. 

 

For his noble work, all of humanity owes a debt of gratitude to Steven Spielberg (whose foundation has subsequently worked with the Kigali Genocide Memorial to capture the testimonies of survivors of the Rwandan genocide). Twenty-five years after the release of Schindler’s List, in December 2018, the filmmaker reflected on this “most important experience” of his career. Survivors who could bear to watch the film told him that it could not compare to what was. But they were glad he told the story—it should not be forgotten.

 

Owing to the singular circumstances, perhaps no author of Holocaust fiction can aspire to again produce a work as far-reaching as Schindler’s List. But writers who ignore or take liberties with the truth ought to reflect on their purposes. If in some measure they aim to edify, counter hate, and inspire empathy, they might be mindful of those who did not live to tell their stories—who, when they could, engraved their names and places of birth in the walls of barracks, or implored others to remember them. Had they had a choice, I believe Hitler’s victims would have wanted nothing about the mortal crimes against them falsified. 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174674 https://historynewsnetwork.org/article/174674 0
Plots Against America?: Jim Crow was Homegrown Fascism

 

Note: this essay quotes a sign attached to the body of a man lynched in 1919, which uses a racial slur.

HBO’s recent release of a prestige adaptation of Philip Roth’s 2004 novel The Plot Against America makes it worthwhile to examine whether fascism is really so alien to the United States as many wish to believe. In Roth’s novel, a Jewish extended family in Newark experiences fascism’s arrival in America, with the 1932 election of Charles Lindbergh to the presidency, as an intrusion of European extremism against an American Way—and a short-lived one, at that. When many white thinkers ponder Roth’s Plot or the sardonic title of Sinclair Lewis’s 1935 novel It Can’t Happen Here, they often miss the many ways in which it has already happened here. After all, the public prominence of the Ku Klux Klan and the massive riots that rock the climax of Roth’s novel were already regular features of American life, depending upon where you lived. For example, in early May 1927, just a few weeks before the historical Lindbergh took off from Roosevelt Field in the Spirit of St. Louis, some 5,000 whites rioted in the black business district of Little Rock, Arkansas, where they burned the body of man named John Carter. Local police officers, many rumored to be Klan members, did nothing to stop the violence—and may have even taken part.

Black observers of current trends, however, tend to be a little more astute. For example, in his February 21, 2020, New York Times column, Jamelle Bouie argues that the expansive authoritarianism of Donald Trump has its analogue in the Jim Crow South. However, we should not consider the Venn Diagram of fascism and Jim Crow as a circle. The reality is a little more complicated.

The word “fascism” has also long been employed as a political Rorschach blot. As early as 1946, just one year after the end of World War II, George Orwell was complaining of this fact in his essay “Politics and the English Language,” writing: “The word Fascism has now no meaning except in so far as it signifies ‘something not desirable.’” But let us take this as a working definition:

Fascism is the attempt, birthed in reactionary politics, to resolve the contradictions of democracy for purposes of preserving elite power against the demands of the masses.

This will need some explanation. Although we today associate democracy with high ideals, its origins are a bit grubbier. For example, opposition to the Angevin kings of England (which led to the Magna Carta) included, according to historian Robert Bartlett, such charges as “heavy taxation, elevation of low-born officials, slow and venal justice, disregard for the property rights and dignities of the aristocracy.” Much of the Magna Carta focuses upon preserving the property rights and prestige of the aristocracy while limiting the king’s ability to levy certain taxes without “the common counsel of the kingdom.” The eventual emergence of a mercantile class in the late Middle Ages and early Renaissance sparked another expansion of “democracy,” as the bourgeoisie sought similar privileges in order to protect their own wealth. With industrialization, and the eventual concentration of the lower classes into cities, the emerging proletariat began to press for access to the franchise itself on both sides of the Atlantic, as exemplified in the UK by the Reform Act of 1832 and in the US by Jacksonian democracy and the removal of property qualifications for the vote. 

At each step in the expansion of democracy, those who already possessed the franchise feared the loss of their power and wealth by allowing any “lower” classes the privilege of voting. The United States has been much more a racial society than a class society along the lines of the UK, and so here it was easier to get elite buy-in to the idea of universal male suffrage, so long as those males were exclusively white. The abolition of slavery and the expansion of suffrage to those former slaves and their eventual descendants provoked the rage of the south’s idle landlords, who initiated a campaign of violence in the immediate aftermath of the Civil War in order to return to the status quo ante of black servitude and submissiveness. What historians call the first Ku Klux Klan was an elite project to scuttle the political empowerment of African Americans.

The “contradictions of democracy” can be seen in this struggle between those who believe that republican government should preserve elite power and the democratic desire that all citizens be given a voice in how they are governed. Fascism is an attempt to short-circuit this tension through the advancement of a purely corporate figure who is cast as the savior of “the people,” not by empowering them but rather by emphasizing his own unique attributes to act on their behalf. As the Israeli scholar Ishay Landa points out in The Apprentice’s Sorcerer: Liberal Tradition and Fascism, while fascism regularly employs the rhetoric of collectivism, it centralizes such collective and democratic yearnings upon the individual strongman leader, so that he becomes democracy personified, the one true spokesman for “the people,” who no longer need engage in self-governance. 

But there is more to it. As historian Aristotle Kallis observes in Genocide and Fascism: The Eliminationist Drive in Fascist Europe, fascist ideology was born with the specific aim of seeking redemption from recent “humiliations” by latching onto the glories of the past to drive a new utopian future. This “redemption” manifested itself externally, through expansionist policies of conquest, and internally, through a “cleansing” of the population aimed at eliminating those figures responsible for recent humiliations: socialists, communists, Jews and other minority groups. The drive to “cleanse” the state, Kallis writes, “helped shape a redemptive licence to hatedirected at particular ‘others’ and render the prospect of their elimination more desirable, more intelligible, and less morally troubling.” 

This has been just a brief overview, but it allows us to draw some parallels between fascism and Jim Crow. Both fascism and Jim Crow were means of limiting democratic participation and thus the political and economic emancipation of certain “others.” And both fostered a “license to hate” that resulted in massive violence against the enemies of the elite. But there are more parallels. As Landa writes in Fascism and the Masses: The Revolt against the Last Humans, 1848–1945, “Rhetoric of honoring labor aside, the Nazis strove to achieve the exact opposite: keeping wages low and increasing working hours, which was precisely what German business was insisting should be done throughout the years of the Weimar Republic.” Much the same held true in the Jim Crow South, where particular ire was reserved for those who resisted the southern tradition of racialized economic exploitation. In June 1919, after Clyde Ellison refused to work for Lincoln County, Arkansas, planter David Bennett for a mere 85 cents a day, he was hanged from a bridge, with a sign attached to his body reading, “This is how we treat lazy niggers.” Later that year, and not too far away, white mobs and soldiers would slaughter untold numbers of African Americans, in what has become known as the Elaine Massacre, for daring to organize a farmers’ union.

Although both fascism and Jim Crow constituted violent means of securing elite power, there are important distinctions to note. While southern states had their share of demagogues, Jim Crow was a multi-generational project of the Democratic Party, one not centered upon any particular individual. Too, while both fostered a “license to hate” against racial and ideological others, the Jim Crow project made a distinction between “good negroes” who “knew their place” and “bad negroes” who sought the privileges reserved to whites. The latter may have to be killed, and the region “cleansed” of those “outsider” whites who spread dreams of equality, but black people who were dutifully submissive could be tolerated in so far as their unpaid or underpaid labor created the region’s wealth.

Back in 2004, The Plot Against America was widely regarded as a commentary about the administration of George W. Bush. No doubt, the 2020 television adaptation will be viewed in the light of Donald Trump, whose rhetoric and policies have been compared by critics to both fascism and Jim Crow. Perhaps the television series will exhibit a more sophisticated understanding of fascism and America than did the book. Perhaps not. Either way, the series should provide a good opportunity for historians to educate the public on who, exactly, lay behind the centuries-old plot against all Americans.

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174673 https://historynewsnetwork.org/article/174673 0
The Three Daring Women Who Traversed the Himalayas

 

 

Antonia Deacock, Anne Davies and Eve Sims were three rather extraordinary women who, when in their mid-twenties and thirties, set off overland from England to Tibet in 1958.  Their aim was climbing one of the Himalayas’ unexplored high peaks. They made the 16,000-mile drive to India and back, adding a 300-mile trek on foot to Zanskar, a remote province of Ladakh (part of Kashmir), and became the first European women to set foot there.

 

They called their adventure the Women’s Overland Himalayan Expedition, and were inspired and encouraged by their husbands, explorers and climbers themselves. Not wanting to be left behind "chained to the kitchen sink" when their husbands went on a trek, the women spent six months planning their own adventure. "It was just something we wanted to do," said Anne matter-of-factly. "And it was a good idea."

 

They had no vehicle, little money and no equipment, and didn’t even know each other very well at the start, but by working together they mustered enough support to fund their expedition, gaining sponsorship from, among other companies, Brooke Bond Tea (they mistakenly ordered enough tea "to keep a family going for 150 years," according to Eve), lllustrated magazine, John Player & Son, even cosmetician Max Factor. They persuaded Land Rover to sell them a demonstration model of a modified long-wheelbase all-terrain vehicle at a significant discount, and the British Ministry of Agriculture and Fisheries donated steak, vegetables, and berries preserved with the new technology of freeze-drying. A publicity campaign before they left captured the public’s attention, and headlines such as "Four Fed-up Wives" (one member had to pull out at the last minute, due to an unexpected pregnancy) helped raise the required funds.

 

They weren’t completely naïve, however. Anne was fluent in Urdu and Hindi and had previously trekked in Kashmir with her husband, their baby strapped to the back of a mule. Sims had spent two years motorbiking around Australia and New Zealand, and before that had learned to climb in Wales. Deacock was an experienced rock climber. They weren’t the kind of women who would let the fact that two of them couldn’t even drive when they started planning their trip deter them.

 

Eve and Antonia had not yet had children, but Anne left behind her three sons, aged 15, 14 and five. "I didn't feel guilty," she said. "Because Lester had gone off on expeditions, and this was my turn."

 

Their five-month journey saw them battle illness, delays, inhospitable terrain, the effects of altitude, and terrible roads. They occasionally had to fend off unwelcome advances and negotiate with recalcitrant porters. Rest days were spent scrubbing their laundry on rocks in a nearby river, catching up on correspondence–communication was slow and difficult, and they could only occasionally get word to their husbands and families that they were safe–and checking their supplies. The entire trip was a considerable feat of organisation – after the women estimated what they would need to last the entire trip, several crates were shipped to Bombay and had to be retrieved from the docks after bureaucratic delays.

 

Due to the heat in Iran, they were often forced to drive at night, and were faced with constant enquiries as to where the ‘sahibs’ were and incredulity that the women were travelling on their own. They camped almost everywhere, and were welcomed by Land Rover’s agents in the European cities they travelled through, who praised the women for their maintenance of the vehicle, though they often expressed surprise that they successfully completed the trip. 

 

Following an audience with the Indian Prime Minister Nehru they were granted the rare privilege of travelling beyond the "inner line,", a boundary across India and Tibet that had been drawn up in the 1800s and beyond which no British subject might rely on government protection or rescue. No non-Indian had been granted permission to cross the inner line since before the Second World War, and it was one of the last regions untraveled by Europeans on the planet. They also were thought to be the first European women to cross Afghanistan unescorted. 

 

Fording fast-flowing meltwater streams, sometimes up to their waists in water, they trekked through snow and ice, climbing a 18,700 foot peak and naming it Biri Giri ("Wives Peak"). As they travelled, they passed through villages untouched by European contact, meeting locals who had never seen things such as zippers or nylon climbing ropes. One elderly woman was terrified of the sight of her own face in one of their mirrors. "How privileged we were to witness and partake in societies that were virtually strangers to the modern world that we know," said Antonia afterwards.

 

The women were bound by a common desire to prove themselves. "I’d been my father’s daughter, my husband’s wife," said Eve. "But this time I was somebody on my own." Despite living in close proximity and enduring many hardships, surviving dust and discomfort, the freezing cold and the intense heat, they claim never to have argued, preferring to thoroughly discuss issues and abide by a majority rule on decisions.

 

After their expedition, they carried on the spirit of adventure in their lives. Antonia wrote a book about their exploits, then eventually moved to Australia and established an Outward Bound school and the world’s first dedicated adventure travel company with her husband, also establishing close links with Nepal. Eve had three children and went on to run an Outward Bound centre with her husband, and Davies helped her husband to run an outward bound school in the Lake District.

 

The 1950s were a time when relations between British and other western people and newly independent nations in South Asia were in their infancy, and the women’s fearless endeavour is all the more remarkable for it. Although trekking in remote lands is now far more common, it is unlikely that they would be able to make such a journey–particularly through Afghanistan and Iran– today.

 

A film about their expedition, by Pulse Films and Britain’s Film4, is in development. Antonia Deacock’s book, No Purdah in Padam, is now out of print, but available from antiquarian booksellers.

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174675 https://historynewsnetwork.org/article/174675 0
As We Zoom into Online Learning….

 

 

“Zoom” – this playful kid-word, which once referred to fast cars, now signals a fast-approaching sea-change in higher education, unfolding before our eyes thanks to COVID-19 and the need to move live university classes online. Zoom, for those who do not know, is the video meeting platform by which faculty are all migrating our classes to on-line format.

 

We all might have larger things to worry about in the next few weeks and months, like our loved ones, our colleagues, and our students becoming terribly ill. If this happens, then the nature of online education will hardly be our biggest problem. 

 

But for the moment at least, as an academic who has been teaching at state universities for nearly thirty years, I am torn concerning the issue at hand. On the one hand, the students who signed up for my History of the Holocaust class this semester at the University of Florida did so because they were interested in the topic, some intensely so. As I try and move my lectures and discussion sessions from a classroom to a Zoom format, I want to provide something as close to the classroom experience as I can. On the other hand, I suspect, as do many of my colleagues, that university administrators and state legislators throughout the US will study this crash experiment in online education very closely one day. Are we academics showing them how they might replace us in the name of heightened efficiency? 

 

We can agree that some of the efficiencies are indeed desirable. Those of us who remember putting books on reserve in the library for twenty-five students at a time will attest to this. A certain number of online classes, moreover, have existed for the last couple of decades, helping place-bound students and those students, younger and older, who work full time. But what happens when everything goes online, and all at once? If we discover that all classes can be delivered online from a remote location, then what is the point of having lecture halls, classrooms, or for that matter a diverse faculty of broad expertise and talents? The arguments that have been percolating in universities for the past decade will intensify overnight.Yes, there are faculty who have put in an immense amount of time in order to develop fine online experiences. But I have also seen half-baked efforts over the years that are rather disastrous, even within the oft-cited rationalizing context of the apocryphal 1970s professor (I never actually had one of these guys) who mumbled through his yellowing lecture notes. 

 

My colleagues are proceeding cautiously. One colleague warned me not to record my lectures into the Zoom cloud, but to provide them live through Zoom. Everyone, I hear, is giving synchronous (“live” in Zoomspeak) as opposed to asynchronous (“recorded” in Zoomspeak) lectures. Anyone who has seen the infatuation with online learning in higher education administration over the past twenty years knows that this is hardly a paranoid reaction. The university would own the recorded content, as the work is done for the university in return for compensation. I actually recorded the first few lectures for my class. I needed to crawl with this technology before I could walk, and the students, I thought, would need time to adjust to the new reality of a full course load online as they simultaneously move from Gainesville back to their homes in Florida and elsewhere in the US.  Nonetheless, like some of my colleagues, I am uneasy even with synchronous content. If I have learned anything from other people’s travails with Facebook and Twitter over the years, it is that nothing put online, even briefly, is truly protected or truly deleted. 

 

And if I know what faculty will say once the experiment is over, I am less sure about the students.  I hope that they give a contextual yet roundly negative assessment -- something like, “I understand the situation, but I can’t wait to get back to live classes.” But today’s students are online-surveyed to death, starting with online evaluations of faculty each semester that most do not complete. Worse, twenty-somethings believe that they can effectively multitask–what others of us would call diffusing one’s focus. The professor telling them at the start of class to turn off their cell phones and laptops is now the professor who depends on these devices as we try to lecture or hold discussion sections from remote locations. Zoom actually has a feature that discloses a student’s level of attention—have they left their screen? Are they messaging? Are they watching Netflix? But I really don’t want to check that feature, and the fact that Zoom has it at all reveals the nature of the problem. Live classes promote a level of decorum that everyone in the classroom understands and from which everyone in the room benefits. But taking a class alone in one’s kitchen or bedroom? There is a reason that different rooms have different names, and in every language.

 

Finally, there is the quality of our own work, our pedagogical preparation. Like most of my colleagues in certain disciplines, I believe that each lecture and each discussion is the result of having worked at our craft over a period of years. What material shall we present to make a particular point about, say, Jewish resistance in the Warsaw ghetto, or about Reconstruction after the Civil War, or about Robespierre’s dictatorship? How shall we present it? What verbiage will we use? What visuals will we use? When will we leave the lectern for a stroll up the aisle? When will we pause and urge the the students think rather than just take notes? What questions will we pose to them when they discuss? How can we encourage them to interact and even debate with one another face-to-face-to-face, complete with expressions and gestures? How will we get them to understand that there are no black and white answers but only arguments, some thoughtful, some needing intensive development? 

 

These questions and many others form the very stuff that makes live higher education on a university campus an experience for faculty and students that cannot be replicated online, at least through the Zoom technology with which I have become familiar. Even if all of the technology “works,” how can our broader efforts, having been squeezed through the portal between a faculty computer and those of the students, come out undistorted on either side in ways that we cannot yet fully recognize? Zoom is fine technology—for conference calls. It enables business executives to talk to one another over long distances while presenting flowcharts and such. Academics can even have faculty meetings via Zoom, so that we ourselves can “multitask” to our heart’s content while discussing the minutiae of departmental by-laws.

 

But the real interaction that results in true learning? I am not sure at all. Zoom at its bandwidth-driven heart allows us to see one another and hear one another only to the point where we can talk to one another’s images (with cheesy optional backgrounds of the tropics or of outer space no less) and not speak to one another as human beings. We can see, but our vision is circumscribed. We can listen, but our hearing is muffled. We can connect, but our interaction is impeded.

 

For this, we all need to be, once again, in the same room. 

 

Let’s hope it is soon.

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174676 https://historynewsnetwork.org/article/174676 0
Use What You Know: Online Teaching Tools If you teach history at any sort of educational institution, whether K–12 or higher ed, chances are your institution has “pivoted” to remote (online) instruction, if not closed campus entirely, as a response to the rapid spread of COVID-19 and the imperative for social distancing. Over the past week, a wave of these pivots and closures has left many of us scrambling for alternative means of engaging our students in an online and likely asynchronous setting. This is not the optimal way to teach and learn history, given that good online courses take more time and planning to develop than the handful of days most of us have been given.

So what do we do? How do we keep as many of the essential elements of our course as possible, even if those look different online? We want to help our students to continue to be engaged with history; that is, to still feel present in the course and actively work with the course material. We want our students to do things like discuss, analyze, work with primary sources, and be able to communicate their interpretations to others.....

Many historians have already been doing this kind of teaching online, so there might not be the need for you to re-discover fire. Check out the #twitterstorians hashtag on Twitter to read and participate in ongoing conversations and resource sharing. Waitman Beorn, a senior lecturer in history at Northumbria University, has generously created a spreadsheet of teaching tools, digital history sites you can direct students to, digital tools for historical scholarship, digital humanities projects, and digital archives (use the tabs at the bottom of the spreadsheet to navigate between categories). H-Net has put together a repository for resources on teaching history online, which should be a thriving community soon. One of the most exciting cross-disciplinary products has been the “Keep Teaching” online community set up by the staff of Kansas State University’s Global Campus, an excellent virtual gathering spot where faculty, staff, and designers are sharing tips, tricks, techniques, and—most importantly—solidarity as we navigate this rapidly changing landscape together. 

As historians and teachers, we pride ourselves on being able to engage students with the complexity and wonders of the past. Though our current circumstances are far different than we anticipated, we have the research skills and critical faculties to help solve this new set of problems. Being analytical and discerning about the tools we use is a necessary part of that process, but so, too, is our discipline’s remarkable willingness to collaborate and share expertise. If you’re one of the thousands of us “moving online,” good luck, and see you on the internet!

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174651 https://historynewsnetwork.org/article/174651 0
The Cultural Constants of Contagion

 

Sometime during the year 541, a few rats found their way into Byzantium. Soon more would arrive in the city. Whether they came from ships unloading cargo in Constantinople’s bay, or overland in carts bringing goods from points further east, the rodents carried fleas harboring the Yersina pestis bacterium. Byzantium’s citizens, ruled over by the increasingly erratic Justinian, had heard accounts of the plague as it struck down other nations first. The historian Procopius, who was witness to the ensuing pestilence, writes in his History of the Wars that the resultant disease “seemed to move by fixed arrangement, and to tarry for a specified time in each country, casting its blight… spreading in either direction out to the ends of this world, as if fearing some corner of the earth might escape it.” 

Known as Justinian’s Plague, the pandemic marked the first instance of the bubonic plague in Europe, almost a millennium before its more famous manifestation as the Medieval Black Death. William Rosen writes in Justinian’s Flea that while contagion was a feature of human life, before the sixth-century “none of them [had] ever swept across what amounted to the entire known world, ending tens of millions of lives, and stopping tens of millions more from ever being born.” By the pandemic’s end, epidemiologists estimate that perhaps a quarter of the known world’s population had perished.

There is something deep within our collective unconscious that dimly apprehends these past traumas. The accounts of a historian like Procopius sound familiar to us; echoes of such horror in everything from the skeleton masks of Halloween, the morbid aesthetic of the gothic, and the horror movies of pandemic–from the clinical Contagion and Outbreak to the fantastical The Walking Dead and 28 Days Later. We so fearfully thrill to narratives of apocalypse, that when faced with our own pandemic there is something almost uncanny about it. For the past several months a few Western observers nervously read reports about the coronavirus outbreak in Wuhan, China. More people paid attention as cases emerged in Italy. Shades of Procopius, who reflected that “at first the deaths were a little more than the normal, then the mortality rose still higher, and afterwards the tale of dead reached five thousand each day, and again it even came to ten thousand and still more than that.” Today almost every nation in the European Union is affected by coronavirus, a majority of U.S. states, and every single continent save for Antarctica. Physicians have yet to fully ascertain the disease’s mortality rate, but thousands have already died, and contrary the obstinate denials from the president of the United States, Covid-19 clearly seems to be more than “just the flu.” 

Those of us who work in the humanities have, for more than a generation, been loath to compare radically different time periods and cultures–and for good reason. There can be a flattening to human experience when we read the past as simply a mirror of our own lives. Yet plague is, in some ways, a type of cultural absolute zero, a shared experience of extremity that does seem to offer certain perennial themes that approach universality. For the first time in more than a century, since the Spanish Influenza outbreak of 1918, we collectively and globally face a pandemic that threatens to radically alter the lives of virtually everybody on the planet. It behooves us to converse with the dead. Much is alien about our forebears, but when we read accounts of pestilence raging through ancient cities, it’s hard not to hear echoes of our own increasingly frenzied push notifications. 

There is a collection of morbid motifs which recur with epidemics–the initial disbelief, the governmental incompetence, hoarding of goods, desperate attempts at protection (including the embrace of superstition), social stigma and bigotry, social distancing and quarantine, and of course panic. In the pilfered N95 surgical masks there are echoes of the avian masks worn by Renaissance plague doctors; in the rosemary, sage, and thyme clutched by medieval peasants there are precursors of the Purell which we’re all desperately using (albeit the latter is certainly more effective). As different as those who lived before us may be, the null void which is the pandemic does display a certain universality in how people react to their circumstances, born necessarily from biology itself. For example, gallows humor is an inextricable and required aspect of human endurance, though it can also mark a dangerous denialism. Catharine Arnold writes in Pandemic 1918 that the in its earliest days, the Spanish flu was either denied or joked about. Arnold writes that “At the [Cape Town, South Africa] Opera House, a cough in the audience provided the actor on stage with an excellent opportunity to ad-lib. ‘Ha, the Spanish flu, I presume?’ The remark brought the house down.” However, she writes that “within days, that joke wasn’t funny anymore.”

Few emotions are as all encompassing, understandable, and universal as is panic. In the queasy feeling which a pandemic inculcates throughout a society there is a unity of experience across disparate time periods. Consider the language from The New York Evening Post in July of 1832 as they traced the inevitable arrival of a cholera epidemic into the city, where inhabitants of the metropolis fled “from the city, as we may suppose the inhabitants of Pompeii or Reggio fled from those devoted places.” Naturally hoarding is another predictable behavior, if often dangerous and self-defeating, as can be attested to anyone who has tried to buy hand sanitizer or toilet paper in the last week, or with the squirreling away of face masks which are currently needed by public health officials. Procopius again notes that in Byzantium, “it seemed a difficult and very notable thing to have sufficiency of bread or of anything else; so that with some of the sick it appeared that the end of life came about sooner than it should have come by reason of the lack of the necessities of life.”

This panic can metastasize into rank irrationalism, as pandemic often leads to the embrace of quack cures, or more disturbingly the promulgation of noxious hatreds. Philip Ziegler writes in The Black Death that during the pandemic of 1348, in France a “fashionable course of study” regarding the plague included “an ointment called ‘Black de Razes’” that was “on sale at apothecaries as a cure recommended for virtually any ailment.” Its effectiveness might be compared to the silver nitrate solutions hawked by televangelist Jim Bakker as a prophylactic against coronavirus. During the Black Death, fear was a convenient catalyst for old hatreds, as the plague justified antisemitic pogroms. By dint of both their isolation and the religious strictures that encouraged rigorous hygiene, medieval Jewish communities were sometimes spared the worst of the plague. This was interpreted by Christians as complicity in the pandemic, and so the Jews were made to suffer for their previous good fortune. John Kelly writes in The Great Mortality that antisemitism “bubbled up from the medieval Teutonic psyche,” where in Germany and other nations the religious cult of the Flagellants “believed the curse of the mortality could be lifted through self-abuse of the flesh and slaying Jews.” 

Xenophobia similarly marked more modern pandemics. In the first four years of the twentieth-century, San Francisco saw America’s only prolonged outbreak of bubonic plague. Brought by ship into the city, where today a coronavirus contaminated cruise ship lingers off the bay, the plague broke out in Chinatown, leading to hundreds of deaths. Marilyn Chase explains in The Barbary Plague that “public health efforts… were handicapped by limited scientific knowledge and bedeviled by the twin demons of denial and discrimination.” Rather than offering treatment, Mayor James D. Phelan spread fear about the Chinese residents of the city, anticipating today’s talk of “foreign virus.” He called them “a constant menace to the public health,” while ironically exacerbating the outbreak by dint of his policy. Meanwhile, California governor Henry Gage simply denied the existence of the plague at all, fearing more that the economy would be harmed then that his fellow citizens were dying. 

If there is a commonality to the recurring themes that mark pandemics throughout history, it’s because of the physical reality of the ailment. Covid-19 can’t be explained away, can’t be ignored, can’t be obscured, can’t be denied. It can’t be tweeted out of existence. It turns out that internet memes aren’t the same thing as viruses. We may yet discover that in a “post-reality” world, reality has a way of coming to collect its debts. 

An important reminder, however; for all of the uncertainty, panic, and horror which pandemics spread, there is also the acknowledgment, at least among some, of our shared affliction, our collective ailment, our common humanity. Within a plague there are fears both rational and irrational, there are prejudices and panics, there is selfishness, cruelty, and hatred. But there is also kindness, and the opportunity for kindness. The story of pandemics contains flagellants, but also selfless physicians and nurses; it includes the shunning of whole groups of people but the relief of treatment as well. We’ve never faced something like the coronavirus in the contemporary Western world, but we’d do well to remember something strangely hopeful that Daniel Defoe observed in his quasi-fictional A Journal of the Plague Year, 1666 when the last of the major bubonic outbreaks killed a fifth of London’s population: “a close conversing with Death, or the Diseases that threaten Death, would scum off the Gall from our Tempers, remove the Animosities among us, and bring us to see with differing Eyes, than those which we look’d on Things with before.”   

 

 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174671 https://historynewsnetwork.org/article/174671 0
Bancroft Prize Goes to Books on Emancipation and Urban Renewal A sweeping reconsideration of the complexities of Emancipation and a biography of the nearly forgotten mid-20th-century urban planner who reshaped Boston and other cities have won this year’s Bancroft Prize, which is considered one of the most prestigious honors in the field of American history.

Lizabeth Cohen’s “Saving America’s Cities: Ed Logue and the Struggle to Renew Urban American in the Suburban Age,” published by Farrar Straus and Giroux, was cited for offering “a nuanced view of federally-funded urban redevelopment and of one of its major practitioners that goes beyond the simplicity of good and bad, heroes and villains.”

Reviewing the book last year in The New York Times Book Review, Alan Ehrenhalt praised Dr. Cohen, a professor of American Studies at Harvard, for her “incisive treatment of the entire urban-planning world in America in the last half of the 20th century,” and fair-mindedness in addressing what has become, he writes, “a highly polarized subject.”

The second winner, Joseph P. Reidy’s “Illusions of Emancipation: The Pursuit of Freedom and Equality in the Twilight of Slavery,” published by University of North Carolina Press, was cited by the prize committee for the way it builds on and departs from the huge existing literature on the subject to “deepen our understanding of the vagaries of Emancipation in the United States.”

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174669 https://historynewsnetwork.org/article/174669 0
Updated 4/3: What Historians Are Saying About COVID-19 and Trump's Response Click inside the image to scroll through tweets

 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174574 https://historynewsnetwork.org/article/174574 0
Updated 4/3: Historians Discuss the Media's Coverage of COVID-19 Click Inside the Image to Scroll Tweets

 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174678 https://historynewsnetwork.org/article/174678 0
Roundup Top Ten for March 20, 2020

The Shortages May Be Worse Than the Disease

by Elise A. Mitchell

Societies further their own destruction whenever they fail to provide anyone health care, housing, or dispensation from work because of their employment, socioeconomic, or immigration status.

 

We Need Social Solidarity, Not Just Social Distancing

by Eric Klinenberg

To combat the coronavirus, Americans need to do more than secure their own safety.

 

 

Hurricane Katrina Provides Lessons about Closing Campuses

by Andre M. Perry

Students in New Orleans needed resources to return to normalcy. But when racial wealth gaps are the norm, a stumble can become a fall.

 

 

Work Requirements are Catastrophic in a Pandemic

by Elisa Minoff

Instead, we should be implementing policies that support people’s work in the wage labor force and make it possible for working families to make ends meet.

 

 

Counting Everyone—Citizens and Non-Citizens—In the 2020 Census is Crucial

by Brendan A. Shanahan

Even without a citizenship question, the Trump administration wants to shape how states reapportion their legislatures.

 

 

Why Sanders Isn’t Winning Over Black Voters

by Keeanga-Yamahtta Taylor

For millions, even when government “works” it is not working for them.

 

 

An Epicenter of the Pandemic Will Be Jails and Prisons, if Inaction Continues

by Amanda Klonsky

How will we prevent incarcerated people and those who work in these institutions from becoming ill and spreading the virus?

 

 

Democracy: How 1860 Connects to 2020

by Daniel W. Crofts

In the years before the Civil War, just as today, minority rule was the norm. White Southerners dominated the Democratic Party, and the Democratic Party dominated the federal government.

 

 

College Worth Fighting For

by Ryan Boyd

Professors are in a class struggle, a real fight that cannot be won with critique alone.

 

We Can’t Forget Women as We Tell The Story of COVID-19

by Jennifer Brier

Women who have been medical (and political) subjects of HIV/AIDS also have much to teach us during our current pandemic.

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174668 https://historynewsnetwork.org/article/174668 0
An Interview with Mary V. Thompson on the Lives of the Enslaved Residents of Mount Vernon

 

Mount Vernon Historian Mary V. Thompson is the author of “The Only Unavoidable Subject of Regret”: George Washington, Slavery, and the Enslaved Community at Mount Vernon (University of Virginia Press, 2019).

Robin Lindley is a Seattle-based writer and attorney. He is features editor for the History News Network (hnn.us), and his work also has appeared in Writer’s Chronicle, Crosscut, Documentary, NW Lawyer, Real Change, Huffington Post, Bill Moyers.com, Salon.com, and more. He has a special interest in the history of human rights and conflict. He can be reached by email: robinlindley@gmail.com.

 

Drawing on years of extensive research and a wide variety of sources from financial and property records to letters and diaries, Ms. Thompson recounts the back-breaking work and everyday activities of those held in bondage. Without sentimentality she describes oppressive working conditions; the confinement; the diet and food shortages; the illness; the drafty housing; the ragged clothes; the spasms of cruel punishment; the solace in religion and customs; and the episodic resistance. 

Ms. Thompson also illuminates the lives of George and Martha Washington through their relationships with black slaves. Washington was a strict disciplinarian with high expectations of himself and his slaves. As a young man, he callously bought and sold slaves like cattle. However, as Ms. Thompson explores, his attitudes toward slavery and race changed with the American Revolution when he saw black men fight valiantly beside white troops. Although not a vocal abolitionist, his postwar statements reveal that he found slavery hypocritical and incompatible with the ideals of democracy and freedom for which he had fought. He was the only Founding Father who freed his slaves in his will.

Ms. Thompson brings to life this complicated history of enslaved people and their legendary owner. Her careful explication of the many aspects of life at Mount Vernon offers a vivid microcosm for readers to better understand the institution of slavery and its human consequences during colonial period and early decades of the republic.

Since 1980, Mary V. Thompson has worked at George Washington's Mount Vernon in several capacities, and currently serves as Research Historian who supports programs in all departments at Mount Vernon, with a primary focus on everyday life on the estate, including domestic routines, foodways, religious practices, slavery, and the slave community. She has lectured on many subjects, ranging from family life and private enterprise among the slaves, to slave resistance, to religious practices and funerary customs in George Washington's family. Her other books include “In the Hands of a Good Providence:” Religion in the Life of George Washington, and A Short Biography of Martha Washington.Ms. Thompson also has written chapters for several books, entries in encyclopedias, and numerous articles. She earned an M.A. in History from the University of Virginia.

Ms. Thompson generously responded by email to a series of questions on her work and her new book on the slave community at Mount Vernon.

 

Robin Lindley: Congratulations Ms. Thompson on your recent book on George Washington and enslavement at Mount Vernon. Before getting to your book, I wanted to ask about your background. How did you decide on a career as a historian?

Mary V. Thompson: My father was a major influence on that.  He served for 32 years as an Army Chaplain and, through quite a few moves, would drag us to nearby museums and historic sites and encourage us to read about the next place we were going and all the exciting things that happened there, so we were pretty psyched by the time we got there. He was also the first curator of the Army Chaplains Museum, when it was in Brooklyn, during the Bicentennial of the Revolutionary War.  As part of that job, he also edited a 5-volume history of the Chaplains Corps, while writing the first volume, which covered the American Revolution.  So, as I went through high school, I helped in the museum with some of the exhibits, helped with acquisitions, and with research. I loved all of it.

Robin Lindley: I understand that you’ve spent most of your professional career as a historian at Mount Vernon. How did you come to work at this historic plantation and what is your role?

Mary V. Thompson: This was definitely a result of serendipity---or providence, depending on your world view.  I was getting ready to finish a master’s degree at the University of Virginia, while working as a volunteer for the Army Ordnance Museum at Aberdeen Proving Ground in Maryland, and sending out what felt like bazillions of resumes for jobs all over the country.  I started out part-time [at Mount Vernon] as an historic interpreter (giving tours to about 8,000 visitors per day).  From there, I moved on to doing special projects for the Curator, then to assisting full-time in the Curatorial Department.  I moved up to being the Registrar in the Curatorial Department, which involved cataloguing new objects as they came into the collection, keeping track of where everything was, doing inventories, working with insurance companies, etc.  

To keep me from going nuts, they gave me one day per week to do research on a specific, agreed-upon topic, the first of which dealt with foodways.  After a few years, my boss asked me to switch to studying slavery and slave life at Mount Vernon.  In the late 1990s, as the 200thanniversary of George Washington’s death was rapidly approaching, I worked on three major projects: a travelling exhibition entitled, “Treasures from Mount Vernon:  George Washington Revealed,” which opened in late 1998 and travelled to five cities around the country; redoing the furnishings in the mansion, with special exhibitions to make the house look as though the Washingtons had just walked out of the room; and the recreation/reenactment of George Washington’s funeral, a three-hour event on C-Span.  

I was then moved to the Library, where I worked as the Research Specialist and then as Research Historian.  This involved dealing with questions from people all over the country, generally dealing with domestic life here at Mount Vernon; helping authors, illustrators, and publishers by vetting publications; helping pretty much every department on the estate with helpful quotes and deciding whether we had enough information on a particular subject to do a special exhibit or program built around it.  Best of all was the opportunity to give talks on and publish my own research.  

Robin Lindley: What sparked your recent book on enslavement at Mount Vernon? 

Mary V. Thompson: I actually started working on the topic in the late 1980s, because Mount Vernon really needed to be able to teach its staff and visitors about this issue, but it was probably about seven or eight years after that before it knew it wanted to be a book.  It was in the early 1960s that I first learned about slavery, as a result of the Civil War centennial, which was going on when I was an elementary school student, at the same time that the Civil Rights movement was playing out on the news every night during dinner.  Then in graduate school at the University of Virginia in the late 1970s, slavery was the subject of much of our reading and classroom discussions.

Robin Lindley: Your book has been praised for its impressive detail and extensive research. What was your research process?

Mary V. Thompson: Thankfully, I was able to start with some of the sources compiled by prior members of our Library staff.  One of the Librarians had put together a bound volume of statements by George Washington on the topic of slavery, which she’d typed up back in the 1940s.  I went through that, page by page, listing the topics covered on each and then photocopied the pages and put them into loose-leaf binders for each of those topics.

I also went through bound volumes of photostats of the Weekly Work Reports that Washington required from his overseers, as well as photostats of his financial records.  The Weekly Reports provided detailed information on the work being done on each of the five farms that made up the Mount Vernon estate, as well as information on the food being delivered to each, the weather on each day, food delivered to each farm, the number of people working on each farm, and explanations for why certain people were not working each week.  This last category was really interesting, because it provides information on illnesses, injuries, childbirth, and how long women were out of work because they were recovering from giving birth.

Another great source was correspondence by family members other than George Washington, as well as descriptions of Mount Vernon by visitors to the plantation, that often mention those enslaved there.  In order to understand where Mount Vernon fit in the overall picture of plantations in Virginia, it was also necessary to learn about life at Monticello, Montpelier, Sabine Hall, and elsewhere in the colony/state.   

Robin Lindley: You reconstruct and put a human face on the lives of slaves at Mount Vernon—despite the virtual lack of any contemporary documents by slaves from that period. How did you deal with that challenge?

Mary V. Thompson:  Getting at the enslaved community was one of my favorite parts of this project.  I started by taking the two fullest slave lists, from 1786 and 1799, and used them to try to reconstruct families. Thankfully, these two lists enumerated the people on each of the five farms and what their work was, with the 1786 list linking mothers and their children who were too young to work, and the ages of those children.  The 1799 list did the same, but also linked women and their husbands and told where those husbands lived (whether they were on the same farm with their wives and children, lived on another of Washington’s farms, or belonged to another owner altogether, or were free men).

 Comparing the two lists made it possible to start reconstructing extended, multigenerational families.  I put together a document for each of the farms, organized by family, and then, as people would be named in the work reports, the financial records, or correspondence, would put those references in the individual records, if I was as sure as I could be that I’d found the right person.  

For most of the people, I was keeping track of such things as information about what work they were doing; references to their health; children; ways they might have made extra money; rations of food and clothing; instances of resistance; etc.

Robin Lindley: I was impressed by your description of the massive size of Mount Vernon and the number of slaves who worked there. How would you briefly describe the Mount Vernon plantation in Washington’s era in terms of area, farming, crops, forests, and number of slaves? 

Mary V. Thompson:  Mount Vernon reached an ultimate size of 8,000 acres during Washington’s lifetime. While Washington, like many plantation owners prior to the American Revolution, started out as a tobacco grower, by the late 1760s, he was making the switch from tobacco to grain and from markets in Europe to American and West Indian markets.  Much of the land was still forested after switching in crops and markets. As I understand it, in order to keep fireplaces running on a daily basis for heating, cooking, and washing, it takes ten acres of forest to get enough trees and branches dying naturally to do those things, without the need to cut any more trees.  The largest number of enslaved people on the plantation was 317 in 1799, the last year of George Washington’s life. 

Robin Lindley: What are a few salient things you learned about Washington’s treatment of slaves? 

Mary V. Thompson: Washington was a stickler for detail and a strict disciplinarian.  He was also approachable when his enslaved workers had problems with their overseers, needed to borrow something, or someone was interested in moving from one plantation job to another that required more responsibility.  They even talked to him to clarify things, when he didn’t understand a particular problem.  

Robin Lindley: How did Washington’s military background affect his treatment of slaves and other workers?

Mary V. Thompson: Washington used the same methods to keep an eye on his army as he did on the plantation with his slaves.  He directed that both officers and overseers spend time with his soldiers and slaves, respectively; he expected regular reports from them so that he had a very good idea about how things were going and would also travel daily through his military camps and farms to catch problems before they became major issues.  He also insisted on proper medical care for both soldiers and slaves and was a strict disciplinarian in both situations.

Robin Lindley: How did Martha Washington see and treat slaves? It seems she was more dismissive and derogatory than her husband concerning black people.

Mary V. Thompson:  Like her husband, Martha Washington tended to doubt the trustworthiness of the enslaved people at Mount Vernon.  Upon learning of the death of an enslaved child with whom her niece was close, she wrote that the younger woman should “not find in him much loss,” because “the Blacks are so bad in th[e]ir nature that they have not the least grat[i]tude for the kindness that may be sh[o]wed them.”  

The Washingtons never seemed to realize that they only knew Africans and African-Americans as people who were enslaved, which meant that they were not interacting as equals and any ideas they may have had about innate qualities of this different culture were tainted by the institution of slavery.

Robin Lindley: I realize that direct evidence from slaves is limited, but what did you learn about how slaves viewed George Washington? 

Mary V. Thompson:  Because Washington was so admired by his contemporaries, many of whom came to Mount Vernon to see his home—and especially his tomb—those visitors often talked with the slaves and formerly enslaved people on the plantation in order to learn snippets about what the private George Washington was like. 

Extended members of the Washington family, former neighbors, official guests, and journalists, often wrote about their experiences at Mount Vernon and what they learned about Washington from those enslaved by him. Some people were still angry about how they were treated, while others were grateful for having been freed by him.

Robin Lindley: In his early years as a plantation owner, Washington—like most slave owners—saw his slaves as his property and he bought and sold slaves with seeming indifference to the cruelty and unfairness of this institution. He broke up slave marriages and families, and he considered black people indolent and intellectually inferior. However, as you detail, his views evolved. How do you see the arc of Washington’s life in terms of how he viewed his slaves and slavery?

Mary V. Thompson: That change primarily happened during the American Revolution.  Washington took command of the American Army in mid-1775.  Within three years, he was confiding to a cousin, who was managing Mount Vernon for him, that he no longer wanted to be a slave owner.  In those years, Washington was spending long periods of time in parts of the country where agriculture was successfully practiced without slave labor and he saw black soldiers fighting alongside white ones. He also could see the hypocrisy of fighting for liberty and freedom, while keeping others enslaved.  There were even younger officers on his staff who supported abolition.  

While he came to believe that slavery was something he wanted nothing more to do with, it was one thing to think that slavery was wrong, and something else again to figure out what to do to remedy the situation.  For example, it was not until 1782 that Virginia made it possible for individual slave owners to manumit their slaves without going through the state legislature.  After an 8-year absence from home, during which he took no salary, Washington also faced legal and financial issues that would also hamper his ability to free the Mount Vernon slaves.

Robin Lindley: Many readers are familiar with the story of Thomas Jefferson and Sally Hemmings. Did you find any evidence that George Washington had intimate relationships with any of his slaves or any free blacks?  

Mary V. Thompson:  Not really. As a young officer on the frontier during the French and Indian War, one of his brother officers wrote a letter, teasing him about his relationship with a woman described as “M’s Nel.”  The wording suggests several possibilities: she might have been a barmaid working for a tavern owner or pimp, whose first initial was M; another possibility is that she was the mistress of a brother officer; or perhaps that she was enslaved to another person.  With the minimal evidence that survives, there are many unanswered questions about this mystery woman.

The oral history of an enslaved family at Bushfield, the home of Washington’s younger brother, John Augustine Washington, alleges that George Washington was the father of a young male slave named West Ford, who was born in Westmoreland County, Virginia, roughly 95 miles from Mount Vernon, about a year or two after the American Revolution.  Here, the surviving documentary evidence contradicts the oral history, indicating that Ford’s father was someone in the Bushfield branch of the family.

Robin Lindley: What struck you particularly about the working conditions for slaves at Mount Vernon and how did they compare to conditions at other plantations?

Mary V. Thompson:  As was true on other Virginia plantations in the eighteenth century, the enslaved labor force at Mount Vernon worked from dawn to dusk six days per week, with the exception of four days off for Christmas, two days each off for Easter and Pentecost, and every Sunday throughout the year,  Because Easter and Pentecost took place on Sunday, which was already a day off, the slaves were given an additional day off on the Monday following the religious holiday.  If they were required to work on a holiday, there is considerable evidence that they were paid for their time on those days.

Robin Lindley: What are a few things you’d like readers to know about the living conditions of slaves at Mount Vernon?

Mary V. Thompson:  Most of the enslaved residents at Mount Vernon lived in wooden cabins—the smaller ones served as homes for one family, while the larger “duplexes” housed two families, separated by a fireplace wall.  

The majority of Americans at this period, free and enslaved, lived in very small quarters.  In comparing the sizes of cabins used by enslaved overseers and their families at two of the farms at Mount Vernon with those of the overseer on a plantation in Richmond County, the two at Mount Vernon had a total living space of 640 square feet, while the other had 480 square feet.

The homes of 75% of middle-class white farmers in the southwestern part of Virginia in 1785 were wooden cabins ranging from 640 square feet to 394 square feet.  Our visitors tend to be very surprised to learn that the entire average Virginia home for middle class or poor families in the eighteenth century would fit easily into just “the New Room,” the first room they enter in the Mount Vernon mansion. In other words, pretty much everyone was on the poor end of the scale, unless they were like the Washingtons, the Custises, or the Carters.  

Robin Lindley: I was surprised that some of the Mount Vernon slaves were literate. I had thought that education of slaves was illegal then. 

Mary V. Thompson: There were no restrictions on teaching slaves to read in eighteenth century Virginia, and, in fact, it might have been a useful skill, especially for slaves working in more of a business capacity, than in agricultural labor.  It was not until after a slave revolt known as Gabriel’s Rebellion (1800), that the state passed a law forbidding enslaved people to gather together in order to learn to read.  At least one historian has suggested that between 15 and 20 percent of slaves could read in the 18thcentury.

Robin Lindley: You found evidence that many slaves were aware of African lore and practices—at times from stories passed down through generations and at times from black people more recently arrived from Africa. What are some things you learned about African influences?

Mary V. Thompson:  African influence can be seen in everything from naming practices within families, to family lore and folk tales told to children, the languages spoken in the quarters, religious beliefs and practices, and even some of the food and cooking traditions.

Robin Lindley: You note that slaves were punished physically at Mount Vernon and that even Washington at times applied the lash. What did you find about forms of punishment at the plantation?

Mary V. Thompson:  One of the changes on the plantation after the war, recorded by Washington’s secretary Tobias Lear, was that his employer was trying to put limits on the physical punishment doled out to the slaved.  According to Lear, Washington wrote that no one was to be punished unless there was an investigation into the case and “the defendant found guilty of some bad deed.”  After the war, Washington also tried to use more positive reinforcement, instead of punishment, in order to get the sort of behavior he wanted.  Those positive reinforcements included such things as the chance to get a better job, earning monetary rewards, or even better quality clothing.

Robin Lindley: What happened to slaves at Mount Vernon who escaped and were recaptured? 

Mary V. Thompson: It would depend on the circumstances and how difficult it was to get them back.  Some people might run away briefly because of a conflict with someone else in the quarters, or with an overseer and needed a breather to let the situation cool off.  Others might have left to visit relatives on another plantation.  If they were not gone long and came back on their own, there might be little punishment.  In other cases, if someone continually ran away or was involved in petty crimes, they might be punished physically or even sold away.  

We know of at least one slave, who was sold to another plantation in Virginia, after running away four times in five years; three times when George Washington sold a person to the West Indies, something many people today consider akin to a death sentence; and one case where a young man at Mount Vernon—and his parents—were told that he would be sold there, as well, if he didn’t start exhibiting better behavior.

Robin Lindley: Did you find examples of slave resistance?

Mary V. Thompson:  Yes, many. When people today think of resistance, most probably are thinking of things like running away, or physically fighting back with an overseer, stealing something to eat, or poisoning someone in the big house. Not everyone was brave enough or desperate enough to do something so easily detectable.  They might well have tried something less obvious, like slowing down the pace of work, procrastinating on finishing a particular job, or even pretending to be sick or pregnant.  

Robin Lindley: Oney Judge Staines was a Mount Vernon slave who escaped to New Hampshire a few years before Washington died. He was angry and vigorously sought her return, but was unsuccessful. Did you find new information on this fascinating case?

Mary V. Thompson:  It wasn’t exactly new information, but the fact that this young woman was one of the “dower slaves” from the estate of Martha Washington’s first husband, meant that Martha did not own her or any of the others, but only had the use of them (and any offspring they had) until her death.  George Washington would lose access to those slaves upon Martha’s death, when the dower slaves would be divided among the heirs of her first husband, who in this case were her four Custis grandchildren.

According to a Virginia law at the time, if any dower slave from that state was taken to another state, without the permission of the heirs—or presumably the guardian of those heirs if they were minors—then the heirs or the guardian acting on their behalf would be entitled to take the entire estate immediately, without having to wait for the death of either the husband or wife. Oney’s escape may well have threatened the entire Custis estate.

Robin Lindley: You note that Washington was the only slave-owning Founder who freed all of his slaves in his will. You also note that he seemed circumspect and perhaps ashamed about owning slaves later in his life. Did he ever speak out publicly for the abolition of slavery in his lifetime?

Mary V. Thompson:  It depends on what a person means by “publicly”.  Washington corresponded with quite a few abolitionists, both British and American, after the Revolution.  In response to those people who were pushing him to emancipate those he held in bondage, Washington typically responded that he thought the only legitimate way to do that was through a gradual process of manumission, much like the northern states were setting up.  He noted that he would always vote to forward such a plan, however, he never stood in front of a legislative body as a proponent of a plan like that.  

Robin Lindley: What do you hope readers take from your groundbreaking book? 

Mary V. Thompson:  I would like people to understand that slavery in eighteenth-century Virginia differed from the same institution in both the seventeenth and nineteenth centuries, and that it was a complex institution.  For example, there were people at Mount Vernon who were free, hired, indentured, and enslaved.  They came from many countries and cultures on two continents, represented a variety of both European and African religious traditions, and began their relationships speaking many different languages.  

Robin Lindley: It’s a complicated story. Thank you very much for your thoughtful comments Ms. Thompson, and congratulations on your illuminating book on the Father of the Country and enslavement on his plantation. 

 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/blog/154325 https://historynewsnetwork.org/blog/154325 0
When the Western Wall Was A Battleground For Jewish Rights

 

Moshe Phillips reviews this book as part of Herut North America's Zionist History Book of the Month series.

 

The book The Western Wall Wars (Whirlwind Press, 2019) details the stories about the young men who, from 1930 to 1947, violated British regulations which banned the sounding the shofar at the Western Wall at the conclusion of Yom Kippur services each year.

 

Moshe Zvi Segal was the first of these young men and he was arrested for sounding the shofar. He blazed a path forward for young Zionist revolutionaries to follow in what was the longest running Zionist underground operation in its history.

 

Rabbi Moshe Segal (1904-1985) was the quintessential Zionist rebel and was a key figure in the histories of Betar, Brit HaBiryonim, Irgun, LEHI (Stern Group), and Haganah and he was the founder of the Brit HaShmonaim religious youth movement. All of these organizations were part of the movement initiated by Zev Jabotinsky (1880-1940) who was the greatest pre-World War Two Zionist leader after Theodor Herzl. Segal himself was a close comrade of Yitzhak Shamir when the later prime minister was a 1940s commander of LEHI.

 

Author Zev Golan knew Rabbi Segal personally and interviewed him many times in addition to attending his lectures and translating his writings. Golan is one of only a handful of Americans who made it their business to seek out the aging heroes of the Irgun and LEHI and to get to know them, their stories, and the ideas that animated their deeds while they were still alive.

 

The Western Wall Wars is subtitled How the Wailing Wall Became the Heroic Wall and is a direct result of Golan's relationship with Segal. In 1930, Segal was the first individual to violate the British regulations against the sounding of the shofar at the Western Wall at the conclusion of the Yom Kippur service. Until 1947, a volunteer from the Irgun, Betar, or the Brit HaShmoniam sounded the shofar every year–often after receiving personal training from Segal in both the mitzvah of shofar as well as how to elude the British police. The British authorities went to great lengths to stop the shofar from being sounded. British efforts to stop Jews from performing a mitzvah probably will seem impossible to fathom to today’s readers and that is just one of the reasons this book is so important.

 

The book explains how Segal and the others who followed in his footsteps transformed the Western Wall from a site of wailing to one of national pride. The book reveals the details of the actual operations at the Western Wall and the full stories of the volunteers who were arrested, escaped from prison, and/or deported to prisons in Africa. Some were involved in the 1946 Irgun attack on the King David Hotel and other Irgun or LEHI operations. Many later fought in Israel's wars. The Western Wall Wars also covers Arab attempts during the 1920s to drive the Jews from the Western Wall and the Jewish response to the Arab effort. Segal was a leader of the opposition in this area as well.

 

The emerging Jewish Underground in the pre-1940 period was a time when the Jabotinsky movement suffered the slings and arrows of the leftist establishment and bravely soldiered on. The light of history has shown that the stances of the Jabotinsky Zionists were correct. If Jabotinsky had been more successful, perhaps the tragedies of the Holocaust and the loss of life in the 1948 war could have been lessened. Progressive historians have always downplayed--and often completely removed–the role of Jabotinsky's movement from their histories of Zionism. This book helps to preserve authentic history and that is a highly praiseworthy thing.

 

The story of the Zionist underground in the pre-state period told here also helps the reader to understand the ideology that guided these warriors as they fought for Jewish rights and rebelled against the British Empire.

 

And this is no small thing. The ideology of Jabotinsky, Rabbi Segal and their comrades is just as instructive and relevant now as it was many decades ago--probably more so.

 

Now that the Jewish People possess a sovereign Jewish State, the concept of just what a Jewish State should rightly be is of vital importance. Avrum Burg, a former Speaker of the Knesset who was also a former chairman of both the World Zionist Organization and Jewish Agency, said "To define the state of Israel as a Jewish state is the key to its end," in a June 2007 interview with Israel's Haaretz newspaper. Now, we live in a time when many radical Jewish organizations in the U.S. struggle to redefine Israel as something other than a Jewish State.

  

For today’s Zionists to be truly successful in a way that transcends politics and elections–in a nation transforming way–we must reevaluate the philosophy of the heroes who fought for Israel’s freedom and Jewish rights in Jerusalem. These heroes were not only the ideological heirs of Jabotinsky but the champions who brought Jabotinsky's deepest hopes into reality.

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174672 https://historynewsnetwork.org/article/174672 0
The Cost of Loyalty: A Crisis of Ethics in the Military

 

At a time when faith in our institutions is precarious, and our old dependence on norms has proven to be misguided at best, Americans take succor in believing that the backbone of our country’s security—the U.S. military—is, well, secure. 

It’s necessary to think this way because the alternative is just too disturbing to consider. Yet, from the U.S. Military Academy, known as West Point, which its military administrators assure us with no hint of humility, or data, is “the world’s preeminent leadership institution,” I’d like to sound the alarm. Our faith in military leaders and their progeny is dangerously misplaced. 

The Afghanistan Papers, more than 600 recorded interviews of military and civilian leaders obtained by the Washington Post after years of litigation against the government, showed, according to former lieutenant general Douglas Lute, “we didn’t have the foggiest notion of what we were undertaking” in a war that started more than 18 years ago.

Now 75 years since America’s last victory in a major war, one led by a general, Eisenhower, who was born in 1890, Americans are numb to the military’s strategic failures: bombing to pulp the cities of North Korea, the war of attrition in Vietnam, the wayward invasion and dismal occupation of Iraq. When President Trump and Congress increase the military budget, the near worship expressed by many Americans for the military prevents discussion about what is necessary for self-defense and what is wasted on the self-aggrandizement of generals and admirals and military contractors. 

Recent policies announced by the U.S. military regarding a redeployment of landmines at the discretion of commanders and the placement of low-yield nuclear warheads on submarines should make us shudder. From outside, the military may appear to be an undifferentiated mass of strong-jawed soldiers who come from a sturdier fabric than the rest of us, people who could be trusted with such massive responsibilities. It’s a myth. As I’ve described recently, the military does not have the capacity necessary to achieve success in a modern world. These are average men and women trained within an inch of their nature at the U.S. military academies. And I’ve seen these people up close. I’ve answered to them. I’ve been blackballed by them, retaliated against by them, and will likely be attacked by them again simply for speaking up. At West Point, the headwaters that feeds the armed forces, the values of loyalty and conformity have blasted away any sense of ethics. When fealty to anyone of higher rank becomes the norm—and the essential component of careerism—truth will dissipate quickly. 

The military is in trouble because the schools feeding it have long been breeding grounds for its worst tendencies. One 2017 study by researchers at West Point shows that the intellectual core of the Army and West Point has been eroding for the past 75 years.

Most Americans  still consider the military institution to be an invincible force. Even though, led mainly by generals who graduated from West Point, it snatched a stalemate from the jaws of victory in Korea, got 58,000 young American men killed in Vietnam, and has run Iraq and Afghanistan into the ground, we continue to worship at the altar of military infallibility. Its leaders are driven by self-regard, its soldiers by self-preservation, and all are ingrained to protect the institution, not us.

From the first day of Beast Barracks, a summer training and indoctrination ritual, new cadets at West Point learn that loyalty to each other is the preeminent value, high above all the others, including truth. Of course, loyalty is important—we all understand the need for it, and how war requires unity forged in this trust. But in reality, loyalty doesn’t look like it does in movies—an exhausted soldier avoiding fire to save his comrade. It takes the form of hiding a breach from a sanctioned “outsider,” protecting a superior from the suspicion of an inspector general, lying on a casualty report, or falsifying records to evade criticism. The value being cultivated by the military ethos at the academies and in the theaters across the world goes by a different word for the rest of us: deceit.

And the military knows it. Leonard Wong and Stephen Gerras, former colonels now at the Army War College, wrote in “Lying to Ourselves: Dishonesty in the Army Profession,” a 2015 study describing a culture devoid of integrity that “’white’ lies and ‘innocent’ mistruths have become so commonplace in the U.S. Army that there is often no ethical angst, no deep soul-searching, and no righteous outrage when examples of routine dishonesty are encountered.” Rather, “mutually agreed deception exists in the Army because many decisions to lie, cheat, or steal are simply no longer viewed as ethical choices.” 

The Afghanistan Papersillustrate a military command and civilian bureaucracy in almost unanimous agreement that the strategies employed in the war in Afghanistan have been futile. But not a single official in a position to know ever spoke up. This is not a coincidence. This kind of deception is not a bug of the military hierarchy. It is a feature.

The problems coming from inside the U.S. military will be difficult to solve for two reasons. First, Americans’ trust in the military is akin to religious worship, so we don’t engage in any real oversight. Second, the military itself has no incentives for admitting faults in order to rectify problems, despite a continuing cascade of losses and harm to soldiers at home. “We’re creating an environment where everything is too rosy because everyone is afraid to paint the true picture,” an army officer said in the Wong-Gerras study. “You just wonder where it will break, when it will fall apart.”

The regal military “chain of command,” a relic in 2020, has become, in the hands of military officers, a device to coerce blind unquestioning loyalty at the expense of truth. This has spawned silence among generals when their speaking up could prevent America from engaging in almost perpetual warfare. 

They could best fulfill their constitutional obligations by telling us the truth. But, alas, they weren’t trained for that. 

Though evidence amasses—through scandals, leaks, evidence on the ground, and decades of failure—Americans  stubbornly hold faith in the U.S. military. A staggering 80% of Americans believe our military will always act in our country’s best interest. Is it because, were we to consider the evidence, it would be hard to sleep at night? 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174571 https://historynewsnetwork.org/article/174571 0
"Faster" Author Neal Bascomb on the Jewish Auto Racer who Defied Hitler

 

Neal Bascomb has written many works of nonfiction, digging into historical episodes to find stories of perseverance and triumph over adversity. The research for his latest book involved unearthing a forgotten challenge to Nazi doctrines of racial supremacy by a Jewish driver pushed out of racing by European antisemitism, and his unlikely allies. It also involved high-speed laps in a rare vintage Delahaye race car. He shares insight into the process of research and writing here. 

 

How did you come to write about Rene Dreyfus and his once-famous Delahaye?

Book ideas originate from my places. Sometimes you’re out fishing for ideas; other times you come across a vignette in a history that you believe could play out on a much larger scale. And on the rare occasion, one drops your lap like a gift from heaven. This was the latter, courtesy of my good friend and talented Wall Street Journalcolumnist Sam Walker. Four years ago, while in New York visiting his family, he passed along to me a small news article about classic car collector Peter Mullin who had just premiered his latest gem called the Million Franc Delahaye. There was an intriguing story behind its genesis. According to the piece, the French-made car had been produced to take on the fearsome German Silver Arrows before the break of World War II; its creation was financed by an American heiress named Lucy Schell; and, if this was not epic tale enough, the Delahaye was piloted by Jewish driver Rene Dreyfus. It did not take a genius to know this was a remarkable sports story, perhaps better even than Jesse Owens at the Berlin Olympics, of David beating Goliath. Instead of running, we had race cars. Even better, Hitler sought to destroy the Delahaye when he invaded France, and the car was disassembled and hidden to avoid discovery. After years of restoration efforts, Mullin had brought the car back to its former glory. Even after focusing on this story for three years, I still get excited thinking about it.

 

Beyond what sounds like a compelling narrative, why does this history of a race car matter to today’s reader?

In building the Silver Arrow race cars, Hitler wanted to prove the superiority of the German nation—yes, in motor sport but also in terms of their engineering, technological and economic prowess. These bullets on wheels were nationalistic symbols (in some ways of capitalism versus fascism). The same game continues to play out today. One sees it most pervasively by China through their high-speeds trains, their skyscrapers, their leaps in artificial intelligence and super computers, and even their “Belt and Road” initiative spanning the globe. Not only must they dominate, but they need to be seen to dominate above all others, past and present. The battle on the Grand Prix was an early predecessor of this, and the attempt by Hitler to erase the history of their defeat at the hands of Dreyfus, Schell, and Delahaye (by destroying the records of the 1938 season and seeking out the car itself for the same) has echoes today—and throughout time—as well. It is not much different when you think of ISIS trying to eliminate any vestige of Shiite mosques, tombs, and shrines in Iraq. Or the mortaring of the famed Mostar bridge during the Bosnian war. It is rewriting history by attempting to erase it. That is why the resurrection of the Delahaye to its former glory today resonates so deeply. 

 

Have you always been a car fanatic?

Truth be told, no. As a kid, my grandmother owned a 1968 red Mustang convertible, which I was in absolute awe of, but my fascination waned, not least because when I was old enough to drive, my experience was limited to utilitarian snorers, including a Pontiac Sunbird, Isuzu Impulse, and Suzuki Grand Vitara. Cubic capacities, the differences between a turbo and supercharged engine, and the dynamics of suspension systems—all were lost on me. That said, when I first visited the Petersen Museum in Los Angeles to meet Peter Mullin, everything changed. There was a special exhibit on Bugattis, and I simply could not believe the beauty and refinement of these masterpieces of design. Then, when I first got a chance to ride in some of these classics, to feel their power and the thrum of their engines, I was lost to this world. Now I frequent classic car sites like bringatrailer.com and revel in the weekly “My Ride” dispatches from the Wall Street Journal’s A.J. Baime, to name just a few distractions. One day soon, I intend to get my hands on a ’68 convertible too!

 

Why have we never heard of the name Lucy Schell?

Over the past two years, the New York Times has been running a spectacular project called “Overlooked.” Originated by Amisha Padnani, it features the obituaries of remarkable women that the paper-of-record overlooked on their deaths, including the first American woman to claim an Olympic championship to a literary star of the Harlem Renaissance. Surely the faded memory of Lucy Schell suffered from the same discrimination and prejudice. The fact is Lucy was a true path-breaker who lived a incredible life. A nurse in WWI, she went on to become one of the first speedqueens. For almost half a decade, she was one of the best Monte Carlo Rallyers, man or woman. She was surely the top-ranked American. Then she was the first woman to start her own Grand Prix race car team. She helped win the famed Million Franc prize for Delahaye, and it was her car—and her driver Rene Dreyfus—who beat the German Silver Arrows in an epic race before the war. One of my proudest accomplishments withFaster is restoring her rightful position in the racing history!

 

What was the most challenging aspect of the research?

On previous books, there was often a world of publicly available, primary research. Sometimes I had to dig for weeks or months among obscure files to discover what I needed, but a picture ID and a fair travel budget usually sufficed to obtain entry (except in Russia!). In the automotive world, private collectors often corner the market on archival material, including company documents, interviews, photographs or personal papers. At first, I was rebuffed by these collectors, likely because I was seen as an interloper (and not French enough when it came to Delahayes). Charm offensives and a lot of follow-up finally gained me access to many treasures scattered about the globe, whether found in a sprawling French farmhouse, a cluttered Seattle garage, or a storybook English manor, among other places. Fortunately, the reward was never-before heard interviews with Rene Dreyfus, personal histories of Lucy Schell, grainy video footage of 1930s Grand Prix racing, and rarely seen blueprints and production figures from Delahaye. The generosity from these collectors, not to mention the exquisite archives at the REVS Institute and Daimler-Benz, have allowed me to tell this remarkable story in what I hope is a visceral, edge-of-your seat way.

 

What was most memorable about writing Faster?

The first I chronicle in the introduction: zooming through the orange groves of California in the 1938 Delahaye 145 at speeds that still make me tremble. Second to that would be my journey to France, Germany, and Monaco. It’s one thing to watch a race on the Nurburgring or through Monte Carlo, it’s another to drive these same stretches, then walk them on foot. I wanted to know every turn and dip in an attempt to get some sense of the challenges of racing these courses at hundreds of miles an hour, amid a crowded field of other cars that could vault off the road at any minute (or directly into you). Of special note was a private tour of Montlhery outside Paris. We raced around the oval autodrome, and I truly got the sense of what it was like to round the banked curves, feeling like a fly stuck on a wall, but in danger of slipping off at any moment. It was deliciously frightening, and the experience of a lifetime. 

 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174570 https://historynewsnetwork.org/article/174570 0
Golda Meir and Women's Wartime Leadership

 

A CNN poll at the time of the New Hampshire primary revealed that 30 percent of voters believed a female Democratic nominee would have a harder time beating Donald Trump than a male nominee would. As the number of women in the field dwindled from six, to two, to zero, in an election cycle that had been widely discussed as a possible referendum on women’s empowerment, a question emerged, generally implicit but still present:  will voters trust a woman to lead the nation in time of war?

 

The potential for a woman to wield ultimate power in a field jealously guarded by men for centuries was answered, though not definitively, in 2016. Nearly 66 million citizens voted for Hillary Clinton to be their military commander-in-chief, three million more than voted to give that authority to Donald Trump. Matters of war are certainly not the only election issues. But the absence of a woman war leader is remarkable for a country that has stood on democracy’s cutting edge since its founding and has been engaged for decades in nearly continuous armed conflicts abroad—while many other countries have looked to women for wartime leadership.

 

Among modern leaders, none played for higher stakes than a Jewish grandmother named Golda Meir. She helped coordinate Jerusalem’s defense during Israel’s war of independence in 1948, served as foreign minister during the 1956 Suez War, and informally advised the government during the 1967 Six-Day War. Two years later, she became Israel’s first female prime minister. In a country in a constant state of war, she received many of the sort of the middle-of-the-night calls that were portrayedprominentlyand ominouslyin American campaign commercials in 2008.

 

On October 6, 1973, one of those calls informed her that Syria and Egypt would be launching a full-scale attack that very day—Yom Kippur, the highest of holy days. Golda’s military advisors recommended a strategy that had proved so successful in the past: a preemptive air strike against Syrian and Egyptian airfields. Years of war had hardened Golda to the reality of life surrounded by enemies. “Are we supposed to sit here with our hands folded, praying and murmuring, ‘Let’s hope that nothing happens?’” she asked one interviewer. “Praying doesn’t help.

What helps is to counterattack. With all possible means, including means that we don’t necessarily like.”

 

Yet as a statesman, Golda also had to focus beyond the war’s opening salvos. Israel depended upon the United States for ammunition, aircraft and replacement parts, and a first-strike by Israel would jeopardize relations with its superpower patron. Haggard from stress, she forbade her commanders from launching a preemptive strike. For the war’s first few days, Israel’s survival hung in the balance. Its army lost a quarter of its tanks and an eighth of its fighter-bombers. Israel’s defense line in the Sinai began cracking, and Syrian troops made inroads along the Golan Heights. Should either attacker have broken through Israel’s thin green line, its heartland would have beenwide open to attack. 

 

Golda spent several anxious days playing the parts of strategist, diplomat, and cheerleader. Chain-smoking, gulping a gallon of coffee each day, the 75 year-old woman stiffened the badly-shaken spirits of her defense minister, the veteran general Moshe Dayan. When Dayan asked Golda for permission to assemble Israel’s nuclear weapons, Golda refused. The battle would be fought in the Sinai and Golan Heights, but the war would be won—or lost—by the support of Israel’s friends. There would be no nuclear option. Shadowing Golda as she reviewed military reports with Dayan, the defense editor of the Israeli newspaper Ha’retz wrote, “It was strange to see a warrior of seven campaigns and brilliant past chief of staff of the IDF bringing clearly operational subjects to a Jewish grandmother for decision.”

 

With U.S. supplies and Israel’s reserves taking the field, the tide of battle turned. Israeli troops threw back the Syrian’s on the Golan Heights and surrounded an Egyptian army in the Sinai. On October 29, less than a month after the war began, Israeli commanders met their Egyptian counterparts under a tent stretched across the guns of four parked tanks. They negotiated a withdrawal of Egyptian forces on terms that permitted Egypt’s president Anwar Sadat to save face and, in time, negotiate a lasting peace with Israel.

 

Golda’s was just one chapter in a rich history of women leading their nations in wartime. In ancient times, Egypt’s Cleopatra and Tomyris of the Massagetae led their nations against the superpowers of their day. In the Middle Ages, Queen Manduhai of the Mongols reclaimed a chunk of Genghis Khan’s Empire about the time that Marguerite d’Anjou, wife of England’s King Henry VI, rallied the Lancaster faction in the Wars of the Roses. Elizabeth Tudor turned back the Spanish Armada, Queen Njinga ravaged Portugal’s slave-trading outposts in Angola, and Catherine the Great led her nation to victory against the Ottoman Empire, Sweden and Poland. In the Age of Democracy, Indira Gandhi resortedto war to solve a humanitarian crisis in Bangladesh, and Margaret Thatcher ejected Argentine invaders from the Falkland Islands.

 

These examples suggest sex is no bar to effective political leadership of a country at war, or of leadership over military forces. Indeed, comparatively few male American presidents have boastedhigh-level military expertise when confronted with a decision to go to war. Our historical bias in favor of male leaders in wartime will come to a close, sooner or later.and the next generation of conflicts may find the United States led, capably and competently, by a commandress-in-chief. 

 

Jonathan W. Jordan and Emily Anne Jordan are co-authors of the book, THE WAR QUEENS: Extraordinary Women Who Ruled the Battlefield (Diversion Books, March 10, 2020).

 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174573 https://historynewsnetwork.org/article/174573 0
I Taught my Students to Create Hoaxes in Public. Here’s Why

 

Two hoaxes surfaced on the Internet in late November and early December 2019. The hoaxes drew interest and comment, prompted clicks and reposts, and fooled both the unsuspecting and the highly trained. The hoaxes did not spark widespread panic or promise essential health cures; they were not created by marketing bots or Russian spies. The hoaxes were designed, planned, and launched by my undergraduate students at the University of Utah as part of their preparation to become savvy citizens in the twenty-first century.

 

One hoax announced the “discovery” of a wedding ring in Frisco, Utah, bearing the inscription “Etta Place - Chubut, 1904.” A photograph of the ring was enough to excite Facebook users to fill in the gaps. “Etta Place was the girlfriend of Henry Longbaugh aka the Sundance Kid,” wrote one. “Chubut is the province in Argentina where the couple settled,” added another. “I thought I knew pretty much all of the legends of Butch and his companions,” observed a third, “I’m looking forward to the rest of the story.” This ruse met its match in a museum professional who pointed out that the dirt in the photo did not reflect the composition of soil from southern Utah and that a gold ring would not have fractured like the tungsten ring in the photo.

 

The second hoax “uncovered” a document in the university archive with implications for a collegiate rivalry. A first-year graduate student “found” an architectural sketch indicating that the statue of Brigham Young on his namesake campus in Provo was originally intended to fill a pedestal that remains empty to this day on the campus of the University of Utah. Then an ardent “fan” created an online petition calling for the return of a statue with a Twitter hashtag to #BringBrighamBack! BYU fans seemed amused, while a Utah fan retorted “Please keep the name and the statue!” One reader knowingly stated that he’d “already heard” this as an urban myth and was glad the answer was finally found in the archives. A professor from one of the schools read the petition, clicked through to the faked study, and then reposted the underlying study.

 

Why teach students to create hoaxes? First and foremost, we need citizens with the skills and perspective to survive in “the golden age of hoaxes.” Photoshopped images and deep faked videos go viral on social media, cable channels air documentaries about mermaids and monsters, celebrities become politicians, and politicians call the media “fake.” And yet, what circulates so quickly on the Internet today also circulated in other forms in the past—forged diaries and documents, a mermaid body washed up on shore, photographs of fairies or Lincoln’s ghost, and tales of life on the moon.

 

Teaching about hoaxes also represents good professional practice. I taught the course as a night class at a university, but by day I am the director of an archive/research library where I am responsible both to share information and preserve its integrity for perpetuity. My library joins the Smithsonian among the countless victims of the most notorious murderer-forger in history. We follow the best practices of the archival profession for providing secure access in our reading room. Our IT professionals constantly monitor for phishing, social engineering, or outright hacking. One way to protect the historical record is to understand how it can be stolen, manipulated, modified, or erased.

 

Teaching with hoaxes also presented an opportunity for engaging pedagogy. In designing the course, I took cues from psychologists who examine abnormal mental dysfunction to promote better mental health, from police officers who experience tasers and K-9 takedowns before being authorized to use them, and from athletes who run the plays of opponents in order to beat them. If we are to avoid being fooled by bad information, we must understand how it works. How better to understand how it works than by creating our own hoaxes? We began by exploring the history and methods of past hoaxers, frauds, and forgers. We sought usable lessons by developing skills in the historical method to identify and debunk false claims and by seeking to understand how digital misinformation and disinformation circulate today.

 

The students succeeded in creating hoaxes that fooled some in their target audiences, but our creations faced stiff competition on the worldwide web of misinformation. Both hoaxes ran for about two weeks before we came clean. Those weeks also witnessed presidential impeachment hearings by the House Judiciary Committee, a rumor about the height of Disney’s most famous animated snowman, and a Facebook hoax about women being abducted in white vans. One of the takeaways noted by my students was that it was hard to get people’s attention in an environment in which they are constantly bombarded by misinformation.  

The experience also prompted increased respect for actual professional expertise. The group working on the fake ring explicitly targeted baby boomers, thinking they’d be an easy target. That intergenerational spite dissipated after their best effort was so effortlessly debunked by someone with more experience.

 

Other lessons learned came out of the ongoing classroom discussions about the ethics of our activities. After all, it’s not every day that one gets assigned to forge a historical document, photoshop an image, create fake social media accounts, and spread deception on the Internet. We drew a line that prohibited the forgery of medical claims, soliciting money, public harm, or anything that would get any of us arrested, fired, or expelled. That left us free to experiment with information, history, and emotional appeal.

 

The discipline of history has much to offer our present age of misinformation in terms of subject matter and analytical methods. Here’s hoping for a future in which we recognize the flood of misinformation around us, acknowledge the value of expertise, and develop the thinking skills necessary to survive our century. 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174575 https://historynewsnetwork.org/article/174575 0
Trump's Budget Proposal Reveals His Values

 

It is often said that government budgets are “an expression of values.”  Those values are clear in the Trump administration’s $4.8 trillion budget proposal for 2021, unveiled early this February.

The budget calls for deep cuts in major U.S. government programs, especially those protecting public health. The Department of Health and Human Services would be slashed by 10 percent, while the Centers for Disease Control and Prevention, which has already been proven to be underfunded and unprepared to deal with the coronavirus outbreak, would be cut by a further 9 percent. Spending on Medicaid, which currently insures healthcare for one out of five Americans, would plummet by roughly $900 billion, largely thanks to reductions in coverage for the poor and the disabled. Meanwhile, Medicare expenditures would drop by roughly $500 billion. The budget proposal’s reshuffling of agency responsibilities in connection with tobacco regulation also seems likely to contribute to a decline in public health.

 

Public education constitutes another low-priority item in the Trump administration’s budget proposal. Calling for a funding cut of nearly 8 percent in the Department of Education, the proposal hits student assistance programs particularly hard. Despite the soaring costs of a college educationthe budget would eliminate subsidized federal student loans and end the Public Service Loan Forgiveness program, which currently cancels federal student loan debt for teachers and other public servants after a decade of loan payments. The budget would also reduce student work-study funding and increase the percentage of discretionary income student borrowers must devote to repayment.

 

Some of the deepest cuts in the Trump budget relate to the environment. The Environmental Protection agency would lose 26 percent of its funding, including a 10 percent reduction in the Superfund hazardous waste cleanup program, a nearly 50 percent reduction in research and development, and a $376 million decrease in efforts to improve air quality. EPA staffing would fall to its lowest levels in three decades, thereby hampering enforcement of existing environmental regulations. Moreover, there would be cuts to the National Park Service of $587 million and to the U.S. Fish and Wildlife Service of $80 million, including an $11 million reduction in funding for determining extinction risk under the Endangered Species Act. The administration’s approach to the environment is also evident in the budget’s call for a cut of half the funding for the ecosystem work of the U.S. Geological Survey and for a 74 percent reduction in funding for the Energy Efficiency and Renewable Energy program of the Department of Energy.

 

The budget targets other domestic programs for sharp cutbacks, as well. In the area of public transportation, Amtrak’s federal grants would be reduced from $2 billion to less than half that amount. The budget proposal also calls for ending all federal funding for the Corporation for Public Broadcasting, which supports PBS and National Public Radio stations. Although Trump has repeatedly promised not to cut Social Security, his budget would do just that, slashing it by $71 billion worth of benefits earmarked for disabled workers.

 

Programs aiding impoverished Americans come in for particularly harsh treatment. Although homelessness and securing affordable housing in urban areas are major problems in the United States, the budget calls for a 15 percent decrease in funding for the Department of Housing and Urban Development. Programs that help pay for rental assistance for low-income people have been slashed, while award grants to neighborhoods with deteriorating public and federally assisted housing would be eliminated. 

 

Furthermore, the budget proposes cutting funding for the Supplemental Nutrition Assistance Program, the federal government’s primary effort to feed the hungry, by $180 billion between 2021 and 2030. Stricter work requirements would be implemented, and are expected to result in nearly 700,000 Americans being dropped from the program’s coverage. Among them are large numbers of children, who would also lose their enrollment in the free school lunch program. Explaining the cuts, the Trump budget message stated that “too many people are still missing the opportunity to move from dependence to self-sufficiency.”

 

It’s also noteworthy that, in a world plagued by wars, a massive refugee crisis, climate disasters, and disease epidemics, the Trump budget calls for slashing State Department and U.S. Agency for International Development funding by 22 percent. The biggest cuts target diplomatic engagement, food assistance, and international organizations such as the United Nations, which would receive $447 million less for UN peacekeeping efforts and $508 million less for U.S. dues to the world organization. The budget also calls for slashing more than $3 billion in funding from global health programs, including half of U.S. funding for the World Health Organization.

 

By contrast, the Trump budget proposes substantial increases for the president’s favorite programs. Additional spending would be devoted to restricting immigration, including another $2 billion for building Trump’s much-touted wall on the U.S.-Mexico border and another $544 million to hire 4,636 additional ICE enforcement officers and prosecuting attorneys. Moreover, as a New York Times analysis noted, “the budget promotes a fossil fuel ‘energy boom’ in the United States, including an increase in the production of natural gas and crude oil.”  

 

Furthermore, despite the fact that U.S. military spending already surpasses the combined expenditures of the next seven military powers throughout the world, the Trump budget would add billions of dollars to annual U.S. military appropriations, raising them to $741 billion. This military spending would focus on developing a new generation of weapons, especially nuclear weapons, with the Department of Defense receiving $29 billion (a 16 percent increase) and the Department of Energy $20 billion (a 19 percent increase) for this purpose. If one adds in proposed expenditures on “missile defense” and cleaning up nuclear weapons sites, the annual cost of U.S. nuclear war preparations would soar to $75 billion.

 

With presidential and congressional elections now looming, Americans will soon have the opportunity to show whether these priorities―and the values underlying them―accord with their own. As the coronavirus pandemic indicates, a government's priorities and values can be matters of life and death.

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174577 https://historynewsnetwork.org/article/174577 0
Churchill, Stalin and the Legacy of the Grand Alliance: An Interview with Professor Geoffrey Roberts

 

Geoffrey Roberts, Martin Folly and Oleg Rzheshevsky have recently published Churchill and Stalin: Comrades-in-Arms During the Second World War (Pen & Sword, South Yorkshire: UK, 2019). The book offers an overview —based on material, including never before released documents, from the Russian archives — of the relationship between the two leaders in the period surrounding the Grand Alliance of the US, UK and USSR, which defeated Nazi Germany. Aaron Leonard recently exchanged emails with Professor Roberts about his research.

 

‘Common knowledge’ —in the West anyway —holds that Joseph Stalin was among the most evil persons of the twentieth century, yet your book presents a picture of him as a key figure, working hand in hand with the Western icon, Winston Churchill, in the successful defeat of Hitler. How does one reconcile such a seeming paradox?

 

There is no paradox. They united to defeat a common foe who threatened the very existence of their states and societies. Churchill saw Stalin as by far the lesser or two evils compared to Hitler, while for Stalin a common interest in the defeat of fascism prevailed over his hostility to British capitalism and imperialism. While Churchill had been one of the main organisers of the capitalist coalition that tried to overthrow Bolshevik Soviet Russia after the First World War, in the 1930s he opposed Anglo-French appeasement of Nazi Germany and campaigned for a grand alliance of Britain, France and the Soviet Union to oppose Hitler. When negotiations for an Anglo-Soviet-French pact failed, Churchill did not like the ensuing Nazi-Soviet pact, but he understood Stalin’s reasons for doing a deal with Hitler in August 1939 and believed that sooner or later Britain and the Soviet Union would be fighting alongside each other. During the period of the Nazi-Soviet pact, from August 1939-June 1941, Stalin was wary of being prematurely dragged into the war on Britain’s side but he admired Churchill’s refusal to capitulate to Hitler after the fall of France in 1940.

 

You describe the relationship between Churchill and Stalin as one —in Churchill’s view anyway — of ‘warlord to warlord.’ This struck me as a very particular word, and not one that immediately springs to mind. Could you explain why you think it Churchill felt it captured their relationship?

 

By the time Churchill first met Stalin – in Moscow in August 1942 – the Soviet Union had survived the initial German onslaught, though not without the Red Army suffering millions of casualties. Like Churchill in 1940, Stalin had shown his mettle by remaining in Moscow in November 1941 when the Wehrmacht was at the gates of the Soviet capital. Stalin’s counterpart to Churchill’s ‘Finest Hour’ speech of June 1940 was his patriotic appeal to troops parading through Red Square on their way to the front on 7 November 1941. It was also clear from the two men’s correspondence with each other that they were both deeply involved in directing the British and Soviet war efforts. Both men had personal experience of war - Churchill in British imperial wars in Africa and World War One, Stalin during the Russian civil war. They were both steeped in military history, strategy and doctrine and had a penchant for military-style clothing. While Stalin described Roosevelt as a great man in both peace and war he called Churchill his comrade-in-arms.

 

What about Stalin’s relationship with Roosevelt,  the other prong in the troika. How did it compare with that of Churchill’s? Put another way, among which pair were the relationships strongest?

 

You have to remember that Stalin viewed personal relationships through a political prism. To him Roosevelt was a representative of the American progressive bourgeoisie and a potential ally against Fascism, Nazism and war-mongering capitalists and imperialists. Roosevelt’s New Deal wasn’t socialist but it leaned left and FDR was seen by American communists as a part of an embryonic popular front – a perspective that became even more pronounced during the Second World War when the CPUSA dissolved into a Communist Political Association that sought affiliation with the Democratic Party. 

 

When Roosevelt became President in 1933 the Soviet Union and the United States established diplomatic relations and resolved financial disputes arising from the Bolshevik takeover in Russia in 1917. Roosevelt was a bit player in events leading to the outbreak of war in 1939 but he came to fore as an exponent of the United States as the ‘arsenal of democracy’. Stalin was impressed by Roosevelt’s rapid decision to extend Lend-Lease to the USSR when the Germans attacked the Soviet Union – even though the United States had yet to enter the war - and he knew from his sources within the US government that Roosevelt battled to overcome bureaucratic and political obstacles to shipping as much aid as possible to the Soviets. Roosevelt was fortunate to have at his disposal two special envoys – Harry Hopkins and Averell Harriman (US ambassador in Moscow from 1943-1945) who got on well with Stalin and had the Soviet leader’s confidence.

 

Stalin was also a great respecter of the power that Roosevelt represented and he aspired to emulate US industrial and military might through a combination of American mass production methods and socialist economic planning. “Had I been born and brought up in America,” Stalin told the head of the American Chamber of Commerce in June 1944, “I would probably have been a businessman.” 

 

Stalin had a great deal of personal affection for FDR, which started at the Tehran summit in 1943 – a conference that entailed a long and arduous journey for the wheelchair-bound Roosevelt. The ailing US President undertook another gruelling journey when he met Churchill and Stalin at Yalta in 1945. Stalin was genuinely upset by Roosevelt’s unexpected death in April 1945 and apprehensive about the future of Soviet-American relations. Stalin was well aware of pressures in Washington for a more hard-line approach to relations with the USSR but was reassured by reports that Roosevelt’s successor, Harry S. Truman, was committed to continuing co-operation with the Soviets. In a piece for the History News Network a few years ago called Why Roosevelt was Right about Stalin I concluded that “Stalin was sincere in his commitment to collaborate with Roosevelt during the war and was sorely disappointed when that cooperation did not continue under Truman.”

 

It seems to me that while he was emotionally closer to Stalin, Churchill’s relationship with Roosevelt – for class, political and life-history reasons – was stronger. The same was true of Stalin’s relations with Churchill and Roosevelt. He was emotionally intimate and bonded with Churchill but trusted Roosevelt more.When the Republican politician Harold Stassen met him in April, 1947 Stalin told him “I am not a propagandist, I am a man of business,” and pointed out that he and Roosevelt had never indulged in the name-calling game of “totalitarians” v. “monopoly capitalists.” Stalin was referencing not just Truman’s recent speech to Congress calling for a Free World-struggle against totalitarianism but also Churchill’s "Iron Curtain" speech at Fulton, Missouri in March 1946, a clarion call for a harder line against the Soviet Union which had disappointed but not surprised Stalin.

 

In reading your account what comes through is that both Churchill and Stalin attempted to maneuver and leverage their respective positions in the alliance with charm and personality. How critical were such things when weighted against the larger historical forces shaping their decisions? 

 

The answer to that question depends on how you view the role of individuals in history. Generally speaking, historians think that individuals matter, and the more important and powerful the individual the more they matter. In The Hero in History (1943) Sydney Hook distinguished between what he called "eventful" individuals and "event-making" individuals. Eventful individuals are important because of their role in events, while event-making individuals shape and change the course of events. Hook’s key case-study of an event-making individual was Lenin’s role in 1917 when he transformed the character of the Russian Revolution and changed the course of world history. Had he published his book after World War Two Hook could have added case-studies of Churchill and Stalin.  Churchill’s role in keeping Britain fighting in 1940 was both eventful and event-making. Had he taken Britain out of the war Hitler’s domination of Europe would have been secured and provided an even stronger springboard for his attack on the USSR in 1941, which, with Britain neutral, might well have succeeded.

 

Churchill’s immediate declaration of solidarity with the Soviet people when the Germans invaded the USSR in June 1941 was the first critical step in the formation of grand alliance of Britain, the Soviet Union and the United States – one of the most effective war-fighting coalitions in history. Some would argue that British interests dictated an alliance with the Soviets against Hitler. That’s true, but clear only in retrospect. At the time there were those in Britain arguing for a more equivocal response to the Nazi invasion of the USSR, while others welcomed the prospect of Hitler crushing the Soviet communists. Large quantities of western military aid did not flow to the Soviet Union until 1943 but the Eastern Front was on a knife-edge in 1941-42 and every little bit helped. Important, too, in those early months of the Soviet-German war, was the positive impact on Soviet morale of the alliance with Britain, a psychological plus that was further strengthened by the US entry into the war in December 1941.

 

Another example in relation to Churchill is the impact of his personal opposition to opening a Second Front in France in order to relieve German pressure on the Red Army. The Soviets survived the absence of this second front in 1942 but it was a close-run thing. A second front in 1943 would have altered the military course of the war and might have had vast geopolitical consequences. Churchill had his reasons – in the book you can read for yourself the arguments between him and Stalin about this issue - and some people still think that delaying the second front until 1944 was necessary to avoid a costly failure that would have set back the allied cause. But there is no doubt about Churchill’s personal influence in being able to prevent an early second front.

 

Stalin as both an eventful and event-making individual during the war presents a paradox. On the one hand, he was a leader who both brought his country to the brink of disaster by his handling of plans and preparations for war, notably by restraining mobilization in the face of the imminent German attack. While this was not Stalin’s sole responsibility, he alone had the power to change the course of events on the Soviet side. On the other hand, he was the leader who then saved his country by holding its war effort together, albeit by brutal and costly methods.

 

You describe how the preeminent capitalist powers of Britain and the US were able to collaborate with the preeminent socialist one, the Soviet Union — another paradox. What was the basis for this and to what degree did each entity need to compromise to make it work?

 

As a Marxist Stalin believed that the material interests of Britain, the United States and the Soviet Union constitute a solid and durable basis for the anti-Hitler coalition. He also believed that the common interest in containing a resurgence of the German (and Japanese) threat meant the alliance could continue after the war. The ideological struggle between communism and capitalism would continue but in the context of peaceful coexistence and collaboration to maintain postwar security for all states. Stalin also thought that the growing popularity of communism in Europe and the global swing to the left would also facilitate a peacetime grand alliance.

 

Churchill and Roosevelt had a more individualistic view of the foundations of the grand alliance – they had faith in Stalin as a moderate leader. At the same time, they perceived changes in the Soviet internal regime – a degree of convergence with western systems – which meant that a cooperative USSR would become more open to western influence.

 

The perceptions and beliefs of all three leaders were reinforced by their experience of the grand alliance, which was a history of successful negotiation and compromise. So successful that by the end of the war the Big Three, as they came to be called, were convinced it would continue into the foreseeable future. That didn’t happen but it wasn’t for want of trying, at least on Stalin’s part. Churchill lost office in July 1945 but when he returned to power in Britain in 1951 he was an advocate of renewed negotiations with the USSR, notwithstanding his reputation as an early-adopter cold warrior. And who knows what would have happened if Roosevelt had lived a little longer. Maybe he would have been able to restrain the advocates of a tough line with Russia, as he had done during the war.

 

Looking 75 years on, what is the legacy of the ‘Grand Alliance’ today?

 

The immediate legacy on the Grand Alliance was mixed. On the one hand, it had defeated Hitler’s grab for world power and his attempt to establish a Nazi racist empire in Europe. A Europe of independent sovereign states was re-established after the war. The allied coalition, including the Soviet Union, had also fought the war under the banner of democracy and the allied victory did indeed reverse the trend towards authoritarianism which had gathered momentum following the world economic crisis of the late 1920s and early 1930s. On the other hand, this war for democracy had largely been won by another authoritarian state – the Soviet Union – and, with the outbreak of the cold war, Stalin was quick to establish a tightly controlled communist bloc in central and eastern Europe. The postwar failure of the Grand Alliance resulted in the cold war and decades of crisis, confrontation and conflict, not just in Europe but across the globe. That cold war is over but we are still grappling with its consequences.

 

Historical memory of the Grand Alliance, and of the popular anti-fascist unity that underpinned it, remains strong, particularly in Russia where there is yearning for a return to great power politics based on negotiation, compromise, mutual respect and trust.

 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174572 https://historynewsnetwork.org/article/174572 0
The Life and Times of Flamboyant Rock Music Impresario Bill Graham

Baron Wolman. Jimi Hendrix performs at Fillmore Auditorium, San Francisco, February 1, 1968.

Gelatin silver print. Iconic Images/Baron Wolman

 

You remember rock music promoter Bill Graham. Sometime in your life you read about him, saw his photo in a newspaper or magazine or went to one of the thousands of rock and roll shows he produced from coast to coast. Even if you never went to one of his rock shows, you were influenced somehow by his work. Regardless of your age, you were a witness to the legacy of Bill Graham. You may think you know all about him—friend of Mick Jagger, late night confidant of Bob Dylan, pal of Janis Joplin, discoverer of the Grateful Dead, the Jefferson Airplane and Carlos Santana.

Whether you think you know all the details of Graham’s life and career, or know little, you will be dazzled by a mammoth and wildly pleasing new exhibit about his rich life, The Rock & Roll World of Impresario Bill Graham at the New York Historical Society (running February 14-August 23).  

The fun starts when you enter and they give you a headset. On the headset, for the one or two hours you spend at the exhibit you listen to some of the greatest rock and roll music ever played. You go from room to room to see, chronologically, Graham’s life as a promoter.

The Grajonca Family, Berlin, ca. 1938 Gelatin silver print Collection of David and Alex Graham 

 

Did you know that he was not a Bronx native, but was born in Berlin and spirited out of the country as a child in the 1930s by his mother (who would die at Auschwitz) as the Nazis took over, enduring a dangerous train and boat ride from Berlin to New York? Did you know that when he got to Lisbon on that journey, they asked him where he wanted to go to start a new life. He had no answer. “The United States?” an official said. “Yeah, sure,” he said and shrugged his shoulders. Thus was the legend born.

Graham’s love of music and promotion intrigued NYHS CEO Louise Mirrer, but so did his childhood. “Few know about Graham’s immigrant background and New York roots. We are proud to collaborate with our colleagues at the Skirball Cultural Center to present this exhibition in New York – Graham’s first American hometown—and to highlight his local experience. His rock and roll life was a pop cultural version of the American Dream,” she said.

That dream got started when he was the young manager for a street Mime Show. The performers got arrested and Graham staged a concert to raise bail. That was the beginning of a 40 years’ music career that ended suddenly when he died in a freak helicopter crash at the age of 60.

Graham started relatively small; tickets to one of his theaters the, Fillmore East, went for just $3.50 in 1970. In the late ‘60s, you could buy a Sunday afternoon show ticket at the Fillmore West in San Francisco for $1 and get to see the Grateful Dead, AND the Jefferson Airplane AND Carlos Santana. The enterprise grew, however. Graham put up 5,000 of those now-famous wild psychedelic posters in each city where he promoted a show and told the storekeepers that displayed the posters to keep them. The walls of the exhibit are covered with these posters featuring performers and friends such as Janis Joplin, Gracie Slick, Mick Jagger and the Rolling Stones, Etta James. Lenny Bruce, the Grateful Dead, Carlos Santana and Sam the Sham and the Pharoahs. Original copies of some of those posters today area worth several thousand dollars.

Sections of the exhibit tell the story of the Fillmore Auditoriums, East and West, and their short but eventful lives from 1968 to 1971, and of the Winterland Ballroom in San Francisco.

Visitors then pass through rooms telling the story of the huge, 50,000 seat arenas and upscale venues Graham booked for later shows. When Graham was looking for a theater to house a Grateful Dead concert in the 1970s, he turned down his own Fillmore, the Beacon, and Madison Square Garden to put the Dead onstage at Lincoln Center’s swanky, blue-blood Metropolitan Opera House (they sold out, quickly). 

Baron Wolman B.B. King backstage at Winterland Auditorium, San Francisco, December 8, 1967 Gelatin silver print Iconic Images/Baron Wolman 

 

Graham’s social activism is also represented. Visitors can learn of a show he staged to raise money for poor children’s school lunches in California, which drew 50,000 music fans, notably Willie Mays and Marlon Brando.  Graham’s concerts also introduced black blues and R&B musicians like Etta James to young white fans of rock groups they had influenced. Once, B.B. King came to him and said he saw a “bunch of long haired white people” on the ticket line. ” I think they booked us in the wrong place,” he said.

There is social relevance, such as Graham’s campaign to stop Ronald Reagan from visiting a cemetery in Germany where Nazi soldiers were buried and touching stories such as the hundreds of letters he was sent upon the closing of one of his venues. The exhibit also showcases Graham’s personal friendships with musicians. Sometimes these friendships were tested by business. Graham described his time managing the Grateful Dead and the Jefferson Airplane as “the longest year of my life.” There are delicious personal stories, such as his relationship to Bob Dylan. “We heard that Dylan wanted absolute quiet around him all day on the day of a show, so I ordered everybody in the theater not to talk to him. Later at night he comes to my office and says ‘why is nobody talking to me?’

Note from Donovan to Bill Graham, San Francisco, November 1967 Offset print with inscribed ink Collection of David and Alex Graham Photo by Robert Wedemeyer

 

The exhibit contains a wealth of visual and physical artifacts, including stage costumes and a dozen guitars played by musicians in Graham’s orbit. And there are walls full of the famous wildly colored psychedelic posters with all the writing that nobody could understand.  “The only thing anybody is going to understand in that poster is the asterisk,” Graham complained to one artist, who, of course paid no attention to him. 

Gibson Guitar Corporation 1959 Cherry Sunburst Gibson Les Paul played by Duane Allman of the Allman Brothers Band 

during live concert recording at Fillmore East; recorded: March 12–13, 1971

Collection of Galadrielle Allman Photo by Robert Wedemeyer

 

Graham was a gregarious, flamboyant man. Actor Peter Coyote said he “was a cross between Mother Theresa and Al Capone.” The promoter always knew that in the rock music world he was working with oddballs. “I always felt that someone had to relate to reality. That someone was me.”

Ken Friedman Bill Graham between takes during the filming of “A ’60s Reunion with Bill Graham: A Night at the Fillmore,” 

Fillmore Auditorium, San Francisco, 1986 Courtesy of Ken Friedman 

 

And he created a marvelous reality for tens of millions of music fans.

 

 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174576 https://historynewsnetwork.org/article/174576 0
Does Lincoln or Trump Represent the Conscience of the Republican Party?

 

Are we headed for another Civil War?

The idea might seem hard to imagine for a great many obvious reasons, and the prospect is easy to deride. And yet the years in which the “blues” and “greys” went to war can shed light upon America now. The comparison troubled me as I composed my new Lincoln biography, Summoned to Glory. It is hard to ponder Lincoln’s life and times without contemplating the condition of America today: divided, enraged, and almost coming apart at the seams.

And the comparison between Lincoln’s Republican Party and today’s Republican Party is especially troubling.

From the founding of the party through the end of the Civil War, the issue of slavery was paramount for Republicans, in spite of the diversity of motivations within their ranks. The party unified around the principle of outright emancipation by 1865 under Lincoln’s leadership. And Lincoln saw the issue as a commentary on the best and the worst of human nature--nothing less. “Slavery,” he proclaimed in 1854, “is founded in the selfishness of man’s nature, opposition to it in his love of justice. Those principles are an eternal antagonism.”

At the heart of Lincoln’s creed was the principle of egalitarian decency: the principle articulated by Jefferson in the Declaration of Independence. Lincoln urged Americans to rededicate themselves to that creed--the nation’s founding creed--and he said this again and again in the years leading up to the war. The war itself was a test to determine whether any nation so conceived and so dedicated could long endure.

He said this, of course, in the Gettysburg Address, and the words of that speech are quite famous. But he made the same point in much sharper language in the course of an earlier speech--one that he delivered on August 17, 1858 in his campaign against Stephen Douglas.

The Founding Fathers, said Lincoln, believed that nothing “stamped with the divine image and likeness” was meant to be “trodden on, and degraded, and imbruted by its fellows.” So they “erected a beacon to guide their children, and their children’s children.” They “knew the tendency of prosperity to breed tyrants, and so they established these great self-evident truths, that when in the distant future some man, some faction, some interest, should set up the doctrine that none but rich men, or none but white men, or none but Anglo-Saxon white men, were entitled to life, liberty, and the pursuit of happiness, their posterity might look up again to the Declaration of Independence and take courage to renew the battle… so that truth, and justice, and mercy… might not be extinguished from the land.”

That quotation makes poignant reading when today’s Republican Party is led by a man who takes pleasure in degrading and “imbruting” whole classes of people who, in his view, are human “scum.”

As for the proposition that only rich men, or white men, or Anglo-Saxon white men deserve life, liberty, and the pursuit of happiness, consider Trump’s actions toward Latin Americans who come here seeking those things. He wants to treat them like animals.

He vilifies undocumented immigrants at every opportunity and comes close to calling them subhuman. He even vilifies Americans who ask for humanitarian assistance--like the victims of the hurricane in Puerto Rico--if they happen to be people of color.

Lincoln was opposed to the “Know Nothings” who were trying to restrict immigration. He supported opportunity for all. Compare the position of Lincoln with the current Republican immigration policy, which actively frowns upon the pursuit of happiness by anyone who comes as a refugee to these shores unless they happen to be… whites. That was the gist of Trump’s reported remarks when he said that new immigrants ought to come from places like Norway instead of from . . . Africa or from “shithole” nations like Haiti.

Trump denies that he said this. But participants in the meeting in question insist that he did. And such sentiments are perfectly consistent with the public behavior that Trump displays all the time. His bigotry cannot be contained because it flows from the impulse that makes Donald Trump Donald Trump.

He likes to hurt and humiliate anyone he thinks is “beneath” him. Lincoln’s politics emphasized decency. Trump doesn’t know the meaning of the term.

Imagine the exercise of reciting Lincoln’s words in the presence of Trump:  nothing made in God’s image should be “trodden on, and degraded, and imbruted by its fellows.”  Would he pause long enough — in his tweeting of insults — to listen?  And even if he did pay attention to the words (presuming he understood them) what would he say?

We can easily imagine what Donald Trump would say.

And we can easily imagine the responses of Mitch McConnell, Lindsey Graham, and all the other Republican worshippers of power who avert their eyes from Trump’s deeds or else think up ways to excuse them.

Compare Lincoln’s politics to those of Donald Trump and you will see how far downhill the Republican Party has come. Abuse and degradation now flow from the party of Lincoln. The “better angels of our nature” are unknown in the party today. Republicans in Lincoln’s day insisted that the moral issue dividing the nation should be settled in favor of common decency before the country could unite. Common decency and American unity could not be farther away from the minds of Trump and his supporters.

Trump’s Republican Party pushes endless division and, rather than fostering a decent re-unification with an emphasis on mutual respect and common decency, the party sinks our country ever downward. Before very long, we may be mauling one another in the dark.

Scores of decent Republicans recoiled when Trump succeeded in his quest for the Republican nomination in 2016. “Anyone but Trump” was their slogan and some of them left the Republican Party. As the bestiality of Trump’s administration took shape, some Republicans like Jeff Flake and Bob Corker retired from Congress and indeed from politics in revulsion against what was happening.

But Trump’s steady quest for absolute power has proceeded, and only one lone Republican senator--Mitt Romney, the party’s 2012 presidential nominee--had the courage to take a stand in the Senate impeachment trial. All the others went along like sheep, either bleating out pathetic excuses or else remaining silent. They refused to hear the sworn testimony of witnesses. Several of them—Lamar Alexander and Pat Roberts, for instance—were retiring, so this was their now-or-never moment to consider their own place in history. 

They took the coward’s way out.

Perhaps they’re just tired old men. But Trump is quite old and yet he never seems to get tired. Perhaps they simply haven’t the stomach any longer for continued… “unpleasantness.” But that’s a justification that anyone can use—it’s something that anyone can say to themselves—if they decide to give up and let a brutal thug have his way.

Roberts chairs the Dwight D. Eisenhower Memorial Commission, which seeks to honor another Republican exemplar of decency. The connection between Roberts and Eisenhower and Trump is fraught with irony—to put it mildly. Eisenhower worked behind the scenes to defeat a demagogue who was just as vile as Trump:  Joe McCarthy. It bears noting that Trump learned the arts of demagoguery from none other than McCarthy’s top henchman Roy Cohn.

It is sad to anticipate the cant that Pat Roberts will probably intone when the Eisenhower memorial is dedicated. How can he live with himself after failing to take a stand when he had the opportunity? Perhaps he said to himself, “what good would it do if I took a stand—no one would listen.” Well, if others had followed the lead of Mitt Romney, who can say what events might have followed?  It might have triggered a catharsis.

If someone in the future writes a book like John F. Kennedy’s Profiles in Courage, Mitt Romney will be in it and Pat Roberts will not.

Theodore Roosevelt had a pithy way of describing defeatists. In his language of robust and swashbuckling action, he would probably say that the equivocal Republicans of today have a “yellow streak.”  Lincoln had something more exalted to say to the Republican Congress in 1862:  “Fellow citizens, we cannot escape history.... The fiery trial through which we pass, will light us down, in honor or dishonor, to the latest generation.”

Republicans:  you have drifted away from noble origins that would be the envy of any political party. You have cut yourselves off from a heritage of altruism and benevolence to obey the commands of a brute—a leader who, like his idol Mr. Putin, believes in nothing whatsoever but the grubby acquisitions that plunder can deliver into his foul and selfish little hands.

Awaken, Republicans, and make some effort to reflect!  If you value any principle, prove it by your words and your deeds. Prove to yourselves and to the world that there is more to life than raw power.

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174580 https://historynewsnetwork.org/article/174580 0
Do We Want the Progressive Sanders or the Pragmatist Biden?

 

Since my state’s presidential primary in Michigan was on March 10, it was time to make a choice. The only realistic and sensible one was between Bernie Sanders and Joe Biden. They may be two old white guys, but no other candidates remained who had any chance of defeating another old white guy who happens to be one of our worst presidents ever.

There’s a lot to like about the progressive Sanders, including his realization of how serious our climate crisis is. But Biden is also “a solid contender on climate and environment.” As Robert Dallek has noted, Franklin Roosevelt (FDR) was both a progressive and a pragmatist. And the ideal 2020 Democratic presidential candidate would be both, but there is no denying that by historical standards Sanders is more progressive and Biden more pragmatic. So which should one choose?

 

I recommend using history and the need for political wisdom as a guide. First, let’s look at the changing meaning of the term progressivism. The Progressive Era (1890-1914) was a diverse movement to assert and protect a public interest, both economic and social, against unfettered capitalism. It did not attempt to overthrow or replace capitalism, but to have government bodies and laws constrain and supplement it in order to insure that it served the public good. 

 

Progressive reforms included passing laws improving sanitation, education, food and drug safety, housing, and workers’ rights and conditions, especially for women and children. Progressive efforts also helped create the National Park Service and a graduated federal income tax (16th Amendment). In addition, they reduced corruption in city governments, limited trusts and monopolies, expanded public services, and worked to aid the poor and secure the vote for women, which was not achieved in presidential elections until 1920.

Importantly for our discussion, progressivism was a broad coalition that included both the bellicose Republican (and president from 1901 to early 1909) Theodore Roosevelt and the likes of Jane Addams, a radical reformer and pacifist. It was Addams’s pragmatism that led her to shun political labels and seek alliances to advance social progress.

The 1920 presidential election brought the Republican Warren Harding to office, and for the next twelve years it was held by Republicans, but then the progressive, pragmatic Democrat Franklin Roosevelt (FDR) came to office and remained there until his death in 1945. 

I have described progressive values as similar to the values of political wisdom. They included the proper mix of realism and idealism, love, compassion, empathy, self-discipline, passion, courage, persistence, peace, optimism, humility, tolerance, humor, creativity, and an appreciation for beauty and the need to compromise.

Like progressivism, pragmatism has a rich tradition in the USA. Walter Isaacson wrote that while popular opinion of the founders emphasizes high principle, deliberation on the Constitution in 1787 “showed that they were also something just as great and often more difficult to be: compromisers.” Isaacson also stated that for Franklin “compromise was not only a practical approach but a moral one. Tolerance, humility and a respect for others required it…. Compromisers may not make great heroes, but they do make great democracies.”

Recent Democratic presidents—Truman, JFK, Lyndon Johnson, Jimmy Carter, Bill Clinton, Barack Obama—have embodied this pragmatic impulse. As former President Bill Clinton said, “This is a practical country. We have ideals. We have philosophies. But the problem with any ideology is that it gives the answer before you look at the evidence.” And historian James Kloppenberg has emphasized the centrality of pragmatism to Obama’s politics. This type of pragmatism “challenges the claims of absolutists--whether their dogmas are rooted in science or religion--and instead embraces uncertainty, provisionally, and the continuous testing of hypotheses through experimentation," in order to see what works. 

Thus, both progressivism and pragmatism can claim to be authentically American approaches, but where does that leave us regarding the Sanders/Biden choice? Has Sanders demonstrated enough pragmatism? Unfortunately, I do not think so. The political-wisdom qualities of compassion, empathy, passion, courage, persistence, and love of peace are certainly there. But the proper mix of realism and idealism, creativity, humility, tolerance, and a willingness to compromise--not so much.  

Today the USA is badly split between pro and anti-Trumpers and is likely to remain one split between right and left politicians. We are a country that has traditionally favored the pragmatic over the ideological. The reality is that almost all the former Democratic contenders for the 2020 nomination now support Biden, and he, rather than Sanders, is more likely to attract independents and dissatisfied Republicans and be able to work with a new Congress to achieve results. We are a country that has traditionally favored the pragmatic over the ideological. We need a unifier, not an ideologue who will keep us divided. 

By sticking too closely to an ideological democratic-socialist message, Sanders has failed to display the creativity that historian R. B. Bernstein (in his The Founding Fathers Reconsidered) wrote of regarding those men, as well as FDR and his supporters who “reinterpreted the Constitution’s origins, stressing the founding fathers’ creative experimentation, which they sought to foster in the new nation.” 

 

Regarding humility, tolerance, and a willingness to compromise, they all go together. Humility should tell us not to be dogmatic, that we do not have all the answers. We should tolerate the views of others not only for reasons of love, compassion, and empathy, but also because we might learn something from different views. Compromise implies both humility and tolerance, plus the realization that to achieve the common good in a democratic society compromises will often be necessary. 

 

Sanders has stressed a we-versus-they approach. In a speech after Super Tuesday he made it clear--as he often has--who are on the two sides. The “we” includes those  “who in many cases are working longer hours for lower wages . . . people who have not traditionally been involved in the political process.”Also many of those favoring his Medicare-for-All plan, those suffering from a “dysfunctional and cruel healthcare system in which we are spending twice as much per person on healthcare as are the people of any other country and yet we have 87 million Americans who are uninsured, underinsured.” Not specifically mentioned, but a big part of the “we” are millions of young people, many of whom would benefit from the free college tuition and canceling of “all student loan debt” that Bernie’s web site indicates

 

Among the “they” are the “corporate establishment” and the “political establishment,” as well as “Wall Street and the drug companies and the insurance companies and the fossil fuel industry and the military industrial complex and the prison industrial complex, and most of the 1%.” Not mentioned are all those who voted for Trump in 2016, including a majority of Protestant and Catholic voters, and still hold views contrary to Bernie’s. 

 

Bernie’s approach is not a unifying one, not one that says, “Let’s reason together and see what we can achieve for the common good.” However enlightened his ideas might be in theory--and I think many of them are noble--a majority of U.S. citizens and their politically diverse legislators are not yet ready to back them. He advocates, in his own words, “a political revolution,” and revolutions are more divisive than unifying--think of the civil war that followed the Russian revolution. Biden insists that voters “aren't looking for revolution, they're looking for results." And the primary results heretofore (through Mar. 10) bear him out. 

 

In essays of 2012 and 2015 about Obama’s pragmatism vs. Republican dogmatism I criticized Republicans for not being open enough to policies that might further the common good. Bernie’s supporters in 2020 tend to be more ideological and less pragmatic than other Democratic primary voters, and the danger is real that if Bernie is not the Democratic nominee many of them might not vote for whoever is. But they should think twice. 

 

Biden is not yet that nominee. Things could still change. A poor Biden debate performance. Stunning upsets in some big yet-to-vote states. Unknown political effects of the coronavirus. Who knows? But one thing is certain. Our country needs a president that is more progressive and pragmatic than Trump, who is neither, but only an unprincipled opportunist. Either Biden or Sanders would be better than Trump, and Democrats need to rally around whoever it is. But because his pragmatic style matches more the American tradition, Biden has a better chance of unifying our nation and delivering positive long-range results. An enlightened vote is one that considers the common good, not just that of those who think like us. 

 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174578 https://historynewsnetwork.org/article/174578 0
Roundup Top Ten for March 13, 2020

Coronavirus School Closings: Don’t Wait Until It’s Too Late

by Howard Markel

In the history of medicine, we have never been more prepared to confront this virus than we are today. But this history also teaches us that when it comes to school closings, we must always be ready to act today — not tomorrow.

 

Joe Biden Personifies Democratic Party Failures Since the Cold War

by Michael Brenes

The Democratic Party needs to escape the shadow of anti-communism and embrace economic and racial justice.

 

 

Liberal Activists Have to Think Broadly and Unite Across Lines

by Matthew D. Lassiter

Fifty years before Greta Thunberg, students at the University of Michigan organized a Teach-In that paved the way for Earth Day demonstrations that mobilized 20 million people in 1970.

 

 

Coronavirus and the Great Online-Learning Experiment

by Jonathan Zimmerman

Let’s use this crisis to determine what our students actually learn when we teach them online.

 

 

The History of Slavery Remains With Us Today

by Ariela Gross and Alejandro de la Fuente

Two historians trace how law and institutions developed around anti-black ideology in the Americas.

 

 

We’ve Been Looking in the Wrong Places to Understand Sanders’s Socialism

by Richard White

Detractors like to equate Senator Bernie Sanders’s socialism with Soviet and Chinese Communism, but they’re swinging at the wrong century, the wrong country and the wrong socialism.

 

 

My Abortion Before Roe v. Wade

by Elizabeth Stone

Roe v. Wade is in peril, flinging me back to a terrifying time in my own life, one I never expected women today would have to face.

 

 

I Helped Fact-Check the 1619 Project. The Times Ignored Me.

by Leslie M. Harris

The paper’s series on slavery made avoidable mistakes. But the attacks from its critics are much more dangerous, argues historian Leslie M. Harris.

 

 

The Latest Battle over the Confederate Flag Isn’t Happening Where You’d Expect

by Megan Kate Nelson

Confederate actions in the Far Western theater of the war reveal the extent to which the Confederate flag became a symbol of white supremacy and conquest.

 

 

There’s a Complex History of Skin Lighteners in Africa and Beyond

by Lynn M. Thomas

The politics of skin colour in South Africa have been importantly shaped by the history of white supremacy and institutions of racial slavery, colonialism, and segregation. My book examines that history.

 

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/article/174565 https://historynewsnetwork.org/article/174565 0
The Gross Clinic

]]>
Fri, 03 Apr 2020 17:42:38 +0000 https://historynewsnetwork.org/blog/154323 https://historynewsnetwork.org/blog/154323 0