History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Sun, 19 Jan 2020 17:43:56 +0000 Sun, 19 Jan 2020 17:43:56 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://theguardian.hnn.org/site/feed Our GOP Problem

New York Times, April 2, 1950

 

Stone Age Brain is the blog of Rick Shenkman, the founding editor of the History News Network. His newest book is Political Animals: How Our Stone-Age Brain Gets in the Way of Smart Politics (Basic Books, 2016). You can follow him on Twitter @rickshenkman.

 

The Republican Party, unsurprisingly, has taken the position that President Trump should be defended. This is unsurprising because this is what parties in power do.  If we want to explain what has happened to the Republican Party, which all must try to do in this hour of crisis when democracy itself is on the line owing to Republican perfidy, it is essential for us to view events not from the perspective of the rational actor but from that of the party politician.  Only then can the alarming events through which we are living become understandable. 

 

We must begin with the basics.  Three overriding causes may be said to account for the behavior of politicians holding national office.  One, is money, with which we need not concern ourselves too much.  It’s obvious the role money plays in our politics.  Members of Congress must always be thinking in the back of their mind how any vote may affect their chance of financing their next election. Every GOP member of Congress has to worry that if they vote against Trump they’ll be cut-off from various campaign funds available to Republicans in good standing with the party and the party’s major-domo donors such as Sheldon Adelson and Charles Koch. 

 

More interesting, though also obvious, is the second factor, pure partisanship. The social sciences tell us that partisanship is hard-wired in the human brain.  It is the reason we cheer for our side in a ball game and hope for the opposition’s defeat.  Once we identify with a group we look for evidence that confirms the group’s status and dismiss evidence that detracts from it.  Because partisanship is stronger among Republicans generally than it is among Democrats, perhaps owing to a default loyalty bias among people who identify as conservative, it is pretty easy to comprehend the ordinary Republican’s behavior in ordinary times.  

 

Of course, these are not ordinary times.  Presidents are rarely impeached.  So Judiciary Committee Chairman Jerry Nadler, ahead of the committee’s vote on impeachment, issued a rare plea that his Republican colleagues consult their consciences before voting.  As many have noted Republicans during Watergate did just this, voting against Nixon when they similarly faced an impeachment vote.  Why is no Republican doing that this time?

 

The explanation may be found in the third factor accounting for the behavior of politicians. It is this one that is perhaps the most telling in the current situation.  Politicians prefer winning over losing and recent history suggests that the way to win, notwithstanding the losses the party suffered in 2018, is to stand with Donald Trump .  By nature politicians are cautious.  The only way to know what will succeed in winning votes is to follow the path of proven winners like Trump.  As long as he appears to be retaining the support of the GOP party base it is prudent to assume that he has figured out the magic sauce in the recipe of political victory and to follow the recipe closely.  Only a few dare to tamper with the ingredients.

 

Change is unlikely in the Republican Party short of a massive defeat.  Only in defeat do politicians, facing years in the wilderness, risk experimenting with new approaches.  Thus far there’s little sign that the party base is fielding second thoughts about Trump.  He remains nearly as popular today among Republicans as he did when he was elected.  Polls show his support among Republicans in states like California and Texas is north of 85 percent.  Nixon's support, by contrast, began to collapse by the time he faced impeachment.  At the beginning of 1973, before Watergate shook the country, Nixon had the support of 91 percent of GOP voters.  By the end of the year — a year in which John Dean testified about payoffs to the Watergate burglars and Special Prosecutor Archibald Cox was fired in the Saturday Night Massacre — Nixon’s support in the GOP had fallen to 54 percent.  

 

So the real question isn’t why members of Congress are remaining staunch Trump supporters, but why the GOP base is.  Many reasons have been offered for this strange phenomenon (strange because Trump is so unlikely an avatar of Republican virtue). They include Fox News, Rush Limbaugh, and the other leading cogs in the propaganda machine that props up the Republican Party.

 

Whatever the cause of Trump's hold over the GOP base, it's a fact, and we as a country need to do something about it. We have to hope that the GOP evolves into a better version of itself because, as Arthur Schlesinger Jr. observed in an article in the New York Times in 1950, this country needs two intelligent parties. Right now we've got just one.  Only the Democrats are grappling with the real problems the United States faces, among them climate change and inequality.  This is untenable over the long term.

 

Through much of our history we have had a responsible conservative party, as Schlesinger noted in his piece in the Times.  In antebellum America the party of Jefferson was cross-checked by the party of Hamilton and Adams.  In the next generation Jacksonians faced off against Whigs, and while the Whigs eventually disappeared, for decades they offered Americans like Lincoln an intelligent alternative.  In the postbellum period the GOP espoused (for a time)  a bold vision of racial equality and entrepreneurial zeal.  Later it was captured by the plutocrats but by the turn of the 19th century reform elements led by Teddy Roosevelt succeeded in refashioning the party as an engine of reform.  In the 1920s the party once again became beholden to the rich until the Great Depression put an end to its control of the federal government.  For a couple of decades it nearly ceased to exist at the national level.  Then, as if in response to Schlesinger’s call, the party finally made peace with the New Deal under the leadership of Dwight Eisenhower.  “Should any political party attempt to abolish social security, unemployment insurance and eliminate labor laws and farm programs,” Ike wrote, “you would not hear of that party again in our political history.”

Under both Richard Nixon and Ronald Reagan the GOP continued to deal with real world problems, particularly in foreign affairs.  But slowly in the years following the end of the Cold War Republicans gave themselves over increasingly to fake nostrums.  They did this because they found they couldn’t win by running on their real agenda -- tax cuts for the wealthy, which constituted nearly the whole of their domestic program once welfare had been reformed in the 1990s.  

 

Trump in 2016 correctly identified several key issues that demand public attention, especially the decline and demoralization of much of rural America.  But rather than offer a rational program to address this and other issues he won election by dividing the country along racial and religious lines.  Instead of appealing to the better angels of our nature he played his voters for fools.  He began his time in the national political spotlight by hinting that Barack Obama was born in a foreign country and might be a secret Muslim.  Later he signed up as a card carrying member of the anti-science brigade of climate change deniers.  Throughout his presidency he’s spread rumors of conspiracies. And his biggest "accomplishment"?  It was giving the wealthy huge tax breaks.

 

What if the GOP doesn’t reinvent itself as a responsible party? Schlesinger worried seven decades ago the GOP could collapse into pieces, leaving “its members prey for fascist-minded demagogues.”  There was, it turns out, another possibility Schlesinger didn’t anticipate.  It’s that the party would hold together by itself appealing to a trinity of fascist evils: xenophobia, racism, and authoritarianism. This should worry all of us.

 

 

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/blog/154299 https://historynewsnetwork.org/blog/154299 0
Chief Justice John Roberts' Predecessors: The Supreme Court Chief Justices Who Presided Over Previous Impeachment Trials

 

As the Senate impeachment trial of Donald Trump looms, many aspects of the trial are still undetermined. Will the parties call witnesses? How long will it last? How seriously will Senate Majority leader Mitch McConnell take it? 

 

One aspect that is determined but often misunderstood is who presides over the trial. As Chief Justice John Roberts, appointed by President George W. Bush in 2005, readies himself for his historic role as the presiding judge over the trial, it is instructive to look back at the experiences of the two prior Chief Justices who presided over the trials of President Andrew Johnson in 1868 and of President Bill Clinton in 1999.

 

Salmon P. Chase, Chief Justice from 1864-1873, and William Rehnquist, Chief Justice from 1986-2005, both faced great pressures as presiding judge over the highly partisan impeachment trials. Neither one would be considered noncontroversial in his career, but both had the responsibility to uphold the Constitution at times of great turmoil, and both did so, after an early period of controversy around Salmon P. Chase.

 

Salmon P. Chase’s career reflected the realignment of political parties in the mid nineteenth century. He was a member of the Whig Party in the 1830s, the Liberty Party of the 1840s, the Free Soil Party from 1848-1854, the Republican Party from its founding in 1854 to 1868, and finally, the Democratic Party in the last five years of his life, while still serving as Chief Justice by appointment of Abraham Lincoln.

 

Chase helped recruit former Democratic President Martin Van Buren to run as the Free Soil Presidential candidate in 1848; helped found the Republican Party on the same principles of antislavery activism; sought the Republican nomination for President in 1860 before Lincoln was selected by the Republican National Convention; and he sought the Presidency on the Democratic Party line in 1868 and the Liberal Republican line in 1872 while still serving as Chief Justice.  He had a varied career as Ohio Senator (1849-1855), Governor (1856-1860), and Secretary of the Treasury under Lincoln (1861-1864).

 

Chase attempted to establish the concept of unilateral rulings on procedural matters during the early days of the trial of Andrew Johnson, but he was overruled by the Senate majority, controlled by Radical Republicans, and quickly gave up trying to control the trial. He moved toward neutrality and simple presiding as the trial moved forward after early turmoil.

 

William H. Rehnquist could not have been more different than Salmon P. Chase in his political leanings.  As far “left” as Chase was in his times, Rehnquist was far ‘right”, starting his political career as a legal advisor to Republican Senator Barry Goldwater in his failed campaign for President of the Arizona Senator in 1964.  Rehnquist was appointed Assistant Attorney General of the Office of Legal Counsel in 1969 by President Richard Nixon. 

 

Nixon nominated him for the Supreme Court in late 1971 and he was confirmed and sworn in the first week of 1972. Rehnquist served nearly 34 yearson the Court and was elevated to Chief Justice in 1986 by President Ronald Reagan. He was regarded as the most conservative member on the Warren Burger Court and was one of the most consistently conservative Justices in modern times. Rehnquist recused himself from participating in the US V. Nixon Case in 1974, where the President was ordered to hand over the Watergate Tapes to the Special Prosecutor Leon Jaworski, leading to Nixon’s resignation on August  9, 1974.

 

Presiding over the Bill Clinton Impeachment Trial in the Spring of 1999, Rehnquist chose to  limit any attempt to influence the trial that was being promoted by a strong conservative Republican leadership in the House of Representatives, led by Speaker of the House Newt Gingrich and House Judiciary Committee Chairman Henry Hyde.  Despite his strong conservative credentials, Rehnquist managed always to get along well with his Supreme Court colleagues, and there were no controversies about his handling of the Clinton Impeachment Trial. 

 

He was, despite his right wing credentials and voting record on the Court, seen as fair minded, approachable, and a far more unifying leader of the Court before and after the Clinton Impeachment Trial than Chase was before and after the Andrew Johnson Impeachment Trial.

 

Now, Chief Justice John Roberts, who clerked for Rehnquist in 1980-1981, is faced with the same challenge of presiding over a highly charged impeachment trial.

 

Roberts worked in the Ronald Reagan and George H. W. Bush Administrations in the Justice Department and the Office of White House Counsel, then as Principal Deputy Solicitor General,followed by private law practice before his appointment to the DC Court Of Appeals by George W. Bush in 2003.  In 2005, he was nominated to replace the retiring Associate Justice Sandra Day O’Connor, but before hearings could begin on the nomination, Chief Justice Rehnquist died. Roberts was then nominated to replace Rehnquist. 

 

Roberts has been very clear in his desire to run a Court that has the respect and regard of the American people, and while he has a strong conservative judicial philosophy in his 14 plus years on the Court, he has also come across as having a willingness to work with the Supreme Court’s liberal bloc, and is seen as the “swing” vote on the Court since Associate Justice Anthony Kennedy retired in 2018.  

 

He has surprised many liberal commentators with some of his votes, including the preservation of “ObamaCare.” He is seen as comparatively more moderate and conciliatory, and he has been somewhat critical of utterances by President Donald Trump regarding bias of Justices appointed by Presidents Bill Clinton, George W. Bush, and Barack Obama.

 

 It is clear that Roberts wants to have a good historical reputation as only the 17th person to head the Supreme Court, and while he will work to avoid controversy in the upcoming Trump Impeachment Trial, he will wish to preserve respect for the Constitution, democracy, and the rule of law, and will be the center of attention in the coming weeks and months.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/blog/154302 https://historynewsnetwork.org/blog/154302 0
Roundup Top 10!  

The Job of the Academic Market

by Rebecca S. Wingo

Over three years, I dedicated 106.5 workdays to getting a job—while working another job. 

 

Prohibition Was a Failed Experiment in Moral Governance

by Annika Neklason

A repealed amendment and generations of Supreme Court rulings have left the constitutional regulation of private behavior in the past. Will it stay there?

 

 

History and the Opioid Crisis

by Jeremy Milloy

In the 1970s, just as now, people living with and recovering from substance use disorders faced prejudice and mistreatment at the hiring stage and in the workplace itself.

 

 

1619?

by Sasha Turner

What to the historian is 1619?

 

 

Boris Johnson Might Break Up the U.K. That’s a Good Thing.

by David Edgerton

It’s time to let the fantasy of the “British nation” die.

 

 

The problem with a year of celebrating the 19th Amendment

by Andrew Joseph Pegoda

Our entire understanding of the history of feminism is skewed.

 

 

Assassination as Cure: Disease Metaphors and Foreign Policy

by Sarah Swedberg

Kinzinger’s words fit within a long historical tradition of badly used disease metaphors that often accompany bad outcomes.

 

 

Another Disability Disaster in the Making

by Jonathan M. Stein

The Trump administration’s Social Security proposal would repeat one of Ronald Reagan’s most damaging mistakes.

 

 

How the President Became a Drone Operator

by Allegra Harpootlian

From Obama to Trump, the Escalation of Drone Warfare

 

 

 

What Australia’s Fires Should Teach the USA: Be Alarmist!

by Walter G. Moss

Most importantly in this 2020 election year, the Australian tragedy tells us we should vote out all the human-caused climate-change deniers and minimizers.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/174066 https://historynewsnetwork.org/article/174066 0
Stepping Back From the Brink of War Trump’s order to kill General Soleimani is one of the most reckless acts taken by a president, who once again has put his personal political interest above the nation’s security. Certainly, Soleimani deserved to meet his bitter fate. He was behind the killing of hundreds of American soldiers in Iraq while threatening and acting against American allies. However, killing him without considering the potentially dire regional repercussions and without a strategy, under the guise of national security concerns, is hard to fathom. Republican members of Congress who praised the assassination of General Soleimani seem to be utterly blinded by their desire to see him eliminated. What will happen next, they seem to have no clue. Trump, who is fighting for his political life, appeared to have cared less about the horrifying consequences as long as he distracts public attention from his political woes. He made the decision to assassinate Soleimani seven months ago, but he gave the order now to serve his own self-interest, especially in this election year where he desperately needs a victory while awaiting an impeachment trial in the Senate. During the Senate briefing on Iran led by Secretaries of State and Defense Pompeo and Esper, and CIA Director Haspel, they produced no evidence that there was an imminent danger of an attack on four American embassies orchestrated by Soleimani, as Trump has claimed. In fact, Esper said openly in a January 12 interview that he saw no evidence. Republican Senator Mike Lee labeled it as “probably the worst briefing I have seen, at least on a military issue…What I found so distressing about the briefing is one of the messages we received from the briefers was, ‘Do not debate, do not discuss the issue of the appropriateness of further military intervention against Iran,’ and that if you do ‘You will be emboldening Iran.’” Now, having failed to produce evidence of imminent danger, the Trump administration claims that the killing of Soleimani was part of a long-term deterrence strategy. The assassination itself has certainly emboldened Iran’s resolve to continue its nefarious activities throughout the region, but even then, the measure Trump has taken to presumably make the US more secure has in fact done the complete opposite. It has created new mounting problems and multiple crises. Trump dangerously escalated the conflict with Iran; severely compromised the US’ geostrategic interest in the Middle East; intensified the Iranian threat against our allies, especially Israel; led Iran to double down in its support of terrorist and Jihadist groups; badly wounded the US’ relations with its European allies; deemed the US untrustworthy by friends and foes; and pushed Iran to annul much of the nuclear deal, all while impressively advancing its anti-ballistic missile technology. And contrary to Trump’s claim that he made the right decision for the sake of American security, 55 percent of voters in a USA Today survey released on January 9th said he made the US less safe. And now we are still at the brink of war. Although Iran has admitted to being behind the attack on the Asad air base in Iraq, it initiated the attack to save face in the eyes of its public and demonstrate its possession of precision missiles and willingness to stand up to the US. This retaliation was expected, but since Iran wants to avoid an all-out war, it was strategic and carefully calculated to inflict the fewest American casualties, if any, to prevent a vicious cycle of retaliatory attacks which could get out of control and lead to a war. This, however, does not suggest that Iran will stop its clandestine proxy operations—employing its well-trained militia in Iraq, Yemen, and Syria to execute new attacks on American and allies’ targets in the region while maintaining deniability. Similarly, the clergy can also pressure hawks in and outside the government to avoid any provocative acts against the US. Iran is patient and will carefully weigh its gains and losses before it takes the next step. Following Iran’s attack on the Asad base, Trump has also shown restraint because he too wants to prevent an all-out war, knowing that even though the US can win it handedly, it will be the costliest victory in blood and treasure and certainly in political capital. The whole mess began when Trump withdrew from the Iran deal. What did Trump think he could accomplish? Withdrawing from the deal without having any substitute, without consultation with the European signatories, and with re-imposing sanctions, especially when Iran was in full compliance with all the deal’s provisions, is dangerously reckless—undermining our national security interests and jeopardizing the security of our allies in the region. The Iran deal was not perfect, but the idea was to build on it, gradually normalize relations with Iran, and prevent it from acquiring nuclear weapons altogether as it works to become a constructive member of the community of nations. To resolve the crisis with Iran, the US must demonstrate a clear understanding of the Iranian mindset. Iran is a proud nation with a long and continuing rich history; it has huge natural and human resources, is the leader of the Shiite world, occupies one of the most geostrategic locations in the world, and wants to be respected. The Iranians are not compulsive; they think strategically and are patient, consistent, and determined. The revocation of the Iran deal simply reaffirms Iran’s distrust of the US, from the time the CIA toppled the Mosaddeq government in 1953 to the continuing sanctions, adversarial attitude, and the open call for regime change. Both Khamenei and Trump have their own domestic pressure to contend with and want to avoid war. The Iranian public is becoming increasingly restive. They are back in streets demanding immediate economic relief. Conversely, Trump calculated that further escalation of violent conflict with Iran will erode rather than enhance his political prospects, and would make defeat in November all but certain. West European countries are extremely sensitive to any major escalation of violence, as it would lead to mounting casualties and destruction on all sides. Iran can resort to a wide range of hostile measures, including disrupting oil supplies from Saudi Arabia and other Gulf states, by mining the Straits of Hormuz through which 21 million barrels per day (21% of global oil consumption) pass, resulting in a massive economic dislocation in the Middle East and Europe in particular. The pause in hostilities offers a golden opportunity to begin a new process of mitigation. Germany, France, and Britain have already engaged the Iranians in an effort to ease the tension between Iran and the US and create conditions conducive to direct US-Iran negotiations. By now, Trump must realize that Iran cannot be bullied and the only way to prevent it from pursuing nuclear weapons is through dialogue. Regardless of how flawed Trump views the Iran deal, it still provides the foundation for a new agreement, as many of its the original provisions remain valid and can be built on it. Other conflicting issues between the two sides, especially Iran’s subversive activities, should be negotiated on a separate track. In some ways, both Iran and the US need to lick their wounds and begin a new chapter, however long and arduous it may be, because war is not and will never be an option.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/174065 https://historynewsnetwork.org/article/174065 0
Trump Checks all of the Impeachment Boxes: Will it Matter?

 

In my last article, I wrote that there is something wholly different about Donald Trump’s actions than those of other presidents who have exceeded their power. Why? Unlike other presidents, Trump’s actions meet each of the requirements that the Framer’s laid out to impeach a president. Ironically, just as impeachment is needed most, the partisan tenor of the times may make it impossible to accomplish.

 

The Framers of the Constitution had included the power of impeachment for instances of treason, bribery, or other high crimes and misdemeanors committed by executive branch officials and the president. They had rejected policy disputes as a basis for impeachment. When George Mason proposed adding maladministration as an impeachable offense, Madison responded that “so vague a term will be equivalent to tenure during pleasure of the Senate.” It was at this point that “high crimes and misdemeanors” were added. While the first two are clear, the third sounds vague to us today. Yet, the Framers had a clear idea of what this meant. As I wrote previously for the History News Network, the Framers “thought that the power of impeachment should be reserved for abuses of power, especially those that involved elections, the role of foreign interference, and actions that place personal interest above the public good” which fall within the definition of high crimes and misdemeanors.  Professor Noah Feldman of Harvard Law School, in his testimony to the House Judiciary Committee, said that for the Framers “the essential definition of high crimes and misdemeanors is the abuse of office” by a president, of using “the power of his office to gain personal advantage.” Trump’s actions with Ukraine checks each of these boxes.

 

Presidents and Congress have often found themselves in conflict, dating back to the earliest days of our Republic. Part of this is inevitable, built into a constitutional system of separation of powers with overlapping functions between the two branches. Only Congress can pass laws, but presidents can veto them. Presidents can negotiate treaties but must obtain the advice and consent of the Senate. As Arthur Schlesinger Jr. observed, checks and balances also make our political system subject to inertia. The system only really works “in response to vigorous presidential leadership,” Schlesinger wrote in The Imperial Presidency. But sometimes presidents grasp for powers that fall outside of the normal types of Constitutional disputes. This is where impeachment enters the picture.

 

In the early Republic, disputes between presidents and congress revolved around the veto power. Andrew Jackson was censured by the Senate over his veto of bank legislation and his subsequent removal of federal deposits from the Second Bank of the United States. Jackson’s actions showed a willingness to grab power and to ignore the law and the system of checks and balances when it suited his purposes. While the censure motion passed, it did not have the force of law. It was also unlikely that impeachment would have been successful, since the dispute was over policy. While the president had violated the law, not all illegal acts are impeachable. As Lawrence Tribe and Joshua Matz have noted, “nearly every president has used power in illegal ways.” Impeachment is meant to be limited to those actions, like treason and bribery, “that risk grave injury to the nation.”

 

Congress considered impeaching John Tyler in 1842 over policy disputes for which he too used the veto power. Tyler is sometimes referred to as the accidental president since he assumed the presidency when William Henry Harrison died in office one month after he was sworn in. Tyler had previously been a Democrat and state’s-rights champion who had joined the Whig Party over disagreements with Jackson’s use of presidential power. Yet he preceded to use the powers of the presidency to advance his own policy views, and not those of his newly adopted Whig Party, vetoing major bills favored by the Whigs, which led to an impeachment inquiry. A House Committee led by John Quincy Adams issued a report that found the President had engaged in “offenses of the gravest nature” but did not recommend that Tyler be impeached. 

 

Even Andrew’s Johnson’s impeachment largely revolved around policy disputes, albeit extremely important ones. Johnson was a pro-Union southerner who was selected by Lincoln to aid in his reelection effort in 1864. Johnson was clearly a racist, a member of the lowest rung of southern white society, those that felt their social position was threatened by the advancement of blacks, a view that shaped Johnson’s policies on Reconstruction. While the Civil War ended slavery, it did not end the discussion of race and who can be an American. Johnson, much like Trump, represented those who believed that America was a typical nation, made up of one racial group. “I am for a white man’s government in America,” he said during the war. On the other hand, the Radical Republicans believed that America was essentially a nation dedicated to liberty and equality, and they set out to fulfill the promise of the American Creed for all American men, black as well as white. This was the underlying tension during the period of Reconstruction. “Johnson preferred what he called Restoration to Reconstruction, welcoming the white citizenry in the South back into the Union at the expense of the freed blacks,” Joseph Ellis writes.

 

But Johnson was ultimately impeached on false pretenses, not for his policy disputes with Congress, but due to his violation of the Tenure of Office Act, which some have characterized as an impeachment trap. The act denied the president the power to fire executive branch officials until a Senate confirmed appointment had been made. While the House impeached Johnson, he escaped removal by one vote in the Senate. Johnson’s sin was egregious; he had violated one of the core tenants of the American Creed, that all are created equal. Yet this did not rise to the level of an impeachable offense that warranted the removal of a president. It was a policy dispute, one that went to the core of who we are as Americans, but it was not a high crime or misdemeanor. 

 

It would be over one hundred years before impeachment would be considered again, this time in the case of Richard Nixon. Like Trump, Nixon abused the power of his office to advance his own reelection. Both men shared a sense that the president has unlimited power. Nixon famously told David Frost that “when the president does it, that means its not illegal,” while Trump has claimed that under Article II “I have the right to do whatever I want.  But Nixon, unlike Trump, did not elicit foreign interference in his reelection effort. The two men also shared certain personality traits that led to the problems they experienced in office. As the political scientist James David Barber wrote in his book The Presidential Character, Richard Nixon was an active-negative president, one who had “a persistent problem in managing his aggressive feelings” and who attempted to “achieve and hold power” at any cost. Trump too fits this pattern, sharing with Nixon a predilection toward “singlehanded decision making,” a leader who thrives on conflict. 

 

Indeed, Nixon would likely have gotten away with Watergate except for the tapes that documented in detail his role in covering up the “third rate burglary” that occurred of the Democratic Party headquarters on June 17, 1972. Paranoid about possibly losing another election, Nixon had directed his staff to use “a dirty tricks campaign linked to his reelection bid in 1972,” presidential historian Timothy Naftali has written. When the break in was discovered, Nixon then engaged in a systematic cover-up, going so far as to tell the CIA to get the FBI to back off the investigation of the break in on bogus national security grounds. 

 

Much like Trump, Nixon stonewalled the various investigations into his actions, what Naftali calls “deceptive cooperation.” He had the Watergate Special Prosecutor, Archibald Cox, fired in October 1973 in order to conceal the tapes, knowing his presidency was over once they were revealed. In the aftermath of Cox’s firing during the so-called Saturday Night Massacre, Nixon refused to release the tapes to the House’s impeachment inquiry. Instead, he provided a transcript he had personally edited that was highly misleading. The final brick in Nixon’s wall of obstruction was removed when the Supreme Court unanimously ruled in July 1974 that Nixon had to release the tapes, which he complied with. One wonders if Trump would do the same.  

One difference with the Trump case is that there was a degree of bipartisanship during the Nixon impeachment process. By the early summer of 1974, cracks had begun to appear in the Republicans support for Nixon. Unlike today, there were still moderate Republicans who were appalled by Nixon’s actions and had become convinced that the president had engineered a cover-up. Peter Rodino, the Democratic chairman of the House Judiciary, had bent over backwards to appeal to the moderate Republicans and to Southern Democrats, where Nixon was popular. Despite strong pressure from the leadership of the GOP in the House, it was this group that ultimately drew up the articles of impeachment. 

 

Still, a large number of Republicans in the House continued to stick with the president until the tapes were finally released. It was at this point that even die-hard Nixon supporters deserted him when it became apparent that Nixon had been lying all along and had committed a crime. Nixon’s case shows both the importance of bipartisanship in the impeachments process, but also how difficult it is for members of the president’s party to turn on him. In August of 1974, Nixon resigned when confronted by a group of Senators and House members, led by conservative Senator Barry Goldwater.

 

The impeachment of Bill Clinton is the anomaly, since it was not about policy (as in Johnson’s case) or the abuse of power (in Nixon’s case). Rather it emerged in part due to a character flaw. Clinton could not restrain himself when it came to women. 

 

The facts of the case are well known. While president, Clinton had an illicit sexual encounter with Monica Lewinski in the Oval Office. He then proceeded to lie about it, both to the country and also during a deposition in the Paula Jones case, and attempted to cover up the affair. Kenneth Starr, who had been appointed as Independent Counsel to investigate the Whitewater matter, a failed land deal in which the Clinton’s lost money but did not nothing wrong, then turned his investigation to the president’s actions with Lewinski and recommended that the House of Representative consider impeachment proceedings for perjury and obstruction of justice.

 

By this point, Clinton had admitted he had lied to the country and apologized for his actions. The House had the opportunity to censure Clinton, but Tom Delay, one of the Republican leaders, buried that attempt, even in the aftermath of the midterm elections when Democrats gained seats, clearly pointing to public opposition to impeachment of the president, whose approval rating were going up. While the House voted for impeachment largely along partisan lines, the Senate easily acquitted Clinton on a bi-partisan basis. Clinton’s actions, while “indefensible, outrageous, unforgiveable, shameless,” as his own attorney described them, did not rise to the level the Framers’ had established for impeachment. 

 

Clinton’s impeachment in the House was largely a product of partisan politics that were out of control. As Lawrence Tribe and Joshua Matz have written, “starting in the mid-1990’s and continuing through the present, we’ve seen the creeping emergence of a permanent impeachment campaign.” Both George W. Bush and Barack Obama faced impeachment movements during their terms in office over issues that in the past no one would have considered impeachable. During the 2016 election, both Hillary Clinton and Donald Trump lobbed accusations that the other would face impeachment if elected. The current toxic political environment raises the issue of whether a bipartisan impeachment effort has any chance at all, especially when the two sides cannot even agree over basic facts. Nancy Pelosi was right to hold off the movement to impeach Trump prior to the Ukrainian matter, but now that we have such a clear abuse of power, what is the alternative? At this moment, when the tool of impeachment is most needed for a president who meets all of the criteria laid out by the Framers, the process itself has become captive to extreme partisanship by the Republicans. 

 

The ball is now in the Senate’s court, where the trial will soon occur. While conviction and removal from office is highly unlikely short of additional corroborating evidence (which Mitch McConnell has been attempting to squelch), perhaps the Senate can find the will to issue a bipartisan censure motion that condemns the president and issues a warning that another similar abuse of power will result in removal. Ultimately, the voters will likely decide Trump’s fate come November. We can only hope they choose correctly.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/174016 https://historynewsnetwork.org/article/174016 0
Annual Jewish Film Festival, Following New Wave of Anti-Semitism, Offers Hope and Inspiration Early last month, four people were killed at a Jewish food store next to a synagogue in Jersey City, N.J. (two blocks from the building in which I work). A few days later, a man wielding a machete stabbed and badly injured five Jews praying in the home of a Rabbi in Monsey, New York, about 40 miles from New York City. Since then, several swastikas have been painted on buildings in various cities. These incidents are all part of a growing wave of anti-Semitism  in America. The anti-Semitic crimes in Chicago, New York and Los Angeles were the highest in 18 years in 2019. The hate crimes in Los Angeles, a category expanded by police to include swastikas an any religious property, doubled in 2019 over the previous year.  New York City counted 229 anti-Semitic crimes in the past year, a new record, and up significantly from last year.  The Anti-Defamation League said 2019 showed the third highest anti-Semitic crime total in the entire history of the organization.

 

On Wednesday, the 29th annual New York Jewish Film Festival, a two week (January 15-28) cinematic celebration of Jewish life, kicks off at Lincoln Center’s Water Reade Theater, in New York, and serves as hope and inspiration to not just Jews, but everybody.

 

Given the attacks on Jews all over the country, the Jewish Film Festival, one of the oldest in the United States, could not have come at a better time.

 

Aviva Weintraub, the executive director of the festival that is sponsored by the New York Jewish Museum and the Film Society of Lincoln Center, said what has been happening against Jews in the nation over the last two months is “horrifying.” She said the goal of the festival each year is to “bring Jews together with each other and others” and said she is hopeful that will happen again this year. 

    

Discrimination and persecution, of course, are no strangers to Jews and the selectins of films from the festival reflects that.

 

The film festival starts with the upbeat Aulcie, a sports film about how basketball star Aulcie Perry was spotted playing basketball on a New York City playground tournament by a scout for the Maccabi Tel Aviv basketball team from Israel in 1976. He was signed and, despite personal problems, helped the Maccabi team win two separate European championships. He later converted to Judaism and became an Israeli citizen

 

The centerpiece of the film festival is the screening of the award winnings 1970 film The Garden of the Finzi-Continis, director Vittorio De Sica’s movie about the struggles of the Jews in the World War II era in Italy, now celebrating its 50th anniversary. That Holocaust era film is joined by a new documentary, Four Winters: A Story of Jewish Partisan Resistance and Bravery in WW II, that tells the story of Jewish resistance to the Nazis throughout World War II in different countries.

 

“We chose The Garden of the Finzi-Continis because of its anniversary, but also because it is such as great film about the struggle of the Jews against the Nazis and because it is a beautiful and moving story,” said Ms. Weintraub. The film won 26 international awards in 1970, plus the Oscar for Best Foreign Language film.

 

Executive director Weintraub is equally proud of Four Winters. “The strength of the film is not just its story, but the inspiring story of each of the men and women, seniors now, who survived the Holocaust and, in the movie, describe what happened. It Is stirring,” said Ms. Weintraub. “You cannot see that documentary and not be moved by it.”

 

She said Four Winters is a factual and inspirational story of years of resistance against the Nazi regime. “It’s amazing to realize that the Jews and others resisted for that long,” she said.

 

The festival has always been popular. Weintraub chuckles when she thinks back on different years of the Festival. “We have hordes of people who literally camp out at Lincoln Center to catch as many films in the festival as they can,” she said. “It’s not surprising for someone to see several films. Many people come to Lincoln Center on their way home from work, or between shopping trips on weekends,” she said.

 

She and two others spend about a year winnowing down the films to 31 or 32 for each festival. “We look for films that represent history, politics and Jewish life. Each year the mix of movies is different, “ she added.

 

Some movies in this year’s festival represent the Holocaust. There is Birch Tree Meadow, a 2003 film that tells the story of a concentration camp survivor who returns to the camp years later to confront memory and the descendant of a Nazi guard. 

 

An Irrepressible Woman is the story of 1940s French Prime Minister Leon Blum, imprisoned at Buchenwald, and his love, Jeanne Reichenbach, who fell in love with him as a teenager, and risks her life to find him again.

 

There are cultural tales. The 1919 silent film Broken Barriers was the first film to tell some of the old Sholom Aleichem stories, that much later became world famous play and move Fiddler on the Roof.

 

Incitement is the complicated story of the lead up to the highly publicized assassination of Israeli Prime Minister Yitzhak Rabin in 1995. It tracks not only the murder, but the politics in the nation at the time.

 

God of the Piano is the story of a woman who is forced to meet high expectations by her father as a pianist. When she grows up, she places those same high expectation of her son but he is deaf. The story is the larger family conflict. 

 

The festival closes on a high note with the unification film Crescendo, the true story of how music conductor Eduard Sporck took over a joint Israeli-Palestinian youth orchestra.  At first, he saw his job as the man to get all of the musicians to produce beautiful music, but he soon realized the harder job,  and the more rewarding job, was to get the children from the two opposing political sides to forget personal differences and to work together as a smoothly running musical group. 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/174042 https://historynewsnetwork.org/article/174042 0
Painter of Disquiet: Félix Vallotton at the Metropolitan Museum of Art'

La Chambre rouge (1898)

 

The Metropolitan Museum of Art is currently presenting the work of Félix Vallotton, an artist who has been largely neglected relative to his contemporaries, such as Pierre Bonnard and Édouard Vuillard. This makes the present exhibition all the more welcome, and fascinating. Vallotton’s work unquestionably merits the renewed attention — his paintings possess a mysterious quality, narrative appeal, and attention to detail, as well as invoke a delicious sense of irony and wit. Born in the Swiss town of Lausanne on the shores of Lake Geneva in 1865, Vallotton displayed early an ability to draw from life; and at sixteen he arrived in Paris to study painting at the Académie Julian. A self-portrait at the age of twenty reveals a virtuosic, confident brush and much more — Vallotton is not interested in merely demonstrating his facility as a realist painter: he has a penetrating eye, a psychological depth, and a naturalism that owes much to northern Renaissance masters, especially Albrecht Dürer and Hans Holbein. It is not entirely clear when Vallotton’s friendship with Les Nabis began – from the Hebrew word for prophet, the Nabis were an avant-garde group of Parisian artists, which included Bonnard, Vuillard, Charles Cottet, and Ker-Xavier Roussel, among others. Valloton’s nickname within the group may be revealing – he was the ‘foreigner Nabi’. Perhaps this name reflected his Swiss origin, but the Nabis were an international group anyhow. It could also reflect in some measure that Vallotton was something of a loner; but it may allude to the fact that while for a time he adopted the Nabi’s dismissal of naturalism in favor of flat forms and the expressivist use of color, Vallotton was and fundamentally remained a realist painter. Recognition arrived early in Vallotton’s career for reviving the art of woodcut prints which had largely been forgotten since the Renaissance. Indeed, by the age of 25, Vallotton had single-handedly brought about a kind of revolution in xylography. Inspired by Japanese woodcuts that were popular in Paris at the time, Vallotton produced images with sharp contrasts of jet black and pristine white. His woodcuts are remarkable for their commentary on the French bourgeoisie, their stinging rebuke of societal decadence in fin-de-siècle Paris, their critical orientation to the police. Vallotton has an eye for violence — be it in the form of murder, or the sudden carriage accident, the political execution or the suicide. Vallotton combines a dark realism with sophisticated satire, wry humor, and a keen acerbic wit. He has an eye for the ambiguous and the enigmatic — a deft and subtle touch that defies finalization. The Demonstration from 1893 renders the political chaos of the day with humor — from the old man who has lost hold of his hat to the sole figure in white, a woman racing along the left hand side. Vallotton would return in both woodcuts and paintings to the scene of the bourgeoisie shopping for the latest luxury goods at the Bon Marche Department Store — the first such store of its kind in the world. The 1898 triptych is not without a certain irony — given that the format Vallotton chose was traditionally associated with altarpieces such as would be found in church. Which is just to underscore the Vallotton is a close observer of modernity, fascinated by the new world he sees emerging around him — with its rampant consumerism, and its technological novelties (such as a moving conveyor belt along a footbridge, which he includes in his woodcut series devoted to the World’s Fair of 1900.) A series of ten woodcuts entitled Intimités is a sustained and biting critique, and among Vallotton’s greatest achievements as a graphic artist. As an unsettling and disquieting series which lays bare the hypocrisies of bourgeois society, Vallotton deftly exposes a decadent class through scenes of adultery, deceit, romantic quarrels and indecent proposals. In 1899, Vallotton married Gabrielle Rodrigues-Henriques and the union was to have a significant effect on the remainder of his career. His wife was a wealthy widow and the daughter of a Paris art merchant, which meant that he now enjoyed a certain financial security and could turn exclusively to painting. While Vallotton’s work generally lost its satirical wit and subversive edge — there is also a certain psychological insight and marked turn towards the inwardness of his subjects that constitutes much of the power of this later period. The acquisition of a Kodak camera in 1899 led to changes in the way the artist worked. Now, he would typically take snapshots of imagery that appealed to him – and then used those photographs to craft his painting in the studio. It appears that often he would retain the sharply contrasting patterns of light and shadow revealed in the small photograph. However, the painter however was by no means subservient to the photographic image, as The Red Room, Etretat (1899), demonstrates. It is a remarkable painting for its unity of composition and psychological structure. All lines in the painting essentially point to (and up to) the seated figure of Gabrielle Vallotton – even the viewer is made to feel they’re looking up to this woman, who meanwhile looks down at the small child in the foreground. This child, so crucial to the psychological depth of the painting is entirely absent, however, from the photograph. Vallotton was also a master of ambiguity. There is always something more to the story he is telling that must ever remain just beyond our reach. Consider, for example, one of this exhibition’s finest offerings, The White and the Black (1913), a provocative work depicting a young white woman, nude and reclining; and a black woman, seated and coolly smoking a cigarette. The painting may be a response to Édouard Manet’s Olympia (1863) in which a white model, a prostitute likely, lies on a bed attended by a black servant bringing her flowers (probably from a client). But Vallotton may also be in dialogue with Jean-Auguste-Dominique Ingres’ Grande Odalisque (1814) and Odalisque with a Slave (1839). All of his friend’s attest to Vallotton’s love and admiration for Ingres, by whom he was “conquered without offering any resistance” – as one contemporary put it. Vallotton was clearly a close observer of Manet as well, and many of his paintings – for example, Misia at Her Dressing Table (1898) – emphasize the harsh light, the large color surfaces and shallow depth that was characteristic of Manet’s work, including Olympia (1863). But at the same time Vallotton subverts the traditional roles of mistress and servant by making the relationship between these two women utterly ambiguous. The Provincial (1909) is notable for its exploration of one of Vallotton’s recurring themes – namely, the complex, uneven relationship between men and women. In this painting, and others such as Chaste Suzanne (1922), a powerful female figure holds sway over her subservient male counterpart. The white lace of her blouse protrudes in a witty but subtle reminder of her breasts, which only underscores her sexual dominance over the docile man who sits beside her with eyes deferentially lowered. Moonlight (1895) is a standout work that reveals the influence of the Symbolists, in Vallotton’s attention to emotional force over actual topographical representation – water, earth and sky have become interfused in what is almost an abstraction. The picture also anticipates the latter part of his career, when Vallotton increasingly turned to landscape painting – often beginning with a photographic image or an on-site sketch, which was then imaginatively reconstructed on the canvas. The painter referred to his later landscapes as paysages composes (composed landscapes) – and remarked in 1906, “I dream of a painting free from any literal respect for nature.” Valloton said he wanted to “be able to recreate landscapes only with the help of the emotion they have provoked in me.” His method allows him to simplify the compositional structure, to intensify the mood and emphasize the emotional impact of color – as, for example, in Last Rays (1911) where we find the silhouettes of umbrella pines as they receive the last light of the evening sun. Félix Vallotton more than deserves the attention that this exhibition brings. His work – from the groundbreaking forays into xylography, to his portraits and scenes of bourgeois society, to his hauntingly mesmerizing landscapes – defies identification with any artistic school. Like all truly great artists, Vallotton is inimitable; while he experiments with various artistic programs, he ultimately remains aloof from them, determined to paint pictures purged of all sentimentality, beholden only to the emotional, psychological or social reality with which he is concerned. Such a body of work remains ever fresh and vital, and rewards close attention with a glimpse of truth.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/174040 https://historynewsnetwork.org/article/174040 0
Depression Era Tenor Hits All the High Notes

 

It is 1934, at the height of the Depression, and an opera company in Cleveland is trying to make enough money to stay in business. The answer, its officials believe, is to bring in fabled Italian tenor Tito Merelli for a one-night concert. Merelli’s appearance is highly publicized and the theater is sold out. The opera company has a stage set. It has an orchestra. It has an audience. But it has no Merelli. He is missing.

 

In late afternoon, just a few hours before the performance, the distraught opera company’s manager takes a bold step. He will use Max, his assistant and an amateur tenor, to dress as the clown Pagliacci and do the show as Merelli. With all the makeup and man in disguise with a tenor’s voice, and also about the correct height and weight, who would know? What could possibly go wrong?

 

That’s where everything starts to collapse in this play set in 1934, Lend Me a Tenor, that opened last weekend at the Westchester Broadway Theater in Elmsford, N.Y. It is a wacky, crazy, upside down play by Ken Ludwig that has the audience roaring with laughter. This tenor can hit all the high notes and makes everybody laugh, too. He is his own opera company – if he can be found.

 

The fraudulent singer plan is in place, Max is ready for his fifteen minutes of fame and the audience is waiting, unaware of the deception. Then, unannounced, Tito Merelli arrives, as flamboyant as flamboyant can be, with his overly emotional, arm waving wife, Maria, who is always angry about something. He is ready to go on. Max is ready to go on. Who will go on? 

 

And then…………well, see the play.

 

Lend Me A Tenor is not only a hilarious show, well directed by Harry Bouvy, but a neat look back at how opera singers and other highly regarded performers toured the country in that era. Merelli lived when they left their homes in the U.S. or in Europe and went on tours of America or appeared in different cities at different times of the year. The play captures the physical details of the traveling star’s Depression life well.

 

Star tours were very common in the Depression, despite the sagging national economy, that hurt theaters and opera houses as badly as it hurt everything else. Bringing in a star for a one-night performance usually worked not only to put money in the opera house bank, but garner enormous attention for the opera company which translated to ticket sales for the company’s regular operas. Marian Anderson, the great singer, toured the U.S almost every year in the late 1930s, sometimes taking tours that lasted several months an included up to eighty concerts.  The big bands of the era - Duke Ellington, Glen Miller, the Dorseys - did the same thing, hitting the road early in the year and living out of suitcases for most of it. The casts of Broadway shows also traveled in that manner. There were always logistical problems- train breakdowns, snowstorms, ticket mix-ups, but, in general, the tours worked well.

 

Except for Tito Merelli.

 

He and his wife have enormous problems to combat when they finally do arrive in Cleveland (Tito’s main problem is his firecracker wife). How do you un-vanish? How does Max, bursting for a chance in the musical spotlight, cope with the fact what Tito is, in fact, in Cleveland? Or is he? Is anybody really in Cleveland?

 

In Lend Me A Tenor, the Cleveland Opera Company has rented a palatial, two room suites for Merelli and his wife, an adoring bell hop brings up their bags and the opera company’s manger treats him like visiting royalty. All of this is historically accurate. Performers arrived by plane or train, mostly, although some went by car. They ate well in the hotel’s restaurant, were given tours of the area ad introduced to dozens of artistic and municipal officials. Newspaper ads for Merelli would have been taken out for a week or more prior to the show. Hundreds of large, cardboard broadsides would have been put in store windows or nailed to trees and telephone poles. There might even have been a Tito Merelli Day (that is, if they could find Tito). 

 

Everything that show business people did to arrange an overnight stay and performance for a star like Morelli in 1934 was done year after year and became the backbone of the American Tour. It is the same way today although social media, television, emails and i-phones have modernized the tour. Tito Merelli would love the contemporary tour, if he could find it.

 

Director Bouvy, who pays such careful attention to the history of the play, has done a superb job with a talented cast of actors. He lets them all shine as a very group of the eccentric, egomaniacal show biz people we all know and love. The two stars of the show are Tito, played wonderfully and with great style by Joey Sorge, and Max, played equally well by J.D. Daw. In addition to them, Bouvy gets fine work from Molly McCaskill as Maggie, Max’ girlfriend, Philip Hoffman the opera manager, Kathy Voytko as Tito’s wife, Tregoney Sheperd as the doughty opera company board chair, Hannah Jane McMurray as a soprano, Sam Seferian as the bellhop and John Savatore as an assistant stage manager.

 

PRODUCTION: The play is produced by Westchester Broadway Theater. Sets: Steve Loftus, Costumes:  Keith Nielsen, Lighting: Andrew Gmoser, Sound: Mark Zuckerman . The play is directed by Harry Bouvy. It runs through January 26.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/174041 https://historynewsnetwork.org/article/174041 0
Can America Recapture Its Signature Exuberance?

 

As the news cycle churns on with impeachment coverage, pundits and politicians are quick to remind that the Constitution is all that stands between us and the whims and dangers of authoritarian rule. It’s a good point, but incomplete. America is a country of law and legend and our founding document, essential as it is, won’t save us if we don’t also buy into a binding story about the point and purpose of our democracy.

 

That’s a tall order in today’s world. To author Lee Siegel, hacking America’s dour realities is like scaling a rock face. In his recent essay “Why Is America So Depressed?” he suggests we reach for the metaphorical pitons—after “the iron spikes mountain climbers drive into rock to ascend, sometimes hand over hand”—to get a humanistic grip amid our “bitter social antagonisms” and the problems of gun violence, climate crisis and social inequities that have contributed to an alarming rise in rates of anxiety, depression and suicide. 

 

“Work is a piton,” says Siegel. “The enjoyment of art is a piton. Showing kindness to another person is a piton.”

 

Thanks to Siegel, I now think of the 200-year anniversary of poet Walt Whitman’s birth in the year just ended as a piton for the uplift it gave me. As Ed Simon, a staff writer for The Millions and contributing editor at HNN, declares in another fine essay published New Year’s week, Whitman is “our greatest poet and prophet of democracy.” We’d do well to enlist the bard’s ideas in 2020, he argues, to steer ourselves through “warring ideologies” and to offset “the compelling, if nihilistic, story that authoritarians have told about nationality.”

 

I would simply add this: Whitman is also the godfather of our national exuberance, a champion of the country’s heaving potential and can-do spirit—traits that, until recently at least, could win grudging regard from some of America’s fiercest critics. Despite his intermittent bouts of depression, Whitman tapped into the energy and brashness of our democratic experiment like no other. His verse hovers drone-like above the landscape, sweeping the manifold particulars of American life into a rapturous, cosmic frame.

 

Whitman did some of his most inspiring work in the worst of times. He published the first edition of his collection “Leaves of Grass” in 1855 as the country drifted toward civil war. His Homeric imprint, inclusive for its time, invited individuals from varied walks of life (“Workmen and Workwomen!…. I will be even with you and you shall be even with me.”) to splice up with the democratic push to explore contours of the American collective.

 

My first brush with Whitman’s high spirits came when I was an undergraduate in the late 1960s. Like campuses across the country, the University of Washington was riven by clashing dogmas and muscular protests against the Vietnam War. Like not a few working-class kids brown-bagging it from their family homes, I rode the fence politically, not knowing how to jump. As a citizen, I was offended by club-wielding riot police entering campus in force; as a student of Asian history, I was repulsed watching radical activists trash an Asian studies library, flinging rare old texts out windows onto the grass. Cultural revolution was a tough business.

 

Lucky for me, a generous English professor, Richard Baldwin, supplied some useful ballast. Sensing my fascination with Emerson, Thoreau and Dickinson, he offered to help me sort out the Transcendentalists during office hours. But it was Whitman who bulled his way forward. His booming energy, delivered in his self-proclaimed “barbaric yawp,” turned me into a forever fanboy.

 

I loved how Whitman challenged the thou-shalt-nots of established order in “Song of Myself,” joining the role and responsibilities of the individual to humanity’s common lot:

 

Unscrew the locks from the doors!

Unscrew the doors themselves from their jambs!

 

Whoever degrades another degrades me,

And whatever is done or said returns at last to me.

 

I loved his fervent embrace of democracy in “Democratic Vistas,” his 1871 dispatch to a divided nation:

 

Did you, too, O friend, suppose democracy was only for elections, for politics, and for a party name? I say democracy is only of use there that it may pass on and come to its flower and fruits in manners, in the highest forms of interaction between men, and their beliefs - in religion, literature, colleges, and schools- democracy in all public and private life … .

 

I loved his zest for the many-parted American experience, for linking the natural environment to the soul of the country, as he proclaimed his outsized poetic ambition in “Starting from Paumanok”:

 

… After roaming many lands, lover of populous pavements,

Dweller in Mannahatta my city, or on southern savannas,

Or a soldier camp'd or carrying my knapsack and gun, or a miner 

in California,

Or rude in my home in Dakota's woods, my diet meat, my drink

from the spring,

Or withdrawn to muse and meditate in some deep recess,

Far from the clank of crowds intervals passing rapt and happy

Aware of the fresh free giver the flowing Missouri, aware of mighty

Niagara,

Aware of the buffalo herds grazing the plains, the hirsute and

strong-breasted bull,

Of earth, rocks, Fifth-month flowers experienced, stars, rain, snow,

my amaze….

Solitary, singing in the West, I strike up for a New World.

 

Whitman had his flaws, to be sure. Despite cheerleading for a big-tent America, he also shared the prejudices of his time, blotting his copybook, for example, with ugly remarks about African Americans. By 1969, when I hungered for a healing narrative, the bard’s prescriptions were undergoing reappraisal in light of righteous and long-overlooked storylines offered up by the civil rights and antiwar movements.

 

Still, Whitman has kept his place in chain of title to America’s resilience. He conveys a panoramic confidence that we are greater as a people than our contemporary realities; that if we lose the thread, we’ll find it again and reweave our story into a new and improved version. As Whitman says in the preface to the 1855 edition of “Leaves of Grass,” people look to the poet “to indicate the path between reality and their souls.”

 

Thomas Jefferson saw utility in the cohesive national story. In her 2018 book “The Death of Truth,” Michiko Kakutani writes that Jefferson “spoke in his inaugural address of the young country uniting ‘in common efforts for the common good.’ A common purpose and a shared sense of reality mattered because they bound the disparate states and regions together, and they remain essential for conducting a national conversation.”

 

Today, Kakutani argues, we’re mired in “a kind of homegrown nihilism … partly a by-product of disillusion with a grossly dysfunctional political system that runs on partisan warfare; partly a sense of dislocation in a world reeling from technological change, globalization, and data overload; and partly a reflection of dwindling hopes among the middle class that the basic promises of the American Dream … were achievable … .” 

 

Whitman understood transcendence of national mood is an uphill climb. Periods of division and strife are baked into our democracy, part of the yin and yang, the theory goes, that sorts new realities into a renovated sense of purpose. Yet periods of upheaval must necessarily lead to a refitting, not obliteration, of our common story or democracy is toast.

 

Do Americans still have the chops to reimagine the song of ourselves? 

 

Lord knows, we’ve had the chops. In his 1995 book “The End of Education,” media critic Neil Postman writes: “Our genius lies in our capacity to make meaning through the creation of narratives that give point to our labors, exalt our history, elucidate the present, and give direction to our future. ”But we do have to mobilize the genius.

 

That’s easier said than done in a time brimming with distraction, distress and division. It’s hard to say when or if we’ll devise a story with the oomph necessary to yank us up and over the walls of our gated communities of the mind—toward efforts that will make society more inclusive, the economy more equitable and life on the planet more sustainable. 

 

In the meantime, three cheers for the ebullient currents of American life Walt Whitman chronicled. On our best days, they can lift us above our brattish, self-defeating ways.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/174015 https://historynewsnetwork.org/article/174015 0
Cinema Paradiso: The Academy Museum of Motion Pictures Will Be The Home of Movies, Past and Present

(For more images of the museum, click the image above or here)

 

Acclaimed film director Martin Scorsese was recently in the news for asserting his opinion that comic book movies are not “cinema.” In light of his upcoming historical film depicting the life of notorious mobster Jimmy Hoffa, Scorsese expanded on his claim that the massive push for superhero movies has affected young people’s understanding of history, saying “they perceive even the concept of what history is supposed to be [differently]." 

 

Whether it promotes a better understanding of history or not, film itself has remained a major cultural influence around the world for over a century now. Its larger historical impact is hard to measure. Soon, however, the first large-scale museum in the country solely dedicated to the history of motion pictures is set to open and attempt to do just that. 

 

The Academy Museum of Motion Pictures will be located on the corner of Wilshire and Fairfax Boulevard in Los Angeles. According to Jessica Niebel, the Exhibitions Curator for the museum, “the Academy Museum of Motion Pictures will be the world’s premier institution devoted to the art and science of movies and moviemaking.” The museum is on track to open sometime in 2020 after several delays in construction. 

 

The purpose of the museum is to encapsulate how film has changed over time. Moving pictures can be traced all the way back to the 19th century; in 1895 the first ever black and white film, the 50-second long “Arrival of a Train,” was released. The film caused an uproar from audiences, who had never even conceived what moving picture could be. Since then it has grown as an art form, which is something that the Academy Museum aims to capture. Niebel hopes that the museum’s programs and exhibitions will allow visitors to experience, “how movies evolved and are made, and highlight artists, craftspeople, designers and technicians who make the movies possible.”

 

The 20th century was integral to the growth of film as an art form. Despite the fact that it was new, Niebel explains that film, “being the most democratic artform of the 20th century as it was available, affordable and attractive to the masses, had a very strong connect to cultural histories everywhere.” Hollywood’s growth as an industry was soon followed by the rapid growth of film industries in India with “Bollywood,” and later in Nigeria with “Nollywood.” The Academy Museum plans to showcase international film in its first major temporary exhibition, “an unprecedented retrospective on Hayao Miyazaki, which will be the first major exhibition in the United States of the work of the legendary Japanese filmmaker, organized in collaboration with Studio Ghibli.”

 

Visitors to the museum can expect to experience a wide variety of programs, including film programs, exhibitions, public programs, publications, and education programs. The six-story museum itself will be a resource for film education, featuring “more than 50,000 square feet of exhibition galleries, a state-of-the-art education studio, two film and performance theaters, a 34-foot high, double-height gallery for cutting-edge temporary installations, a restaurant and café, and public and special event spaces.” Some programming may involve hosting industry professionals and guest speakers to give insight into their experience with film and an insider look into how much of a collaborative process filmmaking really is. A recent New Yorker piece detailed how the American movie industry took off in the 20th century with the help of many groups that don’t get on-screen credit, especially women. Museum programming hopes to address that.

 

An official grand opening date has not been announced yet, despite the fact that it’s been a long time coming. Plans for the construction of the museum were announced in 2012; there have been significant delays for the project in the seven years since. The Academy has chalked that up to the sheer feat involved in building it, including the renovation of a 1939 LA landmark (the May Company building), building a new spherical structure that includes a 1500 panel glass dome, and joining them together. In a statement, the Academy said that “we have always chosen the path that would enhance the structure, even if that meant construction would take more time to complete,” and “we are weighing the overall schedule for major industry events in 2020, and on this basis will choose the optimal moment for our official opening.”

 

Once it finally opens, the museum will be the first of its kind in the US. As such, it has been very important for planners like Niebel to create an experience that is altogether unique with an eye to the future. That involves screening films in the correct format that they were intended to be seen in, while also providing exhibitions that complement the screenings. Education will be an emphasis, as Niebel explained: “Film exhibitions cannot recreate the cinematic experience but translate it into another medium, that of the exhibition.” The Academy Museum has differentiated itself from other museums in that regard. History and art museums typically focus on one aspect of either education or visual display. Niebel maintains that the Academy Museum will be able to address both, because “film exhibitions are a ‘genre’ of their own in that they combine artifacts, film clips, design and immersive experiences to achieve not only educational aspects or aesthetic impressions, but wholistic experiences.” 

 

The museum will have the advantage of access to the Academy of Motion Pictures Arts and Sciences Archive. The archive has been acquiring film since 1929 and currently holds over 190,000 materials, including many of the films nominated for Oscars in all categories and all of the award-winning films for Best Picture. Niebel confirmed that the museum will “draw on the unique intellectual and material resources of the Academy of Motion Picture Arts and Sciences.” It should be an interesting landmark for historians and movie buffs alike. Film has always been a kind of public history, a reflection of society and culture. The Academy Museum of Motion Pictures, once it opens, may capture that. Score one for Martin Scorsese. 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173843 https://historynewsnetwork.org/article/173843 0
Why The West Is Losing The Fight For Democracy

 

Adapted from The Light That Failed by Ivan Krastev and Stephen Holmes, published by Pegasus Books. Reprinted with permission. All other rights reserved.

 

The future was better yesterday. We used to believe that the year 1989 divided ‘the past from the future almost as clearly as the Berlin wall divided the East from the West.’ We had ‘trouble imagining a world that is radically better than our own, or a future that is not essentially democratic and capitalist.’ That is not the way we think today. Most of us now have trouble imagining a future, even in the West, that remains securely democratic and liberal.

 

When the Cold War ended, hopes for liberal capitalist democracy spreading globally were high. The geopolitical stage seemed set for a performance not unlike George Bernard Shaw’s Pygmalion, an optimistic and didactic play in which a professor of phonetics, over a short period of time, succeeds in teaching a poor flower girl to speak like the Queen and feel at home in polite company.

 

Having prematurely celebrated the integration of the East into the West, interested observers eventually realized that the spectacle before them was not playing out as expected. It was as if, instead of watching a performance of Pygmalion, the world ended up with a theatrical adaptation of Mary Shelley’s Frankenstein, a pessimistic and didactic novel about a man who decided to play God by assembling replicas of human body parts into a humanoid creature. The defective monster felt doomed to loneliness, invisibility and rejection. And envying the unattainable happiness of its creator, it turned violently against the latter’s friends and family, laying their world to waste, leaving only remorse and heartbreak as legacies of a misguided experiment in human ​self-​duplication.

 

So, how did liberalism end up the victim of its heralded success in the Cold War? Superficially, the fault lay with a series of profoundly destabilizing political events: the 9/11 attack on the World Trade Center in New York, the second Iraq War, the 2008 financial crisis, Russia’s annexation of Crimea and intervention in Eastern Ukraine, the impotence of the West as Syria descended into a humanitarian nightmare, the 2015 migration crisis in Europe, the Brexit referendum, and the election of Donald Trump. Liberal democracy’s ​post-​Cold War afterglow has also been dimmed by the Chinese economic miracle, orchestrated by a political leadership that is unapologetically neither liberal nor democratic. Attempts to salvage the good name of liberal democracy by contrasting it favourably with ​non-​Western autocracy have been undercut by the feckless violation of liberal norms, as in the torture of prisoners, and the evident malfunctioning of democratic institutions inside the West itself. Tellingly, how democracies atrophy and perish has become the question that most preoccupies liberal scholars today.

 

The very ideal of ‘an open society,’ too, has lost its ​once​-fêted lustre. For many disillusioned citizens, openness to the world now suggests more grounds for anxiety than for hope. When the Berlin Wall was toppled, there were only sixteen border fences in the world. Now there are ​sixty​-five fortified perimeters either completed or under construction. According to Quebec University expert Elisabeth Vallet, almost a third of the world’s countries are rearing barriers along their borders. The three decades following 1989 turned out to be an ‘inter-​mural period’, a brief ​barricade-​free interval between the dramatic breaching of the Berlin Wall, exciting utopian fantasies of a borderless world, and a global craze of ​wall​-building, with cement and ​barbed-​wire barriers embodying existential (if sometimes imaginary) fears.

 

Most Europeans and Americans today also believe that the lives of their children will be less prosperous and fulfilling than their own. Public faith in democracy is plummeting and ​long-​established political parties are disintegrating or being crowded out by amorphous political movements and populist strongmen, putting into question the willingness of organized political forces to fight for democracy’s survival in times of crisis. Spooked by the phantom of ​large​-scale migration, electorates in parts of Europe and America are increasingly drawn to xenophobic rhetoric, authoritarian leaders and militarized borders. Rather than believing that the future will be uplifted by the liberal ideas radiating out of the West, they fear that ​21st-​century history will be afflicted by the millions of people streaming into it. Once extolled as a bulwark against tyranny, human rights are now routinely accused of limiting the ability of democracies to fight terrorism effectively. Fears for liberalism’s survival are so acute that references to William Butler Yeats’s ‘The Second Coming’, written in 1919 in the wake of one of the deadliest conflicts in human history, became an almost obligatory refrain for political commentators in 2016. A century after Yeats wrote them, these words are now the mantra of apprehensive defenders of liberal democracy worldwide: ‘Things fall apart; the centre cannot hold; / Mere anarchy is loosed upon the world.’

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/174017 https://historynewsnetwork.org/article/174017 0
The History of Fire: An interview with Stephen Pyne

 

Stephen Pyne is an emeritus professor at Arizona State University. He has published 35 books, most of them dealing with fire, but others on Antarctica, the Grand Canyon, the Voyager mission. With his oldest daughter, he wrote an inquiry into the Pleistocene. His fire histories include surveys of America, Australia, Canada, Europe (including Russia), and the Earth. To learn more about his work, please feel free to visit his website

 

What made you want to be a historian?

I drifted into history. I enjoyed reading it, especially world history, as a kid, then the spark took when I realized that I understood things best through their history.  It was by reading Carl Boyer’s History of the Calculus, for example, that I finally appreciated what the numbers were all about.  The same has proven true for topics like fire, Antarctica, Grand Canyon, and the rest.  Then I realized that history could also be literature. That clinched it.

I saw that you used to be a wildland firefighter. Is this what made you want to study the history of fire?

A few days after graduating from high school, I was hired as a laborer at Grand Canyon National Park.  While I was signing my papers, an opening appeared on the North Rim fire crew, and I was asked if I wanted to join.  I’d never been to the North Rim, never worked around a flame bigger than a campfire, didn’t even know the names of the basic tools, and of course said, Sure. It was a moment of biographical wind shear.  I returned to the Longshots for 15 seasons, 12 as crew boss, then spent three summers writing fire plans for Rocky Mountain and Yellowstone National Parks.  Then I went to Antarctica for a season. The North Rim fire crew and campus were separate lives: neither had much to do with the other.  It took 10 years as a Longshot and after getting a Ph.D. that I finally brought the two worlds together.  I decided to apply to the scholarship I had been trained into the subject that most animated me. Fire in America was the result. It’s not a hard weld; the two lives have never fully fused.  I have one line of books that continues the topics I studied in grad school, and another that deals with fire, particularly big-screen fire histories for America, Australia, Canada, Europe (including Russia), and the Earth.  In some respects, I’ve been a demographic of one. But it’s also like being on the American River in 1848 California. There are riverbeds of nuggets just waiting to be picked up.    

 

How did personal experience influence your scholarship?

Obviously, as a topic.  I would never have thought of fire as a subject without those hopping seasons on the Rim.  They gave me woods credibility. More subtly, those years shaped how I think and speak about fire.  On a fire crew, you quickly appreciate how fires shape a season, and how fire seasons can shape a life.  It’s not a big step to wonder if the same might be true for humanity. After all, we are a uniquely fire creature on a uniquely fire planet.  Fire is what we do that no other species does. It makes a pretty good index of our environmental agency. You pick up a language, a familiarity, a sensibility toward fire – it’s a relationship, in some way.  Without anthropomorphizing fire, you learn to animate it – give it a presence. That’s what I can bring that someone else might not.  At the same time, I need to abstract the vernacular into more general concepts, which is what scholarship does. I’m constantly fluctuating between the two poles, a kind of alternating current. There are always trade-offs.  I begin with fire and construct a history.  I don’t begin with historiography – questions of interest to the history community – and use fire to illustrate them.  That makes my fire stuff different, and it means it can be hard to massage it into a more general history or a classroom.  I sometimes wonder if I invented a subject only to kill it.  

 

As I’m sure you are aware, wildfires recently ravaged the state of California, and bushfires continue to burn Australia. How does your understanding of the history of fire shape how you think about these wildfires? 

Ah, as long as California keeps burning it seems I’ll never be lonely.  I do a lot of interviews. Last year I finally wrote a fire primer for journalists and posted it on my website. California is built to burn and to burn explosively.  Against that, we have a society that is determined to live and work where and how it wants.  For a century California has buffered between hammer and anvil with a world-class firefighting apparatus.  But four fire busts in three years have – or should have – broken that strategy. It’s costly, it’s ineffective against the extreme events (which are the ones that matter), and it’s unfair to put crews at risk in this way.  Doing the same things at ever-higher intensities only worsens the conditions. Australia is California at a continental scale, only drier, and with winds that can turn the southeast into a veritable fire flume.  It has a long history of eruptive fires, but the 2009 Black Saturday fires and the Forever fires burning now feel different.  They are more savage, more frequent, and more disruptive.  Australia is also a firepower because it has a cultural connection to fire at a range and depth, from art to politics, that I haven’t found elsewhere.  Australia’s foresters were the first to adopt controlled burning as a strategy for protection – their experience makes a fascinating contrast to what happened in the U.S. Both places show the importance of framing the problem.  Fire is a creation of the living world, but we have defined it as a phenomenon for physics and chemistry, which leads us to seek physical solutions like dropping retardants and shoving hydrocarbons around.  We haven’t really thought about nuanced ecological engineering. We’ve mostly ignored the ideas and institutions that shape the social half of the equation. We’ve neglected how these scenes are historically constructed, how they carry a long evolution that doesn’t derive from first principles. For that matter, the intellectual history of fire is relevant because fire as an integral subject was a casualty of the Enlightenment.  We have no fire department at a university except the one that sends emergency vehicles when an alarm sounds. There is a lot here for historians to chew on.  But it isn’t enough to problematize. We have to show how our analysis can lead to problem-solving.

 

Is climate change alone enough to explain wildfires or do we need to understand more about history to understand why they are such a problem? 

There are many ways to get big fires.  Presently, climate change is serving as a performance enhancer.  In the 19th and early 20th centuries, megafires an order of magnitude greater than those of today were powered by logging and land clearing slash.  Climate integrates many factors, so does fire, and when you stir those two sloppy variables together, it’s tricky to attribute particular causes to the stew of effects. If you make fire an informing principle, a narrative axis, you find that the shift to burning fossil fuels, what I think of as the pyric transition, unhinged Earth’s fire regimes even without climate change.  The conversion has rolled over habitat after habitat. Land use and humanity’s fire practices, for example, are hugely important in interacting with climate to shape fires. But most of those changes also trace back to fossil fuels. Basically, we’re burning our combustion candle at both ends. How lithic landscapes and living landscapes interact has not been something fire ecology or physics has considered.  They dread dealing with humans because people muck up the models. You have to come at those notions sideways, you have to view the scene from outside the disciplinary prisms that we’re trained in.  Paradoxically perhaps, the humanities may be better positioned to make conceptual contributions than the natural sciences.

 

How do you think the field of environmental history will change as the climate crisis becomes a more and more pressing issue?

The crisis is spooky.  But I reject the notion that we are heading into a no-narrative, no-analog future.  With fire, I can offer a pretty substantial narrative – it’s one of the oldest humanity has.  In truth, I now regard climate history as a sub-narrative of fire history. And I’ve come to imagine our evolving fire age as something comparable to the ice ages of the Pleistocene.  That’s a crisp analog. Changing sea levels, mass extinction, wholesale upheavals of biotas, regions reconstructed with the fire-equivalent of ice sheets and pluvial lakes – it’s all there.  For me, the Anthropocene extends across the whole of the Holocene, and from a fire perspective, the Anthropocene could be usefully renamed the Pyrocene. There are other helpful analogs out there.  The problem of powerline-kindled wildfires is very similar to that of railroad fires in the past.  The problem of fringe communities burning (the fatuously named wildland-urban interface) replays the chronicle of settlement fires in the 19th century.  Then it was agricultural colonization; now, an urban reclamation of rural lands. We have pretty good examples of how we might cope. The WUI problem got defined by the wildland fire community which saw houses complicating their management of land, but it makes more sense to pick up the other end of the stick and define these places as urban settings with peculiar landscaping.  The aptest analogy is not to wildland fire but to urban fire. Do that, and it’s obvious what we need to do to reduce the havoc. History also holds lessons beyond data and techniques.  How do we live in a contingent world about which we have incomplete knowledge?  That’s a topic for stories and characters, not algorithms. Mostly, though, the sciences and fire folk don’t credit history with much analytical power.  Historians deal with anecdotes. Historians are good for yearbooks and court poetry. The critics are wrong, but it can be a tough slog, like mopping up in mixed conifer duff. Still, when I began, fire was a fringe topic.  Now it’s a global concern.  People want context, and that’s what history provides, which has created a context for what I do.  It’s been an interesting ride.     

 

I also saw that you specialized in the history of exploration. Have you been able to intertwine that interest with your knowledge of fire?

I went to grad school to study geology, western history, exploration – stuff relevant to my life on the Rim.  All my applications were rejected. Then, serendipitously, it was suggested I apply to William Goetzmann in the American Civ program at UT-Austin.  He accepted me and mostly left me alone. At the time he was playing with the idea of a second great age of discovery. I quickly added a third and have used it as an organizing principle – a kind of conceptual rebar - for a series of books, including The Ice, How the Canyon Became Grand, and Voyager.  I’ve finally systematized the grand schema into The Great Ages of Discovery, now headed for publication. I see exploration as a cultural movement and for the West a kind of quest narrative. My exploration books do better critically and commercially than my fire books.  Exploration has a literary tradition, fire other than disaster and battlefield doesn’t.  It’s been helpful to have two themes – puts me back into the two-cycle rhythms I knew in my rim-campus days, keeps me from getting too stale in either one.  If I were starting over, I’d write the two series under different names.

 

In 2012, you wrote a book with your daughter, The Last Lost World. What was it like working with her, and what kind of expertise did she bring to the book?

My oldest daughter, Lydia, was attracted to archaeology and paleoanthropology and went to graduate school to study them.  Then she decided that the history of the field was more appealing. For years we joked about writing a book together sometime.  She graduated in 2009, a horrible moment for a new Ph.D., so I thought that the time had come. I had a sabbatical semester, and we took her topic and wrote The Last Lost World.  I knew about a few of the subjects, and Lydia the others, and she directed our research.  I credit her with keeping us (me) on message. I have good memories of the project, particularly the weeks we spent at our mountain cabin revising. (She’s gone on to a successful career as a writer; her latest, Genuine Fakes: How Phony Things Teach Us About Real Stuff, is just out.)  And, yes, we’re still speaking to each other.

 

Is there any other information you want people to know about you or the work that you do?

Because I ended up in a school of life sciences, I didn’t do much with graduate students.  Historians didn’t want someone from biology on their committee, and biologists didn’t want a historian.  Eventually, I decided to offer a course on nonfiction writing, and then wrote a couple of books about writing books.  Call it my postmodern phase.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173629 https://historynewsnetwork.org/article/173629 0
A New History of American Liberalism

 

Historians interested in the history of political philosophies would do well to read James Traub's new book What Was Liberalism? The Past, Present, and Promise of a Noble Idea.  Traub's ambitious book documents that liberalism has evolved over time. For John Stuart Mill and Thomas Jefferson, it was mostly a check on government's power and preservation of individuals' liberty. For Theodore Roosevelt and Franklin Roosevelt, liberalism meant using government's power to regulate business and promote social welfare. Recently it has been  more about government policies for equality and inclusion. 

 

But Traub frets that liberalism has foundered. People's faith in government has eroded. Working and middle class people have become disenchanted. Traub recounts that George Wallace, running for president in 1964, rebuked what he called liberal elites. "They have looked down their noses at the average man in the street too long," Wallace cried. That alienation grew and culminated in Donald Trump's 2016 campaign. Trump, as president, keeps hammering at liberal values. In addition, in Traub's view, Trump poses a threat to political civility, free speech, and the rule of law.

 

Traub's book begins with a chapter on "Why Liberalism Matters."  He concludes it with a chapter on "A Liberal Nationalism" that laments liberalism's eclipse but looks for signs of its potential resurgence. He finds faint hope in liberalism's history of adaptation and resurgence. Liberals need to identify with opportunity, national confidence, inclusion, civility. They need to be seen as champions of "the public good." 

 

In  the book's last sentence, he says "Liberalism will renew itself only if enough people believe that its principles are worth fighting for."

 

James Traub's sense of concern contrasts sharply with the buoyancy, optimism, self- confidence and determination that helped make liberalism a success in the past.

 

In his odyssey through liberalism's history, Traub brings in Hubert H. Humphrey (1911-1978), longtime Democratic senator from Minnesota, Vice President (1965-1969) and the party's unsuccessful presidential nominee in 1968. Traub draws extensively on Humphrey's autobiography, The Education of a Public Man: My Life in Politics, published in 1976. A better book, though, for insight into the liberal mind at the movement's high tide in the 1960's -- and a contrast to liberal disarray and doldrums today -- is Humphrey's 1964 book The Cause is Mankind: A Liberal Program for Modern America.

 

Humphrey exudes optimism and confidence: liberals know what the nation needs and are determined to secure it.  “The enduring strength of American liberalism," Humphrey wrote in the book, "is that it recognizes and welcomes change as an essential part of life, and moves to seize rather than evade the challenges and opportunities that change presents. It is, basically, an attitude toward life rather than a dogma—characterized by a warm heart, an open mind, and willing hands.”

 

To be sure, there were lots of challenges. "We are living in an age when America seems to be bursting with issues and problems. We must secure civil rights for all our citizens. We must end poverty, Our economy must grow in all parts of the country. Automation and technology must create new jobs, not more jobless...We must rebuild our cities, revitalize our rural areas... We must conserve our natural resources."

 

But in Humphrey's sunny view of things, challenges were little more than opportunities for reform.  Nothing was too difficult for Americans who needed to endorse bold government initiatives engineered by the liberal spirit.  Humphrey's book included a chapter on planning. He had proposals to streamline the work of Congress. He had proposals for reconciling big business and labor and preserving competition. "The chief economic role of government," he wrote, "must be the smoothing of the way for new men and new ideas."

 

He endorsed the "welfare state" and government's responsibility to ensure human dignity and a decent standard of living for everyone. He proposed a massive "war on poverty" that was bolder than what President Lyndon Johnson was sponsoring. Better agricultural policies would preserve family farms and at the same time provide food in abundance. Federal education aid would boost that sector.

 

Civil rights, something Humphrey had championed for years, would keep progressing.

 

Sometimes, it would take experimentation and improvisation. No matter, said Humphrey. Americans were good at that.

 

Humphrey did not foresee that the war in Vietnam and other events that would soon undercut the Johnson/Humphrey liberal domestic agenda. In his book, it was all about forward momentum into a beckoning, bracing future. American liberalism "sees free competitive enterprise as the mainspring economic life and is dedicated to the maintenance of the traditional freedoms of speech, of the press, of assembly and the like." But it is behind "the use of power of the state to achieve both freedom and a reasonable measure of equality."

 

Liberals looking for an enlightening but sobering account of their movement's history should read James Traub. 

 

But liberals should also read Hubert Humphrey for some much-needed inspiration.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/174019 https://historynewsnetwork.org/article/174019 0
Capitalism Versus Socialism: Did Capitalism Really Win?

 

In a recent op-ed, “I Was Once a Socialist. Then I Saw How It Worked,” conservative columnist David Brooks wrote, “We ran that social experiment [between capitalism and socialism] for 100 years and capitalism won.” But did capitalism really win? As I indicated in a chapter (“Capitalism, Socialism, and Communism”) in my An Age of Progress? Clashing Twentieth-Century Global Forces, history tells a more complicated story.

 

As sociologist Max Weber and conservative economist Milton Friedman have told us, the primary purpose of capitalism is earning a profit. Friedman even believed it was business’s main “social responsibility.” Sociologist Daniel Bell wrote that capitalism has “no moral or transcendental ethic.” 

 

 

Nineteenth-century capitalism provided no adequate answers for how to deal with such problems as unsafe working conditions, unfair business practices, pollution, public health, slum housing, or the abuse of child labor. Nor did it deal with the distribution of income and left unanswered if great poverty should exist along with great wealth. 

 

 

At that time socialism, especially Marxist socialism, arose as the major challenger of capitalism. By 1917, socialism had split into two principal types, the communist variety led by Vladimir Lenin--later formalized in the name Union of Soviet Socialist Republics (USSR)--and a revisionist type, more common in western Europe, that advocated democratic, reformist means of obtaining power. The goal of such socialism was primarily greater economic equality through such steps as government ownership of at least the chief means of production. In general, democratic socialists wished to control and regulate the economy in behalf of the entire population. In the German parliamentary election of 1912, the German Social Democratic Party received more votes than any other party. 

 

 

Marx’s ideas also influenced trade-unionism and other isms that challenged nineteenth-century capitalists. One such was Progressivism, a cross-fertilizing trans-Atlantic movement that was powerful enough in the USA to produce the Progressive Era from 1890 to 1914.  

 

 

But Progressivism did not attempt to overthrow or replace capitalism, but to constrain and supplement it in order to insure that it served the public good. As Daniel Rodgers indicates in his Atlantic Crossings: Social Politics in a Progressive Age (2000), it was a diverse movement “to limit the socially destructive effects of morally unhindered capitalism, to extract from those [capitalist] markets the tasks they had demonstrably bungled, to counterbalance the markets’ atomizing social effects with a countercalculus of the public weal [well-being].” 

 

 

After World War I and a dozen years of Republican rule, progressivism returned with the election of Franklin Roosevelt (FDR) in 1932 and his subsequent implementation of the New Deal. 

 

From FDR’s time to our own, the U.S. economy, as our State Department indicated  in 2001, has been “mixed:” “The United States is often described as a ‘capitalist’economy,” it “is perhaps better described as a ‘mixed’ economy, with government playing an important role along with private enterprise.”

 

Following World War II, many western European economies also became mixed as they introduced more welfare-state provisions--social programs assisting the poor, old, sick, and unemployed. Generally, compared to the USA, western Europe gave more assistance to the needy and the government controlled more of the economy. 

 

 

But in both the USA and western Europe the degree of government regulation has fluctuated considerably. Prominent European democratic socialists Chancellor Willy Brandt of West Germany and President Francois Mitterrand of France led their countries from from 1969 to 1974 and 1981 to 1995, respectively. Conversely, in the 1980s Ronald Reagan in the USA and Margaret Thatcher in Britain cut back the government regulation of capitalism. While progressivism, the New Deal, and welfare-state policies inched liberals closer to democratic socialism, the Reagan-Thatcher opposition to “creeping socialism” and “big government” encouraged conservatives to place more trust in capitalism or “the free market.”

 

Today, not only are there very different types of capitalism in different countries--Brooks, for example, distinguishes “between a version of democratic capitalism, found in the U.S., Canada and Denmark, and forms of authoritarian capitalism, found in China and Russia”--but U. S. capitalism today is not the same as it was before the Progressive Era. Nor is democratic socialism in Europe the same as it was in 1912, when the German Social Democratic Party demonstrated its popularity.

 

fair and unbiased review, “Democratic Socialist Countries 2019,” concludes that most prominent European countries contain a mix of capitalist and democratic-socialist elements. For example, it states, “Norway, like other Scandinavian countries, is . . . [not] fully socialist nor fully capitalist.”

 

Thus, Brooks is too simplistic when he writes “capitalism won.” If one equates socialism with communism, as Brooks comes close to doing, then saying capitalism won over socialism is closer to the truth. For Soviet-style communism did collapse by 1991 with the disintegration of the USSR. Still, that’s not completely accurate because western economic-political systems are more mixed than purely democratic capitalist.

 

Brooks’ narrow view of socialism and triumphant take on capitalism require a historical corrective because of three significant  problems. 

 

The first is the political use of simplistic capitalism-vs.-socialism propaganda. Socialism and socialist have long been scare words conservatives hurl at opponents. Social Security? Socialist! Medicare? Socialist! In the 1960s when Congress debated Medicare, the American Medical Association (AMA) hired Ronald Reagan to speak out against it on a record entitled “Ronald Reagan Speaks Out Against Socialized Medicine.”

 

More recently, Donald Trump stated that socialism “promises unity, but it delivers hatred and it delivers division. Socialism promises a better future, but it always returns to the darkest chapters of the past. That never fails. It always happens. Socialism is a sad and discredited ideology rooted in the total ignorance of history and human nature.” His campaign has claimed that “Bernie Sanders has already won the debate in the Democrat primary, because every candidate is embracing his brand of socialism.”

 

The second problem is that our political-economic development has made us  more of a mixed system than a capitalistic one thanks to such developments as Progressivism, the New Deal, Lyndon’s Johnson’s “Great Society,” and Medicare. 

 

The third main problem is that capitalism’s main aim of profit-seeking still needs (in Bell’s words) a higher “moral or transcendental ethic.” Seeking the common good is one such ethic that has often been suggested, from the Progressive era to, more recently, Pope Francis and Nobel Prize–winning economist Joseph Stiglitz.

 

In previous articles such as “BP, Corporations, Capitalism, Progressivism, and Government” and “The Opioid Crisis and the Need for Progressivism,” I dealt with the profit-at-all-cost operations of corporations such as British Petroleum (BP), and Purdue Pharma. I also addressed why governments need to regulate companies responsible for oils spills and opioid deaths to ensure they serve the common good. 

 

 

In the last two years, two more notable cases of insufficient corporate concern for anything but profits emerged, again demonstrating the need for government oversight. 

 

 

The first involves Pacific Gas & Electric (PG&E). In early December, 2019 it agreed to pay $13.5 billion for causing Northern California wildfires that destroyed thousands of homes and businesses and killed more than 100 people in just two of the fires. The LA Times noted that critics believe the company put “short-term profits before necessary safety measures.”A Marxist publication’s headline read “Profits before people: Why PG&E turned off the lights in California.”

 

The second case involves aircraft manufacturer and defense contractor Boeing. In late 2018-early 2019, two Boeing 737 Max 8 Jets crashed killing 346 people. Again, profits were placed before safety. One Daily Beast article declared that “top managers in the company’s Chicago headquarters, more alert to Wall Street than airline safety, gave priority to stock buybacks and shareholder returns.”

 

Again, government regulation of Boeing was insufficient. Pilot Captain Chesley “Sully” Sullenberger, depicted by Tom Hanks in the movie “Sully: Miracle on the Hudson,” criticized the USA for not providing sufficient funds to the Federal Aviation Administration (FAA), leaving it ill equipped to oversee aircraft safety. Instead, Sully charged, the FAA allowed Boeing itself to certify that it was meeting safety standards.  “To make matters worse, there is too cozy a relationship between the industry and the regulators. And in too many cases, FAA employees who rightly called for stricter compliance with safety standards and more rigorous design choices have been overruled by FAA management, often under corporate or political pressure.”

 

Thus, history demonstrates that the assertion that capitalism “won” is inaccurate. The irresponsibility of corporations like BP, Purdue Pharma, PG&E, and Boeing indicate the ongoing cost of unregulated capitalism. Today, President Trump continues to assail the type of enlightened regulation produced by the Progressive Era and FDR. The main task of voters in 2020 and thereafter will not be to choose whether we want capitalism or democratic socialism. Rather, it will be to decide the best pragmatic mix of the two (and best presidential candidate) to further the common good. 

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173969 https://historynewsnetwork.org/article/173969 0
Have Americans Usually Supported Their Wars?

Editor's note January 7, 2020: After recent events in Iraq, many people across the globe are worried that the United States and Iran might go to war. Many attended anti-war protests over the weekend. Here is helpful historical context. 

Related Links

●  Lawrence F. Kaplan: What the Public Will Stomach

● Stanley Karnow: Did Public Opinion Really Shift After Tet? 

It seemed to come as a bit of a shock to the Bush administration that Americans turned against the war in Iraq. Just as administration officials weren't prepared for the fierce resistance that developed to the American occupation of Iraq, neither were they prepared for a change in public opinion at home as the war dragged on. Cindy Sheehan's galvanizing protests outside the president's vacation home caught officials by surprise.

But throughout American history there has always been significant opposition to war. New England states threatened to secede from the union during the War of 1812, which severely hampered the trade carried in New England ships. The Mexican-American War was opposed by leading Whigs including Congressman Abraham Lincoln. During the Civil War President Lincoln faced the opposition of the Copperheads, many of whom he had thrown in jail. The Spanish-American War triggered a robust anti-imperialist movement led by William Jennings Bryan.

In the twentieth century, as Hazel Erskine demonstrated in her widely cited 1970 article, "Was War a Mistake?" (Public Opinion Quarterly), "the American public has never been sold on the validity of any war but World War II." She noted that as of 1969--a year after the Tet Offensive and the brief invasion of the American embassy in Saigon--"in spite of the current anti-war fervor, dissent against Vietnam has not yet reached the peaks of dissatisfaction attained by either World War I or the Korean War."

Reviewing the wars of the 20th century, she noted: "In answer to quite comparable questions, in 1937 64 per cent called World War I a mistake, in 1951 62 per cent considered Korean involvement wrong, whereas in 1969 burgeoning anti-Vietnam dissent nationwide never topped 58 per cent."

Her charts demonstrated the extent of American opposition to war:

 

 

 

 

 

 

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/14857 https://historynewsnetwork.org/article/14857 0
Winter Isn’t Coming. Prepare for the Pyrocene.

A massive fire burns the Amazon in Brazil

 

 

Millions of acres are burning in the Arctic, thousands of fires blaze in the Amazon, and with seemingly endless flareups in between, from California to Gran Canaria – fire seems everywhere, and everywhere dangerous and destabilizing. With a worsening climate, the fires dappling Earth from the tropics to the tundra appear as the pilot flames of an advancing apocalypse.  To some commentators, so dire, so unprecedented are the forecast changes that they argue we have no language or narrative to express them.  

 

Actually, the fire scene is worse than the headlines and breathless commentaries suggest because it is not just about bad burns that crash into towns and trash countrysides.  It’s equally about the good fires that have vanished because they are suppressed or no longer lit.  More of the world suffers from a famine of good fires than from a surfeit of bad ones; the bad ones are filling a void; they are not so much wild as feral.  

 

Underwriting both is that immense inflection in which humans turned from burning living landscapes to burning lithic ones in the form of fossil fuels.  That is the Big Burn of today, acting as a performance enhancer on all aspects of fire’s global presence.  So vast is the magnitude of these changes that we might rightly speak of a coming Fire Age equivalent in stature to the Ice Ages of the Pleistocene.  Call it the Pyrocene.

 

So there does exist a narrative, one of the oldest known to humanity, and one that has defined our distinctive ecological agency. It’s the story of fire.  Earth is a uniquely fire planet – it has been since life clambered onto the continents.  Equally, humans are a uniquely fire creature, not only the keystone species for fire but a species monopolist over its manipulation.  The fires in the Arctic testify to the planetary antiquity of fire.  Nearly all are kindled by lightning and burn biotas nicely adapted to fire; many could be suppressed, but extinguishing them will only put off, not put out, the flames. By contrast, the fires in the Amazon bear witness to a Faustian pact that hominins made with fire so long ago it is coded into our genome.  They are set by people in circumstances that people made, well outside ecological barriers and historical buffers.

 

This is a narrative so ancient it is prelapsarian. Our alliance with fire has become a veritable symbiosis.  We got small guts and big heads because we learned to cook food.  We went to the top of the food chain because we learned to cook landscapes.  Now we have become a geological force because we have begun to cook the planet.  We have taken fire to places and times it could never have reached on its own, and it has taken us everywhere, even off world. We have leveraged fire; fire has leveraged us.

 

How this happened is a largely hidden history – hidden in plain sight.  Fire disappeared as an integral subject about the time we hid fire into Franklin stoves and steam engines.  (The only fire department at a university is the one that sends emergency vehicles when an alarm sounds.)  It lost standing as a topic in its own right.  As with the fires of today, its use in history has been to illustrate other themes, not to track a narrative of its own.  

 

Yet how the present scene came to be is clear enough in its general contours.  How, outfitted with firesticks early humans could take over select biotas.  How, with axes and plows and livestock as fire fulcrums, societies could recode the patches and pulses of vast swathes of land for agriculture.  How, hungering for ever more firepower, we turned from burning living landscapes to burning lithic ones – once-living biomass converted over eons into oil, gas, lignite, and coal.  Our firepower became unbounded.

 

 

That is literally true.  The old quest for sources has morphed into one for sinks.  The search for more stuff to burn has become a problem of where to put all the effluent.  Industrial combustion can burn without any of the old ecological checks-and-balances: it can burn day and night, winter and summer, through drought and deluge.  We are taking stuff out of the geologic past and unleashing it into the geologic future.  

 

It’s not only about changing climate, or acidifying oceans. It’s about how we live on the land. Land use is the other half of the modern dialectic of fire on Earth, and when a people shift to fossil-fuels, they alter the way they inhabit landscapes.  They rely on industrial pyrotechnologies to organize agriculture, transportation, urban patterns, even nature reserves, all of which tend to aggravate the hazards from bad fire and complicate the reintroduction of good fire. The many conflagrations sparked by powerlines nicely capture the pyric collision between living and lithic landscapes. Still, even if fossil-fuel combustion were tamed, we would yet have to work through our deranged relationship to fires on living landscapes.  

 

Because fire is a reaction, not a substance, the scale of our fire-induced transformations can be difficult to see.  But we are fashioning the fire-informed equivalents of ice sheets, mountain glaciers, pluvial lakes, outwash plains, and of course changing sea levels, not to mention sparking wholesale extinctions.  Too much bad fire, too little good, too much combustion overall - it’s an ice age for fire.  The Pyrocene is moving from metaphor to descriptor.  

 

It’s all there: narrative, analogue, explication.  A couple of centuries ago we began hiding our fires in machines and off site, which can make it difficult for modern urbanites to appreciate how profoundly anthropogenic fire practices inform Earth today.  We use the rampaging flames to animate other agendas, not to understand what fire is telling us.  But fire, the great shape-shifter, is fast morphing beyond our grasp.  

 

What does a full-blown fire age look like?  We’re about to find out.

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/172842 https://historynewsnetwork.org/article/172842 0
Roundup Top 10!  

 

Why Did the U.S. Kill Suleimani?

by Elizabeth Cobbs and Kimberly C. Field

The attack illustrates America’s lack of a clear grand strategy — and why we need one immediately.

 

War with Iran is not inevitable — but the U.S. must change course

by Kelly J. Shannon

The relationship between the countries, once friends and allies, has soured — because of U.S. aggression.

 

 

The Global War of Error

by Tom Engelhardt

Failure is the new success and that applies as well to the “industrial” part of the military-industrial complex.

 

 

Putin’s Big Historical Lie

by Anne Applebaum

In a series of comments in late December, the Russian president appeared to blame Poland for the outbreak of the Second World War.

 

 

The Colonization of Puerto Rico and the Limits of Impeachment

by DJ Polite

It is misleading to call impeachment "justice" when it reflects the priorities of empire.

 

 

It’s 1856 All Over Again

by Steve Inskeep

Immigration. Race. Demographic change. Political demagogy. That year’s presidential race had it all. What can it tell us about 2020?

 

 

Before the ‘Final Solution’ There Was a ‘Test Killing’

by Kenny Fries

Too few know the history of the Nazi methodical mass murder of disabled people. That is why I write.

 

 

Yes, Bernie Sanders Could Be the Nominee—and It Would Be an Epic Nightmare for Democrats

by Ronald Radosh

Sanders is where he is today in part because no one has really attacked him. But just wait until Republicans spend a billion dollars painting him as an extremist.

 

 

Why policymakers must act to preserve information freedom at home and abroad

by Diana Lemberg

A Cold War dichotomy pitting capitalism and democracy versus state control misreads history.

 

 

 

 

Evangelicals using religion for political gain is nothing new. It is a US tradition

by Reverend William Barber

No one who has read US history can be surprised by the hypocrisy of Evangelicals for Trump but it also tells us how their undoing will inevitably come.

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/174014 https://historynewsnetwork.org/article/174014 0
A History of Climate Change Science and Denialism

The girl got up to speak before a crowd of global leaders. “Coming here today, I have no hidden agenda. I am fighting for my future. Losing my future is not like losing an election or a few points on the stock market. I am here to speak for all generations to come.” She continued: “I have dreamt of seeing the great herds of wild animals, jungles and rainforests full of birds and butterflies, but now I wonder if they will even exist for my children to see. Did you have to worry about these little things when you were my age? All this is happening before our eyes.” She challenged the adults in the room: “parents should be able to comfort their children by saying "everything's going to be alright', "we're doing the best we can" and "it's not the end of the world". But I don't think you can say that to us anymore.” 

 

No, these were not Greta Thunberg’s words earlier this year. This appeal came from Severn Suzuki at the Rio Earth Summit back in 1992. In the 27 years since, we have produced more than half of all the greenhouse gas emissions in history

 

Reading recent media reports, you could be forgiven for thinking that climate change is a sudden crisis. From the New York Times: “Climate Change Is Accelerating, Bringing World ‘Dangerously Close’ to Irreversible Change.” From the Financial Times: “Climate Change is Reaching a Tipping Point.” If the contents of these articles have surprised Americans, it reveals far more about the national discourse than then any new climate science. Scientists have understood the greenhouse effect since the 19th century. They have understood the potential for human-caused (anthropogenic) global warming for decades. Only the fog of denialism has obscured the long-held scientific consensus from the general public.

 

Who knew what when?

Joseph Fourier was Napoleon’s science adviser. In the early 19th century, he studied the nature of heat transfer and concluded that given the Earth’s distance from the sun, our planet should be far colder than it was. In an 1824 work, Fourier explained that the atmosphere must retain some of Earth’s heat. He speculated that human activities might also impact Earth’s temperature. Just over a decade later, Claude Pouillet theorized that water vapor and carbon dioxide (CO2) in the atmosphere trap infrared heat and warm the Earth. In 1859, the Irish physicist John Tyndall demonstrated empirically that certain molecules such as CO2 and methane absorb infrared radiation. More of these molecules meant more warming. Building on Tyndall’s work, Sweden’s Svante Arrhenius investigated the connection between atmospheric CO2 and the Earth’s climate. Arrhenius devised mathematical rules for the relationship. In doing so, he produced the first climate model. He also recognized that humans had the potential to change Earth’s climate, writing “the enormous combustion of coal by our industrial establishments suffices to increase the percentage of carbon dioxide in the air to a perceptible degree."  

Later scientific work supported Arrhenius’ main conclusions and led to major advancements in climate science and forecasting. While Arrhenius’ findings were discussed and debated in the first half of the 20th century, global emissions rose. After WWII, emission growth accelerated and began to raise concerns in the scientific community. During the 1950s, American scientists made a series of troubling discoveries. Oceanographer Roger Reveille showed that the oceans had a limited capacity to absorb CO2. Furthermore, CO2 lingered in the atmosphere for far longer than expected, allowing it to accumulate over time. At the Mauna Loa observatory, Charles David Keeling conclusively showed that atmospheric CO2 concentrations were rising. Before John F. Kennedy took office, many scientists were already warning that current emissions trends had the potential to drastically alter the climate within decades. Reveille described the global emissions trajectory as an uncontrolled and unprecedented “large-scale geophysical experiment.” 

 

In 1965, President Johnson received a report from his science advisory committee on climate change. The report’s introduction explained that “pollutants have altered on a global scale the carbon dioxide content of the air.” The scientists explained that they “can conclude with fair assurance that at the present time, fossil fuels are the only source of CO2 being added to the ocean-atmosphere-biosphere system.” The report then discussed the hazards posed by climate change including melting ice caps, rising sea levels, and ocean acidity. The conclusion from the available data was that by the year 2000, atmospheric CO2 would be 25% higher than pre-industrial levels, at 350 parts per million. 

 

The report was accurate except for one detail. Humanity increased its emissions faster than expected and by 2000, CO2 concentrations were measured at 370 parts per million, nearly 33% above pre-industrial levels.

 

Policymakers in the Nixon Administration also took notice of the mounting scientific evidence. Adviser Daniel Patrick Moynihan wrote to Nixon that it was “pretty clearly agreed” that CO2 levels would rise by 25% by 2000. The long-term implications of this could be dire, with rising temperatures and rising sea levels, “goodbye New York. Goodbye Washington, for that matter,” Moynihan wrote. Nixon himself pushed NATO to study the impacts of climate change. In 1969, NATO established the Committee on the Challenges of Modern Society (CCMS) partly to explore environmental threats.

 

The Clinching Evidence

By the 1970s, the scientific community had long understood the greenhouse effect. With increasing accuracy, they could model the relationship between atmospheric greenhouse gas concentrations and Earth’s temperature. They knew that CO2 concentrations were rising, and human activities were the likely cause. The only thing they lacked was conclusive empirical evidence that global temperature was rising. Some researchers had begun to notice an upward trend in temperature records, but global temperature is affected by many factors. The scientific method is an inherently conservative process. Scientists do not “confirm” their hypothesis, but instead rule out alternative and “null” hypotheses. Despite the strong evidence and logic for anthropogenic global warming, researchers needed to see the signal (warming) emerge clearly from the noise (natural variability). Given short-term temperature variability, that signal would take time to fully emerge. Meanwhile, as research continued, other alarming findings were published.

 

Scientists knew that CO2 was not the only greenhouse gases humans had put into the atmosphere. During the 1970s, research by James Lovelock revealed that levels of human-produced chlorofluorocarbons (CFCs) were rapidly rising. Used as refrigerants and propellants, CFCs were 10,000 times as effective as CO2 in trapping heat. Later, scientists discovered CFCs also destroy the ozone layer. 

 

In 1979, at the behest of America’s National Academy of Sciences, MIT meteorologist Jule Charney convened a dozen leading climate scientists to study CO2 and climate. Using increasingly sophisticated climate models, the scientists refined estimates for the scale and speed of global warming. The Charney Report’s forward stated, “we now have incontrovertible evidence that the atmosphere is indeed changing and that we ourselves contribute to that change.” The report “estimate[d] the most probable global warming for a doubling of CO2 to be near 3°C.” Forty years later, newer observations and more powerful models have supported that original estimate. The researchers also forecasted CO2 levels would double by the mid-21st century. The report’s expected rate of warming agreed with numbers posited by John Sawyer of the UK’s Meteorological Office in a 1972 article in Nature. Sawyer projected warming of 0.6°C by 2000, which also proved remarkably accurate.

 

Shortly after the release of the Charney Report, many American politicians began to oppose environmental action. The Reagan Administration worked to roll back environmental regulations. Obeying a radical free-market ideology, they gutted the Environmental Protection Agency and ignored scientific concerns about acid rain, ozone depletion, and climate change. 

 

However, the Clean Air and Clean Water Acts had already meaningfully improved air and water quality. Other nations had followed suit with similar anti-pollution policies. Interestingly, the success of these regulations made it easier for researchers to observe global warming trends. Many of the aerosol pollutants had the unintended effect of blocking incoming solar radiation. As a result, they had masked some of the emissions-driven greenhouse effect. As concentrations of these pollutants fell, a clear warming trend emerged. Scientists also corroborated ground temperature observations with satellite measurements. In addition, historical ice cores also provided independent evidence of the CO2-temperature relationship.

 

Sounding the Alarm

Despite his Midwestern reserve, James Hansen brought a stark message to Washington on a sweltering June day in 1988. “The evidence is pretty strong that the greenhouse effect is here.” Hansen led NASA’s Goddard Institute for Space Studies(GISS) and was one of the world’s foremost climate modelers. In his Congressional testimony, he explained that NASA was 99% certain that the observed temperature changes were not natural variation. The next day, the New York Times ran the headline “Global Warming Has Begun, Expert Tells Senate.” Hansen’s powerful testimony made it clear to politicians and the public where the scientists stood on climate change.

 

Also in 1988, the United Nations Environmental Programme (UNEP) and the World Meteorological Organization (WMO) created the Intergovernmental Panel on Climate Change (IPCC). The IPCC was created to study both the physical science of climate change and the numerous effects of the changes. To do that, the IPCC evaluates global research on climate change, adaptation, mitigation, and impacts. Thousands of leading scientists contribute to IPCC assessment reports as authors and reviewers. IPCC reports represent the largest scientific endeavor in human history and showcase the scientific process at its very best. The work is rigorous, interdisciplinary, and cutting edge.

 

While the IPCC has contributed massively to our understanding of our changing world, its core message has remained largely unchanged for three decades. The First Assessment Report (FAR) in 1990 stated “emissions resulting from human activities are substantially increasing the atmospheric concentrations of the greenhouse gases.”  Since then, the dangers have only grown closer and clearer with each report. New reports not only forecast hazards but describe the present chaos too. As the 2018 Special Report (SR15)  explained: “we are already seeing the consequences of 1°C of global warming through more extreme weather, rising sea levels and diminishing Arctic sea ice, among other changes.”  

 

Wasted Time

As this story has shown, climate science is not a new discipline and the scientific consensus on climate change is far older than many people think. Ironically, the history of climate denialism is far shorter. Indeed, a 1968 Stanford University study that reported “significant temperature changes are almost certain to occur by the year 2000 and these could bring about climatic changes,” was funded by the American Petroleum Institute. During the 1970s, fossil fuel companies conducted research demonstrating that CO2 emissions would likely increase global temperature. Only with political changes in the 1980s did climate denialism take off. 

 

Not only is climate denialism relatively new, but it is uniquely American. No other Western nation has anywhere near America’s level of climate change skepticism. The epidemic of denialism has many causes. It is partly the result of a concerted effort by fossil fuel interests to confuse the American public on the science of climate change. It is partly due to free-market ideologues that refuse to accept a role for regulation. It is partly because of the media’s misguided notion of fairness and equal time for all views. It is partly due to the popular erosion of trust in experts. It is partly because the consequences of climate change are enormous and terrifying. Yet, you can no more reject anthropogenic climate change than you can reject gravity or magnetism. The laws of physics operate independently of human belief. 

 

However, many who bear blame for our current predicament do not deny the science.  For decades, global leaders have greeted dire forecasts with rounds of empty promises. James Hansen has been frustrated the lack of progress since his 1988 testimony. “All we’ve done is agree there’s a problem…we haven’t acknowledged what is required to solve it.” The costs of dealing with climate change are only increasing. Economic harms may run into the trillions. According to the IPCC’s SR15, to avoid some of climate change’s most devastating effects, global temperature rise should be kept to below 1.5°C above pre-industrial levels. That would likely require a reduction in emissions to half of 2010 levels by 2030, and to net-zero emissions by 2050. Had the world embarked on that path after Hansen’s spoke on Capitol Hill, it would have required annual emissions reductions of less than 2%. Now, according to the latest IPCC report, the same goal requires annual reductions of nearly 8%. 1.5°C appears to be slipping out of reach. 

 

We have known about the causes of climate change for a long time. We have known about the impacts of climate change for a long time. And we have known about the solution to climate change for a long time. An academic review earlier this year demonstrated the impressive accuracy of climate models from the 1970s. This is no longer a scientific issue. While science can continue to forecast with greater geographic and temporal precision, the biggest unknown remains our action. What we choose today will shape the future.  

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173971 https://historynewsnetwork.org/article/173971 0
New PBS Documentary "McCarthy" Highlights a Tumultuous Time in Our History

 

The new American Experience documentary “McCarthy," premiering January 6 on PBS, details the meteoric rise and fall of Wisconsin Senator Joseph McCarthy. Through interviews with historical scholars and first-hand witnesses, this impeccably-sourced film tells the harrowing story of McCarthyism, the House Un-American Activities Committee, and the fear of a sinister communist threat—real or imagined—that gripped the American people during the Second Red Scare. By revisiting this tumultuous period in American history, viewers can reflect on our nation’s controversial past and draw clearly defined parallels between McCarthy’s time and our current political climate.

 

Ahead of the film’s release, I had the exciting opportunity to speak with its director, Sharon Grimberg, about the value of public television as a historical platform, the importance of reaching younger viewers, and what we can learn from the McCarthy era in modern-day America.

 

Q: What value, in your opinion, does public television have as a platform for teaching history?

 

A: Well, I worked for American Experience on staff for fifteen years. From my experience, I felt like we took on a lot of topics that are not taken on by other broadcasters, so what we really try to do is spend a lot of time really digging in and doing the research and telling stories in a very kind of in depth and thoughtful way and giving producers the time to think about the best way to tell a story. It’s harder for people to do in other broadcast networks—I don’t want to denigrate anybody else, but PBS is very committed to doing programming that’s thoughtful and also reaching people both in the home and in the classroom, reaching kids in all sorts of communities. In the past they’ve even taken pieces of the story and created a teaching module around them. So many people have said to me, “Sharon, my son just watched one of your films at school.” There’s definitely a commitment to catering to a broad range of viewers including kids in classrooms, and creating content that works for different communities. 

 

 

Q: In what ways do you think public broadcast documentaries like “McCarthy” positively impact the field of history?

 

A: I think the way that there’s a push and pull is that for a TV documentary to work, it has to have a narrative story, and not all historical scholarship is written that way, right? So I think there is a way in which there’s something to learn from telling stories narratively, because I think it is more accessible to people. But the thing about a documentary is that you can watch a two-hour documentary about McCarthy and you’re going to walk away knowing something, whereas there’s a lot of really great scholarship about this era and the Red Scare—for example David Oshinsky wrote a very, very fine book, really well-constructed and thoughtful—but it’s a much bigger commitment for the average person. On the other hand, maybe the film will encourage people to do more reading—pick up a book on the topic and make that bigger time commitment.

 

On this topic, there’s one story that always stands out to me. Years ago I was in Georgia, and my father got a speeding ticket when he was driving me here. We had to go to the police station, and when we got there, there was a police officer who was reading the collected works of Ibsen, and I asked her why. It was because she had just seen the play dramatized on PBS. She had gone to the library and gotten his collected works, because she had seen his play The Dollhouse on television. It just felt to me that there is this kind of synergy of what goes on television: someone who might not necessarily have picked up that particular book had gone to the library and gotten herself a copy, and there she was in this rural, one-room police house reading Ibsen. I do think, if you’re taken somewhere you haven’t been taken before, it opens new doors for you, and that’s what TV can do.

 

Q: When making historical content for public television, how do you appeal to younger audiences?

 

A: That’s such an interesting question. I have to say that when I was making this McCarthy film, younger people had no idea who Joseph McCarthy was—absolutely none. So I was very conscious when I was making this film about what people do and don’t know, and wanted to fill in the gaps around what people hadn’t already learned. It’s a very complicated story, and I wanted people to understand why there was a communist community in America during the 1950s: what appealed to people about communism, why they might have agreed with it at that particular moment, and why in the 1950s that might have seemed so scary. Why a woman who was a schoolteacher would be fired from her job, no questions asked, because she was a member of the communist party or even just a communist-dominated labor union. I wanted people to understand the context of those things—hoping that people who don’t come in with a lot of knowledge would understand the complexity of the time. You would need a certain level of interest to turn on the program, but I would hope that kids who see part of the film in the classroom would be intrigued.

 

Q: The documentary mentions how Joseph McCarthy actually started as a Democrat. Were his Republican views purely opportunistic, and if so, is this sort of flip-flopping a common behavioral pattern among demagogues?

 

A: It’s very hard to know exactly what he was thinking, but that’s certainly what David Oshinsky would say. He was looking at this whole landscape and he was very ambitious, and he was working as a judge for a while, but that wasn’t going to work out for him. His campaign aide even said to him once, “Why are you so glum? Even if you lose this campaign, you’re still going to be a judge.” He replied, “I don’t want to be a judge all my life—what are you thinking?” He saw that he had the chance to win the seat, and so that’s why he switched sides, I think. He was very ambitious, and I think he saw an opportunity.

 

Q: There are striking parallels between the McCarthy era and modern political discourse in terms of polarization and enmity between the two parties. Do you think this documentary will raise awareness of this repeated historical pattern?

 

A: I hope that it makes people think about the way in which democracy works, and the way in which we all have a part in that. Everybody comes with preconceptions, and it’s very easy to drown out people who oppose you instead of taking the time to figure out “why does this person see things so differently from the way I do?” That to me seems like a big take-home.

 

There was one story, one that didn’t make the final cut, about this guy called Harold Michaels who was a young Republican in Wisconsin. He had worked as a volunteer helping McCarthy get elected and worked on his reelection in 1952, but by 1954 he was completely disgusted with McCarthy. He had been a lifelong Republican, but he was one of the few to look around at what was happening and say, “This isn’t right.” So he ended up campaigning for McCarthy’s opponent in the next election. They didn’t get quite enough votes, but they made a good effort. So here’s somebody who was in the middle and shifted his perspective.

 

Q: The documentary mentions how the rest of the Republican party was wary of McCarthy, but continued to support him because “they had no other alternative.” Margaret Chase Smith was, in fact, the only Republican to stand up to him. In a political climate that is, once again, strikingly similar, what can we learn from Republicans’ reaction to McCarthy?

 

A: The thing that always sticks in my mind is the Edward Murrow broadcast, in which he says something like “this is no time for people to be quiet.” We cannot abdicate our responsibilities as citizens of a democracy; we have to protect freedom at home, and we are the beacon of freedom around the world. But we can’t defend freedom around the world if we’re not defending it here. What he’s saying is, democracy is fragile, and we are responsible for protecting other people’s freedoms and ensuring that America remains true to its founding ideals.

 

McCarthy did overstep bounds, and he became so reckless and thoughtless that people who hadn’t done anything were hurt. I think, if anything, we don’t do enough to explain what happened to some of the people called before his committee—one man even killed himself.  There were real, horrible consequences for people who, sometimes, had done nothing but be part of a communist-dominated union. They weren’t traitors to their country; they were just left-leaning. But he so overstepped his bounds that people began to have a distaste for that sort of ideology. Some of my historians in the film would say that he made anti-communism so distasteful that the government actually backed too far away from it, because of what had happened to ordinary people. It was counter-productive to be that reckless with ordinary people’s lives.

 

One story that really stuck out to me was when a graduate student named Leon was charged with contempt of congress for refusing to answer McCarthy’s questions. He was brought to trial, but the case was thrown out on a technicality when McCarthy walked into the room and a group of Irish Catholic spectators began to cheer. The judge threw out the case because the jury had been tainted by cheering, and then held a bench hearing where he found that McCarthy didn’t have jurisdiction over the case. Even though the case was throw out, Leon couldn’t find a job after, so he moved to Canada for over a decade. Leon’s experience really illustrates that most of those targeted by McCarthy were uninfluential people—they didn't have access to state secrets, they were just ordinary, left-leaning citizens.

 

I think that, overall, what the film says to people is that democracy is fragile, and we all have a responsibility to protect it. The ordinary person does that by voting and protesting and writing letters, and elected representatives by acting correctly.

 

Q: What did you find most interesting or surprising when you were creating this film?

 

A: What I found interesting is that America was very, very divided back then, and you could see that in the way that Republicans and Democrats treated each other in the Senate. It was a very divided time, venomous and acrimonious. You could see that in archives, even—representatives got horrible letters from constituents, which is different from today because the sort of thing wasn’t broadcasted on social media.

 

Another thing is that even though McCarthy was chastised in the McCarthy hearings and then censured by the Senate, if you look at the polling all through the fall of 1954, there was still a very solid 30% of the country that supported him. It didn’t waver.

 

There is a whole lot of baggage that comes with how we understand things: you can point to anyone on the political spectrum and see that people come to things with their preconceived notions and worldview intact. You trust your sources because they fit into your worldview, and that doesn’t allow you to shift your position much because that’s what you bring to the table.

 

I was very struck with that. Despite everything, there were a lot of people who still felt he was fighting a good fight. And that’s why I wanted the film to end the way it does—with Splits saying that he was a patriot—because a lot of people still felt that way.

 

Q: Finally, what do you hope your audience will take away from “McCarthy?”

 

A: I think that I would say what I said before—that democracy is fragile and we’re all responsible. That, to me, seems an enduring truth. We can’t count on always living in a democracy, but we need to work at it to make sure that the vulnerable aren’t exploited, that the powerful don’t usurp too much power, and that the justice system is working fairly. Those things don’t happen unless we keep are eyes open and we’re active. We as citizens have a responsibility to keep our eyes open, listen, and speak up when we see things that aren’t right.

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173886 https://historynewsnetwork.org/article/173886 0
A Historian Reflects on the Return of Fascism

Members of the Maquis, French Resistance Fighters in World War II

 

Back in 1941, the year of my birth, fascism stood on the brink of conquering the world. During the preceding decades, movements of the Radical Right―mobilized by demagogues into a cult of virulent nationalism, racial and religious hatred, and militarism―had made great strides in nations around the globe.  By the end of 1941, fascist Germany, Italy, and Japan, having launched massive military invasions of other lands, where they were assisted by local rightwing collaborators, had conquered much of Europe, Asia, and the Middle East.

 

It was a grim time.

 

Fortunately, though, an enormous movement arose to resist the fascist juggernaut.  Led by liberals and assorted leftists around the world and eventually bolstered by the alliance of Britain, the Soviet Union, and the United States, this resistance movement ultimately prevailed.

 

The antifascist struggle of World War II established the groundwork for a new and better international order.  In January 1941, U.S. President Franklin D. Roosevelt, in a major public address, outlined what became known as The Four Freedoms.  The people of all nations, he proclaimed, should enjoy freedom of speech and expression, freedom of worship, freedom from fear, and freedom from want.  That August, Roosevelt and British Prime Minister Winston Churchill unveiled the Atlantic Charter, declaring that people should have the right to choose their own form of government, that force should be abandoned in world affairs, and that international action should promote improved living and working conditions for all people.  

 

These public declarations―coupled with the widespread discrediting of rightwing parties, movements, and ideas―led directly to the establishment, in 1945, of the United Nations.  According to the UN Charter, the purpose of the new world organization was to “to save succeeding generations from the scourge of war,” “to reaffirm faith in fundamental human rights,” and “to employ international machinery for the promotion of the economic and social advancement of all peoples.”

 

And, in fact, in the decades following World War II, there were significant strides forward along these lines.  Led by Eleanor Roosevelt, the United Nations issued a Universal Declaration of Human Rights, setting forth fundamental rights to be protected.  Furthermore, much of Europe, the cockpit of two terrible world wars, cast aside nationalism to establish a federal union.  Moreover, a wave of decolonization freed much of the world from foreign rule, UN forces engaged in numerous peacekeeping operations, and the United Nations and many national governments established economic aid programs for the world’s poorest countries.

 

Admittedly, national policies sometimes fell short of the new internationalist, antimilitarist, and égalitarian ideals and programs.  Governments―and particularly governments of the major powers―all too often ignored the United Nations and, instead, squandered their resources on military buildups and terrible wars.  Many governments also had a spotty record when it came to respecting human rights, promoting social and economic progress, and curbing the rising power of multinational corporations.

 

Even so, for decades, humane domestic policies―from banning racial discrimination to scrapping unfair immigration laws, from improving public health to promoting antipoverty efforts and workers’ rights―remained the norm in many nations, as did at least a token genuflection to peace and international law.  Political parties with a democratic socialist or liberal orientation, elected to public office, implemented programs emphasizing social justice and international cooperation.  On occasion, though far less consistently, centrist and communist governments fostered such programs, as well.  Only parties of the Radical Right attacked these policies across the board; but, swimming against the tide, they remained marginal.

 

Nevertheless, in the last decade or so, enormous headway has been made by movements and parties following the old fascist playbook, with rightwing demagogues trumpeting its key elements of virulent nationalism, racial and religious intolerance, and militarism.  Seizing, particularly, on mass migration and funded by avaricious economic élites, the Radical Right has made startling progress―undermining the European Union, contesting for power in Britain, France, Germany, the Netherlands, and Greece, and taking control of such countries as Russia, India, Italy, Hungary, Poland, Turkey, Brazil, the Philippines, Israel, Egypt, and, of course, the United States.

 

Long before the advent of Donald Trump, the Republican Party had been shifting rightward, pulled in that direction by its incorporation of Southern racists and Christian evangelicals.  This political reorientation sped up after the election of Barack Obama sent white supremacists into a frenzy of rage and self-pity.

 

Trump’s 2015-16 campaign for the presidency accelerated the GOP’s radicalization.  Drawing upon unusually hate-filled rhetoric, he viciously denounced his Republican and Democratic rivals.  Along the way, he engaged in his characteristic lying and mocked or incited violence against his critics, the disabled, immigrants, racial minorities, Muslims, women, and the press.  His racism, xenophobia, and militarism, combined with his thuggish style and manifest lack of qualifications for public office, should have doomed his campaign. But, instead, he emerged victorious―a clear sign that a substantial number of Americans found his approach appealing.

 

As president, Trump has not only displayed a remarkable contempt for truth, law, civil liberties, the poor, civil rights, and women’s rights, but catered to the wealthy, the corporations, white supremacists, and religious fanatics.  He has also proved adept at inciting hatreds among his rightwing followers through racist, xenophobic diatribes delivered at mass rallies and through propaganda messages. Meanwhile, he has forged close alliances with his authoritarian counterparts abroad.  Either out of fear or love, Republican officeholders cling ever more tenaciously to him as the nation’s Supreme Leader.  If the GOP is not yet a fascist party, it is well on its way to becoming one.

 

Having grown up at a time when ranting maniacs dispatched their fanatical followers to stamp out freedom and human decency, I am, unfortunately, quite familiar with the pattern.

 

Even so, the struggle to shape the future is far from over.  During my lifetime, I have seen powerful movements wage successful fights for racial justice, women’s rights, and economic equality.  I have seen massive campaigns successfully challenge wars and nuclear insanity.  I have seen the emergence of inspiring political leaders who have toppled dictatorships against incredible odds.  Perhaps most important, I have seen millions of people, in the United States and around the globe, turn the tide against fascism when, some eight decades ago, it threatened to engulf the world.  

 

Let’s hope they can do it again.   

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173972 https://historynewsnetwork.org/article/173972 0
The Tea Party Revisited Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

Ten years ago, the Tea Party was big news. The Tea Party announced itself just as I began writing political op-eds in 2009. I found them deeply disturbing. They proclaimed their allegiance to freedom as loudly as they threatened mine. I didn’t agree with their economic claims that the deficit was America’s biggest problem, and I suspected their pose as the best protectors of the Constitution was a front for less reasonable beliefs about race, gender, and religion.

 

Founded in 2009 as a reaction to the election of Barack Obama as President, the federal bailouts of banks and other institutions in the wake of the great recession of 2008, and, later, the passage of the Affordable Care Act in 2010, the Tea Party entered conservative politics with a splash in the 2010 elections. NBC identified 130 candidates for the House and 10 for the Senate, all Republicans, as having strong Tea Party support. Among them, 5 Senate candidates and 40 House candidates won election. Those numbers are very high, because many Tea Party candidates defeated established politicians. Pat Toomey in Pennsylvania, Rand Paul in Kentucky, Marco Rubio in Florida, Ron Johnson in Wisconsin, and Mike Lee in Utah defeated more established politicians, including some incumbents, in both parties. They are all still Senators. Among the 5 Senate candidates who lost, Christine O’Donnell in Delaware, Sharron Angle in Nevada, and John Raese in West Virginia took extreme and sometimes laughable positions; Ken Buck in Colorado and Joe Miller in Alaska lost by tiny margins.

 

The Tea Party claimed to follow an ambitious agenda. One list on teaparty.org of “Non-negotiable Core Beliefs” included many economic items: “national budget must be balanced”; “deficit spending will end”; “reduce personal income taxes a must”; “reduce business taxes is mandatory”. A slightly different list called the “Contract from America” was also heavy with economic priorities: a constitutional amendment requiring a balanced budget; a single-rate tax system; “end runaway government spending”; “stop the pork”. The Contract included no social issues at all. The Core Beliefs began with “Illegal Aliens Are Here Illegally”, and included “Gun Ownership is Sacred”, “Traditional Family Values Are Encouraged”, and “English As Core Language Is Required”. Tea Partiers claimed complete allegiance to the Constitution as originally written.

 

Recently many commentators have asserted that the Tea Party was a failure and is dead. A NY Times article said “the ideas that animated the Tea Party movement have been largely abandoned by Republicans under President Trump”, because deficit spending has ballooned since he took office. Senator Rand Paul said “The Tea Party is no more.” A New Yorker article noted “the movement’s failure”, because they did not achieve a repeal of Obamacare. Jeff Jacoby, the conservative columnist for the Boston Globe, “mourned its demise in February 2018 under the title, “The Tea Party is dead and buried, and the GOP just danced on its grave”. He focused on the Tea Party’s inability to get Republicans to rein in spending.

 

Most of the successful Tea Party candidates from 2010 are no longer in Washington. Aside from the 5 successful Senators, only 16 of the 40 Tea Party House membersare left. Justin Amash recently left the Republican Party after indicating support for impeachment. But those figures are not a surprise. The average tenure in office of a member of the House is just under 10 years, so about half should have left by now. Two moved up in the political world. Mick Mulvaney is now head of the Office of Management and Budget. Tim Scott won election as a Senator. 

 

The whole narrative of Tea Party failure is wrong, in my opinion. While Tea Party organizations proclaimed high-minded principles of fiscal restraint, I don’t think that complex budgetary issues or particular readings of the Constitution motivate masses of voters. Today’s Republican Party is entirely in the hands of Trump, he completely ignores adherence to the Constitution and maintaining a balanced budget, and Tea Partiers are delirious with joy. The enthusiasts who scream at Trump rallies are the same people who signed on to the Contract from America in 2010. Trump embodies their real core beliefs: white supremacy; opposition to abortion rights, gay marriage, transgender people and anything that appears to deviate from their mythology of the “traditional family”; opposition to government regulation of private business, but support for government intrusion into private life; opposition to gender equality.

 

The social scientist Theda Skocpol, who studied Tea Party grassroots at the beginning, dismissed their economic policies as window dressing. She argued in 2011that these white older conservative Americans “concentrated on resentment of perceived federal government “handouts” to “undeserving” groups, the definition of which seems heavily influenced by racial and ethnic stereotypes.” She noted that “the opposition between working and nonworking people is fundamental to Tea Party ideology”, and that “nonworking” was assumed to refer to non-white. In a recent interview, Skocpol identifies Tea Party advocates as Christian conservatives, not libertarians. Today the Christian right shouts its joy about Donald Trump from every pulpit.

 

I was right and wrong about the Tea Party in 2010. I recognized that “The Tea Partiers are wrong. The people they support will increase government intrusion into our private lives, under the guise of protecting us from enemies all around, and will help big business exploit our private resources.”

 

I also wrote, “They won’t change American politics. Despite putting pretty faces like Glenn Beck and Sarah Palin on their posters, they’re way too unattractive. Like the guy who strolls into Starbucks with his gun, they might get a lot of attention, but they’ll make no friends.” How wrong that was. Their disdain for the views of other Americans, their distorted understanding of the Constitution, their blindness to facts which do not support their ideology, their racism and sexism, are now in control of the White House. The Republicans they called RINOs are gone.

 

They only supported limited government when a black man was President. Now they shout for the arrest of anyone they don’t like. The Tea Party no longer needs to attack the Republican Party from the right. They are the Republican Party, and their desire to recreate our country in their image is non-negotiable.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/blog/154297 https://historynewsnetwork.org/blog/154297 0
Trump is no Hitler: His Enablers are the Greater Problem

 

Comparisons of Donald Trump and Adolf Hitler are becoming more relevant as the president responds to further revelations of his priorities and his impeachment. Despite the protestations of some analysts who claim the contrast has no value, it is worth considering any historical antecedents that might give us insight into the current distressing political climate.

 

The two leaders’ tactics and personalities have been reported and analyzed over the last few years with fascinating similarities. For example; Trump’s first wife, Ivana, revealed he kept a collection of Hitler’s speeches at his bedside, and a respected Hitler biographer and scholar, Ron Rosenbaum, claims the president continues to use Mein Kamph as a playbook. Trump's inflammatory rhetoric and scapegoating certainly emulates the infamous German leader. 

 

However, most of the Trump-Hitler comparisons have been made without consideration of how the facilitators of dangerous leaders are central to understanding any parallels.

 

As Washington’s climate becomes dramatically polarized, the behavior of Trump’s allies reflect an abhorrent historical pattern. Impending disaster looms as facts are dismissed and personal attacks are the only response of the president and his defenders. We relentlessly hear the details in daily news reports; outright lies and fabrications have become so prevalent there is a growing tolerance to the climate of dysfunction. 

 

If we consider the larger picture, a blind and immoral political frenzy has seized the United States, similar in some ways to pre-war Nazi Germany. The rise of one of the most destructive dictators in world history was promoted and tolerated in a similar atmosphere. 

 

There are, of course, some clear differences. Though within his base he has a small army of domestic neo-Nazis and some supporters in the military, the president is incapable of exercising the kind of brutal authority that Hitler used to gain full control. He has succeeded in putting refugees in camps, but Trump has not been able to round up his perceived enemies and have them silenced. His hostility towards the press has yielded some violence, though much to his frustration, media continues to report on his worst offenses. Although he intimidates witnesses and detractors, Trump has not succeeded in creating enough animosity towards any enemy to deter investigators or distract a majority of the public from his ridiculous behavior. He imagines being all-powerful, but the president will not have the Capitol burned as Hitler torched the Reichstag. Trump has so far failed in his ability to control the country with the efficiency of the German führer. 

 

The degree of Trump's success has been achieved only because of the leverage he has on political cronies. Trump is no Hitler, and his acolytes are at a loss, lacking a truly powerful leader. In their empty defense of the president, whether repeating Trump’s angry excuses or remaining silent, many members of the party of Lincoln feign their innocence in allowing the unfolding constitutional crisis. How they manage to do so in the face of articles of impeachment and very specific constitutional violations will be telling. But the overt posturing of those compromised by self-serving loyalty and blatant hypocrisy is a symptom of political cowardice, thus making any change in position unlikely.

 

The collaborators who claim they are behind the president are watching the bizarre show, waiting to see which direction the wind blows with the public to best protect their own interests. They are not fools, thus like the president, they willingly put personal interests before the country’s. What degree of insanity and illegality will be needed before they admit the president deserves removal from office? 

 

Consider if both the Senate and the House were now controlled by Republicans, would there be any remnant of balance of power? And how much control would be granted to Mr. Trump? Perhaps Congress would tolerate a declaration of a national emergency that transferred all of their powers to the president’s cabinet. Could it happen here?

 

Hitler’s corporate and political supporters assisted in the collapse of Germany’s constitutional democracy in 1933, although a similar coup is very unlikely in the United States. Trump may have a narcissistic personality disorder and erupt with vitriolic diatribes like Hitler, but his minions have given their loyalty to a self-absorbed, self-incriminating president who is driving the country towards chaos. They have fallen for the tactics of a charlatan who commands the most powerful and menacing military force on the planet.

 

The president's continuing egomaniacal diplomatic decisions have initiated distress among some in his party. However, Republican defenders have not dared to admit the obvious: Trump’s betrayal of US international interests and his illegal seizure of foreign affairs for personal gain are part of the same outrageous, subversive behavior, establishing without doubt that he is impeachable. 

 

This bizarre and divisive epoch has exposed moral failings in government of the highest order. Investigations and judicial rulings must further unfold before there is any consensus on Trump’s legacy. Yet the empowerment granted by his protectors has already initiated a collapse of constitutional order and degradation of government.

 

We witness a repeating pattern that echoes through centuries: the enablers of ruinous tyrants light fires that eventually consume them. Some will recognize their catastrophic failures only in retrospect.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173973 https://historynewsnetwork.org/article/173973 0
Reexamining the Mayflower 400 Years Later

 

Adapted from The Journey to the Mayflower by Stephen Tomkins, published by Pegasus Books. Reprinted with permission. All other rights reserved.

 

There were a number of considerations that made the Separatists look to America in 1617, but the most commonly cited motive for the sailing of the Mayflower– to escape persecution and worship freely – was not one of them. Persecution had driven them out of England to the Netherlands, but they had not suffered persecution in the Netherlands and the Leiden church had especially good relations with the Dutch. Persecution prevented them from going home, but not from staying where they were. 

 

The key consideration, according to the Separatists themselves, was that life in Dutch cities seemed just too grim for their church to have any future. They were losing the older generation, Bradford said, who seemed to be dying prematurely after years of unskilled urban labor, or, having used up their savings, returned to England to lodge with family. And they were losing the younger generation, who were fed up with godly poverty and surrounded by worldliness they would never have encountered in an English village. ‘Getting the reins off their necks’, Bradford said, some young people joined the army, some went off to sea and others took ‘worse courses’. Those young people who remained in the faith, he said, joined in their parents’ labors and were incapacitated by them. The leaders feared that within years their church would collapse. America, they imagined, would allow them to create an English village 3,000 miles from the nearest bishop, returning to the agricultural life they yearned for, and escaping the fleshpots of Leiden and Amsterdam.

 

A sense of failure in their mission added to the Brownists’ discontent, according to Winslow, who lamented ‘how little good we did, or were like to do’. He had hoped that the purity of their church would inspire the Dutch to a new reformation, but they ignored them. The English paid more attention, and many were impressed by the Brownists’ example, but too few were willing to join them in the Netherlands, and fewer stayed. Admittedly, America would offer still fewer opportunities to convert the English, but perhaps, Bradford said, they would be God’s instrument for converting the Americans.

 

On top of all the difficulties of life in the Netherlands, the English feared that it was about to get worse. The Spanish truce expired in 1621, and it seemed likely that the war would resume. This was a serious consideration, though war with Spain had not stopped the church coming to the Netherlands in the first place. 

 

Another motive Winslow mentioned for going to America was, paradoxically, to stay English. The younger generation had gradually integrated with Dutch society, and it was clear, as Winslow put it, ‘how like we were to lose our language and our name of English’. This undermined their mission to model reform to the English churches. If the Brownists did survive in the Netherlands, becoming just another Dutch denomination would defeat their object in being there.

 

There is, however, something intriguingly inconsistent and incoherent about the reasons the colonists gave for their decision. Although Bradford’s main explanation was that they were driven to America by the grimness and poverty of life in Leiden, which lay behind all their other problems too, yet he says elsewhere: ‘At length they came to raise a competent and comfortable living’; and though it was not easy at first, he adds,‘(after many difficulties) they continued many years, in a comfortable condition’. He describes their departure from ‘that goodly and pleasant city’. Above all, he says they looked to America as a ‘place of better advantage and less danger’ than Holland, but also says that they were aware of its ‘inconceivable perils’. ‘The miseries of the land ... would be too hard to be borne.’ It would be likely ‘to consume and utterly to ruinate them’. They feared famine and nakedness, ‘sore sicknesses, and grievous diseases’. They had heard terrifying accounts of the indigenous Americans torturing people to death. They were aware of how many settlements had failed and how many lives had been lost. Add to all this the perils of the journey itself, and the frailty of the travelers, and one has to wonder in what sense exactly Bradford was using the phrase ‘less danger’.

 

When members voiced their serious doubts about the expedition Robinson and Brewster proposed, the reply, as Bradford has it, is fascinating: ‘All great and honorable actions are accompanied with great difficulties; and must be both enterprise, and overcome with answerable courages.’ The reason they needed to leave Leiden, supposedly, was to escape its hardships; and when critics of the plan worried that America would have great hardships, they were told that this great enterprise was worth great hardships. So what was the enterprise, if not to escape hardships?

 

The colonists seemed to have difficulty articulating exactly what compelled them to leave the Netherlands for America. They had a sense of an important enterprise, which, though they might rationalize it in terms of escaping the difficulties of life in Holland, did not meet that objective terribly well. Their calculation of the risks and imperatives does not add up unless we remember how ingrained it was for the Separatists to see their story in biblical terms, as following biblical patterns. They saw their reflection in countless scriptural parallels, but above all in the exodus – God leading the new children of Israel out of the bloody and antichristian land of their birth, to a place he had prepared for them. From Browne’s first writing, to Helwys’s last, the Separatist equation was repeated (critically in Helwys’s case): ‘England was as Egypt’. They had been delivered and there could be no going back. The Netherlands, however, had not felt at all like the Promised Land, which meant that instead they were in the wilderness, the stretch of wandering and learning that occupied Israel before reaching Canaan. They were God’s ‘little church, fled into the wilderness’, in Ainsworth’s words, or, as Johnson put it, they had followed Moses’ path, rejecting the treasures of Egypt ‘to suffer adversity with the people of God’; they were ‘but strangers and pilgrims’.

 

The Separatists were prime examples of the fundamentally forward-looking nature of Protestantism – ‘a religion of progress’, as Alec Ryrie puts it, ‘of restless, relentless advance toward holiness, not of stagnation’. Each stream saw itself as the culmination in a century of progress towards truth and obedience, while still challenging its own orthodoxies, their church covenants explicitly committing them to embrace God’s future revelations, to be ready to take the next step. Individual members had once walked miles to acceptable churches, then moved to London maybe, then sailed for Amsterdam, then moved again to Leiden or Emden. If their dreams had failed to come true and no simple way forward presented itself, then their whole religious experience and outlook told them God would reveal a new path ahead. Satan called them back, the Lord called them on. And if the way forward involved a terrifying leap of faith, ‘yet might they have comfort in the same’. Whatever America might entail, it was not retreat; it was not going back to Egypt.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173967 https://historynewsnetwork.org/article/173967 0
A Personal History of Vietnam War Refugee Policies

 

When the war ended in Vietnam in 1975, America was completely unprepared for the sudden influx of refugees who came to the United States from South Vietnam. There was no policy for letting them in. There was no policy to keep them out. The Vietnamese who came did so because they felt they had no other option. 

 

That first group to arrive was a diverse lot. Some worked for the American press. Others were in the employ of big American companies with construction and communication projects in South Vietnam. Some were professionals -- lawyers, teachers, and doctors and nurses. Some worked for South Vietnamese non-profits. There were cooks and dishwashers, street cleaners and soldiers. Many were family members who feared reprisals from the North Vietnamese. Their backgrounds and occupations may have been different, but they all shared one overriding goal: they did not want to live under communism and the domination of Hanoi and the Viet Cong, who were equally hated and feared.   

 

Throughout its history, America has seldom been kind or overly hospitable to refugees. Despite the millions who came in the late 19th and early 20th centuries when our doors were open without prejudice, there has always been an unfounded fear that the newcomers would change the American way of life. After World War I, nationalists, always a powerful factor in American politics, decided enough was enough. They wanted to stop the flow of Italians, Jews, Irish, Germans and the Chinese into the U.S. To mollify the nativists, Congress enacted laws that created strong quotas and tough strictures on immigrants. Deportations for illegal immigrants mainly from Mexico and Central America are also nothing new. For example, between 1931 and 1940 America deported more than 500,000 Mexican-Americans back to Mexico. 

 

In the case of Vietnam, however, these strictures were relaxed, perhaps because of guilt over the way the war had ended. 

         

*****

Josephine Tu Ngoc Suong was born in Saigon in 1942 during the Japanese occupation. She had worked for many years for NBC News before I arrived in Saigon in 1966. We married in December 1968 in a simple ceremony at the Hong Kong city hall. Josephine's father, Tu Hong Phat, had  lived though the long French occupation, the short Japanese takeover during World War II and then the return of French domination until the fall of Dien Bien Phu and the new dominance by America. Through hard work he carved out a confortable life for himself and his family who lived on a typical, unpaved street with a market and Catholic Church in central Saigon. 

 

Before the war was over, Josephine and I were living in Rockville Centre, New York, She and her father wrote many letters to each other. I recently found a few surviving letters her father wrote about what he observed in Saigon as the war was ending. The letters give us a rare insight into what an average Vietnamese man thought about the war. 

 

There were major battles on the days when he wrote. Fighting and chaos were everywhere. Most of Mr. Tu's letters were realistic, well informed and filled with information about the war, how Josephine's three brothers were faring and how the family was dealing with food shortages. 

 

He thanked Josephine for packages of food that she sent weekly. He told her how he sent his wife, her mother, Nguyen Thi Ba, to purchase 500 kilos of rice, dry shrimp, and dry Vietnamese sausage that could last as long as five months if the military situation grew worse and there was a shortage of food. 

 

He wrote "The military situation in the south of Vietnam is in an extremely serious stage. Numerous events have occurred recently, never seen in the history of this war." It was surprising to see how well informed Mr. Tu was about the war. I never trusted the local press in South Vietnam, but Mr. Tu obviously knew how to parse the information, real and rumored, from a myriad of sources. 

 

Mr. Tu wrote about the provinces in northern South Vietnam that had recently fallen to Hanoi: "The communists have taken this opportunity to change their guerrilla fighting over to tactical and strategic fighting using main force troops in strength. Our South Vietnamese forces often withdrew its troops before clashing with the enemy." He said, "according to the latest sources, the northern provinces are in a shambles with over 10,000 North Vietnamese troops fighting near Binh Dinh." He told Josephine "refugees keep flowing deeper south even as far as the southern resort of Vung Tao." 

 

At the time of this letter, the ARVN, South Vietnam’s army, was on its own. American troops had left the country almost two years prior. South Vietnam's ground forces were not doing well without American support, hardly a surprise to most observers.

 

Toward the end of one long letter Mr. Tu became political, voicing some of his thoughts about life in Vietnam as the war was ending. "In brief, the people who are living under freedom could have a happy life if they worked hard. Living in a free society, the people have all their rights, while the communists have none. People under communism must listen to the Communist party if they want to live. If the communists appear anywhere, the place is soon ruined and damaged. No one can forget the VC general offensive during the New Year of 1968. Thousands of innocent people have been buried alive at Hue, the former Imperial capital. That is why people are fleeing the communists. But the present regime in South Vietnam is not much better since there is much injustice and corruption in the military and administrative machinery. Evil."

 

As the war was ending, NBC decided it had a moral obligation to bring to America any Vietnamese employees and their families who wanted to come here. Josephine's family never hesitated. With the help of NBC News they were part of the first wave of 125,000 immigrants who came to the U.S. in 1975. The whole family, except for Josephine’s three brothers who were in the South Vietnamese military, left.

 

The Vietnamese who successfully fled Vietnam were housed in refugee camps in the United States, on Guam, or in the Philippines. Josephine’s family members were sent to California and Arkansas.

 

Meanwhile, as Saigon was falling in 1975, the brothers were at the Bien Hoa Air Base outside Saigon. Unsure of their next move, they decided to make their way to the family home.

 

Their friend Xuan was still serving in the navy as the situation in Saigon deteriorated. Xuan said that "patrols on the water were normal until the morning of April 28 when the VC opened a heavy offensive against the Thanh Giang Bridge," the biggest span on the highway into Saigon. A battle ensued with no definitive result. 

 

After that battle, Xuan drove with his commander to a meeting in the Rung Sat jungle riverine force zone. His commander had an order to evacuate all his navy ships and he asked Xuan if he would like to go with him. Xuan agreed to flee but He had more to do. He took a chance and went to the Tu home where he found Khiet, Khai and Quan waiting for something to happen. It did. The brothers agreed to come with Xuan on the navy boat. Unfortunately, Xuan’s parents didn’t want to leave Saigon. "My parents still would not leave. My father would not budge," Xuan recalled.

 

Saigon was about to fall and the Viet Cong were everywhere so the men could no longer delay their departure. It was time to get moving. After securing extra food for his family and saying goodbye to his mother and father, Xuan and the three brothers made their final scooter ride to Nha Be where the patrol boats were docked at the rendezvous point ready to go. 

 

As they fled, Xuan recalled, "Outside the main streets of central Saigon, the city seemed as a dead city. No houses or stores were open. People stayed inside unsure of what would come next." It was April 29, 1975, one day before Saigon fell and war officially ended. 

         

After briefly staying in a small house with about fifty other people, they boarded the ships. Worried they might have to fight the enemy to escape, the men mounted weapons on the ship’s deck. Luckily, the small fleet with its lights turned off, escaped undetected. Xuan remembered a moving, tearful ceremony as the crew lowered the Republic of Vietnam flag as the passengers said goodbye to their homeland. After seven days at sea they arrived in the Philippines. From there they transferred to a large commercial ship that took them to Guam. They spent two weeks there before being flown to Fort Indiantown Gap, Pennsylvania. 

 

While the family was still settling into our house on Long Island, I had to take up a new assignment for the Today Show in Washington, D,C. Josephine and I flew to D.C. to look for a place to live. While house hunting, we received a 4 A.M. phone call from Josephine’s 23-year-old brother Quan. He was calling from the National Guard camp at Fort Indiantown Gap. Quan, his brothers Khiet and Khai, and their friend Xuan told us they ended up there after a long journey at sea. 

 

Knowing Josephine’s brothers were safe gave added impetus to our house hunting.The day after the 4 AM phone call Josephine and I rented a big home in suburban Maryland that would take care of the family at least temporarily. That Sunday, Josephine and I drove to Pennsylvania to find her brothers and their friend.

 

When we arrived at Indiantown Gap, I told the guard at the gate what we wanted and he unhesitatingly directed us to the administrative building. I entered the office, explained what I wanted to the desk sergeant, and handed him a piece of paper with the names of the men we wanted to take home with us. He grunted in assent, sent a corporal to find the men and soon they were in the office, hugging Josephine, laughing and crying as I signed them out into my custody. 

 

We walked back to the car and I drove to the first Burger King I saw. Everyone ordered burgers, fries and Cokes. Their first real meal in America and, of course, it had to be the fastest of fast food, the most iconic welcome they could have had. They ate, and kept eating. 

         

When we returned to Rockville Centre (we wouldn’t move to Maryland for a few more weeks), there was no negative reaction to the sudden influx of refugees in my Long Island town. Donations of clothing from the two Jewish temples and the Catholic diocese started arriving on our front porch. People cooked meals for us and volunteered to drive family members to stores and medical appointments. Others volunteered to help Josephine in any way she needed. Kindness prevailed. The community outdid itself welcoming the newcomers. 

 

Josephine's three brothers, their newly adopted cousin and her father were still wearing Saigon-style sandals, leather thongs sewn into soles made from discarded rubber tires. They needed something better for their feet, more applicable to the colder weather soon to come. I piled everyone into the station wagon and drove to Baldwin, one town over, and a Shoetown store to buy shoes and socks for everyone. I told the startled clerk that I wanted sneakers and sweat socks for everyone. The men sat down, tried on various sneaker styles, made decisions on what they would wear, and walked out of the store in newly clad feet, happy with their selections and their deeper inclusion into American society.  

 

Josephine’s brothers, their friend and her father started helping around the house washing the bathrooms, cleaning the floors, washing the windows, cutting the grass and upgrading the flowerbeds. Inside and out, the house and yard never looked better. 

 

Then the whole family was off to Maryland so I could start my new job. As we pulled up to our new home, one of our neighbors watched from behind his curtains. Later, he told me he thought he was witnessing a benign invasion by a delegation from an Asian country. The fact that Josephine’s female family members were wearing their traditional dress, the Vietnamese ao dai, only heightened his impression. 

 

Once settled, I started my new job, driving to work every morning at 5:30 A.M. Meanwhile Josephine helped her family adjust to their new home, finding work for everyone so they could be independent. 

 

The men had no definable skills that translated easily into work, which made it hard to find jobs for them. Through a personal contact of mine, we were able to get Josephine's three brothers, their friend, and her father jobs cutting grass, cleaning flowerbeds, and watering lawns. The pay was low, but it was in cash and in those warm weather months they could spend their time outdoors. Soon they started to bring home money and feel more secure in their new life. 

 

It took more doing to find work for the women, who also had no transferable skills. But they were not afraid of menial labor. Josephine made a contact with a local Ramada Inn. They got jobs cleaning rooms, making beds, and washing bathrooms. These were not cash jobs but the accumulated paychecks gave the women a sense of security and worth. 

 

I must emphasize that strangers who helped my family did it with a smile, unlike today when many Americans regard immigrants with disdain and anger. I do not, and the family does not, recall having suffered because they were refugees. 

 

Once Josephine had secured jobs for everyone, her next step was housing. Her family had money coming in and collectively they could afford to move to a place of their own. However, that proved to be difficult. Buying was impossible. Renting was also difficult because none of them had a bank account. None of them had credit cards or Social Security numbers. They had never had a telephone, even in Saigon. Neither had they ever paid an electric bill. In other words, not one of the family had a record of having spent any money in their new country. That would eventually come, but not fast enough to get them their own home. However, Josephine persevered and eventually found a town house the family could rent. She and I vouched for each family member individually and we provided the money for the security deposit. 

 

In a matter of weeks, the family moved into their new home in a lower middle-class development in Gaithersburg, a few miles from where Josephine and I lived. The first night in my Potomac home without them was strange, eerily quiet, as if our basement had never been the home for those who had fled the communist takeover of South Vietnam leaving their old lives behind. 

 

Josephine’s family, continued to work and save money. They got drivers licenses, bought cars and relieved Josephine of her duties as a chauffeur. As soon as they could, the family moved into a bigger house, Here their story takes a new turn. Josephine’s mother, Nguyen Thi Ba, had been a well-known cook in Saigon who ran a small breakfast and lunch restaurant in the front courtyard of their home. 

 

Now she opened a clandestine, illegal restaurant in the front parlor of her house where she served home cooked Vietnamese food to the growing community of refugees starting to populate Maryland, Virginia, and Washington. Some neighbors complained, causing the local board of health to occasionally shut down the restaurant, but that never lasted too long. Even the health inspectors grabbed a quick meal of pho, the hearty signature soup of Vietnam, or a variety of noodles and chicken in plum sauce. The food was too good to pass up. 

 

Josephine’s family had always wanted to own a restaurant. They soon realized their dream. With the money they had been saving the family opened the first of its two restaurants, Taste of Saigon, in Maryland and then Virginia. Their aim, with Josephine as their mentor, was to stand on their own and not be a burden to anyone. In their life as restaurateurs, they were starting to reach their goal.  

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173970 https://historynewsnetwork.org/article/173970 0
A Civil War Heirloom, Ancestry.com, and the Importance of Tracing Our Family's Historical Roots

 

My grandfather, Leo “Butch” Armbruster, was a railroad engineer and a man of exceptional frugality.  Famously in our family circle, when feeding an infant grandchild, he would eat whatever was left in the Gerber jar when we were full.  The shanty behind his house bulged with corroded pipes and faucets, old eyeglasses and tobacco tins brimming with bits of hardware, and the sleds and ice skates of his now-grown children.

 

The real treasures were in his attic.  As the grandson who lived closest, who spent the most time with him, and who he knew loved history, I was favorably positioned to get occasional glimpses of, and sometimes to handle, those treasures.  And on one very special day in the mid 1960s, when I was a high school senior, I got to call his best treasure my own.

 

The treasure is a Spencer repeating rifle.  Known as the “Seven-Shot Wonder”, this invention of one Charles Spencer was the world’s first military, brass-cartridge, repeating rifle.  The Spencer Repeating Rifle Company, and the Burnside Rifle Company under license, together manufactured more than 200,000 of these “Wonders” between 1860 and 1869.  Mine is engraved, “Spencer Repeating Rifle Co., Boston, Mass., Pat’d  March 6, 1860.”

 

Boasting a 30-inch barrel, the Spencer was accurate to 500 yards.  Seven 13mm (approximately half-inch in diameter) cartridges were loaded into a spring-energized tube that slid into the breach (stock). The Wonder could spray the enemy with up to 20 rounds per minute in the hands of a well-trained Union soldier.  This involved levering a cartridge into the chamber, while simultaneously dropping the spent brass to the ground, then pulling back the hammer, aiming and firing. This was not much effort per shot when compared to the gymnastics needed to load shot and powder into a musket, put more powder onto the pan, and get off a (less accurate) lead ball.

 

In an excellent example of bureaucratic bone-headedness, the U.S. Department of War’s Ordinance Department delayed putting the Spencer into the field, because of fear that soldiers would waste too much ammunition.  You needed more mules and wagons to move all that ammunition. And you could buy several trusty Springfield muskets for the price of just one Spencer. Consequently, not until the Battle of Gettysburg did the 13th Pennsylvania Reserves demonstrate the withering worth of the Wonder. (1)

 

The Spencer’s performance at Gettysburg caught the attention of President Lincoln, who gave Charles a chance to demonstrate his invention against the best musketeers.(2)  After that, the rest, as we say, was history. The Confederates were not only outnumbered; they were pathetically outgunned.

 

Time passed

In the fall of 1965, I went off to college.  To my lasting shame, I pretty much lost touch with my grandparents.  My Spencer gathered dust in my old room at home, while I majored in fraternity at Franklin & Marshall College.  My “Pappy” passed away at 86 during my college years.  The remaining treasures in his attic got divided up among his kids.  In 1969, joining the Coast Guard after graduation to steer clear of Vietnam, I moved away from home for good.

 

The Wonder went with me, first to Cleveland, Ohio, and later to Austin, Texas, where I taught business law after getting my doctorate and law degree. Sometimes it got displayed, sometimes shoved into a closet.  All I knew of its original owner from Pappy was that his name was Aaron Henry and he hailed from the Scotch-Irish side of my grandfather’s family.  Pap’s mother’s maiden name was Morrison.  Pappy claimed Aaron was his uncle, which made him my great, great uncle.  The old veteran had retired to the Armbruster family farm.  Having been wounded in the knee, he remained in Pappy’s memory as a geezer who limped around the property until he finally passed away.

 

When my wife and I --- the family prodigals --- finally moved back to Pennsylvania in the 1980s, I made a few desultory attempts to track old Aaron down.   Pennsylvania’s Civil War military records weren’t known for their ease of access. Caught up in raising a family and pursuing a law practice, I wasn’t known for my persistence in such matters as genealogy.  

 

Enter Ancestry.com

 

As one ages, ancestry, like an old friend long taken for granted, often gains in importance.  My retirement from Rider University in June 2019 was coincident with my son’s Christmas gift --- a six-month subscription to Ancestry.com --- going “live.”  He was wise enough to know that with more time on my hands, this gift would be appreciated.

 

Thanks to the vast archival resources Ancestry has pulled into its tent, I was able to find seven “Aaron Henry” entries in the annals of Pennsylvania regimental records.  From there, it wasn’t hard to narrow the quest down to my great, great uncle. Born in Mauch Chunk (now Jim Thorpe, my hometown) on November 12, 1837, he was baptized a couple of weeks later in the town’s First Presbyterian Church.  By 1850, 13-year-old Aaron was living with his mother, her second husband Alexander Craig, and a trio of half sisters.  

 

Aaron it seems never left Mauch Chunk, until at 23 he volunteered for a three-month enlistment with the Pennsylvania Sixth Regiment on April 22, 1861.  Note that the bombardment of Fort Sumter, the first battle of the Civil War, occurred on April 12th and 13th.  Young Aaron wasted no time in joining up.

 

In fact, Dr. Gregory J.W. Urwin, an esteemed military historian and Civil War reenactor on the faculty of Temple University, tells me, “You can consider Henry one of the most ardent of Pennsylvania’s patriots by his rushing off to fight as soon as war broke out.”(3)

The Battle of Bristoe Station

According to Ancestry.com, Uncle Aaron received a musket ball he carried in his knee for the rest of his life on October 14th of the war’s third year.  Presumably, it was either more efficient or less dangerous to leave the ball in situ than to risk removing it and losing the lower leg in the process.

 

Urwin noted, “If he was wounded at Bristoe Station on October 14, 1863, that means he then joined a regiment raised to serve for three years.  The fact that he wanted to see the war through to its conclusion also says something about his character.”

 

The prelude to the Battle of Bristoe Station was the nadir of the Union army’s fortunes, a trough of incompetence from which the Yankees only began their painful climb at Gettysburg.

 

The depth of Uncle Aaron’s patriotism, as Dr. Urwin surmised, is punctuated by the fact that, during the winter and spring of 1863, the Army of the Potomac was enduring some 200 desertions a day.  On New Year’s Day, encamped on the northern banks of the frozen Rappahannock River, the troops hadn’t been paid for six months. Living conditions were deplorable, facilitating the spread of vermin and disease.(4) No wonder men lit out for home!

 

Worse was to come in the first half of 1863.  Following a desultory foray in the enemy’s general direction that ended in a spring-mud debacle, General Ambrose Burnside ---the previous year’s loser at Fredericksburg--- was replaced by Joseph Hooker.  Apparently a better quartermaster than tactician, Hooker cleaned up the camps, saw his army fed and paid, then marched them off to a crushing defeat at Chancellorsville.

 

While Hooker’s incompetence and timidity cost the Army of the Potomac some 17,000 men, Lee’s brilliant win cost the Confederacy more than 13,000.(5)  In a war of attrition, the Union could absorb the loss so much easier than the Confederacy, even allowing for the desertions and the Copperhead agitation that was stirring northern and especially mid-western anti-war sentiment. 

 

Unable to sustain a war of attrition, Lee led his army north into Pennsylvania.  In the first days of July, he met George Gordon Meade, Hooker’s successor, at Gettysburg.  The three-day engagement is often called the “high water mark of the Confederacy.” Lee would never have the initiative again.  Still, nearly two more years of war lay ahead. 

 

In the grand sweep of all the war that still remained--- an ordeal extending from Lincoln’s July 4,1863 battlefield address to April 1865 at Appomattox --- the Battle of Bristoe Station is a mere burp of a battle.  The October 14, 1863 encounter produced only 380 Yankee casualties and 1,360 Rebel dead and wounded.(6)  The facts are straightforward.  After Gettysburg, Meade and Lee played a chess game in northern Virginia.  In early October, Lee stole a march on the less agile Meade, forcing the Army of the Potomac to fall back from its southern-most advance to protect its flanks.

 

Major General Gouveneur K. Warren’s II Corps had been stung by Confederate General J.E.B. Stuart’s cavalry at Auburn (VA) on the 13th.  Warren was faced with pushing Stuart aside, while maintaining an orderly retreat from the Rebel corps commanded by Richard Ewell. He chose Bristoe Station, a stop on the Orange and Alexander Railroad line, to face the music.

 

Lt. General A.P. Hill’s Confederate III Corps reached Bristoe on the 14th. Warren deployed his troops behind a railroad embankment.  From that hidden vantage, he ambushed Hill’s troops as they rushed to catch up with the Yankee rear guard.  The encounter was a Union win.  But with Ewell’s Confederate troops advancing on his left, Warren had to rejoin the general retreat.(7)

 

Aftermath

 

Warren got a promotion out of the encounter.  He is remembered to the present day, at least by Civil War buffs, and is the subject of a relatively recent biography.  His official photograph survives.

 

First Sergeant Aaron Henry of the 81st Pennsylvania Regiment went home to Mauch Chunk. According to Ancrestry.com, he married Sarah Johnson on January 10, 1867.  She died less than a year later, apparently in childbirth.  The daughter, Jennie Henry, survived and lived until 1937. 

 

In 1870, Uncle Aaron took up blacksmithing in his hometown.  He remained single until 1884, when at 47 he married the 34-year-old Amelia Hahn with whom he had a son, Garfield.  They lived for awhile in Franklin Township, a Carbon County hamlet.  Exactly when he moved to the Armbruster farm is unclear.  In 1901, aged 64, Uncle Aaron decamped to a veterans’ home in Hampton, Virginia.  Nine or maybe ten years later, he returned once again to Carbon County, where the disabled vet died on January 10, 1912.

 

Unlike General Warren, Sergeant Henry left behind no portrait.  I don’t even know which knee bore the musket ball.  He was one of three million Americans who fought in the War Between the States, 600,000 of whom died.(8)  Thanks to intrepid photographers with their bulky and primitive equipment, a tiny fraction of these soldiers were captured on glass plates.  For all I know, Uncle Aaron is somewhere among those unidentified fighters.

 

That he brought home his rifle was probably not so unusual.  My father’s generation returned from World War II with all sorts of memorabilia.  As kids, my brother and I played with a disarmed Japanese mortar shell, among sundry other authentic souvenirs.  My Uncle Albert had his .45 pistol.  Pappy told me the Spencer had been used for deer hunting as recently as the 1930s and it still seems to be in good working order.

 

So what?

 

So… now, at last --- more than a half century after my grandfather gave me his best treasure, I know the broad details of the life and labors of the man who carried it into battle.  What does that matter?

 

The reason I think it matters… the reason I have written about it… is the disruptive moment in which each of us, and our nation, find ourselves. Not since Uncle Aaron answered the call in April of 1861 has the nation been so starkly divided. Not since then have we heard each side shout across the divide that the “others” aren’t worthy of life itself. If we are going to raise our eyes from the abyss, gaze across it and acknowledge our fellow Americans, I believe we must first look to ourselves.  What are our personal stories that teach us that the American democracy is greater than our transient differences?  For me it’s knowing that nearly 160 years ago, my great great uncle rallied to the Union cause.  The “Wonder” on my shelf reminds me daily of my own roots.  

 

I am convinced that Churchill could not have faced the existential threat of the Third Reich and rallied his nation had he lacked his appreciation of British history and the place of his family tree in that history.

 

“If this long island story of ours is to end at last, let it end only when each one of us lies choking in his own blood upon the ground.”

 

That sentence illustrates for me how he masterfully combined the sweep of a collective historical experience with the immediate challenge facing each individual Britain, requiring each man and woman to look inside themselves.

 

I believe each of us is well advised to tear ourselves away from the ephemeral distractions of the social media and presidential tweets and TV’s talking heads, and take a little time to recall those ancestors who contributed to making each of us an American.  No matter if, as with me, one can look back a century or two, or if one must look to the history of another nation that drove a decision to emigrate to the U.S. 

 

As with my successful search for Uncle Aaron, a retracing of the tap root of what made each of us an American can be a profound reminder of why we must put our democratic republic ahead of transient sectarian differences and deal with tomorrow’s existential challenges as, collectively, the American constitutional democracy.

 

(1) Philip Leigh, Lee’s Lost Dispatch and Other Civil War Controversies (Yardley PA: Westholme Publishing 2015) at 25-36.

(2) John Walter, The Rifle Story (London: Greenhill Books 2006) at 69.

(3) Gregory J.W. Urwin, “Re: Union Pennsylvania Volunteers,” Message to James Ottavio Castagnera, August 8, 2019, via gmail.com

(4) Geoffrey C. Ward, The Civil War (New York: Alfred A. Knopf, Inc. 1990) at 184.

(5) bid. at 210.

(6) David M. Jordan, Happiness Is Not My Companion: The Life of General G.K. Warren (Bloomington: Indiana University Press 2001) at 108.

(7) Ibid. at 110.

(8) Ward, op. cit., at xix.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173968 https://historynewsnetwork.org/article/173968 0
Little Women are Bigger Than Ever

 

The Civil War was a difficult time for all American families, North and South. It was especially tough for the March family, of Concord, Massachusetts. A mom and four daughters ran the household and worked at various full and part-time jobs while their father was in the Union army. Their story was recorded as fiction in the autobiographical work Little Women, by Louisa May Alcott (it was her family). It was the story of Jo March, the eldest sister, a writer, and her professional and emotional ups and downs and the emotional struggles of her family. It is one of the country’s most beloved novels.

 

I believe I am the only living American who has never read the novel or seen any of the many movie adaptations of it. There was an elderly man who lived in the foothills of the Rockies who never read the book either, but I think he has passed  away.

 

So when I walked into my packed neighborhood theater to see the film (passing the 334 million people on line to see the latest Star Wars), I did not know what to expect. It seemed from the pre-show chatter that everyone in the theater had read the book at least four times.

 

The movie unfolded, very slowly. The first third of the film, that opened nationwide last weekend, is very disjointed. I was not sure what woman was in love with what boy, where the dad was, who was getting sick, what neighbors were friends and what Meryl Streep was doing playing some aunt with a great deal of money. . I was confused and just did not know why the book and previous movies had become, well, immortal.

 

Then, by the middle of the movie, the plot started to develop and the characters, the sisters, bloomed. From that moment on, they had me. I was enthralled with the story and the sisters. I felt like yelling at the screen to tell the sisters to do something about a situation because I so desperately wanted to help them.   The new Little Women,marvelously directed by Greta Gerwig, is a tremendous movie, an all engulfing, emotional roller coaster that not only displays the March family, with its triumphs and tragedies, but the lives of women in the middle of the nineteenth century and how they all fit into the story of the Civil War, that at the time of the story was ripping the nation apart.

 

Little Women, a Sony PIctures film, succeeds on several levels.  First and foremost, it is a loving look at several very admirable young women and the  problems they have to grapple with at that time. They fight with each other, fret with each other and most of all love each other. They march down the street together, sit in the theater together, shiver in the New England cold together and, in the end, in several ways, triumph together.

 

Director Gerwig, who also wrote the screenplay, has  made this a very modern look at life in the 1860s. There are several well-crafted dialogues from the girls about how unfair a woman’s lot was in that era – unable to do just about anything because of gender discrimination. Why does their goal in life have to be finding a husband and not finding a career? Why do men look at them as cooks and servants and not as emotional partners?  Why is the best thing a woman could do was bear several children and not have several jobs? That dialogue is clear and reflects the plight  of women today, too. Gerwig’s screenplay is as much 2019 in tone concerning women’s roles in the world as Alcott’s story was about them in the middle of the 19thcentury. The screening I attended was filled with mostly women; men should see this movie too to learn something.

 

My complaint about the story and the movie is that while set during the Civil War and about a soldiers’ family back home, there is not much about the war. There is a scene where Concord residents  provide blankets and food for coming home soldiers, several descriptions of losing loved ones, some worry over the Dad’s safety and, later, his sickness. Jo even cuts her hair for money to help get her dad get back home. The film should have offered more about the war. It should have been noted, too, that while life was hard on northern families, it was hard on southern families, too. For every March family in the Union, there was surely one in the Confederacy.

 

The roles of the women are clearly delineated.  Jo is the older sister who writes plays and novels but doesn’t get anywhere. She guides all the others, who find husbands, despite declarations that they don’t want to be just wives and moms. Mom has nothing but problems with the girls, always dispensing advice about life and love that are just as valuable today as then. She is their rock that the girls stand upon while dad is in the army.

 

In the end, life turns out reasonably well for the girls and for the Union, too. You learn a tremendous amount of history in the film – church, economics, writing, farming, Universities, modes of travel, weather, fashions and  Christmas breakfasts and dinners for rich and poor. Oh, the piano, too. It is a enchanting look at life in the mid 19th century and it is full of worry and travail, too, just as life is today.

 

Director Gerwig gets superb performances from her village of actors. The star of the story, writer Jo, is played with wonder and awe by Saoirse Ronan, who keeps all on their seats as she barrels through life. Other fine performances are by Emma Watson as Meg, Elilza Scanlen as Beth and Florence Pugh as Amy.  Laura Dern is America’s mom as their mother.  Bob Odenkirk, musket over the shoulder, is the dad and Meryl Streep is the very rich and very eccentric aunt. 

 

The film starts after the war with Jo trying to sell her work and then flashes back to the start of the war and the abandonment of the girls when dad marches off to war. The story is both Jo’s hard work as a writer and the life of the girls together with a few tragedies tossed in. The strength of it is the love of the girls for each other, even though, in anger,  they do awful things to each other.

 

These are big sisters for the little women of the story.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173974 https://historynewsnetwork.org/article/173974 0
Somewhere Over Their Rainbows - Deanna Durbin and Judy Garland

 

Do you remember actress Deanna Durbin? If so, you are one of the few.

 

How about Judy Garland? Well, of course you do - Dorothy from Oz.

 

In the late 1930s and 1940s, Deanna and Judy, just teenagers, were two of the biggest stars in Hollywood. Deanna was not only a superb actress, but as a singer had the voice of an angel. Judy had, well, Judy had it all.

 

Judy stayed in Hollywood, led a tragic life and died of a drug overdose at 47. Deanna fled the bright lights and cameras at age 29, stunning the world, moved to a farm house in France, became a recluse and never appeared in another film. Over the next 62 years, she only gave one single media interview. Two careers, two lives and two distinctly different stories.

 

Actress/singer Melanie Gall has merged the two stories into one, Ingenue: Deanna Durbin and Judy Garland, and the Golden Age of Hollywood. The one woman play just opened at the Soho Playhouse, on Van Dam Street, in New York. Gall, who also wrote the drama, has done a fine job. It is an eye opener of a tale and an absolute treasure chest of show business history. Gall plays Deanna and brings in the story of Judy in an interview with an invisible New York Times reporter. It is Deanna’s story, not Judy’s, and she sings Deanna’s music and not Judy’s (except for Somewhere Over the Rainbow).

 

I knew a little bit about Durbin, the Canadian born singer who rocketed to fame by the age of 15, but not much. Nobody knows very much about her. When Durbin fled Hollywood, she not only never appeared as an entertainer again but pulled the plug on most of her American movies and you can hardly see them any more (ironically, a Durbin movie and Garland film were playing at the same time on television last week).

 

The legend was that the two, who starred in a movie together in 1936, were lifelong bitter rivals, but really, they were not. There may have been some jealousy between them, but I doubt they were enemies. Gall, in her story, suggests that latter version, and points out that Garland thought Durbin was shortsighted in leaving the movies and wished she had remained.

 

Gall tells a fascinating and colorful story. Durbin came to Hollywood as kid, like so many others, but had a great voice and won a $100 a week contract with MGM. There, she met Garland and the two became close friends. Movie mogul L.B. Mayer did not think he needed two child stars, so he fired Durbin (the play suggests that might have been accidental and he may have wanted to boot Garland), Durbin, at her new studio, Universal, became famous right away and her first few pictures were so successful that they saved the studio (Deanna starred in  21 movies in her storied career). Judy caught fire with The Wizard of Oz and became immortal. The rumor was that MGM wanted Deanna for the role of Dorothy in Oz and that she auditioned for it, but refused it because Judy wanted it

 

Gall tells the audience that Durbin was probably a better singer, but Judy had more hits. However, film historians seem to agree that in that era Durbin was one of the most beloved actresses in the world. In 1947, she was not only the highest paid actress in Hollywood but the highest paid woman in America.  That year her fan club was the biggest on earth. American GIs in World War II even named a bomber after her. The Metropolitan Opera was even after her to join its company. Deanna was also Winston Churchill’s favorite actress.

 

So why did Durbin become a recluse? Gall says she was tired of Hollywood, found fame tedious and wanted to live a normal life. That can’t be all of it, though. Others say she hated the studio system of dictatorial control of a performer’s life and thought her life was over at 29, as it was for many actresses, and hated never being cast in very serious roles (the directors always had her singing something somewhere in the script).

 

Gall is quite good playing Durbin and she is a superb singer. The problem with the play is that It is a play abut Durbin and Garland without Garland. It would be much better as a two-character play and it should be a bit longer (it’s just a little over an hour). Gall carries the play well, but you really need a richer story and more nuance about Judy’s life.

 

Also, nowhere in the play is any reference to Ray Bradbury’s The Anthem Sprintersa delightful short story about a Deanna Durbin movie screened in Ireland followed by a race of moviegoers to a pub before the cinema starts to play the Irish national anthem at the conclusion of the film.

 

The story needs a far better explanation of why Durbin fled Hollywood. What did her friends say? Show biz buddies? Neighbors? Family? How did her rather wild personal life (three marriages and two out of wedlock pregnancies) affect her?

 

She is far better known today in the United Kingdom and Europe that in the U.S. because the actress never cut off her films there. In fact, there is still a ‘Deanna Devotees’ fan club in England.

 

The best part of the play, for me, was the question and answer session at the end. Gall is an authority on both women and researched their lives thoroughly. She really illuminated their lives by just answering audience questions. It was there, in that Q and A session, that she dropped her bombshell. It seems that back in the early 1950s, when Durbin had been retired for a few years, that the writers of the still untested MyFair Lady went to see her to convince her to play Eliza Doolittle. She flat out refused. That role, of course, would have made her famous all over again, an international superstar, a brilliant comet racing across the show business sky.

 

The strength of the play is its show biz history. You get a wonderful education in how the old movie studio system worked, how child actors were educated at special studio schools how stars had homes built for them right on the film sets. Impressive money could be made, too. The play should be subtitled Show Biz History 101.

 

If you ever notice that one of Deanna Durbin’s movies is on television, a rarity, catch it. This girl could sing!

   

PRODUCTION: The play, originally staged in the Fringe Festival is produced by the Soho Playhouse. Piano and musical arrangements: Bennett Paster, Graphic Design: Christache Ross. The play had a three week run and will be staged again at some U.S. theater.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173975 https://historynewsnetwork.org/article/173975 0
Profiles in Courage and Cowardice Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

The votes in the House and soon in the Senate about impeaching Trump are mostly seen as foregone conclusions. It was clear that he would be impeached on both articles in the House, and that two-thirds of Senators will not vote to convict. Hidden in these outcomes are many individual dramas for the few members of Congress who position themselves near the middle, who represent districts where elections are in doubt. Their votes represent more than partisan loyalty – they display courage or its absence.

 

Democrat Elissa Slotkin represents a House district in Michigan that had long been in Republican hands and was won by Trump in 2016 by 7 points. She beat the incumbent in 2018, winning just 50.6% of the vote. Like all the Democratic House members who won in districts that had gone for Trump, she worried about how her vote would affect her chances of re-election. She read our founding documents at the National Archives and spent a weekend at her family farm reading the hearing transcripts. As she appeared at a town hall meeting in her district last week, she was jeered by Trump supporters before she said a word. When she announced that she would vote for impeachment, she got a standing ovation and shouted insults.

 

She explained herself in an op-ed in the Detroit Free Press:  “I have done what I was trained to do as a CIA officer who worked for both Republicans and Democrats: I took a step back, looked at the full body of available information, and tried to make an objective decision on my vote.” She also faced the consequences squarely: “I’ve been told more times that I can count that the vote I’ll be casting this week will mark the end of my short political career. That may be. . . . There are some decisions in life that have to be made based on what you know in your bones is right. And this is one of those times.”

 

Of the 31 House Democrats who won in districts that Trump carried in 2016, 29 voted to impeach him. Collin Peterson of Minnesota, from a district that Trump won by 31 points, voted against. And then there’s Jeff Van Drew, first-term House member from New Jersey, who won in a district that has flipped back and forth between parties, and was won by Trump by 5 points. His 2018 victory was aided by considerable financing from the Democratic Congressional Campaign Committee. In November, he said in a teleconference that he was against impeachment, but vowed to remain a Democrat, which he had been his whole life: “I am absolutely not changing.” Then he saw a poll of Democratic primary voters, in which 70% said they would be less likely to vote for him if he opposed impeachment. Meanwhile, he was meeting with White House Republicans, who promised him Trump’s support. So he voted against impeachment and became a Republican. The next day, Trump asked his supporters to donate to Van Drew’s campaign. Van Drew’s Congressional staff resigned en masse. He told Trump in a televised Oval Office meeting, “You have my undying support.”

 

Somewhere in the middle between political courage and its absence lies the case of Jared Golden of Maine, whose successful 2018 campaign to unseat a Republican incumbent I supported. His district went for Trump by 10 points in 2016, and Golden won only because the new Maine system of ranked-choice voting gave him enough second-place votes to overcome his rival’s lead. His re-election certainly qualifies as endangered.

 

Golden took a unique approach to impeachment, voting for the first article on abuse of power, but against the second on obstruction of Congress. He said that Trump’s obstruction of Congress “has not yet, in my view, reached the threshold of 'high crime or misdemeanor' that the Constitution demands.” Golden wrote a long statement explaining his actions, arguing that House Democrats had not yet tried hard enough to get the courts to force Trump’s aides to testify.

 

I cannot judge Golden’s motives. He said, “I voted my heart without fear about politics at all.” Perhaps his heart feared the end of his political career.

 

But it is worth considering how Trump has defied Congress since he was elected. When Congress refused to appropriate as much money as he wanted to build his Wall, Trump decided to spend it anyway by declaring a “national emergency”. According to the Constitution, only Congress has the authority to decide how to spend taxpayer funds. Federal courts then blocked Trump’s use of other funds. Trump’s lawyers argued that no entity has the authority to challenge in court Trump’s extension of his powers. In July, the Supreme Court sided with Trump and allowed spending for the Wall to proceed.

 

Trump’s defiance of Congressional oversight began long before the impeachment crisis. In February, the administration refused to send Congress a legally required report about the murder of Jamal Khashoggi by Saudi operatives. A Trump official said, “The President maintains his discretion to decline to act on congressional committee requests when appropriate.” In April, he told a former personnel security official not to appear before the House Oversight Committee, which was investigating White House security clearance practices. That month, the Justice Department defied a bipartisan subpoena from the Oversight Committee investigating the addition of a citizenship question to the 2020 Census.

 

Robert Mueller found many instances of Trump’s obstruction of justice in the Russia investigation. Mueller declined to conclude that Trump had committed a crime, only because of a Justice Department memo that claims temporary immunity of a sitting president from prosecution. He clearly pointed toward impeachment as a remedy, and the House impeachment committees considered putting those actions into an article of impeachment. They decided not to, in order to simplify the process.

 

There are many other examples. Jared Golden’s idea that the House should wait and pursue their requests through the courts ignores the unprecedented nature of Trump’s refusal to do anything that the Democratic House requests or demands. It makes no sense to treat each instance of obstruction as a separate judicial case, which makes it impossible for Congress to do its job. Jeffrey Toobin of the New Yorker wrote, “Trump will create a new constitutional norm—in which the executive can defy the legislature without consequence.”

 

When John Kennedy wrote (or just put his name on?) Profiles in Courage, he quoted a column from Walter Lippmann, who had despaired of any courage among elected politicians: “They advance politically only as they placate, appease, bribe, seduce, bamboozle, or otherwise manage to manipulate the demanding and threatening elements in their constituencies. The decisive consideration is not whether the proposition is good but whether it is popular.” Yet historian Jon Meacham and political science PhD candidate Michael E. Shepherd write that many Congresspeople who took unpopular votes survived.

 

The great majority of young House Democrats who face difficult re-election campaigns in Trump districts acted courageously. Elissa Slotkin explained what courage looks like: “Look, I want to get reelected. The greatest honor of my life is to represent this district. But if I’m not, at least I can look at myself in the mirror when I leave office.”

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/blog/154296 https://historynewsnetwork.org/blog/154296 0
Confederate Monuments in National Perspective

Confederate Monument in Pittsboro, North Carolina

 

The recent removal of a Confederate memorial in Chatham County, North Carolina, indicates that communities continue to discuss the future of monuments to white supremacism despite the efforts of some state legislatures to suppress municipal governance. The displaced Pittsboro soldier statue, produced by the W. H. Mullins Company of Salem, Ohio, serves as a reminder that Confederate monuments took shape in commercial and cultural contexts that extended beyond the former slaveholding states. Northern commemoration introduced the martial metaphors of social regulation that white southerners adapted to local settings, though some Union monuments instead demonstrated the more inspiring potential of war memorials. 

 

My new book, Civil War Monuments and the Militarization of America, presents a nation-wide examination of this important cultural form. Extremely rare before the Civil War, public monuments to common soldiers radiated primarily from the Northeast in the 1860s. Some early memorials marked places of interment, but more were community cenotaphs for local soldiers whose remains were collected in distant cemeteries. A new surge of production accompanied the invigoration of the veteran movement in the 1880s. In the North as well as the South, monuments proliferated dramatically until World War I and more slowly over the next quarter-century. The tributes increasingly honored all soldiers from a community—like the 1907 statue in Pittsboro--rather than singling out the dead. The North installed more monuments than the South, and considerably more of the memorials that most fully explored the possibilities of the practice.

 

Civil War monuments were the chief instrument by which the soldier replaced the farmer as the prototypical American citizen. The substitution of a military exemplar signaled a forcible insistence on the labor and racial orders hardening in this age of inequality. The factory worker was the true successor to the farmer as the characteristic laborer of the industrial era, but the soldier personified the conformity and vigor that monument sponsors expected from ordinary citizens. “The glory of the private soldier of the South was that, an intelligent unit, he permitted himself, for duty and love, to be made into the cog of a wheel," declared industrialist Julian S. Carr at the 1917 dedication of a memorial in the manufacturing town of Rocky Mount, North Carolina. The high-profile deployment of troops to break strikes across the North during the late nineteenth century underscored the tension between the paragon of obedience and workers’ challenges to industrial hierarchy.

 

Celebration of proslavery rebellion in a subsequent society defined by segregation, disfranchisement, racialized incarceration, and lynching certainly distinguished Confederate monuments from their Union counterparts, but exaltation of whiteness was also important to northern memorials. Protests against bronze or stone soldiers based on Irish, Italian, or German models illustrated ethnic prejudices common amid the mass immigration to the North. Some Union memorials incorporated pseudo-Darwinist ideology. The well-publicized model for the allegorical figure on the 1913 memorial in Adams County, Indiana, was Margaret McMasters van Slyke, whom Physical Culture editor Bernarr McFadden had judged Chicago's "most perfectly formed woman" as measured by the features that the eugenicist considered essential to sexual reproduction of the fittest national race.

 

Theo Alice Ruggles Kitson’s Volunteer, unveiled in Newburyport, Massachusetts, in 1902 was an especially elaborate example of this pattern. The acclaimed statue drew on studies of marching by Edward H. Bradford, professor of orthopedics and later dean at Harvard Medical School. In this conceptualization, experienced soldiers eschewed the straight-leg, heel-to-toe gait of urban pedestrians and instead leveraged gravity in a bent-leg, full-foot stride characteristic of “moccasined and semicivilized nations.” The veteran embodied the supposed power of civilized whites to absorb the strengths of “inferior” races. Like other  innovations in the design of soldier monuments, The Volunteer prompted reproductions and imitations across the country. Kitson would go on to produce the most influential common-soldier statue of the Spanish-American War, The Hiker, first installed at the University of Minnesota in 1906. Building on the conjunction of martial training and whiteness, The Hiker explicitly honored full-time regulars of the U. S. Army, rather than the temporary volunteers who dominated Civil War monuments, and depicted the hero in the imperialist project of “hiking,” which, as Sarah Beetham has noted in an excellent dissertation on common-soldier monuments, referred specifically to hunting insurgents in the mountains of the Philippines. This monument, too, was reproduced across the North and South.            

 

Scholarly discussion of northern contexts for Confederate commemoration has centered on the level of resistance or deference to the Lost Cause. Community memorials regarded Union soldiers as, in the oft-quoted words of James Garfield, “everlastingly right,” much as sponsors of Confederate monuments insisted that “we cannot praise and commend the martyr and at the same time condemn the cause for which he gave his life.” But over time, many northern memorials shifted from Union monuments to soldier monuments, indicative of a general military readiness and routinely supplemented to honor combatants in earlier or later wars. Extending this logic, a few northern sponsors eager to reach national audiences dismissed the moral gulf between blue and grey. Yale listed its Union and Confederate dead on the university memorial installed in 1915. Princeton did not even identify the sides on which its former students fought. Although unusual, these tributes illustrated a widespread attempt to depoliticize soldiering. Northern monuments established a foundation for the argument that Confederate military service deserves recognition although rendered in a deplorable cause.

 

Union monuments contributed significantly to the disturbing ideology of Confederate monuments but also pointed toward higher potential in war memorials. Postwar cenotaphs, committed to preserving the individuality of lost neighbors, pioneered the interchangeability of names and bodies that Thomas Laqueur has called the hallmark of the age of necronominalism. Focused on the tensions between intimate realms and national mobilization, these works created the space for "personal reflection and private reckoning” that Maya Lin later sought in the Vietnam Veterans Memorial. Early memorial halls—designed to serve as educational buildings, public libraries, or centers of municipal government—envisioned an engaged, informed citizenry as the basis for military voluntarism in times of crisis, rather than the aggressive ideal of masculinity promoted by later monuments. The list of successes might be extended. Much as the large area of overlap between Union and Confederate monuments provides valuable context for current discussions, so do distinctive features of memorials on either side.

 

For more by this author, read: 

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173917 https://historynewsnetwork.org/article/173917 0
Barbarous Hun: The Sinking of the Lusitania and the Rise of Propaganda

 

On May 7, 1915, the British passenger ship Lusitania was hit by repeated torpedo attacks emanating from a German u-boat off the coast of Ireland. Of the 1,960 passengers and crew on board, only 767  survived. 128 of the deceased were American citizens. Europe was already in the midst of World War I, but the U.S. still hadn’t entered the fray despite hundreds of American men and women volunteering to aid the Allied war effort and even enlisting in the Canadian military. However, the American public at large, including Congress, was deeply divided over if the country should formally declare war. The sinking of the Lusitania and the propaganda campaign that resulted from the event quickly turned public opinion in favor of war. 

 

Officially launched in 1906, the Lusitania was a British passenger liner operated by Cunard Lines and at the time of her sinking, the ship was one of the largest and fastest of its kind in the world. The construction of the ship was funded by the British Admiralty with the agreement that the ship could be used for military means in times of war. According to historian Willi Jasper in his book Lusitania: The Cultural History of a Catastrophe, on May 1, 1915, the Lusitania departed New York City’s harbor for Liverpool, England with 1,951 people and military munitions on board. Some passengers must have known the journey held some risk. German submarines had recently sunk several British ships traversing the Atlantic and the German military made repeated threats that they would attack other ships. 

 

As the Lusitania approached the southern coast of Ireland, heavy fog forced the ship to slow down. There, a German submarine (U-20) fired a torpedo which hit the Lusitania’s hull, just below the watermark. This triggered an internal explosion, forcing water into the interior that hastened the ship’s tilt, preventing the crew from launching all but six of the lifeboats. The loss of life was staggering: 1,195 people drowned or died of hypothermia in the frigid Atlantic water. 

 

Reactions varied as news broke of the sinking of the Lusitania. The Allies quickly decried the attack as a murderous plot hatched by the barbarous Huns (a term used heavily during World War I  often seen on Allied war posters). British and U.S. war advocates quickly turned the tragedy into pro-war propaganda and created everything from postcards, to medals, to posters. The shipwreck and subsequent jingoistic campaigns damaged U.S.-German relations and tilted the neutral United States into an open confrontation with Germany in World War I by 1917.

 

Germany’s response to the event was equally belligerent. Following the sinking of the Lusitania, Munich-based artist Karl Goetz created a medal commemorating the event. In a satirical fashion, Goetz mocked British and U.S. business interests, the carrying of munitions aboard the ship, and the supposed neutrality of the United States. However, according to author Patrick O’Sullivan, Goetz made a fatal error that quickly and ironically backfired. Goetz ‘s medal noted the date of the sinking of the ship as May 5, 1915, a full two days prior to the actual event. This error led to charges that Germany had planned the event which further angered a bewildered public and provided fodder for propaganda tactics to exploit the deaths even further. The British and pro-war Americans quickly exploited this error by mass-producing their own version of the medal which came with accompanying propaganda literature. Captain Reginald Hall of Royal Naval Intelligence ordered 300,000 copies to be minted, which were widely distributed and only added to the growing anger among the British public. 

 

Literature and posters were also produced, feeding on the propaganda bonanza. A U.S. poster, for example, advocated for the purchase of additional Liberty Bonds with the text, “Help Crush the Menace of the Seas,” with an accompanying image of a blood-soaked arm holding a dagger emerging from the Atlantic. Fred Spear, of the Boston Committee on Public Safety, produced a powerful and emotionally stirring poster depicting a young mother clutching her infant sinking below the waters of the Atlantic Ocean with only one accompanying word, “Enlist.”

 

British posters depicted the Germans as bloodthirsty devils, marauders, and murderers. A popular British poster, commonly known as the “Freedom of the seas,” mockingly portrayed the German version of the principle of the laws at sea, as the Lusitania sinks in the background. Numerous others simply carried the message, “Remember the Lusitania,” in an effort to swell the ranks of the enlisted. 

 

The sinking of the Lusitania was, at the very least, a public relations nightmare for Germany. Despite arguments that the attack on the vessel was fully warranted, the grave loss to civilian lives on board provided fodder for propaganda machines in Great Britain and in the U.S. Although the event and the subsequent propaganda wasn’t the final trigger sending the U.S. into open hostilities with Germany, it severely affected public opinion in favor of war and further energized a war-weary public in England.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173913 https://historynewsnetwork.org/article/173913 0
Important Links for AHA 2020 New York, NY, January 3-6, 2020

Home Page

Registration Location and Hours

Online Program

Mobile App

Late Breaking Sessions

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173966 https://historynewsnetwork.org/article/173966 0
Why Holocaust Education Is So Important Today

 

One of the many difficult lessons the Holocaust has taught us is that Jews need not be influential or numerous in a country to give rise to anti-Semitism. According to the United States Holocaust Memorial Museum, Jews comprised only 1% of Germany’s population on the eve of the Nazi rise to power in 1933 (505,000 of 67 million people). Yet this small group of German citizens were singled out as an ideal scapegoat for all of the country’s woes following WWI: its defeat in the war, the staggering inflation and high unemployment rates, and the humiliations imposed on Germany by the Treaty of Versailles.  

 

In difficult historical or economic conditions, people often stigmatize and dehumanize those they disdain and distrust. Without adequate education about the past and discussion of the dangers of anti-Semitism, many believe that history could repeat itself. This is why Holocaust education is crucial to dispelling the fear, anxiety and ultimately hatred of the Jews. Unfortunately today, as the last Holocaust survivors pass away, we risk losing touch with the human-caused catastrophe that nearly wiped the Jewish people off the face of the Earth, increasing the risk of rampant anti-Semitism.  

 

In fact, there seems to be an inverse proportion between knowledge of the Holocaust and anti-Semitism. In February 2019, Schoen Consulting conducted a survey at the behest of the Conference on Jewish Material Claims against Germany. The survey indicated that there is a serious deficiency in knowledge about the Holocaust among US adults and that fewer people care about the Holocaust than they did in the 1950s through 90s. 70 percent of those polled believe that fewer people care about the Holocaust today than in the past and nearly 60 percent believe that the Holocaust could happen again. 

 

According to the Schoen findings, one third of Americans, and a staggering 40 percent of Millennials, believe that substantially fewer than 6 million Jews were killed in the Holocaust (they mistakenly consider the figure to be closer to 2 million). Half of the respondents could not name a single concentration camp or Jewish Ghetto among the 40,000 camps and Ghettos across Europe. Stunningly, 41 percent of older adults and 66 percent of Millennials hadn’t heard of Auschwitz, the largest and most notorious concentration camp. Moreover, 80 percent of US adults had never visited a Holocaust museum. Despite these serious gaps in their historical knowledge, the vast majority of the Claims Conference poll respondents—80 percent--believed that education about the Holocaust could help prevent such genocides in the future.

 

The Never Again Education Act is an effort by a bipartisan group of US legislators in both the House and the Senate to promote Holocaust education. Several senators introduced a bill that would help fund and encourage Holocaust education programs in American schools. Jacky Rosen, a Democratic senator from Nevada, spearheaded this bill in 2016. The bill stipulates combining private donations as well as federal and state funding for the Holocaust Education Assistance Program Fund. The program would help pay for training teachers and guest speakers on the Holocaust, cover the cost of textbooks, as well as fund the transportation and housing for teachers to attend conferences and seminars about the Holocaust. 

 

This bill found bipartisan support in the Senate from Republican senator Ted Cruz (Texas), Marco Rubio (Florida), Kevin Cramer (North Dakota) and Democratic senators Tim Kaine (Virginia) and Richard Blumenthal (Connecticut). The bill also has strong proponents in the House of Representatives. Sponsored by Representative Carolyn B. Maloney (Democrat, New York) and Elise Stefanik (Republican, New York), the bill was introduced in the House of Representatives on April 10, 2018 and has gained 209 bipartisan supporters. 

 

The Never Again Education Act would fund and facilitate Holocaust Education in every state in the US. So far several states have already implemented some local codes or guidelines that require that information about the Holocaust and other genocides be taught in public schools, including California (1985), Illinois (1989), New Jersey (1991), Florida (1994), New York (1994) and,  more recently Maryland (2019). In the wake of an alarming rise in anti-Semitic domestic terrorism and attacks on Jewish centers and synagogues, I am glad to see that legislators across the country see the urgent need for a more in-depth, national program of Holocaust education. 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173922 https://historynewsnetwork.org/article/173922 0
Roundup Top 10!  

Why We Need Walt Whitman in 2020

by Ed Simon

With our democracy in crisis, the poet and prophet of the American ideal should be our guide.

 

How America went from Barack Obama to Donald Trump in one head-spinning political decade

by Kevin M. Kruse and Juian Zelizer

Obama’s America and Trump’s America feel like separate nations, but they are bound by Newton's law: For every action, an equal and opposite reaction.

 

 

The case for historians being more engaged in public affairs, not less

by Adam Laats

Americans need to not just study history, but to better understand why our knowledge of the past is ever-evolving.

 

 

Historians Should Stay Out of Politics

by Andrew Ferguson

The lasting question the petition raises is: “So what?” Put another way: “Should we care what these historians think about Trump’s impeachment?”

 

 

Making Room for Miscarriage

by Lara Freidenfelds

In my new book The Myth of the Perfect Pregnancy: A History of Miscarriage in America, I set out to write the history of our modern culture of childbearing in the US, and how it has produced our expectation of perfect pregnancies.

 

 

The Forgotten Story of Christmas 1918

by Mary Elisabeth Cox

We remember the 1914 Christmas Truce as a moment of humanity amid war. Four years later, a darker tale unfolded.

 

 

The Fight Over the 1619 Project Is Not About the Facts

by Adam Serwer

A dispute between a small group of scholars and the authors of The New York Times Magazine’s issue on slavery represents a fundamental disagreement over the trajectory of American society.

 

 

Why demagogues were the Founding Fathers’ greatest fear

by Eli Merritt

Washington’s greatest fear that summer of decision in Philadelphia was that unwise, self-seeking politicians — even if fairly elected to public office — would tear down the central government and its constitutional laws for the sake of their own advancement and glorification.

 

 

The Power of Chilean Women Denouncing Gendered and Sexual Violence Through Song

by Margaret Power

For the last two years and a half women students and faculty have exposed and denounced the rampant sexual abuse and sexist education that take place in Chilean schools and universities.

</

 

Can remembering past atrocities prevent future ones?

by Jennifer Evans and Paula-Irene Villa Braslavsky

How a recent art installation in Berlin sparked controversy.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173965 https://historynewsnetwork.org/article/173965 0
Great Britain’s Secret Role in Prodding a Reluctant U.S. to Superpower Status

 

“Unfair,” “obsolete” and “rip-off” are some of the terms President Trump has used to characterize our nation’s international alliances and defense pacts. 

 

The suspicion of foreign “entanglements” is nothing new; isolationism dominated American politics before World War II. In 1939, American had an army of just 185,000 men and a defense budget smaller than Italy’s. 

 

America’s rapid emergence as a global superpower after 1945 is the subject of Grand Improvisation: America Confronts the British Superpower 1945-1957. Author Derek Leebaert challenges the conventional wisdom that an exhausted Great Britain voluntarily “handed the baton” of world leadership to the U.S. after World War II. 

 

According to Leebaert, it is a “myth” that in 1945 the British Empire “was too weak and too dispirited” to continue its role as global policeman. In Grand Improvisation, he carefully documents a different narrative, arguing persuasively that from 1945 to 1956, Great Britain desperately tried to maintain its role as head of the world’s greatest empire, despite massive war debts and an electorate focused on major domestic reforms.   

 

The author points out that Great Britain, contrary to a popular misconception, survived World War II in good shape. The island nation emerged victorious, retained every inch of its pre-war territory and almost all of its industrial resources. In world of 2.3 billion people in 1945, more than 600 million were subjects of the King of England. Leebaert notes that “A quarter of the map was colored in imperial red” including Egypt, India, South Africa, Hong Kong and Singapore. 

 

In the election of June 1945, Labour Party leader Clement Attlee swept Winston Churchill out of power with a stunning victory. While the new Labour government is remembered principally for its major reforms including creation of the National Health Service and nationalization of the coal and steel industries, it was also dedicated to maintaining Britain as a major world power. The Labour government authorized building of Britain’s own atomic bomb (with plutonium from an advanced type of atomic power plant) and invested in daring military aircraft.

 

While the U.S. reduced its armed forces by 80 percent in the year after August 1945, Britain still had 1.1 million men under arms in December 1946; the British leadership was worried that America would retreat into the isolationist policies of the 1930s. The British foreign policy leadership, headed by Prime Minister Attlee and Foreign Secretary Ernest Bevin and backed by the nation’s elite corps of Foreign Service officers, sought to keep the U.S. directly involved in Europe and Asia while maintaining the economic and political power of its own Commonwealth. 

 

According to Leebaert, Attlee and Bevin cunningly set out to play a “double game.” The British leaders sought to inveigle the Americans into providing substantial military assistance and economic support for Britain’s foreign policy objectives, while they played the role of senior partner. 

 

The first example he cites involved the 1947 crisis in Greece. Since the war, the British had pumped economic assistance into Greece and stationed 5,000 troops there, propping up a shaky monarchy fighting a communist-backed guerrilla army. Great Britain, struggling to pay off its debts and maintain the value of the British pound, was desperate to withdraw its troops and cut economic assistance, yet still remain a dominant power in the Mediterranean. 

 

Bevin, who had been a powerful union leader before the war, used what he privately admitted were “shock tactics” to scare the American leadership into committing massive military aid to Greece.

 

In a daring bluff, Bevin told the U.S. leaders Britain would imminently withdraw its troops and cut aid. This frightened President Harry Truman and Secretary of State George Marshall into announcing a new foreign policy. President Truman went before Congress on March 12, 1947 to announce a new policy whereby the U.S. would “support free peoples” around the world. This new responsibility was necessary, Truman said, because Great Britain was “liquidating” its role. In fact, Leebaert states, the U.S. president had been misled — Bevin would have kept troops in Greece even if the U.S. had not pledged support. 

 

A year later, the U.S. initiated the Marshall Plan, which pumped some $100 billion (in current dollars) into Western Europe. In Leebaert’s view, this was another success by the crafty Brits in tapping Uncle Sam’s checkbook to support their area of influence. 

 

After achieving his goals in Greece, Bevin sought additional American commitments, particularly in Europe, where the Russians were tightening control over the Soviet bloc and deployed strong forces in East Germany. Bevin’s concern was that the Americans were not formally committed to protecting Western Europe. He wanted to get the U.S. to “specify how, when and under what terms” they would send troops to oppose Russia. Thus, he quietly inserted two paragraphs into a draft treaty that would become NATO. This was the famous “Article 5” which committed all signees to respond together in cast of an attack on any one of them (a commitment that Trump has said he may not honor). 

 

The British effort to maintain the illusion that they still controlled an empire and were, in effect, co-equal world leaders with the U.S. was maintained until the Suez Crisis of 1956. 

 

After Egyptian dictator Gamal Nasser nationalized the Suez Canal in July 1956, France, Britain and Israel secretly conspired to invade the canal zone and return it to European control. The result was a military debacle. President Eisenhower, who had not been consulted, was furious and cut-off U.S. oil supplies to the three invaders. The United Nations voted overwhelmingly to condemn the action. Under fire for initiating a humiliating military defeat, Prime Minister Anthony Eden (who had succeeded the elderly Churchill) resigned in early 1957. 

 

The Eisenhower Administration led by Secretary of State John Foster Dulles and Vice President Nixon, soon made a series of speeches that were hailed in the U.S. press as “Eisenhower’s Declaration of Independence” from British foreign policy. Nixon would later cite this moment as point were the U.S. took over “the foreign policy leadership of the free world.” 

 

Leebaert notes that the term “superpower” is frequently used today, but often misunderstood. He reports that William T.R. Fox, a Columbia University professor of international relations, coined the term “superpower” in 1944. Fox said it defined a nation that wielded “great power plus great mobility of power.” In other words, the ability to project military power around the world via land, air and sea power. The U.S. achieved superpower status in the 1950s and the Chinese are working hard to achieving it for their nation now.  

 

But what about soft power, the power of culture the power of ideals?  For the last sixty years, the world has watched American films, listened to American music and dressed in American fashions. America’s struggle with racism and the successes of our nation’s civil rights movement have inspired millions of people on other continents to demonstrate for their own civil and human rights.  

 

Leebaert makes a strong case for this thesis that the British political leaders played a “double game,” but he ignores many other factors that impacted America’s developing world role. 

 

He ignores Dr. King and the civil rights movement and the rapid global spread of American culture. The rise of the right-wing anti-communist movement certainly influenced Eisenhower and Nixon, yet the author dismisses Senator Joseph McCarthy in three paragraphs. 

 

The dictionary definition of improvisation is to “simultaneously compose and perform without any preparation.” Was America’s assumption of the British Empire’s global leadership role really an “improvisation?” Perhaps it was inevitable for a country that experienced a century of huge social and economic progress, capped by the emergence of Wall Street as the capital of international finance.  

 

Leebaert has shown a new light on an important passage in modern American history, but if he had taken a few steps back for a broader perspective he could have illuminated so much more.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173923 https://historynewsnetwork.org/article/173923 0
What a Line Deleted from the Declaration of Independence Teaches Us About Thomas Jefferson

 

In his first draft of Declaration of Independence, Jefferson listed a “long train of abuses & usurpations,” at the hand of King George III. Those, he added, are “begun at a distinguished period, & pursuing invariably the same object.” Those abuses are indicative of “arbitrary power,” and it is the right, even duty, of those abused to throw off such discretionary abuse of authority and establish such government, by consent of the people, in accordance of the will of the people.

 

One passage—by far the longest and intentionally placed, for effect, after all other complaints—was a gripe about George III’s role in the North American slave trade, and it was excised by members of the Congress, because the issue of slavery was a divisive issue and the time was not right for debate on it. Yet it is still of importance to scholars of Jefferson because it tells us much of his thinking on slavery at the time of composition of the Declaration. It also contains a heretofore undisclosed argument, implicit, on George III’s hypocrisy apropos of slavery, and that argument has implications for the hypocrisy of the colonists.

 

The abuses Jefferson limns in his draft of the Declaration are many, at least 25—some complaints he lists are compound claims—and he lists last and devotes the most ink to introduction of slavery into the colonies.

 

he has waged cruel war against human nature itself, violating it’s most sacred rights of life & liberty in the persons of a distant people who never offended him, captivating & carrying them into slavery in another hemisphere, or to incur miserable death in their transportation thither. this piratical warfare, the opprobrium of infidel powers, is the warfare of the Christian king of Great Britain, determined to keep open a market where MEN should be bought & sold, he has prostituted his negative for suppressing every legislative attempt to prohibit or to restrain this execrable commerce: and that this assemblage of horrors might want no fact of distinguished die, he is now exciting those very people to rise in arms among us, and to purchase that liberty of which he has deprived them, & murdering the people upon whom he also obtruded them; thus paying off former crimes committed against the liberties of one people, with crimes which he urges them to commit against the lives of another.

 

First, there is Jefferson’s use of capital letters for the word men. Nowhere else in his draft does he do use capitals. That shows philosophically and unequivocally that Jefferson considered Blacks as men, not as chattel, and that argues decisively against the naïve view, articulated by many in the secondary literature, that the Declaration was not meant to include Blacks. Blacks, qua men, are deserving of the same axial rights as all other men.

 

Second, Jefferson accused the king of Tartuffery or religious hypocrisy. George III is a “Christian king,” yet he is guilty of “piratical warfare”: taking people, who have done nothing to offend him, and conveying them like cattle to America. The king, of course, did not introduce slavery to America, and Jefferson is not accusing George of that. That occurred in 1619, when Dutch merchants brought 20 Africans, perhaps indentured servants, to Jamestown, Virginia. Those who settled in America eventually found Africans to be a cheaper and more abundant labor source than other indentured servants, mostly penurious Europeans, and so the practice continued. Yet the king, Jefferson asserts, has “prostituted his negative”—that is, he has availed himself of none of presumably numerous legislative opportunities for nullifying or even minifying slave trading. George III could have put an end to the transplantation of Africans, but he did not.

 

Third and most importantly for this essay, there is a layered argument in Jefferson’s draft of the Declaration of Independence that has hitherto gone unnoticed by scholars. George III has been implicitly sanctioning the opprobrious institution, which strips men of their God-given rights and makes commercial gains of them without their sanction of will, by refusal to stop the slave trade by use of his negative. While colonists make slaves of the Blacks brought to the colonies, King George, through abuses and usurpations, makes slaves of his colonial subjects. Thus, there are two levels of slaves: colonists, who are not deserving of the same rights and treatments of other British citizens perhaps because of their transplantation, and transplanted Blacks, who are the property of the colonists or the slaves of the Colonial “slaves.”

 

Is that itself significant?

 

It is difficult to say. Jefferson might have in mind two notions. One, introduction of slavery is a means of keeping colonists preoccupied with slaves, so that they will not see that George is making slaves of them. Second, getting colonists inured to the institution of slavery—to men of one kind treating men of another kind as inferiors—will make them somewhat less uncomfortable with being treated as slaves—viz., as men without rights. Such notions, however, are mere speculations.

 

Yet George III then encourages African slaves, Blacks who have been stripped of their humanity by being stripped of their rights, to rise up in revolt against their white masters by joining the British in the Revolutionary War. His inducement is freedom from their insufferably oppressive condition—a condition for which he, through his own refusal to act, is in large part responsible. Nonetheless, by the same argument, the colonists, stripped of their humanity by being stripped of their rights, are entitled to rise up against the king, as George III is implicitly sanctioning a generic argument that any people stripped of their rights have a right to revolt. Thus, the king himself is thereby sanctioning implicitly colonial revolution.

 

In giving birth to the layered argument in the passage and in underscoring the king’s Tartuffery, Jefferson must have often reflected on the hypocrisy of colonists, who had taken in the transported Blacks and accepted them as slaves. That the king might be responsible for the transplantation of slaves to the continent does not exculpate colonists for continuing enslavement.  A person, knowing certain goods to be stolen and accepting them as a gift, is equally guilty and deserving of inculpation as the stealer.

 

Finally, the undue length and the placement of the passage in Jefferson’s first draft are revelatory. There are 168 words in the passage. No other grievance comes near to it in length. That argues for the strength of Jefferson’s conviction that slavery is opprobrious. Moreover, that Jefferson positions the lengthy grievance in the last place is indicative that he considers the grievance to be his coup de grace. 

 

Those things noted, there is something strained in the passage. Carl Becker in his The Declaration of Independence writes: “The passage is clear, precise, carefully balanced. It employs the most tremendous words—‘murder,’ ‘piratical warfare,’ ‘prostituted,’ and ‘miserable death.’ But in spite of every effort, the passage somehow leaves us cold.” It is “calm and quiescent,” lacking in warmth, and fails to move us. Readers get a sense of “labored effort”—that is, of “deliberate striving for an effect that does not come.”

 

Becker is right but fails to recognize the reason: the hypocrisy of the colonists, Jefferson included. He blames the king of sanctioning slavery by not stopping the exportation of slaves to America, but he nowise addresses the issue of the colonists, freed Blacks among them, putting transmigrated Blacks to work as slaves. The guilt here must be shared.

 

Jefferson’s anti-slavery passage, we know, was excised by Congress, and so it did not appear in the Declaration of Independence. The reason was certainly that slavery, widely practiced in the South, was a divisive issue and the Declaration of Independence was to be a pronouncement about which all states would be in agreement. Inclusion of the lengthy grievance, Jefferson should have seen, would have been reason for large dissension among members of Congress that would have led to unneeded controversion. The moment was kairotic and dissention needed to be avoided at all costs.

 

Jefferson expressed contempt that the excised passage was not included in the final draft. He said in notes on the Continental Congress: “the clause…, reprobating the enslaving the inhabitants of Africa, was struck out in complaisance to South Carolina & Georgia, who had never attempted to restrain the importation of slaves, and who on the contrary still wished to continue it. our Northern brethren also I believe felt a little tender under those censures; for tho’ their people have very few slaves themselves yet they had been pretty considerable carriers of them to others.”

 

His hypocrisy aside, Jefferson is to be lauded for articulating his anti-slavery views in his draft of the document, even if the paragraph was axed. By doing so, he was sticking out his neck, so to speak, by placing himself at odds with most others from the South, his own state especially, on slavery. The passage did reach the hands of others in the Congress and Jefferson’s opposition to slavery became widely known by members. In that regard, the excised passage was not then without effect and ought not now to be without effect. Yet today’s scholars often conveniently overlook the risk Jefferson was taking in crafting that passage.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173919 https://historynewsnetwork.org/article/173919 0
"There is a historian in each one of us:" Historian Anna Müller on the Experiences that Have Shaped Her Scholarship

 

Anna Müller is Associate Professor of History, and The Frank and Mary Padzieski Endowed Professor in Polish/Polish American/Eastern European Studies at the University of Michigan-Dearborn. She holds an M.A. from the University of Gdańsk, Poland and a Ph.D. from Indiana University. Dr. Müller is the 2019 recipient of The Polish Institute of Arts and Sciences of America's (PIASA) Oskar Halecki Polish History Award for her book If the Walls Could Speak: Inside a Women’s Prison in Communist Poland (New York: Oxford University Press, 2018). Dr. Müller is President of the Polish American Historical Association (PAHA). 

 

What books are you reading now?

 

I have a habit of reading a few books at the same time. A while ago I had the idea of reading some of Ursula Le Guin’s stories to my 9-year-old son in Polish—my native language and the language that I am hoping my son maintains—so we began reading Earthsea. Needless to say, the language in this book turned out to be too difficult for him, but I continued reading it myself. I never read fantasy books, so this was certainly new for me, but something that I enjoyed tremendously. To my surprise, I easily managed to find myself in this world of wizards, who, despite or because of the powers they possess, have to face their own darkness while searching for some kind of version of truth that would help them figure out who they are. One of the things that struck me in the first story in Earthsea was the belief that one’s individual power is in one’s name. As a historian, I see numerous examples throughout history when taking away someone's name becomes a form of dehumanization, but the story of a power hidden in a name opened to me a new richness of this fact—defining who we are, naming ourselves, and protecting our name gives us strength. 

 

At the same time, I was reading two nonfiction books. One was a book about Pola Nireńska by Polish journalist Weronika Kostyrko. Nireńska was a Polish Jew born in pre-war Warsaw, a modern dancer who managed to survive the Holocaust and immigrate to the United States after the war. She married Jan Karski, a Polish courier of the Polish Underground State who traveled to England and the US during the war to inform the highest American and British authorities about the annihilation of the Jews taking place on the Polish territories. We know a lot about Karski and nothing about his wife. Nireńska struggled her entire life to find her place as a Jewish woman, a modern dancer, and bisexual. As if following the advice of the wizards of Earthsea, she engaged in a constant search for her own name and for her own strength. At various moments of her life, she had to face her demons and her own darkness, which eventually pushed her to take her own life. 

 

The other nonfiction book I was reading at the same time was a kaleidoscope of various writings by American prisoners sentenced to life. Life Sentencesis a book manifesto and a testament to their daily struggles to find meaning in their lives (beyond the structural placement of a prisoner). This book is also about fighting one’s demons while figuring out one’s name—a result of the journey inside themselves that the authors of the book undertook while in prison, a journey that began and continued because they had enough strength within themselves to know who they are and insist on it. And here is a short excerpt from one of the poems included in the book: 

 

“In this cell, 

I’m giving fair ones to all my demons, 

And every nightmare wakes up to worse dreaming, 

But that’s just life in a cell” (In a cell, Malakki)

 

In a sense, we all live in a different version of a cell. 

 

It always amazes me how these various worlds that open themselves up to me in the books I read are able to speak to each other, as if the historical figures that occupy my history texts are looking for some respite in the world that fiction conjures, or vice versa. That’s my imagined world, literary community, conversation of written words, a dialogue between multiple people and authors while I sit and learn from their wisdom. 

 

I should add that it is the men I met inside the prison walls that reintroduced me to poetry. Growing up in Poland I got used to memorizing poetry and expressing myself with poetry. Studying Eastern European history, you live and breath poetry. Committing a poem to memory is like committing to the people who wrote it: to their fears, joys, and the trust that they express when writing a poem. But poetry is also a code that we communicate in: concise and fragmentary, but somehow also dialogic and opening space for the readers to find his or her space. Poetry is essential in life. I thought I had lost this in the US until recently when I began hearing the poems that the incarcerated men create in prison. To quote one of them: “Good things come to people that wait.”

 

What is your favorite history book?

I am not sure I have a favorite history book. There are many books that I would call my favorite at one particular moment; specific books grab me at different points because where ever I am personally at this particular moment makes me especially attuned to the voice of the author. And then that moment passes and a different one emerges. I tend to think of authors that I read as coffee dates with good friends. They tell me their stories and I listen and learn. I have ended up having many good friends.  

 

Recently, I enjoyed A History of the Grandparents I Never Had by Ivan Jablonka, partly because this book is so close to the research I am working on at the moment. Jablonka tried to reconstruct the story of his grandparents, Polish Jews and Communists, who barely left any sources behind. It is a touching story of people entangled in perhaps the most dramatic moments of history: Nazism, the Holocaust, Communism. The book carries a very personal touch; in the end, Jablonka is trying to uncover the history of his own grandparents. Especially touching is the part where he investigates their life in Paris, where they immigrated to before the war and where he was born, lived for a while almost next to one of their Parisian locations, and learned about them from the shadows left by history. He owed so much to people he knew so little about. But more than being a personal story, it is also a story of the stubbornness of a historian to show the world that he can recreate the past and the fabric of his grandparent’s life despite a lack of concrete or direct sources. 

 

Another one of my recent favorites is a book by Kate Brown, Dispatches from Dystopia: Histories of Places Not Yet Forgotten. This work is another historical journey, this time into forgotten places, but also into an investigation of how our bodies and minds and sites interact, how time and space matter, how bodies play a function in history—bodies that we tend to ignore mostly due to a lack of language to speak about them—and finally how significantly our research is embodied. 

 

I have the privilege to teach for the Inside-Out Prison Exchange Program at the Macomb Correctional Facility in Michigan. Inside-Out facilitates college classes in prison, where outside students can meet with a group of incarcerated men or women. We sit in a circle and discuss books, ideas, life, individual expectations, and social responsibility. A group of amazing and inspiring individuals made up of Inside-Out alumni, called the Theory Group, help with classes as teaching assistants and also form a group that meets to discuss books separate from faculty-led class periods. One of the books that we recently read was The Body Keeps the Score about how much everything that happens around us affects our bodies. I was unable to participate in the discussion when they read it, but I was reading it along with them. 

 

These two books (The Body Keeps the Score and Dispatches from Dystopia) made me realize how embodied our research, or perhaps life experiences, are—an obvious statement, I know. But everything in prison reminds me of how much our physicality plays a role in our lives, or how much of our past is written in our bodies. For example, the security checks that we experience to be allowed to enter prison used to awake some of my fears; for example, my documentation being processed under the vigilant eye of an officer was somehow analogous to crossing the borders in Eastern Europe before the fall of Communism or more recently to my going through security checks at American airports after flights back from Europe (and often taken to a back room for additional questioning). But at the same time, this prison space makes me more receptive to understanding the meaning and value of small gestures, such as a handshake. And of course, it brings to the forefront of my thoughts the entire complicated problem of race relationships, which I was really unaware of prior to coming to the United States. 

 

The concept of humanity gains a very different meaning in prison. So, I could say that as a historian I am trying to experience the present as much as possible in order to understand the past. But also vice versa, what I know about the past gives me an insight into the present, while my various experience as a Polish immigrant (and a bit of an outsider), mother of two great kids, college teacher, and a volunteer in prison keeps me very grounded in the present and do not allow me to lose sight of how rich and wonderful it is to discover people of the present so I do not get lost in the past. To follow Kate Brown, “I am in a place to see, hear, smell, and touch, and there to take a stand.”

 

Why did you choose history as your career?

I am not sure I chose it, or at least I don’t recall the moment when I made the decision. I think it was history that chose me. I grew up in the amazing town of Gdańsk, Poland, in the 1980s, a time when history, written with a capital H, was happening around me—on the streets, in school, during conversations with my schoolmates. It was happening literally in my yard. Gdańsk was the birthplace of the Free Trade Union Solidarity, which in the long run contributed to the fall of Communism in Poland and the eastern bloc. Regardless of what Americans might think about life in the Eastern bloc, my childhood was full of unbridled joy, but also full of things I had trouble understanding—fires and water cannons in the streets, people running in different directions and hiding in staircases. 

 

With time came a desire to put it all into some kind of sequence, narrative, or story that would provide an explanation. And with that came a joy of participating in something important and curious. These were my formative moments. History was pulling me in with incredible force. All I wanted was to understand the changing reality, and history seemed to hold the keys to understanding. Hence, it was the time, but also the space. Gdańsk—I had the privilege to be born in a place where history speaks to us at every corner. And to fully recognize myself as Gdańszczanka, to gain that name, I had to learn history. For me, history was an existential necessity. And Gdańsk will forever be my home. Or at least one of them. 

 

What qualities do you need to be a historian?

There is a historian in each one of us. Once we begin asking questions of why the past unfolds in a given way, we become historians. I think a historian should be curious and open, but aren’t these qualities necessary for every discipline? We should be ready to complicate the answers we have while trying to go beyond what appears to be black and white. But again, isn’t that something that can be said about many disciplines as well? Maybe historians need to be more empathetic in order to be able to see the world through the prism of somebody else’s eyes, but also perhaps very patient so as not to get frustrated when layer after layer of a complex set of circumstances and reasons why individuals act a certain way unravel in front of us. 

 

It is easier for me to say what qualities one can develop once you begin thinking of one’s self as a historian and working on mastering skills that historians, in my eyes, should have: imagination and empathy combined with distance or at least an understanding of how much context matters in understanding motivations, reasoning, desires, fears, and even the emotions of the people we study. In many respects, I think the historians’ trade provides foundational skills in analyzing and understanding the world around us: the significance of changing context, a complex set of factors that define individuals and their positions in society. 

 

Who was your favorite history teacher?

I am lucky to have met many people who are not only excellent historians but also wonderful people, who in different moments of my life pushed me to look at the reality around me from different perspectives. During my high school years, history and so-called education about society seemed to be one of the most important subjects, because it was teaching us to understand the changing reality around us and how to engage with it. I remember endless discussions about what democracy is and what its limits are … imagining the US as a role model. 

 

My history teachers in both high school and then colleges in Poland, Switzerland (where I briefly studied), and the US very quickly became role models for civic disagreement, engagement, and way of life. I arrived to the USA to begin my graduate school pregnant with my first child. For many reasons, it was a very scary moment. And one of the first female professors I met welcomed me and my pregnant and anxious self with incredible warmth and support, saying that she wore different hats, so she knew how to support me in this difficult transition that was happening to me on multiple levels. It was hard for me at first to understand what hats had to do with my situation, but eventually I understood the expression as well as the philosophy of life expressed through it. I think that became my pattern – I do wear many different hats in life. Some of my professors may read this conversation (and I don’t want to mention names because I will inevitably miss somebody), so they should know that I will never stop thanking them from the bottom of my heart for teaching me how to keep broadening my world and keep my mind open. 

 

What is your most memorable or rewarding teaching experience?

There are plenty. Every class involves meeting new people and personalities but also new problems. I love the beginning of every semester; it is filled with expectation, anticipation, anxieties.  

 

But the experiences that I would describe as most life-changing are teaching for the Inside-Out Prison Exchange Program, which I already mentioned, and the Study Abroad program in Poland, which I have had a chance to lead three times so far. Both give me a chance for amazing interactions with people based on sharing the world we read about and the world we experience together. It is all about learning together, broadening horizons, encountering the unknown, and facing what may be uncomfortable. But it is also about the journey to create a community that can emerge as a result of this act of sharing. 

 

And now I need to mention the Polish writer Olga Tokarczuk, who received the Nobel Prize in Literature while I was drafting my responses to these questions. One of the books by her that affected me the most is Flights. I loved the Polish version and am in the middle of reading the English version, one chapter per day in order to savor it slowly. This book is about travels, journeys in and out, inside oneself alone and with others—reaching moments that feel liminal, as if standing on the edge of the world and looking down and not knowing if that means the end or the beginning. Teaching gives me those moments, especially teaching in prison or while taking students abroad, when our worlds join together to create a new one. And Tokarczuk knows best what words to use to reflect the silence around us when we realize the power of what is happening:

 

“Nothing happens—the march of darkness halts at the door to the house, and all the clamor of fading fails silent, makes a thick skin like on hot milk cooling. The contours of the buildings against the backdrop of the sky stretch out into infinity, slowly lose their sharp angles, corners, edges. The dimming light takes the air with it—there’s nothing left to breathe. Now the dark soaks into my skin. Sounds have curled up inside themselves, withdrawing their snail’s eyes; the orchestra of the world had departed, vanishing into the park. 

 

That evening is the limit of the world…”

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

No, I am not a collector. I love being surrounded by books, but not rare books. It’s like being surrounded by stories and people that make your life more interesting. Maybe I never had a chance to become a collector. I value old things tremendously, and I love living in old houses, where every single corner speaks to you about history, but I never wanted to possess anything old. I worked for some museums throughout my life and the amount of care and attention old things require always surprised me and made me appreciate people who have enough time and knowledge to take care of the old maps, books, and documents that they collect. I don’t think I am capable of that. 

 

What have you found most rewarding and most frustrating about your career? 

Students. Reading. Writing. Teaching. Interacting with students and seeing them grow is definitely the most rewarding part of my work. The same can be said about reading. There are so many beautiful and delightful books coming out: history books, novels, philosophical treaties. There is something beautiful in seeing a story unravel in front of your eyes, new intricate connections that authors make between certain facts. The same goes for writing; creating and looking for connections is endlessly satisfying. Having said that, all of this can be equally frustrating. I always struggle with finding enough time to read, write, or prepare classes. It physically hurts when students are simply uninterested in what I have to offer or when I cannot find a key to unlock their hearts and minds to history. 

 

What are you doing next?

My next project is a biography of Tonia Lechtman, a Pole, Jew, and Communist. Her family left Poland when she was coming of age, at a time when the ugly head of anti-Semitism began rising fast in public life in Poland and its neighboring countries. Her departure from Poland in 1935 led to over a decade of wandering from Palestine through France to Switzerland and Germany. After the war, when she decided to return to Poland, it was East European Communism with its paranoia and anti-Semitism that haunted her.

 

It is a fascinating story, but in many respects, she was an ordinary person. So, in a sense, it is a biography of ordinary life. From the very beginning of writing this biography, I struggled with the pressure imposed by the larger historical framework that clearly delineated her life but also with the urge to underscore her own agency. Historians working on biographies or those who try to reconstruct individual narratives from oral interviews or varied sources tend to think of life as composed according to an idea that helps an individual imbue it with meaning. Anthropologist Mary Catherine Bateson (Composing a Life) defines it beautifully when she states that work on individual life is like weaving a quilt, the ultimate shape of which depends on how we arrange various elements together. Even if we know the norms and rules, there is always space for an individual and a distinctive way of stitching the past together—creative makeshifts and improvisations. And Tonia indeed surprises me with her ability to weave her life together. My travel through her life is like learning how to read those quilts. 

 

Finally, there is an amazing array of sources left behind to tell her life—photos, letters, some interviews. I don’t think any historian could walk by this collection. The cache of photos and letters seems to be never-ending. Some of the documents are as distant as her grandfather Tobiasz Bialer’s school certificate or the medical certificate showing that her father, Aron, had chickenpox when he was nine (1886). The touch and smell of these old documents help us almost feel the presence of the family. Tomasz Kietliński, a political philosopher and cultural and social analyst, wrote beautifully when he reflected on archives in general: “Archives are the secrets of existence enchanted in the secrets of images and texts.... Archives are senses and sensuality of wardrobes, drawers—real and virtual, visual and written, thought and written, spoken and concealed.... The archive is a multi-voiced, open, process, infinity.” Tonia’s archives are indeed a treasure trove of images and texts that are leading me to present Tonia’s voice. And that almost intimate interaction with her, an attempt to recover her intentions and fears is intoxicating – that’s the essence of the thrill of being a historian for me. 

 

Thank you for giving me this opportunity to speak about the importance that the written has for me. 

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173914 https://historynewsnetwork.org/article/173914 0
Does truth matter less than it did in 1974?

The impeachment vote is broadcast at a D.C. bar amidst holiday decorations

 

“Don’t confuse me with the facts.”

 

At the moment of the impeachment and pending trial of another president, there are still lessons to be learned from Watergate.  One of these is the sad story of Earl Landgrebe.

 

Landgrebe, a third-term Congressman from Indiana, uttered the words above in August 1974.  A conservative Republican from Valparaiso, he remained a stalwart defender of President Richard Nixon, even as evidence mounted of his complicity in the Watergate coverup.  For example, Landgrebe voted against launching the impeachment hearings, one of only four Congressmen to do so.   He remained recalcitrant, even as Nixon’s guilt became hard for most to ignore.  Late in the process, even after the so-called “smoking gun” tape surfaced that proved Nixon’s culpability in his own words, Landgrebe refused to change his mind.  When asked about the tape, he blustered: “Don’t confuse me with the facts. I’ve got a closed mind.  I will not vote for impeachment.  I’m going to stick with my president even if he and I have to be taken out and shot.”

 

Facing almost certain impeachment, Nixon resigned from the Presidency the next day.

 

I was an undergraduate History major at Taylor University in Indiana at the time of the Watergate scandal. Like many of my peers, I followed news of the investigations closely, running back to my dormitory after classes to catch news of the latest break in the story.  Although a conservative Republican at the time, I reacted to Landgrebe’s willful ignorance of the evidence with complete disbelief and disgust.

 

Landgrebe’s constituents reacted with similar disbelief and disgust and he faced substantial political backlash.  When he ran for reelection in November 1974, his Democratic challenger was an American history professor at Purdue University named Floyd Fithian.  Two years earlier, Landgrebe had soundly defeated Fithian 55% to 45%.  But the impeachment crisis, and Landgrebe’s words and actions, had altered the public mood. Fithian trounced Landgrebe 61% to 39% this time.  Landgrebe retired to private life, while Fithian went on to serve four terms in Congress.

 

I remember waking up the morning after the 1974 election and reading about Landgrebe’s defeat in the Indianapolis paper.  It seemed like a just outcome.  As a student in a major built around the construction of arguments based on evidence, Landgrebe’s obstinate dismissal of the evidence seemed particularly reprehensible. Plus, I took particular delight in the fact that he was defeated by a history professor.

 

In the present crisis, there seem to be a number of Earl Landgrebes on the Republican side in Congress. Some, like Senator Lindsey Graham of South Carolina, boast that they have no interest in looking at the evidence. Perhaps, they are afraid of being confused by the facts.  Will any suffer Landgrebe’s fate at the hands on their constituents in the next election? Or does truth matter less than it did in 1974?

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173931 https://historynewsnetwork.org/article/173931 0
An Abolitionist and the Christmas Tree

 

Karl Follen came to America from Germany in 1825 as political refugee. As a young law professor he had written that people are justified in rising up against despotic regimes. Lafayette, returning for the fiftieth anniversary of the American Revolution, assisted with introductions. 

 

Soon Follen was again a professor: at Harvard, teaching in German literature, church history, and ethics. Popular with students, he introduced German gymnastics and opened America’s first public gym. He married Eliza Lee Cabot, from a famous Boston family. In 1831, after the birth to their son, Charlie, she wrote a hymn, “Remember the Slave,” that began:

                        

Mother! Whene’er around your child/ You clasp your arms in love,

           And when, with grateful joy, you raise/ Your eyes to God above,

           Think of the negro mother, when/ Her child is torn away,

           Sold as a little slave, -- O, then,/ For that poor mother pray!

 

Her husband clearly had her support when he joined the Anti-Slavery Society. He told her that it might mean losing his position. It did. In 1835, Harvard ended his professorship. Many donors were invested in textile mills using slave-picked cotton. Poorer whites, fearing that an end to slavery would cut their wages, had led anti-abolitionist riots– in Boston, where William Lloyd Garrison was nearly lynched; in Philadelphia, New York, and even Alton, Illinois, where another abolitionist editor, Elijah Lovejoy, was killed. 

 

At Christmas, 1835, the Follens still lived in a comfortable house in Cambridge, built with help from Eliza’s family. Charles, as he was now called, was preparing for a second career: as a minister, mentored by Eliza’s influential Unitarian pastor, William Ellery Channing. Channing had summoned a protest meeting in Boston after the Lovejoy murder, and published book on Slavery -- condemning it while also saying that many now calling themselves “abolitionists” were guilty of condemning not just the sin, but also all those tangled in its cotton threads; and with no plan to help those freed, as demanded, “immediately.” Suffering the endemic disease of the time, tuberculosis, Channing made Charles his personal representative in anti-slavery circles. 

 

At the holidays, the Follens had a houseguest, Harriet Martineau, sister of a rising leader among British Unitarians, and a female pioneer in sociology and journalism.  She had supported Britain’s abolition of slavery in its Caribbean colonies, then in a six-year transition period. Plantation owners were to treat former slaves as “apprentices,” using public compensation to prepare them to enter the labor market. Channing hoped that the U.S. might fund such a transition by selling public lands in the West. 

 

Despite the danger, the Follens asked Harriet to go with them to an anti-slavery meeting. 

There she said only that she felt slavery “inconsistent with the law of God.” That was enough. Many houses closed their doors to her. Her reports on America became one part dry statistics plus personal anecdotes. She lost her chance to rival de Tocqueville, in assessing the true nature of “democracy in America.” 

 

Christmas, in 1835 Boston, was not yet widely celebrated. New England’s Puritan heritage considered it “popish.” New Year’s was, however. Follen, feeling nostalgic for the Christmas customs of his childhood in Germany, arranged for a New Year’s party for his son, Charlie, his playmates and their parents. Eliza greeted guests in the front parlor. Meanwhile, in a back parlor separated by sliding doors, Follen set up in a tub a spruce tree that he had cut. He and Martineau then decorated it – with small dolls, gilded eggshells, and paper cornucopias with candied fruit. They then placed candles in holders on branches where they would not catch fire. 

 

Once he had lit all the little candles –with a bucket of water standing by – Follen flung open the doors. Charlie and the guests, young and old, looked in, amazed. As Martineau wrote: 

 

It really looked beautiful; the room seemed in a blaze, and the ornaments were so well hung on that no accident happened, except that one doll’s petticoat caught fire. There was a sponge tied to the end of a stick to put out any supernumerary blaze, and no harm ensued. . . . The children poured in, but in a moment every voice was hushed. Their faces were upturned to the blaze, all eyes wide open, all lips parted, all steps arrested.  

 

Those words, published first in a popular women’s magazine, then a pamphlet, gave the first vivid description in English of the German custom of the Christmas tree. The practice soon caught on, both in America and in England, where Queen Victoria’s 1839 engagement to a German prince made fashionable all things German.

 

Meanwhile, around Boston, younger Unitarian ministers, almost all influenced by Channing, some by Follen, in 1836 formed a group later called the Transcendentalist Circle. Later known chiefly as writers and individualists – as were Emerson, Thoreau, and Alcott out in Concord – many other Transcendentalists remained Boston church-based social activists. Influenced by German idealists such as Kant, they challenged materialism as the basis of moral philosophy. They saw that myfull spiritual, moral development depends upon promoting yours.  They then sparked the American struggle for racial, gender, and social justice. Through spiritual friendship, they promoted Margaret Fuller’s feminism, Horace Mann’s efforts for free public education, and the crusade by Dorothea Dix for decent treatment of the mentally ill. Like Follen, all were disciples of Channing.

 

Sadly, Follen’s Christmas tree has gone largely forgotten, except perhaps at the Follen Community Church in Lexington, Massachusetts. Five years after lighting that 1835 Christmas tree, he was hurrying back from lecturing in New York to dedicate the building he designed for Unitarians there who had called him to be their pastor. While he had carefully avoided the tree catching fire, he himself perished at sea, amid burning bales of cotton, as the steamer Lexington caught fire on Long Island Sound.  

 

That, however, is another story. One that I have tried to tell as touching on issues that should still concern us all this Christmas. 

 

For more by the author read: 

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173925 https://historynewsnetwork.org/article/173925 0
Brexit and the End of the United Kingdom

Queen Elizabeth reads a speech outining Boris Johnson's Brexit agenda

 

Queen Elizabeth I’s court-astrologer, a wizened and bearded character named John Dee who looked like a stock-wizard and claimed to be able to communicate with “Enochian” angels, invented a new political concept in 1576 after consulting various ancient prophecies and mythological sources. That year Dee wrote a treatise entitled General and Rare Memorials Pertaining to the Perfect Art of Navigation, part of a planned trilogy on the subject of Elizabethan colonization of the Americas. Writing in another pamphlet a year later, Dee’s “great hope was conceived… that her Majesty might, then, have become the Chief Commander, and in manner Imperial Governor of all Christian kings, princes, and states.” That wizard would conjure a novel phrase, displayed in a third pamphlet, written in 1577, and titled The Limits of the British Empire. Embryonic and inchoate as an idea of “Britain” may have been when Dee conceived of it as an idea, it could be argued that 1577 is a credible birth-date for that project. And December 12, 2019 would be an appropriate date to mark when the idea of Britain finally died.  

Since the 2016 Brexit referendum, Britain proved no more immune to the same forces of authoritarianism and nascent fascism encouraged by a pernicious xenophobia that is now sweeping the rest of the Western world. With Boris Johnson’s stunning and overwhelming electoral victory on Thursday, and the decisive wins for the Tories across England (though not elsewhere), certain conclusions must be entertained. When Dee conceived of the idea of “Britain,” there was no United Kingdom. Because of Johnson’s ascendancy, it must be considered that it is not only inevitable that there won’t be a United Kingdom again, but that it’s necessary that there shouldn’t be. 

Brexit, the career of slimy ghoul Nigel Farage, and now Johnson’s confirmation of continued residence at 10 Downing Street verify that the same storm of revanchism that is transforming America, Russia, Turkey, Hungary, Brazil, and others doesn’t stop at the English Channel. Johnson’s victory, and the failure in messaging from the Labour Party that resulted in their worst showing since 1935, should be disturbing to anyone who fears the rise of the new right around the globe. However, in analyzing what happened this week, there is also some cause for cautious optimism for those of us who support the decolonization movements of peoples struggling for their own autonomy. 

In examining an electoral map of Great Britain, there are some observations that should be conclusive beyond the simple fact of the collapse of Jeremy Corbyn’s Labour Party in industrial and working-class districts formerly amenable to their left-wing message. Tory blue dominates in wide-swaths across England, with Labour red in tonier and more cosmopolitan urban areas (not dissimilar to an electoral map in the United States). That encroachment is limited to England, however; for the blue stops near the location of Hadrian’s former wall on the Scottish border. In Scotland (exempting only three districts), both Labour and the Conservatives were routed by the increasingly pro-independence Scottish National Party. Across the Irish Sea and results were similar, as pro-Republican parties took the majority for the first time in Northern Ireland. Sensing the political possibilities, First Minister of Scotland Nicola Sturgeon vowed to hold an independence referendum in a speech the day following the election, intoning that “This isn’t about asking Boris Johnson or any other Westminster politician for permission. This is instead an assertion of the democratic right of the people of Scotland to determine our own future.” Brexit, it must be said, will be an unmitigated social, cultural, and economic disaster with which people in England (the younger generation of which was steadfastly opposed to it) will greatly suffer. Ironically, however, it may be that that referendum and Johnson’s subsequent victory could ensure a silver lining – an independent Scotland and a unified Ireland. 

As a forensic pathology report, it may be helpful to analyze what the idea of “Britain” has meant, and what risks and benefits attend its coming demise. In a manner confusing to outsiders, the word has never been a synonym for “England,” even while politicians of Johnson and Farage’s ilk have fundamentally treated it that way. The word was often used as a signifier to fundamentally justify a specifically English colonialism (a brutal process that first began with the countries closest to it). Irish political theorist Benedict Anderson in his classic study Imagined Communities: Reflections on the Origin and Spread of Nationalism supplied a helpful language to explain how nations – disparate groups of people across wide geographic areas who don’t known each other and often have little in common – were able to unify around a common identity. He argues that central to the notion of the “nation-state,” a political innovation that didn’t emerge until after the sectarian violence of the seventeenth-century, was what he called the “imagined community,” that is an invented and constructed sense of identity that was able to fuse together loosely connected but diverse polities. Organizing fictions were helpful in this endeavor, chiefly language, religion, and ethnicity. In the centuries-long process of loose confederations of people being solidified into countries like “France,” “Germany,” and “Italy,” a certain reductionism and flattening has conflated an imagined “ethnicity” with the political reality of the nation. Anderson notes, however, that Great Britain had the “rare distinction of refusing nationality in its naming” (it’s an oversight of the scholar that he doesn’t recognize this is even far truer for the American nationality).  

Readers will note that there is no “British” language as there is a “French” or “Italian” one (even if those later examples were accomplished by the elimination and fusing of numerous disparate dialects into an imagined unity). For that matter, there is no “British” ethnicity, excluding the ancient Britons who spoke a Celtic language and were pushed to the Welsh and Cornish peripheries by subsequent waves of Germanic speaking Angles, Saxons, Frisians, and Jutes during the seventh through tenth-centuries. “Britain” as a modern concept can trace itself to men like Dee and his compatriots. If later attempts at building a nation-state, or an imperial extension of the nation-state, relied on ethnic or linguistic categories, than Dee’s initial conception of the British Empire drew from different well-springs. Historian Frances Yates wrote in her seminal The Occult Philosophy of the Elizabethan Age that Dee drew from the ''Arthurian, mythical, and mystical side of the Elizabethan idea of the ‘British Empire.’ '' Inspired by the Tudors’ and his own Welsh background (at the same historical juncture when Elizabeth was oppressing Celtic speaking peoples), Dee ironically conflated “Englishness” with the mythic “Britishness” that designated the original inhabitants of the island, a “basic political fact… draped in the mystique of ‘ancient British monarchy,’ with its Arthurian associations,” as Yates writes. Dee thus envisioned a British Empire that had yet to exist, defined not by “ethnicity” (itself a nebulous concept at the time), but rather as Yates explains as a “purified and reformed religion to be expressed and propagated through a reformed empire… by their Tudors in their capacity as an ancient British line, of supposed Arthurian descent.” 

At the time that Dee was writing (and conjuring), England was a northwest backwater, a relatively poor realm on the edge of Europe sharing its island with Scotland. England’s colonial belatedness was a source of embarrassment to men like Dee; while the Spanish had already built a mighty New World civilization that filled the coffers of the Hapsburg Court with Aztec and Incan gold, American colonies were still decades away. With the Union of Crowns in 1603, both England and Scotland gained the same monarch whereby the Scottish King James VI also became James I of England, with the sovereign designating himself “King of Great Britain,” even as Parliament rejected the term as an official designation of the two unified kingdoms. A century later, and Edinburgh’s parliament would be subsumed into London, with the Acts of Union of 1707. Thus “British” moved from the realm of quasi-mythic history and the imagination of writers like Dee into political reality, as the word came to designate subjects in the modern world. Anderson writes that Britain’s status as a non-ethnic national identity “suggests that it is… the precursor of a twenty-first century internationalist order.” The political theorist would be the first to emphasize that that’s not necessarily a good thing, as British imperialism meant first the exploitation and oppression of its closest neighbors, particularly Ireland, and then the exportation of those cruelties throughout the Americas, Africa, Asia, and Oceania. 

There have been permutations and changes in what delineated British identity, effected by the collapse of the first British Empire with the American Revolution and then later with decolonization efforts in the twentieth-century that heralded the end of the second British Empire. Throughout its history as a term, “British” might not have necessarily meant “English,” but it didn’t necessarily mean something that wasn’t supremacist. If we conceive of the British Empire, and Britain, as implying certain supranational values, they have always been conflicted ones. Dee may have envisioned a transcendent millennial empire, but by the time those dreams were secularized and enacted, the British Empire promised a certain vision whereby this small European island had the duty and right to govern the majority of the world’s peoples, to extract whatever fortune they saw themselves as being warranted towards, whatever the cruel costs may have been. Such was a perspective exemplified in that apologist for empire Prime Minister Winston Churchill’s A History of the English-Speaking Peoples where he describes that mythic Briton Arthur and “his noble knights, guarding the Sacred Flame of Christianity and the theme of a world order, sustained by valour, physical strength, and good horses and armour… [who] slaughtered innumerable hosts of foul barbarians and set decent folk an example for all time.” 

Nations are of course invented things, though some are more invented than others. It’s worth asking what the use may be of a British nationality in this century, especially as it has become abundantly clear with Brexit and the Tory victory that the constituent non-English nations of the United Kingdom don’t particularly care for Westminster. British identity, in theoretically allowing for a cosmopolitan membership of people from a variety of backgrounds, could have always had a more redemptive potential than the foul chauvinism exemplified by the Churchill passage evidences. Several decades ago, with the collapse of the literal concept of the British Empire, and the welcome arrival into Great Britain of post-colonial immigrant populations known as the “Windrush Generation,” there was a half-century of trying to redefine Britishness as something inclusive, expansive, tolerant, and multicultural (despite the simmering presence of the British National Party and politicians like Enoch Powell). Now such a project seems in doubt, and it’s more than fair for Scotland and Northern Ireland (and Wales) to question whether there is a place for them in “British” identity. English voters of a certain age have shown themselves abundantly fine with circumscribing notions of citizenship into a smaller and smaller domain, as they’ve turned to the intoxicating lies of the ascendant global right. Perhaps freed British subjects on the edges of the Atlantic Archipelago would do better with independence, and new definitions.  With better prescience than Dee, Shakespeare would write a few decades later in his play Richard II something which read in 2019 as if prophecy – “England, that was wont to conquer others, /Hath made a shameful conquest of itself.” 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173927 https://historynewsnetwork.org/article/173927 0
Divisions Mark 30th Anniversary of the Opening of the Brandenburg Gate—Symbol of German Reunification

 

Thirty years ago, East German engineers worked through a winter night to open a crossing at Berlin’s Brandenburg Gate. Since 1961, the city’s most prominent monument stood behind the wall in East Berlin, isolated in the middle of no-man’s land, accessible to no one. Germany, at the time, was divided into democratic (West) and communist (East) states, reflecting  geopolitical realities that followed the 1945 defeat of fascism and the murderous Nazi regime.   

 

On December 22, 1989, however, the gate once again became a crossing point in the center of the city. Helmut Kohl stepped through the portal to become the first West German leader to visit East Berlin. East German Prime Minister Hans Modrow, as well as ecstatically cheering crowds, greeted Kohl in the moment, with the event broadcasted live to tens of millions across both Germanies. It was a time of hope, reunification; an exorcise of the last remnant of an intolerant past that led to the destruction of one of the most socially advanced nations in Europe.  

 

Unfortunately, with growing anti-immigration as well as anti-Semitic sentiments in Germany as well as the U.S., the celebration of this anniversary is shadowed by the specter of that past.  The Brandenburg Gate already boasts a checkered history, and we seem poised at another critical juncture for the landmark, one where the duality of a gate’s utility to divide or connect, dependent on those operating it, seems just as meaningful as ever.

 

From its very beginnings, the Brandenburg Gate stood at the center of momentous events. When Napoleon’s armies captured Berlin in 1806, they removed from the top of the gate the Quadriga, the statue of a goddess driving a chariot, and carried it off to Paris. Following Napoleon’s defeat in 1814, the statue was returned to Berlin, where it was newly adorned with a Prussian eagle and an Iron Cross in honor of the military victory.

 

In the centuries that followed, the Brandenburg Gate served as a backdrop to countless historic moments, including the 1987 speech in which Ronald Reagan called on Mikhail Gorbachev to “tear down this wall.” Just two years later, on November 9, 1989, throngs of East Germans approached the Brandenburg Gate and did what had been unthinkable only hours before: they climbed atop the Berlin wall while their neighbors to the west looked on in awe. 

 

Today, the Brandenburg Gate remains Berlin’s most recognizable symbol and provides a stage for all manner of German national celebrations, be it a World Cup victory or, most recently, the 30th anniversary of the fall of the Berlin wall. At the same time, Berlin enjoys an international reputation as a diverse, cosmopolitan, multilingual world capital, a hub of artistic and technical innovation. 

 

The Brandenburg Gate’s position at the very center of Berlin makes it easy to forget that it was once a gateway out of Berlin, an 18th-century exit through the city wall to the town of Brandenburg some 40 miles to the west. 

 

Brandenburg is also the name of the German state surrounding the capital of Berlin, one of the five new states carved out of the former East Germany after unification in 1990. By every measure, the two entities – the city of Berlin and the state of Brandenburg – could not be more different. 

 

While Berlin is young, dynamic, and diverse, Brandenburg and its neighboring states have struggled with economic stagnation and a constant exodus of young people to urban centers. Not surprisingly, the far-right, anti-immigration party Alternative for Germany (AfD) has capitalized on popular resentments to garner significant support in the region. In September 2019, the AfD received 24 percent of the vote in Brandenburg state parliamentary elections, a mere two points behind the left-center Social Democrats and ahead of Angela Merkel’s party, the Christian Democratic Union. 

 

Moreover, the German daily newspaper, Der Tagesspiegel, reported government data that showed over the last few years a steady increase in crimes linked to the hatred of Jews. On October 9, 2019, a far-right extremist in Halle, a city 100 miles southwest of Berlin, attacked a synagogue on Yom Kippur, leaving two dead.

 

Through its historic association with the surrounding region of Brandenburg, Berlin’s Brandenburg Gate serves as a reminder that in 2019, the German population remains politically and economically divided, even as Germany celebrates three decades as a united entity. 

 

This split is reminiscent of deep-seated divides in American society – between urban and rural populations, the coasts and the heartland, educated “elites” and blue-collar workers, millennials and boomers. These intractable divides contributed to the rise of President Donald Trump, and now threaten to undermine once unshakeable democratic values. 

 

The opening of the Brandenburg Gate that cold night in 1989 is a key moment in the emergence of the new Germany – a leader within Europe and a standard-bearer for political liberalism on the world stage. Three decades later, the opening reminds us that the democratic values so celebrated at the end of the Cold War should never be taken for granted.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173932 https://historynewsnetwork.org/article/173932 0
Could Eisenhower have been impeached in 1960?

 

As the U.S. House of Representatives moves inexorably toward adopting articles of impeachment against President Donald Trump, law makers and media mavens make frequent mention of the two impeachment trials and the one near-impeachment in American history: Andrew Johnson and William Jefferson Clinton, and of course Richard M. Nixon.  Not once has the name Dwight D. Eisenhower been said.  And yet…

 

On May 1, 1960, a Russian missile shot down the U-2 spy plane flown by Francis Gary Powers.  Against all the odds, Powers managed to squeeze himself out of the plummeting cockpit and parachute to capture, confession, and incarceration.  Soviet Premiere Nikita Khrushchev with a mixture of glee and outrage released the news piecemeal over the following week: Soviet air defenses have shot down a U.S. spy plane; we have parts of the plane; we have the pilot and he’s still alive; the pilot has confessed.  

 

As Khrushchev spoon-fed the details during the early days of May, the U.S. State Department and intelligence agencies were forced to flex to the gathering storm, revising the official story from day to day.  Initially, the flight was a NASA-instigated study of high altitude weather conditions that strayed into Soviet airspace.  By May 9th the Eisenhower Administration was admitting to “extensive aerial surveillance by unarmed  civilian aircraft… normally of a peripheral character but on occasion by penetration.”

 

President Eisenhower’s one imperative is encapsulated in another portion of the same statement:  “Specific missions… have not been subject to presidential authorization.”

 

According to journalist James Bamford’s 2000 authoritative history of the National Security Agency (Body of Lies), this was a bold-faced lie.  To the contrary, over-flights of the Soviet Union, beginning with bombers during Ike’s first term and shifting to U-2s in 1956, were always initiated with the direct knowledge and personal approval of President Eisenhower. 

 

As the administration’s prevarications were sequentially exposed, first by the Soviet premiere, and then by the pilot himself, the Senate Foreign Relations Committee leaned closer and closer into launching an investigation.  Ike had to grin and bear it, pretending to embrace a Congressional probe.  Behind closed doors he seethed and shuddered at the thought.  

 

Bamford writes, “According to top secret documents obtained for Body of Secrets, Eisenhower was so fearful of the probe that he went so far as to order his Cabinet officers to hide his involvement in the scandal even while under oath.”

 

And did they obey his order?

 

Most witnesses gave the Senate committee “just gobbledy gook,” in the words of Under Secretary of State C. Douglas Dillon.  However, when his boss, Secretary of State Christian Herter, was asked by Chairman J. William Fulbright if there was “ever a time” when the president approved a U-2 flight, Herter answered, “It has never come up to the president.”

 

According to the U.S. Justice Department’s Criminal Resource Manual, “To establish a case of subornation of perjury, a prosecutor must demonstrate that perjury was committed; that the defendant procured the perjury corruptly, knowing, believing or having reason to believe it to be false testimony; and that the defendant knew, believed or had reason to believe that the perjurer had knowledge of the falsity of his or her testimony.”

 

If Bamford’s book got it right, all the elements of the crime, as described by the DOJ, are met.

 

Why did Ike do it?  Two possibilities present themselves.  First, like all lame-duck presidents, he planned on preserving his legacy as best he could in the face of a foreign policy debacle, in his case made more acute by the failed Paris summit meeting with his Russian counterpart that followed hot on the scandal’s heels.

 

Second, his anointed successor, Richard M. Nixon, was in the political fight of his life against Senator John F. Kennedy.  

 

The parallels to the pending impeachment of Donald Trump are compelling, leading me to wonder whether Eisenhower was as deserving of impeachment as Trump.

 

Eisenhower’s approval of the U-2 incursions was motivated by genuine national security interests, albeit his subornation of perjury was primarily politically motivated.  One could contend national security and political interests more or less coincided. In Trump’s case, the two sharply diverged from the get-go.

 

Still, at bottom, this seems to me a comparison of the pot and the kettle, with the pot making a clean getaway while the kettle faces impeachment.

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173926 https://historynewsnetwork.org/article/173926 0
Jimmy Carter: Watergate’s Final Victim

Jimmy Carter debates Gerald Ford in a 1976 presidential debate

 

“Jimmy Who?” people asked in 1976 as the virtually unknown James Earl “Jimmy” Carter came out of nowhere to capture the Democratic nomination for president, eventually winning the presidential election. Carter was able to accomplish this unlikely victory for one key reason: he was the anti-Nixon in the midst of the Watergate era. His “I’ll never lie to you” pledge resonated with voters disgusted with the corruption of the Nixon administration, allowing Carter to become president more for what he was notthan for what he was.

 

Today, at 95, Carter is the longest-lived president in history, as well as the nation’s longest-retired president. Facing several health-related challenges, perhaps it is time to reflect on the rise, fall, and rise of James Earl “Jimmy” Carter.

 

Ironically, for a man who won the presidency because of Watergate, in the end it was the legacy of Watergate that, more than anything else, brought him down.

 

Jimmy Carter, 39th president of the United States, and the only president to take the oath of office using a nickname, was an unlikely president  who served in  difficult times. He had precious little political experience when he was elected president, having served two terms in the Georgia state legislature, and one term as Governor. Being an “outsider,” not part of the Washington D.C. political establishment, was a great asset in the everything-inside-the-beltway-is-corrupt estimation of the public. But what helped him get elected came back to haunt Carter as his inexperience with beltway politics was, in part, his undoing.

 

Carter was the victor in the first post-Watergate presidential election. In the ‘76 election he barely beat incumbent (appointed by the disgraced Nixon) Gerald Ford, and had it not been for Ford’s pardon of Nixon, the result of the race would likely have gone the other way.

 

As president, Carter attempted to de-pomp the Imperial presidency that had blossomed under Richard Nixon. Downsizing the presidency seemed a good idea at the time, but world events conspired to demand a stronger, more in charge president. Post-Watergate, the public was in a president-bashing mood, and Congress began to flex its muscles, leaving the presidency weaker and more vulnerable than at any time in the previous two generations. Governing in the best of times is difficult enough but governing in an Age of Cynicism and declining trust was all but impossible.

 

As president, Carter faced a series of daunting challenges, beginning with the rise of OPEC. The oil producing states joined together and conspired to raise the price of oil, and the Western governments, so dependent on oil, were squeezed and blackmailed into paying exorbitant prices. All the economies of the West suffered as stagflation (economic stagnation and inflation occurring at the same time) hit hard, causing a recession. In response to this, Carter proposed a far-reaching energy bill that was gutted in Congress. In the post-Watergate period, even doing the smart and necessary thing seemed impossible.

 

Carter did have his domestic successes, including Civil Service Reform, and the creation of two Cabinet-level departments: Energy and Education, as well as the creation of the modern vice presidency. But it was in the field of foreign policy that Carter had his greatest success—and greatest disappointment.

 

As President, Carter placed “human rights” at the center of U.S. policy. He shifted the terms of debate from the uses of power to the advancement of human rights globally, putting our adversaries (and some of our allies) on the defensive. By focusing a light on the human rights abuses in the Soviet Union, China, and elsewhere, Carter played to what was then a great strength of the United States.

 

Carter also brokered the Camp David Accords between Israel and Egypt. It was a groundbreaking achievement that made Israel safer and advanced the cause of peace.  Carter also normalized relations with China, got Congress to pass the Panama Canal treaty, and got the SALT II accords passed. But it was also in foreign policy where the weakness and limits of presidential power would come back to haunt Carter.

 

In November of 1979, Iranian protesters seized the U.S. embassy and held Americans hostage. Negotiations to secure the release of the hostages failed, as did a 1980 rescue mission.  The U.S and its president appeared weak, unable to exert the muscle Americans were accustomed to seeing. This would be the final nail in the political coffin of Jimmy Carter. While all the hostages were eventually released, the feeling of helplessness and weakness spawned by this incident ended up costing Carter his reelection.

 

Carter’s vice president Walter Mondale accurately summed up the Carter presidency: “We told the truth, obeyed the law, and we kept the peace.”

 

If Carter was not a highly rated president (his historical ranking fluctuates placing him somewhere in the low 20s), his post-presidency tells a far different story. Carter is considered by many to be the most successful ex-president in U.S. history. After leaving office he continued to promote human rights, monitor elections across the globe, promote democracy, and engage in conflict resolution. He has written over 30 books and volunteered with Habitat for Humanity. In 2002 he was awarded the Nobel Peace Prize.

 

In attempting to come to grips with the presidency of Jimmy Carter, one might well ask: could anyone have succeeded in the immediate aftermath of Watergate?  We know that leadership is highly contextual, and Jimmy Carter became president at a time ripe for bashing presidents. Given that, perhaps Carter did about as well as might be expected. Enchained by thousands of Lilliputians, Carter did what he could to advance the interests of the United States while demanding that we lived up to our highest ideals.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173915 https://historynewsnetwork.org/article/173915 0
How Tolstoy's War and Peace Can Help Us Understand History's Complexity David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington; his most recent book is Through a Glass Brightly: Using science to see our species as we really are (Oxford University Press), 2018.

 

 

We have extraordinary access to ideas and data, and yet it has become a truism that getting to the bottom of things has become more difficult. Truth itself seems increasingly unclear and under assault. Call it Nietzschean perspectivism if you wish, but I submit it is because things are rarely linear. Even without the astounding barrage of half-truths, outright lies, “alternative facts” and demented tweets emanating from the Orange Presence in the White House, the Internet alone has muddied the waters of our comprehension. Our times are best understood not via relativism or subjectivity of “truth claims” but by recognizing that although events and other facts are indeed real, they mostly result from a complex, interlocking web of causation with no one factor necessarily determinative and no single interpretation necessarily correct. In the brief essay to follow, I summon in support of this proposition no less a presence than Leo Tolstoy. (If you resist such arguments from authority, feel free to skip the quotation that is about to come!)

 

War and Peace can, not unlike our increasingly muddled grasp of reality, be perceptually overwhelming in its recounting of detailed human stories. Yet one of the author’s notable asides in this masterpiece begins with a seemingly simple and mundane natural occurrence – “A bee settling on a flower has stung a child” -  then proceeds to view it from various angles, eventually arriving at a grand generalization:

 

And the child is afraid of bees and declares that bees exist to sting people. A poet admires the bee sucking from the chalice of a flower and says it exists to suck the fragrance of flowers. A beekeeper, seeing the bee collect pollen from flowers and carry it to the hive, says that it exists to gather honey. Another beekeeper who has studied the life of the hive more closely says that the bee gathers pollen dust to feed the young bees and rear a queen, and that it exists to perpetuate its race. A botanist notices that the bee flying with the pollen of a male flower to a pistil fertilizes the latter, and sees in this the purpose of the bee's existence. Another, observing the migration of plants, notices that the bee helps in this work, and may say that in this lies the purpose of the bee. But the ultimate purpose of the bee is not exhausted by the first, the second, or any of the processes the human mind can discern. The higher the human intellect rises in the discovery of these purposes, the more obvious it becomes that the ultimate purpose is beyond our comprehension. All that is accessible to man is the relation of the life of the bee to other manifestations of life. And so it is with the purpose of historic characters and nations.

 

Despite the human yearning for simple, cause-and-effect explanations, people increasingly understand that nothing can be grasped in isolation, even as complexity makes things less graspable in their entirety. This is particularly true when it comes to the elaborate kaleidoscope that is human human behavior, where no single factor explains everything – or indeed, anything.

 

Tolstoy himself was especially concerned with supporting his idea that history in general, war in particular, and the Napoleonic Wars most especially were due to processes that the human mind could not discern, and that contrary to Great Man notions, “historic characters and nations” are influenced by an array of factors such that every individual is no less influential than are deluded, self-proclaimed leaders. And so, Napoleon comes across in War and Peace as more than a bit ridiculous with his insistence that he and he alone drives events. (This, in service of Tolstoy’s urging that if citizens refuse to participate, there would be no wars; i.e., an early version of the bumper sticker, “What if they had a war and no one came?”)

 

Taking a hedgehoggy view of that brilliant old fox, today’s world is no less multi-causal and therefore confusing than Tolstoy proclaimed with his parable of the honeybee. Afflicted with an unending stream of presidential lies and gaslighting, constant yet shifting claims of “fake news” and “alternative facts,” a dizzying array of Internet information overload, and a blizzard of wild conspiracy theories, it is tempting to give up on any coherent interpretation … of anything.

 

Nor is this problem new. Social scientists have long disagreed about what is predominant when it comes to causation: language, socialization, learning, cultural tradition, historical circumstance, evolutionary inheritance, and so forth. Was World War I, for example, due to interlocking alliances, frustrated yearning for colonial empire on the part of Germany, Austria-Hungary’s anxiety about losing its empire, incompetent national leaders, Europe bored with decades of more-or-less peace, the rigidity of war plans combined with strict mobilization time-tables, the assassination of a certain Arch-Duke, machinations by the “merchants of death,” a combination of these, or something else?

 

Was the 2003 invasion of Iraq due to George W. Bush’s yearning to outdo his father, W having been manipulated by Messrs Cheney, Rumsfeld, Wolfowitz et al, a genuine hope of bringing democracy to the Arab Middle East, greed for Iraqi oil, a real if misguided belief that Western “liberators” would be welcomed with chocolate and flowers, illusions about weapons of mass destruction, a combination of these, or something else? 

 

Is the climate crisis due to corporate greed, consumer indifference, technological over-reach, the cowardice and short-sightedness of politicians, human overpopulation in the face of limited energy resources of which fossil fuels are the most available and at the cheapest immediate cost, a collision between atmospheric physics and growth-oriented economics, the inexorable push of energy-gobbling civilizations, a combination of these, or something else?

 

Is the danger of nuclear annihilation due to the military-industrial complex, a human penchant for war, distrust of “the other,” excessive reliance on deterrence as a savior, a kind of psychic numbing due to the unimaginable consequences of thermonuclear holocaust, perceived helplessness on the part of ordinary citizens, widespread feelings of fatalism, a sense that if something really bad hasn’t happened yet it never will, a mesmerized delight in extreme power and potential violence whatever the consequences, a combination of these, or something else?

 

Although ultimate causes and to some extent even reality itself are often beyond our comprehension –possibly even beyond our ability to repair – it is nonetheless our duty to behave as though they aren’t. And certainly our duty to acknowledge such critical realities as war, climate change and nuclear weapons. Just as “All that is accessible to man is the relation of the life of the bee to other manifestations of life,” we can conclude with the existentialists that just as Albert Camus’s Sisyphus was heroic because he persevered in pushing his rock, we are equally obliged – and privileged – to push ours, even though what is accessible to us turns out to be not one but many, going in different directions and with uncertain outcomes. Or as Rabbi Tarfon (70 CE – 135 CE) proclaimed, “It is not your responsibility to finish the work of perfecting the world, but neither are you free to desist from it." Tolstoy would agree.

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/blog/154294 https://historynewsnetwork.org/blog/154294 0
Pass the Protecting the Right to Organize Act

 

Discontent with Donald Trump in 2018 led voters to give Democrats a majority in the House of Representatives in the midterm elections. Although media attention has focused on the impeachment inquiry, the House has passed several pieces of progressive legislation this year.  One bill, the Violence Against Women Reauthorization Act, was supported by 33 Republicans. 

None of the measures has been taken up by the Republican-controlled Senate. Although the bills may not become law this year, they indicate that the Democrats are likely to enact positive reforms in several areas if the 2020 election produces a Democratic Congress and a Democratic President. 

 

Thus far this year the House has voted to improve voting rights, raise the minimum wage, provide protection against discrimination for LGBTQ individuals, advance gun control, provide a path to citizenship for Dreamers, promote climate action and paycheck fairness, lower prescription drug costs, and restore the ability of the Consumer Financial Protection Bureau to protect consumers. On foreign policy, the House passed a measure directing the removal of United States Armed Forces from hostilities in the Republic of Yemen that have not been authorized by Congress. 

 

For the Democrats to win the 2020 election by a wide margin, the single most important measure for the House of Representatives to adopt is the Protecting the Right to Organize Act of 2019. The bill has 218 co-sponsors including two Republicans. The bill would strengthen workers’ right to organize into unions, bargain collectively, and conduct strikes.

 

 Since the enactment of the anti-union Taft-Hartley Act in 1947, unions have been attempting to improve labor laws to give workers a chance to speak up with some power to their employers. The labor movement and its allies sought to improve labor law with the Labor Law Reform Bill in 1978, the Workplace Fairness Bill in the early 1990s, and the Employee Free Choice Bill in 2011.

 

The failure to adopt any of these pro-union bills contributed to the decline of unions and the growth of inequality, the twin hallmarks of the past four decades in the United States. All working people have been hurt by the setbacks in union power. 

 

Workers are today conducting strikes and organizing in new grass roots movements in many industries. Making passage of Protecting the Right to Organize Act of 2019 the number one priority would both galvanize workers and expose President Trump as the ally of the rich and the enemy of working people. 

 

The House of Representatives has passed much good legislation this year, but it could do more to highlight working-class issues. Most importantly, Congress should immediately enact the Protecting the Right to Organize Act. If unions and their allies win passage of this legislation, they may begin to shift the country away from the glaring inequality that is at the core of the country’s discontent. 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173924 https://historynewsnetwork.org/article/173924 0
Amidst the Rubble: Hope Harrison on the 30th Anniversary of the Fall of the Berlin Wall

Hope Harrison is an Associate Professor of History and International Affairs at The George Washington University. She specializes in the Berlin Wall, East and West Germany and the International History of the Cold War, to name a few of her areas of expertise. Her latest book, After the Berlin Wall: Memory and the Making of the New Germany, 1989 to the Present, was published in November and delves into how the Berlin Wall is remembered since its fall up until the present day 30 years later.

 

What made you want to study the Berlin Wall and the way it is remembered?

The Berlin Wall was such a fundamental part of life in Germany and Cold War for the 28 years that the wall stood. It's also iconic. I mean people around the world know about the Berlin Wall and have read about it and seen pictures. It represented the Cold War. It represented the repression of communism, and then when it suddenly fell, it suddenly became a symbol of freedom and of the end of the Cold War. Since it was such an important part of German history, and since Germany is really an expert in dealing with difficult parts of their past, particularly the Holocaust, I wanted to look at how has Germany dealt with this other difficult part of its past, the 28 years that the wall stood.

 

What is your personal connection to the Berlin Wall at the time of its fall? What was it like to be present at such a historic moment in time?

I was on a plane headed to Berlin when it fell and so I arrived a few hours later. It was incredible. It was so exciting. It was amazing to be there and watch them, actually with bulldozers, remove pieces of the wall so that they could create new crossing points in the wall for East Germans to enter into West Berlin and so I saw West Berliners and West Germans standing there on the western side of the wall welcoming these East Germans and everyone was crying and cheering and hugging each other. It was just profoundly moving.

 

Did you use your own personal experience at the site of the Berlin Wall to frame your research for your most recent book?

Not exactly directly, but there is no question that being there and watching the aftermath of the fall of the wall in some sense, in some deep part of me, that probably started the process that would ultimately lead me to write this book. My first book was about the building of the wall. My first book, which was a prize-winning book, was called Driving the Soviets Up the Wall and that told the story of why the wall was built in 1961, so this is my second book now that I just published on the Berlin Wall. Of course, this one looks at the 30 years since its fall.

 

How did you go about collecting information for the book, After the Berlin Wall?

Many, many, many trips to Germany and to Berlin especially over many years. I interviewed over 100 Germans who had something to do with the way in which the Berlin Wall is remembered. These are not just regular private individuals with their own private memories. What I was interested in was the history that is visible in public places that millions of people come to, such as the National Berlin Wall memorial that gets like a million visitors every year. I was interested in who gets to decide what people learn when they come there and in other public spaces and also at big political events commemorating the rise and fall of the wall. How I went about that was by talking to people who have had some impact on basically the story that gets told about the Berlin Wall in public in Germany. That's one thing I did, was I interviewed people. Another thing I did was I attended those major ceremonies, so I was there in Berlin for the 15th, 20th and 25th anniversaries of the fall of the wall, all of which were big events that I write about. I was also there for the 50th anniversary of the building of the wall. Another way I went about this was by consulting archives, all sorts of document collections, as well as following debates in public, particularly in the media, debates about what should be remembered about the wall and how it should be remembered.

 

What were some of the most interesting things that stuck out to you from your research?

One of the people I write a lot about was basically the founder of the National Berlin Wall Memorial and he was a protestant minister, Manfred Fischer, and he said to me, "Look, the Berlin Wall when it fell..." he felt like, "We need to preserve this just like people preserve a crime scene." He said, "This was a crime scene. People were killed here. We must remember that. We need to preserve some parts of the wall as a memorial." One of the interesting things for someone from the outside is the Germans, initially the vast majority of them, didn't want to preserve anything. They just wanted to get rid of it. Now, that's understandable. It had been a terrible thing, so you get it on the one hand, of course they wanted to remove it. On the other hand, they now regret that they didn't save a bit more of it for people to really experience what it was like.

 

What is one thing you hope readers will walk away with after reading your book?

An understanding of the people that I call "memory activists," like Manfred Fischer, people who believe it's important to remember history in public places, even difficult history, which is a big issue in our own country. We're not so good at that. We are only really at the very beginning of grappling in depth with racism, with slavery, with lynching. Germany has really done a lot to confront first the Holocaust, which I also write about in my book, and also the past of the Berlin Wall. Understanding the commitment it takes to fight in an environment where people don't want to remember, to say, "We really need to remember and should remember."

 

Why is this event so important for the public to engage with?

Well, right now in this country, there's much talk about our own wall with Mexico. Now, the Berlin Wall was very different. The Berlin Wall was to keep East Germans in, whereas our wall is to keep people out of the US. As I write about this, [I find that] all walls generally involve violence. The people who served at the Berlin Wall, the border soldiers, some of whom killed people trying to escape across the Berlin Wall, many of them are traumatized to this day 30 years later for the role they played in upholding that lethal border. I think we can learn from that, that walls have very long, complicated legacies even after they are taken down, and that it is not something one should engage with lightly.

 

What other ways, besides writing books, do you engage the public in the historical legacy of the Berlin Wall?

I have given many, many public talks at book stores, at museums. I'm speaking tomorrow at the Library of Congress. I have spoken many places in the US. I have also spoken in London and in Berlin. I have been interviewed on television and on radio and by newspapers really from all over the world, from the US to Germany to England to Russia to Portugal, BBC Spanish. People all over the world are still fascinated by the Berlin Wall and the 30th anniversary of the fall of the wall still seems to get a lot of attention, so I have been doing everything I can to respond to lots of media requests really from all over the world.

 

How do your roles in various historical organizations, like History and Public Policy Program, the Cold War National History Project and the Berlin Wall Memorial Association, allow you to engage with the public?

They all involve either me speaking in public or planning for the engagement of the institution that I am a part of to engage with the public. The Berlin Wall Memorial Association in Berlin, where I am a member of the board, they do multiple events for the public throughout the year. When I was asked to join, I was the only non-German so I am able to sort of bring an outsider perspective to the German public. Also, the Allie Museum in Berlin tells the history of the US, British and French allies and the role they played after World War II for 40 years deployed in West Berlin and West Germany. That museum, hundreds of thousands of people visit that museum, so being part of the decisions about what are the exhibits we are going to have, what are the events that directly engage the public. Here in the US, with the History and Public Policy Program and the Cold War Project, both of which are at the Woodrow Wilson Center here in DC, all of their events are open to the public. Also, they have a fantastic online digital history archive where anybody from anywhere, the public at large, can read original documents from the Cold War.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173842 https://historynewsnetwork.org/article/173842 0
Americans Are Ready for a Different Approach to Nuclear Weapons

 

Although today’s public protests against nuclear weapons can’t compare to the major antinuclear upheavals of past decades, there are clear indications that most Americans reject the Trump administration’s nuclear weapons policies.

 

Since entering office in 2017, the Trump administration has withdrawn the United States from the nuclear agreement with Iranscrapped the Intermediate-Range Nuclear Forces (INF) Treaty with Russia, and apparently abandoned plans to renew the New START Treaty with Russia. After an overwhelming majority of the world’s nations agreed on a landmark UN Treaty on the Prohibitions of Nuclear Weapons in July 2017, the Trump administration quickly announced that it would never sign the treaty.  The only nuclear arms control measure that the Trump administration has pursued―an agreement by North Korea to abandon its nuclear weapons program―appears to have collapsed, at least in part because the Trump administration badly mishandled the negotiations.

 

Moreover, the Trump administration has not only failed to follow the nuclear arms control and disarmament policies of its Democratic and Republican predecessors, but has plunged into a renewed nuclear arms race with other nations by championing a $1.7 trillion program to refurbish the entire U.S. nuclear weapons complex.  Perhaps most alarming, it has again and again publicly threatened to initiate a nuclear war.

 

These policies are quite out of line with U.S. public opinion.

 

Polling Americans in July 2018 about Trump’s withdrawal of the United States from the Iran nuclear agreement, the Chicago Council on Global Affairs found that 66 percent of respondents preferred remaining within it.  In February 2019, when the Chicago Council surveyed Americans about U.S. withdrawal from the INF Treaty, 54 percent opposed the action.  Moreover, when Americans were presented with arguments for and against withdrawal, opposition to withdrawal rose to 66 percent. 

 

The Center for International & Security Studies at the University of Maryland also reported overwhelming public support for nuclear arms control and disarmament agreements.  Polling Americans in early 2019, the Center found that two-thirds of respondents (including a majority of Republicans) favored remaining within the INF Treaty, while eight out of ten respondents wanted the U.S. government to extend the New START Treaty. Indeed, more than eight out of ten U.S. respondents backed new nuclear arms control treaties with Russia―findings similar to those of the Chicago Council, which reported that 87 percent of American respondents to a poll in early 2019 wanted the United States and Russia to secure a further nuclear arms limitation agreement.

But just how much arms control and disarmament do Americans want?  It might come as a shock to the many pundits in the mass media who have never mentioned the 2017 UN Treaty on the Prohibition of Nuclear Weapons, but roughly half the U.S. population supports nuclear abolition along the lines of the treaty.  According to a YouGov opinion survey done in late September 2019, 49 percent of American respondents thought the United States should work with other nations to eliminate all nuclear weapons in the world. Only 32 percent disagreed, while 19 percent said they didn’t know.

 

When it comes to actual use of nuclear weapons, Americans are even clearer in their preferences.  AYouGov/Huffington Post poll in August 2016 found that 67 percent of American respondents thought the U.S. government should never initiate a nuclear attack.  In mid-2019, Zogby Analytics surveys of American respondents in key primary states also discovered very high levels of opposition to first use of nuclear weapons.

 

Not surprisingly, Donald Trump’s angry, impulsive behavior, coupled with his threats to launch nuclear attacks upon other nations, has left many Americans uneasy.  This might help to explain why 68 percent of Americans surveyed in early 2019 by the Center for International & Security Studies backed congressional legislation requiring that a president, before ordering a nuclear attack upon another nation, consult with congress and secure a congressional declaration of war upon that nation.  As the U.S. congress has not passed a declaration of war since 1941, this opinion, too, provides a substantial challenge to current U.S. nuclear policy.

 

There are other indications, as well, that the American public wants a new approach.  In July 2019, the U.S. Conference of Mayors, at its 87th annual meeting, unanimously passed a resolutioncalling on all U.S. presidential candidates “to pledge U.S. global leadership in preventing nuclear war, returning to diplomacy, and negotiating the elimination of nuclear weapons.”  Calling for negotiations to replace the INF Treaty and to extend or replace the New START Treaty, the resolution demanded that candidates support the Treaty on the Prohibition of Nuclear Weapons and renounce the option of first use of nuclear weapons.

 

Yet another sign of public discontent is the emerging Back from the Brink campaign, supported by numerous peace, environmental, religious, health, and other organizations.  Endorsed by dozens of cities and towns across the country, it has also received the official backing of the state legislatures of California and Oregon, as well as of the New Jersey State Assembly and the Maine State Senate.  The campaign calls on the U.S. government to “lead a global effort to prevent nuclear war” by: “renouncing the option of using nuclear weapons first”; “ending the sole, unchecked authority of any U.S. president to launch a nuclear attack”; “taking U.S. nuclear weapons off hair-trigger alert”; “cancelling the plan to replace its entire nuclear arsenal with enhanced weapons”; and “actively pursuing a verifiable agreement among nuclear-armed states to eliminate their nuclear arsenals.”

 

Looked at from the standpoint of most Americans and, indeed, survival in the nuclear age, this departure from the dangerous direction of U.S. nuclear policy makes a lot of sense.  Looked at from the standpoint of candidates seeking election to national office, it would also make good politics.

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173928 https://historynewsnetwork.org/article/173928 0
The Real Alexander von Humboldt: A Scientist of the Romantic Age

 

Alexander von Humboldt was born 250 years ago this fall. As his legacy is celebrated across the globe, I continue to be struck by the grandiose claims that are made about him in the existing literature. The narrative that emerges is of an intrepid explorer, striding out to accomplish a range of spectacular, almost superhuman achievements, in a life dedicated to the single-minded pursuit of scientific endeavour.

 

Looked at more closely, though, things appear a lot less straightforward. Humboldt’s most celebrated achievement is his ascent of the Chimborazo – believed, at the time, to be the highest mountain on Earth. While this was an astonishing feat, the fact remains that the Chimborazo was not the tallest mountain in the world at all. Further, Humboldt had to turn around at around 5,600 meters and failed to reach the summit. 

 

Another claim is Humboldt’s supposed discovery of the Casiquiare Canal, an undiscovered waterway linking the great water systems of the Orinoco and the Amazon. The Casiquiare, however, wasn’t undiscovered. Not only did local people know about it and use it for trade, but Europeans already knew of itas early as 1745 when the explorer Charles Marie de la Condamine informed the Academie française of its existence.

 

Humboldt almost invented the electric battery– in fact, he came very close. Except that he didn’t actually invent it. That was Alessandro Volta. Humboldt used the Voltaic pile – two different metals in an electrolytic salt solution – but it was inadvertently, and he only realized later how close he had been.

 

And then there is the most famous monument to his name, the Humboldt Current. Again, Humboldt didn’t discover it – it was well known to Peruvian fishermen at the time. He never took credit for its discovery, nor did he refer to it as the Humboldt Current. 

 

He did invent isotherms– the idea of linking the points of equal temperature across the globe. They are still in use today – they are the lines we’re used to seeing on weather maps – but, somehow, this feels like a bit of a meagre fruit of a scientific approach that was based on measuring, and comparing, everything possible.

 

A closer look at Humboldt’s letters dismantles the idea of a superhuman figure, one whose life consists of a series of triumphs that we can only marvel at from a distance. Humboldt’s life was guided by a different underlying principle: that of German Romanticism.

 

One of the defining characteristics of early German Romanticism is the urge towards transcending boundaries. This translates into a seeking out of the open-ended and the unfinished, and, conversely, a rejection of all that is known and clearly demarcated. Humboldt’s motivation was personal at least as much as it was scientific, growing out of a Romantic instinct towards the undefined, the wide and the open.

 

So instead of setting out for a particular destination, on a strictly planned course of scientific exploration, Humbodt’s journey was, to a remarkable extend, shaped by chance: First, Humboldt was hoping to go to Italy. When this was thwarted by Napoleon’s Italian campaigns, he aimed for the West Indies instead. This was followed by plans to travel to Egypt, to Morocco, and even to the South Pole. Even as he was on a ship bound for South America, he believed that his destination would be Cuba– it was an outbreak of typhus on board the Pizarro, the ship he was travelling in, that decided him to embark at Cumaná. Later, there was a plan to sail to the Philippines. Humboldt would have gone anywhere: “I would have sailed to the remotest South Seas, even if it hadn’t fulfilled any scientific purpose whatever”.  

 

Another clue to the strong Romantic impulse that propelled him is the type of thing Humboldt noticed on his travels, and thought worth noting down. For a scientist who had the methods of the Enlightenment at his disposal, and used them with great competence, even virtuosity, I found myself astonished by some of them. In the beginning especially, he tended to see everything through eyes that were not only Eurocentric, but also schooled by German Romanticism. 

 

The Rio Manzanares, just outside Cumaná, for example, was “very like the river Saale near Jena”. The mountain range in Venezuela that separated the coast from the interior reminded him of Switzerland and the Tyrol – he actually called it “the American Alps”. And looking down from it towards the coast, he thought the landscape was “just like the background of the Mona Lisa”. 

 

Sometimes, the landscape evoked is a very specific one, and one heavily associated with Romanticism. A stretch of landscape near the Orinoco reminded him of a particular area near Bayreuth – the setting of Jean Paul’s Romantic novel Siebenkäs. Lake Valencia, in Venezuela, struck him as just like Lake Geneva – with a rock face calling to mind the precise scene in Jean-Jacques Rousseau’s La Nouvelle Heloise where Saint-Preux and Julie almost come to grief.

 

So what was it that Humboldt was looking for? Humboldt imposed Romanticism on what he found, but this relationship was to be a reciprocal one. He wanted the New World to make an impression on him, too, and to be changed in turn.

 

It is that desire that was most succinctly expressed by Goethe, in his book Elective Affinities, written after Humboldt’s return from the Americas. Goethe makes one of the characters, Ottilie, refer to Humboldt before stating that “no one can walk beneath palm trees with impunity”. And Humboldt certainly didn’t want to. What he wanted is, again, perfectly expressed by Ottilie: “Doubtless, his tone of thinking becomes very different in a land where elephants and tigers are at home”. I think this is exactly what Humboldt was setting out to do.

 

It is almost with triumph that he writes home, to his botanist friend Willdenow, who is moored in Berlin with wife and child: “How fortunate you are to not see these impenetrable forests on the Rio Negro, these worlds of palm trees – it would seem impossible after that to get used to a pine wood again”. He wanted for the tropics to have left a mark on him – and so is at pains to emphasise that they had had a physical, measurable effect on him: he is, he reports, browner, fatter, happier. On the Orinoco, he persuaded some local men he encountered to paint his face to match theirs, only to then find that the colour couldn’t be washed off – as he writes, with satisfaction, it seems to me, in a footnote. And he has been cured of the sickliness that has dogged him since his youth (there was a whole range of intriguing illnesses: a swollen cheek, a slime fever) – on his travels, he boasted, he never had “even have as much as a headache”.

 

I share with Humboldt an aversion of building pedestals and putting people onto them. In fact, when there was a plan to erect a bust in Humboldt’s honour, he instinctively recoiled: “dreadful news!”, he said, and protested that if it were to come to pass, it should distress him so much, that it would stop him from working for months.

 

Putting people on pedestals, beyond the reach and understanding of lesser mortals, does not help us understand them better. If, perhaps, we lose a hero, we may gain, in Humboldt, an extraordinary scientist who was affected by the extraordinary times he lived in – and, most importantly, a fellow human being who allows us to look him in the eye.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173916 https://historynewsnetwork.org/article/173916 0
What Carl Sandburg and Wendell Berry Can Teach Us about Relating to Trump Voters

 

With campaigning for the 2020 presidential election already underway, Democrats are asking themselves how to approach Trump supporters. Of special concern are the 9.2 percent of Obama voters who voted for Trump in 2016. What motivated the choice of Trump and how now to relate to his supporters are thus key questions.

 

Regarding motivation, George Saunders wrote in 2016 that many Trump supporters  suffered from what he called “usurpation anxiety syndrome,” which he defined as “the feeling that one is, or is about to be, scooped, overrun, or taken advantage of by some Other with questionable intentions.” A mid-2017 article in The Nation proclaimed that “racial attitudes towards blacks and immigration are the key factors associated with support for Trump. In Defense of Elitism (2019), by Joel Stein, argues, however, that “the main reason Trump won wasn’t economic anxiety. It wasn’t sexism. It wasn’t racism. It was that he was anti-elitist. Hillary Clinton represented Wall Street, academics, policy papers, Davos, international treaties, and people who think they’re better than you.” 

 

Throwing more light on these three comments are various surveys of Trump voters. Two-thirds of non-college whites supported him, and he ran especially strong in rural areas and small towns among those who had never moved from their home state. About this time, “rural residents were nearly three times as likely (42 percent) as people in cities (16 percent) to say that immigrants are a burden on the country,” and rural whites were much more likely than urban whites to believe that “whites [are] losing out because of preferences for blacks and Hispanics.” 

 

Thus, “usurpation anxiety,” racism, anti-elitism, limited education, and rural/small-town existence all mattered, but historians are taught to avoid what David Hackett Fischer labels the reductive fallacy, which “reduces complexity to simplicity.” So the anti- and never-Trumpers must be careful about over-emphasizing any one factor.   

Worth pondering are the words of J. D. Vance, author of the best-selling Hillbilly Elegy. He believes that anti-Trump individuals should avoid labeling many Trump supporters as rednecks, hillbillies, or white trash. He indicated, in an interview with American Conservative’s editor Rod Dreher, as Kris Kristofferson put it in his “Jesus Was a Capricorn,” that “everybody's gotta have somebody to look down on.”And, as Vance says, “if you’re an elite white professional, working class whites are an easy target: you don’t have to feel guilty for being a racist or a xenophobe. By looking down on the hillbilly, you can get that high of self-righteousness and superiority without violating any of the moral norms of your own tribe. So your own prejudice is never revealed for what it is.”

To aid in the avoidance of such bias, of simplistic labeling and sweeping generalizations, two great American writers, Carl Sandburg (1878-1967) and Wendell Berry (b. 1934), can help guide us. The first was apoet, Lincoln biographer, folk-song collector and singer. His friend Adlai Stevenson (the Democratic presidential nominee in 1952 and 1956) once said that he “is the one living man whose work and whose life epitomize the American dream.” Like Sandburg, Berry is a writer of both poetry and prose. In 2011, he received the National Humanities Medal from President Obama. Although Berry is a pacifist who often has written for The Progressive magazine, conservative editor Rod Dreher has stated “I am a great admirer of Wendell Berry’s, and agree with much of his diagnosis.”(Like many conservative intellectuals, Dreher has also been a severe critic of President Trump.)

Both Sandburg and Berry have combined strong criticism of racism (see here and here) and anti-immigrant prejudice, with great empathy for common citizens. In 1911 Sandburg wrote, “Woman, the common woman—the wife of the workingman—is the slave of a slave, cooking, sewing, washing, cleaning, nursing in sickness, and rendering a hundred personal services daily for a man who is himself not in power to dictate a constant job and living wage for himself.” (See here for the sources of Sandburg quotes.) In “I Am the People, the Mob,” in Chicago Poems(1916), he wrote, “I am the people—the mob—the crowd—the mass. / Do you know that all the great work of the world is done through me?” 

In 1927, referring to The American Songbag, which he was then preparing for publication, he wrote "It is not so much my book as that of a thousand other people who have made its 260 colonial, pioneer, railroad, work-gang, hobo, Irish, Negro, Mexican, gutter, Gossamer songs, chants and ditties.” His poem The People, Yes (1936) is a book-length affirmation of the common people despite their shortcomings.  As his friend Harry Golden said of him in 1961, “His instincts are with the people. He believes they have an infinite capacity for good.”  

His praise for two of his favorite presidents, Lincoln and Franklin Roosevelt (FDR), flowed largely from their concern for ordinary citizens. Toward the end of his last volume on Lincoln and the “war years,” he wrote: “And to him [Lincoln] the great hero was The People. He could not say too often that he was merely their instrument.” 

On the eve of the 1940 presidential election, Sandburg spoke on a national radio broadcast in behalf of FDR’s reelection. In his speech, Sandburg quoted the words of a man in 1863 who stated that explanation for every act of Lincoln was that “he executes the will of the people. . . . His wisdom consists in carrying out the good sense of the nation.” Sandburg then added “and for some of us, that goes in the main in the present hour of national fate, for Franklin Delano Roosevelt.” 

Wendell Berry, about whom I have written many previous essays, shares Sandburg’s empathy for ordinary people. For decades he has lived and worked as a part-time farmer in a small Kentucky area which he describes as a “rural community” that like others “all over the country are either dying or dead.” In such areas, as noted above, most people voted for Trump. Berry has strong sympathy for them,but not for Trump, whom he thinks “indulges his worst impulses and encourages the worst impulses of others.”

But in the Introduction to his latest collection of agrarian writings, he is also critical of “the venom, the contempt, and the stereotyping rhetoric that some liberal intellectuals” displayedagainst ‘rural America’ and the ‘working class’ people who voted for Mr. Trump.” Such liberals, Berry believes, are ignorant of U. S. economic history and of “the long-term effects of unrestrained global capitalism” that hurt both small-scale farmers and many industrial workers. 

In a late 2018 interview, he indicated that although suffering workers and farmers made a mistake in voting for Trump, “people who are hopeless will do irrational things.” Their vote at least got liberal critics of rural people to notice their pain.  Earlier many liberals apparently didn’t realize that “with their consent, urban America has been freely plundering rural America of agricultural products since about the middle of the last century—and of coal for half a century longer.”

 

In that same interview Berry criticized a U.S. mentality that insisted “that the best life is the freest life. ” In some ways, Trump’s disregard of ethical norms and self-discipline reflect this desire to be unregulated. As opposed to such “freedom,” Berry emphasizes community, ethical limits, being a good neighbor, and taking good care of our environment. “We really have to turn against the selfishness of the individualism that sees everybody as a competitor of everybody else.”

 

In a 2008 essay, “Faustian Economics,” he wrote that “in our limitless selfishness, we have tried to define ‘freedom’ . . . as an escape from all restraint.” But “in neighborliness, stewardship, thrift, temperance, generosity, care, kindness, friendship, loyalty, and love,” self-restraints are implied. He also indicated a concern that has increased during the last decade--global warming, which he believed flowed from greed and waste. 

On occasion he has worked with Bill McKibben, one of the USA’s “most important environmental activists” and co-organizer of a massive September 2019 global climate strike involving 4 million people in 163 countries. In 2016, McKibben wrote that he would “love to have our leaders sit down with Wendell Berry’s novel Jayber Crow because it illustrates well how a working community operates, and    “Berry has long been the great novelist, poet, and essayist of community.” 

 

In his The Unsettling of America: Culture & Agriculture (1977), he contrasted the goals of U. S. exploiters with those of nurturers like good small farmers. While the first sought money and profit, the second were more concerned with the health of their land, family, and community. To Berry’s regret the exploitive mentality, present for a long time in America, had become dominant in corporate America. 

In his 2012 Jefferson Lecture, he spoke of corporate industrialism’s failure to concern itself with the common good. “No amount of fiddling with capitalism to regulate and humanize it, no pointless rhetoric on the virtues of capitalism or socialism, no billions or trillions spent on ‘defense’ of the ‘American dream,’ can for long disguise this failure.”

Both Sandburg and Berry have combined their empathy for common workers with strong criticism of capitalist flaws--Sandburg was a pre-World-War-I socialist.  Both also believed, as Berry said in his 2018 interview, “in the importance of conversation,” adding, “it’s either that or kill each other. . . . What we need to do is submit, for example, to the influence of actually talking to your enemy. Loving your enemy.”

 

In a similar spirit are my words of two years ago: “In our time of bitter political rancor, when the excesses of the Trump administration are so egregious and we are tempted to lash out at any Trump supporters in anger . . . we should temper our passion for justice and truth with kindness, love, empathy, humility, humor, and tolerance.” 

 

These virtues are also ones Sandburg and Berry have admired, encouraged, and tried to live. As the 2020 presidential race intensifies, we should follow their example.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173918 https://historynewsnetwork.org/article/173918 0
Trump Is the Most Corrupt President in History Ronald L. Feinman is the author of “Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama” (Rowman Littlefield Publishers, August 2015). A paperback edition is now available.

 

As President Donald Trump was impeached Wednesday, December 18th, journalists and historians are reexamining the history of presidential corruption. After carefully reviewing this history, I believe Trump’s presidency is the most corrupt in American history. 

 

Before I get to that conclusion, it’s important to review the presidential scandals that precede Trump. One might argue that every presidency has some episodes and personnel that might be considered corrupt, but seven presidencies particularly stand out for their scandals.  For this analysis, I am not including  accusations of sexual liaisons as they did not affect government policies and enforcement. Thus,  the dalliances of Warren G. Harding, Franklin D. Roosevelt, John F. Kennedy, Lyndon B. Johnson, and Bill Clinton are not considered in this discussion of Presidential corruption. One Democrat and six Republicans make up this unfortunate list: Democrat Andrew Jackson and Republicans Ulysses S. Grant, Warren G. Harding, Richard Nixon, Ronald Reagan, George W. Bush, and Donald Trump.  

 

Andrew Jackson infamously introduced the concept of the “spoils system” to American government. Jackson believed the mantra “to the victor belongs the spoils” and nearly 40 percent of all government employees were replaced by party loyalists. Many of these new appointees had minimal or nonexistent credentials for their jobs.  

 

Martin Van Buren, a “Kitchen Cabinet” advisor who served as Secretary of State and Vice President under Jackson, created  the Albany Regency political machine in New York and pushed Jackson to give jobs to political allies. Newspaper editors who favored Jackson were granted special favors and malfeasance of political appointees in handling government funds was common.  Jackson ushered in a fifty year period of widespread cynicism about the commitment of government workers to conduct the public business in an ethical manner.

 

The worst excesses of the Jacksonian spoils system occurred during Ulysses S. Grant’s presidency. Scandals emerged in the Navy, Justice, War, Treasury, Interior, and Post Office Departments, as well as the New York Customs House. Grant was very naïve about people’s motivations, and allowed himself to be manipulated by military associates and people who flattered him in order to gain access to lucrative financial deals at a time of great transformation and development of the industrial economy. Grant was never proven to be directly involved with the scandals, but his association with some people of questionable character, and his acceptance of personal gifts, undermined his reputation and presidential legacy.  

 

The 12  scandals under Grant led to four cabinet members and first term Vice President Schuyler Colfax’s removal from office. This corruption is often labeled as the Credit Mobilier scandal, but it actually began before Grant was in office and continued through his administration.  The Black Friday, Gold Panic, New York Custom House Ring, and Whiskey Ring scandals also occurred during Grant’s presidency and reveal the endemic and disgraceful level of corruption. The Liberal Republican Movement of 1872 was a reaction against the Grant Administration scandals, and ultimately led to the civil service reform movement promoted by the Mugwump faction in the party led by Carl Schurz, Charles Francis Adams, Jr., Mark Twain, E. L. Godkin, and Thomas Nast, among others.

 

With the establishment of the Civil Service Commission in 1883 by the Pendleton Act under President Chester Alan Arthur, corruption did not plague the presidency again until Warren G. Harding in the early 1920s. Like Grant, Harding was naïve about the intentions of the “Ohio Gang,” Ohio politicians he appointed to high political office. The Ohio Gang’s Teapot Dome scandal embroiled Secretary of the Interior Albert Fall, and Secretary of the Navy Edwin Denby, Attorney General Harry Daugherty, and Bureau of Veterans Affairs head Charles Forbes in scandal.  Investigations of these corrupt officials were in full swing when Harding suddenly died on August 2, 1923, just 2 years and 5 months into his term. Harding was aware of the moral and ethical collapse of his administration and was depressed about that reality. 

 

50 years later, Richard Nixon came into office with distrust of the news media and a desire to get revenge on his “enemies” in government and journalism. For Nixon, fighting his enemies meant using every tactic, including wiretapping, break-ins, bribes, and encouraging the Internal Revenue Service to  audit his opponents. Nixon was so brazen he even had tape recordings of everything occurring in the Oval Office, including discussion of illegal activities.  

 

What ultimately brought Nixon to resign was the burgeoning Watergate Scandal, the attempt by Nixon operatives to bug the headquarters of the Democratic National Committee to find out their tactics and strategies for the 1972 Presidential campaign. The Washington Post sent Bob Woodward and Carl Bernstein to investigate the unsuccessful break in on June 17, 1972.  With the help of Deep Throat, Deputy Head of the FBI Mark Felt, the reporting helped spur a Congressional investigation in 1973 and 1974, leading to an impeachment inquiry. After the Supreme Court decided in US. V. Nixon that the President must hand over the Watergate Tapes to the Special Prosecutor and to Congress, Nixon soon resigned on August 9, 1974. A total of 76 government officials were charged with crimes in the Watergate Scandal, and 55 were convicted, and 15 served prison sentences. Nixon avoided prosecution when he was pardoned by his successor, Gerald Ford, on September 8, 1974.

 

When Ronald Reagan came into the Presidency, he revived the role of corporate influence and malfeasance reaching into the cabinet and other government agencies, including the Departments of Defense, Justice, Housing and Urban Development, and Interior, as well as the Environmental Protection Agency and the Central Intelligence Agency.  Reagan appointees--including Attorney General Edwin Meese, Secretary of Defense Caspar Weinberger, National Security Advisers Robert McFarlane and John Poindexter, HUD Secretary Samuel Pierce, Secretary of the Interior James Watt, White House Press Secretary Lynn Nofziger, Deputy Chief of Staff Michael Deaver, EPA head Anne Gorsuch Burford, CIA Head William Casey, and Oliver North--were engrossed in multiple scandals involving money and law breaking.  Many of those involved in scandal were indicted (26), convicted (16), and sentenced (8). Several of the indicted officials were given clemency by incoming President George H. W. Bush, who denied any personal involvement or knowledge of the scandals.

 

Although the Reagan Administration surpassed the Nixon presidency in the number of well-known figures who were embroiled in corruption, Reagan left office with strong public approval. His personality and public image helped him survive in office.  

 

Under President George W. Bush, there was great controversy that developed over the roles of Vice President Dick Cheney, Secretary of Defense Donald Rumsfeld, Attorney General Alberto Gonzales, Advisor Karl Rove, White House Chief of Staff Andrew Card, Cheney Chief of Staff Lewis Libby, and other government agencies and individuals involved in the planning and execution of the Iraq War, Afghanistan War, reaction to Hurricane Katrina, and the economic meltdown that led to the Great Recession of 2008-2009. Bush administration officials received 16 indictments, 16 convictions, and 8 prison sentences.  

 

This led to Bush’s rapid drop in public opinion ratings and Bush was the most unpopular President since Richard Nixon when he left office.  Bush hurt his party and undermined any possibility of 2008 Republican Presidential candidate Senator John McCain winning the election.

 

Now, in the time that Donald Trump has been in office, and as Donald Trump faces an impeachment trial in the US Senate, the level of corruption and scandal is the greatest since Nixon and Reagan.  Just to recap: National Security Adviser Michael Flynn was forced to resign over contacts with Russian government officials and his lobbying activities during the Presidential campaign. Attorney General Jeff Sessions had to recuse himself from any investigations of Russian hacking during the Presidential campaign of 2016.  We have seen convictions not only of Michael Flynn, but also of Michael Cohen, Rick Gates, Paul Manafort, George Papadopoulos, Roger Stone, and many more are likely to come. 

 

 Many other cabinet members have come under fire for incompetence or conflicts of interest, including Rick Perry, Betsy DeVos, Mick Mulvaney, Wilbur Ross, William Barr, Mike Pompeo, and past appointees Ryan Zinke and Scott Pruitt. Trump, himself, has broken the Emoluments Clause of the Constitution, which prevents any President from making profits on his personal business ventures while in office. While all politicians can be accused of lying and deceit at some point in their careers, Donald Trump has set a record that has caused many observers to contend that he is the “Liar in Chief” as he has lied more than 15,000 times in less than three years, as recorded by the Washington Post.  

 

While the presidency has often been embroiled in scandal, Donald Trump’s impeachment and other methods of corruption stand out in history. 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/blog/154295 https://historynewsnetwork.org/blog/154295 0
Impeachment Wounded the Republic

 

Why is the House of Representatives impeaching President Donald Trump for abuse of power and obstruction of Congress? The shock and sadness I feel over the fact that the Congress will not impeach the President based on a statute or a breach of the Constitution per se, but on the two flimsiest, most generic charges possible is profound. The Democratic majority in the House is about to mortally wound the Republic as we know it. As a historian and political theorist, I believe there is no way the American system of government can recover from an impeachment on these grounds.

 

Many scholars argue that our system of government is not modeled after Athenian democracy but rather Roman republicanism. The idea behind republicanism is called in political theory the ethics of non-domination. 

 

In laymen’s terms this means that, ultimately, no one person or group of people can systematically deny the rule of law to another individual or group of people. Basically, by denying the rule of law to others we dominate them, in the political sense, if not in others as well. Part of the rule of law is due process and denying due process to anyone, even powerful people, is a violation of the rule of law. Republicanism goes back to Aristotle who thought that while democracy had merits, the unqualified rule of the majority was deeply problematic.

 

Aristotle was denied citizenship in Athens his whole life just for being born in the wrong city-state. This left him craving political fairness, what we might call social justice. He argued that where the powerful and wealthy ruled, justice was denied to the poor and the politically and socially marginalized. However, he also believed that when the masses of people, led by demagogues amongst the wealthy, were given full reign, the wealthy and powerful were denied justice. He believed, at that time almost alone, that equal justice for all regardless of power levels, status, or ability, was essential to justice, social or otherwise.  

 

He rejected the rule of man, or the rule of people. Arguing that only the law should rule, Aristotle believed that no matter who ruled, everyone had certain basic rights and that institutions needed to balance each other out to ensure than no one person or interest group prevailed permanently and systematically over any other. Ultimately, equal protection under the law and due process, as well as the idea of res publica (the public thing) were incorporated into the idea of polity ora republic.  

 

Aristotle influenced many great thinkers in both the Northern European and Southern European traditions. He also deeply influenced the Founders, including James Madison, John Jay and Alexander Hamilton who wrote the Federalist Papers. 

 

Aristotle and Rome, the two models that gave us the ethics of non-domination, checks and balances, and republicanism, were defeated by the same foe: political polarization. Political parties in ancient times were not formal, but they were what James Madison in Federalist No. 10. called “factions.” 

 

Aristotle was forced to flee Athens because he was viewed as the “enemy” by the faction in control in Athens, a group reminiscent of modern-day progressives. They argued against the rule of law in the Aristotelian sense and wished to punish the wealthy and powerful just because they were of the “other faction.” The poor and their demagogic leaders, as they had done before, demanded progress at any cost or political freedom for none. They received the latter.

 

By contrast, the Roman Republic collapsed for many reasons, but political polarization and the resulting break down in the rule of law were the main culprits. In133 BC, a man named Gracchus, leading a progressive faction promised to seize public land controlled by wealthy Romans and redistribute it without compensation. In order to prevail, Gracchus was prepared to violate the rule of law to win. The Senate ordered a consul, sort of a co-president, to kill Gracchus without due process. Instead, a Senatorial mob murdered Gracchus and got away with it.

 

The Senate ordered Gracchus killed because of unproven rumors within its faction that he wanted to make himself an absolute monarch. Essentially, they accused him of abuse of power and contempt of the Senate. While Gracchus’ radical plans would have harmed the stability of the Republic, it was the Senate’s overreaction that truly devastated the Roman republic.

 

I believe this history is entirely relevant to this exact historical moment. Right now, one house of Congress, based on flimsy evidence, is about to charge the President with abuse of power and contempt of Congress, hoping that his power and his faction will be destroyed by his impeachment and eventual removal.

 

This scenario is exactly opposite to Gracchus in Rome, but the result will be the same. Here progressives are trying to destroy a conservative leader viewed as a prospective tyrant. The American republic, if it has not already done so, will enter inexorable decline, just as the Roman Republic did in 133 BC. Destroying or trying to destroy a legitimately elected, but distasteful president in an unconstitutional manner will ultimately subvert the constitutional order, as the senators found out after they had Gracchus murdered. In other words, trying to support the Constitution by abusing power is like trying to snuff out a small fire by burning your house down.

 

The founders did not mean that the abuse of power outside of a legal or constitutional breach would be impeachable. The articles of impeachment that have come out do not cite any statute or part of the Constitution, except the prerogatives of Congress. In other words, Congress, the House in particular, seeks to abuse its power to correct what it views to be abuses of power. Ironically, this time, the Senate will impede the constitutional breach rather than exacerbate it as the Roman Senate did. However, if impeachment is not defeated, a bad precedent will be set, one which will speed the decline of the republic. 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173929 https://historynewsnetwork.org/article/173929 0
China Needs to Change Strategy After Hong Kong Elections

 

There is no mistaking that the Hongkongers’ overwhelming vote on 24 November was a clear and unequivocal rejecting of Beijing and its appointed government in Hong Kong. They voted for greater democracy, including free municipal elections, personal political liberties and the rule of law. It was a rejection of political abuse and repression, and an irrefutable popular endorsement of the demands of the protesters, if not their violent tactics.

 

Beijing can try to dismiss or downplay the results. Of the 3 million votes cast in these municipal elections, pro-democracy candidates garnered 1.7 million votes. By any calculation, this is a clear majority. Beijing can try to dismiss or downplay these results. The Chinese Communist Party’s media mouthpiece can hyperventilate about ‘foreign interference’ by protesters and mythical voting irregularities. And the humiliated pro-China parties here, seeing their ranks on the local councils decimated, can try to split mathematical hairs, arguing, for example, that the pro-democracy ‘only’ won 60 % of the vote—as if a 71.2% turnout amounts to anything other than an electoral landslide.

 

But putting the spin aside, there is no mistaking that the pro-China bloc in Hong Kong took a pasting of incredible proportions. The only question remaining is how communist rulers and their minions running Hong Kong will respond to this expressed will of the true silent majority. The options are either reform or further repression.

 

For many who know China and its rulers, the clear answer is repression. They recall the brutal crackdown of June 4, 1989, when troops of the People’s Liberation Army brutally crushed the last major pro-democracy uprising by massacring hundreds, if not thousands, of young protesters and other Chinese citizens who were supporting the protesters who had occupied Tiananmen Square. The Tiananmen massacre, which Beijing has tried to erase from the collective consciousness ever since, stands as a testament to how far a tyrannical autocratic regime will go to stay in power. But another, more recent event in China presents a parallel option, indicating that when faced with a challenge to its authority, China’s rulers can also be adroit at accommodation and compromise to lesson a crisis. Instead of Tiananmen Square on June 4, 1989, there is the example of Wuhan in 2011.

 

Wuhan is a small fishing village in southern Guangdong Province. There, in late 2011, villagers fed up over the seizure of their land, revolted. They forced communist party officials and police out of the village, and took control of the local government.

 

Rather than move in with force to crush the village uprising, the central government eventually sent in the provincial party secretary of Guangdong, Wang Yang, who was seen at the time as a leading economic reformer. Wang made an important concession to resolve the weeks-long uprising. This included freezing some of the land deals at the heart of the dispute, releasing jailed prisoners from custody and firing some obstructing local officials.

 

This was a recurring pattern. For example, when faced with a series of labor strikes in 2010 and 2011, Chinese authorities responded first with the police, sparking violent clashes, but eventually urged local provinces to raise the minimum wage.

 

In 2009, a 19-year-old man named Sun Zhongjie, working as a driver for a construction company, was fined for operating an ‘illegal taxi’ after picking up a man who flagged him down on the road. The Shanghai traffic police were operating a sting operation against suspected motorists. Sun was fined the equivalent of $2000.00 and fired from his job. He made a dramatic public declaration of his innocence by chopping off his little finger. After Sun’s case sparked an uproar on social media, he won his case and didn’t have to pay the fine. Hundreds of other drivers, who had been caught in this scheme, received refunds. This was certainly a shift in strategy from that employed by the ‘Great Helmsman’ Mao Zedong.

 

Today, Chinese rulers have become more skillful at monitoring public opinion. They employ a vast network of monitoring centers at universities and state-run news agencies. They follow all the trending discussions online trying to direct local officials to defuse potential crises before they start. More recently, the rulers in Beijing have put in place ‘facial recognition’ and ‘social credits’ to more extensively monitor the Chinese people. There is now and ‘Orwellian tinge’ pervading Chinese society.

 

Of course, this situation comes from the top down. There are major differences between the situation in China now and even a few years ago. President Xi Jinping, who has eliminated term limits, effectively allowing him to rule for life, seems far less interested in compromise and concessions than his predecessor Hu Jintao. As seen by his actions in Xinjiang with the internment of a million Uyghurs in concentration camps with the Orwellian name ‘education centers.’ Xi seems more interested in ruling through power and fear than negotiations. In some ways, he is a throwback to the suppression tactics and ‘personality cult’ of Mao Zedong.

 

Also, since the relatively open 2010-2013 period, Beijing’s stifling of the internet has been nearly complete. Weibo, once a freewheeling platform for debate, is a shell of its former self. Online dissent can be more quickly suppressed. The media is more strictly controlled than ever. Even virtual private networks used to bypass the censors are more difficult to use.

 

But that crackdown on the internet and the media breeds its own problems for Chinese leadership—like leaving them ignorant of the depth of popular discontent in Hong Kong.

 

There has been some informed speculation that China’s leaders, and particularly XI, were unpleasantly surprised by the results of the Hong Kong vote and the overwhelming defeat of the pro-China supporters. One thing that has become certain over the years is that China’s leaders do not want to be surprised. For example, they were surprised by the Falun Gong spiritual movement in in 1999. It staged a sit in outside the Chinese Communist Party compound. To prevent its members from coming to Beijing to protest, city and provincial leaders resorted to brutal methods, including torture and murder, to crush Falun Gong at its grass roots. 

 

With the depth of dissatisfaction in Hong Kong now made patently clear at the ballot box, China and Xi now face a difficult choice. Will it double down on repression or listen to the people and choose a new course that will require some compromise?

 

The outlines of this compromise have been known for months. Hapless Chief Executive, Carrie Lam, and her incompetent cabinet and advisors must resign. An independent commission must be empowered to investigate the violence and excessive force used by Hong Kong police against the protesters. A ‘truth commission’ must investigate the causes of the unrest and offer amnesty for any protesters who didn’t injure anyone or cause serious property damage—and that would mean releasing most of the thousands arrested on ‘rioting’ charges.

 

And finally, Beijing needs to respond to the people’s clear aspirations for autonomy by announcing a relaunch of the long stalled political reform process, with a promise to eventually allow all Hongkongers’ to vote for their leaders.

 

Whether Beijing will opt for the reform path remains to be seen. Like with the changes to the opening and reform policy in 1978, China’s communist leaders have shown they can adroitly shift positions and adapt when they need to. And in Hong Kong right now, shifting and adapting is what they clearly need to do.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173921 https://historynewsnetwork.org/article/173921 0
The Real Santa Who Fed the Hungry

 

The Santa Claus we know and love was based on St. Nicholas, a 4th century Christian bishop from what is now modern day Turkey. St. Nick's idea of Christmas today would be gifts of food for the millions of starving people in the world. 

 

George McKnight's 1917 book on St. Nicholas describes the legend of the saintly man's dedication to feeding the hungry: "It was so on a time that all the province of S. Nicolas suffered great famine, in such wise that victual failed. And then this holy man heard say that certain ships laden with wheat were arrived in the haven."  

 

Those ships were just making a stop and were actually en route to another destination to deliver the wheat. St. Nicholas went to those ships pleading for food donations to help his starving people. However, the ship captains said they could not spare any because it had been measured. The Emperor in Alexandria, whom they were making the delivery for, would punish them if they gave away food before reaching their destination. 

 

St. Nicholas promised them, in God's name, that any amount they gave would not lessen their cargo. The ship captains agreed and unloaded food for Nicholas to distribute to his people. 

 

It is written "Then this holy man distributed the wheat to every man after that he had need, in such wise that it sufficed for two years, not only for to sell, but also to sow."  The communities of St. Nicholas now had food again.  And when those ships continued on and arrived at their destination their cargo measured no less than before. It was another miracle they attributed to Nicholas, servant of God. 

 

The greatest gift one can give is food to a starving person. Doing so does not diminish what you have, but rather lifts up the hungry and yourself. This story of St. Nicholas is a great example for anyone to be a great negotiator in getting food to the hungry. Imagine if every leader in the world followed that example of St. Nicholas. 

 

Elizabeth Rundle Charles' book on saints tells of St. Nicholas and his passion for helping the poor and hungry calling him "the saint of the people, of the oppressed and of strangers." 

 

St. Nick's kindness and the poem "A visit from St. Nicholas", (known as "Twas the night before Christmas") led to the creation of Santa Claus and also the gift giving in today’s holiday.  We all know that sometimes this goes to the extreme. If you turn on the television in December you will definitely see commercials about sales and super expensive products. Christmas shopping can easily get wild and excessive.  

 

But if we are to be true to the legend of St. Nicholas, Christmas should include food for every hungry person in the world. For we know the real St. Nicholas, who lived the Gospel, would not allow someone to starve. 

 

As this Christmas arrives though many people are starving, including children.  There are massive hunger emergencies like in Southern Africa. The UN warns "more than 11 million people now experiencing “crisis” or “emergency” levels of food insecurity (IPC Phases 3 and 4) in nine Southern African countries: Angola, Zimbabwe, Mozambique, Zambia, Madagascar, Malawi, Namibia, Eswatini and Lesotho."

 

In the Horn of Africa likewise drought has caused food shortages in Somalia, Ethiopia and Kenya. The impact of climate change is very real for the hungry in these nations. 

 

South Sudan, Afghanistan and Syria are all suffering in hunger because of conflict. Likewise the Sahel region of Africa is experiencing severe hunger from conflict and drought. In Yemen, a civil war has left over 20 million people in need of food aid. Haiti and Central America also live in hunger because of drought. 

 

Relief agencies like the World Food Program, Save the Children, UNICEF, Catholic Relief Services and others don't have enough funds to keep up with the hunger crisis. We can help them feed the hungry. 

 

The non-profit Edesia, which makes life-saving Plumpy’nut to treat malnourished children, sent some overseas last year along with stuffed animals. The starving children would have food to eat and a friend so they are never alone. That is a gift in keeping with the true Christmas spirit. 

 

To follow St. Nick's example is to feed the hungry. Donating to hunger relief in honor of someone would be a magical Christmas gift that you would never forget. 

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173920 https://historynewsnetwork.org/article/173920 0
Turns out, The Rockettes’ Annual Christmas Spectacular Hasn’t Always Been So Spectacular

 

Since 1932, the Rockettes have been gracing Radio City Music Hall with their complex dance numbers and signature high kicks. Their moves and costumes have become icons of New York City and the Christmas season, garnering worldwide recognition. Originally “The Roxyettes,” the group has performed in over seventy Christmas Spectaculars and sixty Macy’s Thanksgiving Day Parades. Yet, beneath all the sequins and the bows, The Radio City Rockettes have a history that is far from shiny.

 

The Rockettes’ first performance took place in St. Louis, Missouri in 1932. All of the 16 original dancers where white. The ensemble remained all white until 1987 when the first woman of color joined the chorus line. One of the group’s earliest managers explained that the reason for a long-standing lack of diversity in the lineup was that varying skin tones would make the dancers look less uniform. “One or two black girls would definitely distract,” he said, from what is meant to be a seamless and spotless performance.

 

Of course, the Rockettes still imitated ethnic minority cultures while refusing to hire people from those cultures. Starting as early as the 1950s, the women performed dressed as stereotypes of Japanese Geishas and “hula girls”. Yet, traditional Hula attire is not the same faux grass skirts and party leis that the Rockettes wore on stage in 1958. Instead, the Rockettes dawned the Americanized version of Hula attire and style that became popular in United States during the 1910s as Western influence in Polynesia grew.

 

The decision to cast all white dancers was criticized by civil rights leaders and activists in the 1960s and 70s. Furthermore, the Rockettes policy violated the Supreme Court’s 1976 ruling in McDonald v. Santa Fe Trail Transportation Co. The court ruled that it was illegal to employ or fire an individual based on their race. Since the Rockettes failed to comply with such civil rights laws, the controversy surrounding the dancers grew. Despite increasing public pressure, the Radio City dancers remained all white into the late 1980s. Even when Jennifer Jones, the “First Black Rockette,” joined in 1987, many felt her inclusion was more of a marketing ploy than true integration.

 

Today, the Rockettes still represent a largely monolithic ideal of feminine beauty. Although the exact measurements have changed over time, as of 2019, Rockette dancers must be at least 5’6” and no taller than roughly 5’11”. The women are actually measured at auditions to ensure height requirements are met. Additionally, male Rockettes do not exist- at least not yet.

 

While the Rockettes’ current racial diversity hopefully indicates their stance that women of all colors are now valued more than they were at the time of the group’s beginnings, continued stringent regulations on body type and appearance show there is room to grow. As we celebrate tradition this holiday season, it is important to keep in mind their deeper messages about what-and who- we value as a society.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173787 https://historynewsnetwork.org/article/173787 0
Is Donald Trump a Conservative?

 

In December of 2015, Senator Lindsey Graham asked rhetorically, “You know how to make America great again? Tell Donald Trump to go to hell”.  Graham went on to say that President Trump “doesn’t represent my party.  He doesn’t represent the values that men and women who wear the uniform are fighting for.”  Although Graham was among the most outspoken critics of then candidate Trump, he was not alone in this criticism.  Ted Cruz and other prominent Republicans warned against a Trump presidency and feared that his beliefs contradicted that of their party. 

 

As Republicans are now nearly united in defending Trump from impeachment, it is hard to remember that just a few years ago Republicans routinely criticized the President. But it’s worth examining why many Republican leaders initially resisted Trump’s primary bid. Trump’s policy interests and approach to politics deviate from the modern conservative movement. 

 

It is impossible to separate the overlap between conservatism and the Republican party because it has most traditionally served as the philosophical inspiration of the party’s members. For intellectuals, pundits, and mainstream voters alike, conservatism has long been a concrete metric by which to judge a Republican candidate.

 

That is not to call conservatism a one size fits all philosophy. As historian George Nash explains, conservatism is “a coalition with many points of origin and diverse tendencies”.  In other words, conservatism is a broad movement that encompasses a wide range of intellectual beliefs that are relatable but not identical.  Nevertheless, Republicans are generally expected to agree with a few aspects of the conservative coalition: free trade, small government, and individual freedom. So, does Trump – the often touted unique, unprecedented President – fit the aspects of modern conservatism?

 

In large part due to George Nash’s book The Conservative Intellectual Movement in America Since 1945, historians are able to concisely define modern conservatism. Nash conceptualized that there were three coalitions that formed the conservative base after the Second World War. The first were classical liberals – outraged at the New Deal – who feared that individual liberty was disappearing. The second coalition was comprised of traditionalists who were angered by the rise of totalitarianism and advocated for an awakening of religious orthodoxy. This sect, according to Nash, were militant evangelists who hoped to preserve the integrity of America and the western world through the renaissance of traditional values. The final group were former communists themselves who feared the power of the party.  The so called “ex radicals” fought fervently against the spread of communism.  Together, these three separate coalitions formed the modern conservative movement that influenced American politics throughout the Cold War and into the twenty first century. 

 

Of course, there were always natural tensions and divisions within the coalition. However, the divide began to worsen as the Cold War ended and the spread of Communism, a decisive factor that kept the coalition together, faded. In addition, the spread of globalism and mass migration has had a profound impact on all political philosophy of the 21st century. According to Nash, these two events led to the rise of populism internationally because new conflicts filled the void left by the conclusion of the Cold war and globalism has led to a counter movement of nationalism. Populism, of course, can and has existed on both sides of the political spectrum. As a result, it is both unfair and incorrect to prescribe populism as the influence that changed conservatism.

 

So, does Trump fit the mold defined by historians on conservatism? Does his brand of populism even fit the classical ideology of conservative populist thought?

 

Not entirely. While it can be argued that Trump stands for the general three beliefs of conservatism – individual freedom, the suppression of authoritarians, and a return to a “traditional” lifestyle – that is also the jargon of all political leaders, despite the consequences or intentions of their actions. In addition, conservatism is more than just those buzzwords – it is a complex ideology that includes theory and policy. Therefore, it is unfair to fit Trump as a conservative because of these three general beliefs. And, most importantly, Trump greatly differs from conservatism in two main ways: his rejection of globalism and his incessant attacks on all intellectual elites.

 

Globalism and free trade have been a basis for conservatism since the philosophical movement gained popularity. It has historically been so important that movements, organizations, and publications have assembled to tackle free trade economic issues. Most famously, the traditionally conservative Economist publication was started to organize against the English Corn Laws in 1843 – tariffs that harmed the English economy.

 

In the more recent past, conservatism has promoted internationalism to preserve and promote western institutions. The internationalist movement is well documented and is rooted in the philosophy that America must promote freedom by working with other nations regardless of the inherent cost.

 

In addition, recent conservatism has promoted free trade globally. Conservative presidents Ronald Reagan and George H.W. Bush each supported regional trade agreements with Mexico and Canada and Bush 41 started the NAFTA negotiations. Free trade and globalism were a standard platform of modern conservatives.

 

But President Trump supports none of these ideals. Rather, he advocates for tariffs not unlike The Corn laws the Economist lobbied against. He argues that NAFTA is harmful to the United States and that we must rethink our involvement trading with our neighbors. And finally, he withdrew the United States from a similar trade agreement, the Trans-Pacific Partnership, because of his beliefs against free trade.

 

Even beyond his economic policies, President Trump is by no means an internationalist. Rather, he is most identifiable as a nativist – a fervent nationalist who supports only American interests. While traditional conservatives were concerned with the wellbeing of the western world, the current president is concerned only within his own borders.

 

And finally, rather than coalesce against dictatorships – a trait Ronald Reagan was so widely praised for – he is friendly and congenial with the leadership of both Vladimir Putin and Kim Jung-un. It is common, factual knowledge that their governments do not promote freedom, democracy, or individual liberty. But president Trump – unlike modern conservatives – does not care.

 

Donald Trump hardly fits the definition of a conservative when analyzing his policy beliefs. Perhaps what’s most shocking, though, is that he doesn’t seem to care. In fact, he actively tries to not be a classical conservative. Modern conservatives have banded together, worked as a coalition of intellectuals who agree on a philosophical basis. President Trump is not interested in doing so. Rather, he claims his movement “is about replacing a failed and corrupt political establishment” because it “has brought about the destruction of our factories and our jobs”. President Trump clearly has a political vendetta against any intellectual organization.

 

 Importantly, he claimed that nobody has a better understanding of the American system, and that “I alone can fix it”. Trump’s claim of individual power is important to note because he establishes himself as a singular unit of change rather than a leader of a larger philosophical coalition. Conservatism has long been understood as an alliance of individuals who share a common interest. But Donald Trump wants to be an individual – separated from the intellectual movement – and seen as the individual source of knowledge and change.

 

Anti-elitism is not new in politics and it exists across the aisle. Throughout American history populists have attacked and blamed elites for their respective political qualms. According to Nash, though, they often target a select, identifiable group of people: a republican populist would blame the liberal elite and vice versa. But president Trump blames every elite – conservative and liberal alike. This again separates president Trump from the traditional conservative movement.

 

Further, modern conservatives have been blatantly elitist. Mitt Romney, the previous Republican candidate for president, stated that he would never win over 47% of the country because they pay no taxes. This claim to wealthy donors perfectly highlights the blatant elitism that conservatism has often aligned with. Newt Gingrich, former Speaker of the House, ranted about Steve Bannon – a Trump friend and former Chief Strategist – for his attack on the GOP establishment. Gingrich stated that starting a political war against the Republican party was an “absurdity”. Two former symbols of conservatism have been blatantly elitist and highly critical of Trump’s anti-elitism. By diverging from the GOP established leaders, Trump proves to be outside the conservative spectrum.

 

Perhaps what is most interesting is not Trumps sense of anti-elitism, but the irony in his rhetoric. Trump clearly claims to be anti-establishment but is himself is so enshrined in elitism. He often promotes himself as successful and knowledgeable because of his elitism, his wealth, and his luxurious lifestyle. Populists most often try and cast themselves as distinct from elites – Bernie Sanders, for example, portrays himself as outside the wealthy elite. Whether or not that is true is beside the point: the irony is that Trump casts himself as the elite while simultaneously branding them as evil.

 

Trumpism promotes taking down the infrastructure of intellectual establishments. These groups and set on opposing him as well, regardless of political affiliation. For example, the National Review, Commentary, and The Weekly Standard each oppose President Trump in obvious and outspoken ways. The Weekly Standard went as far to issue a plea for an entirely new brand of conservatism. The National Review, started by the conservative intellectual William F. Buckley, even published an article titled “Against Trump”. 

 

It is clear that the conservative establishment does not view the president as one of their own. The three main publications have regularly criticized the president. Rather than endorsing a republican president’s actions, traditional publications actively resist him. Conservative politicians once longed for the endorsement from such intellectually respected publications known for promoting the wellbeing of their philosophical beliefs. But no longer.

 

President Trump’s mutual alienation from the classical conservative ideology shows that he does not support them, and they do not support him. This, combined with his policies that outright defy what modern conservatism stands for, lands him outside the understanding of modern conservatism.

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173911 https://historynewsnetwork.org/article/173911 0
Why We Should Make Go-Go DC’s Official Music

The Metro PCS store that Launched the "Don't Mute DC" Campaign

 

On November 17th, 2019 District of Columbia Mayor Muriel Bowser declared Nov. 17-23, 2019 as “Go-Go Awareness Week.” The week-long celebration is intended to commemorate D.C.’s indigenous music. Many Washingtonians thought the designation was important because it preserves D.C.’s music and culture of Go-Go. The week featured many events, including the Don’t Mute DC Call to Action Conference, the Go-Go Music Awards, and a series of workshops.

 

The sound of go-go originated from the guitarist and singer Chuck Brown (1936-2012). Brown introduced the D.C. based sound during the early 1970’s (a period known as Chocolate City in D.C.) when he blended funk, rhythm, blues, and old school hip-hop with percussion instruments. Brown found a way to bring together Latin rhythms, call and response, and a continuous drum beat to create an authentic sound called Go-Go. The genre gets its name after the slang word for a 1960’s nightclub. The term was first heard in a Motown song called “Going to a Go-Go” by Smokey Robinson. Brown’s unique sound propelled his rise to fame as he performed with his band Chuck Brown & the Soul Searchers.

 

In 1978, Brown released the song Bustin’ Loose and it peaked at number 34 on the Billboard Hot 100 singles chart. Chuck became a hit and was referred to as “The Godfather of Go-Go.” Brown explained that he came up with the new style of music because the Disco DJ's at the time were selling out more shows and realized he was losing his audience. He saw the necessity to make his music more interactive so everyone could have a great time. This forced Brown to cut out pauses and breaks in his music so a set could never stop. Go-Go music’s beat never stops and the songs are transition into one another without ever pausing.

 

The scene began to flourish in the late 70’s when he released his song Go Go Swing Medley which sparked an international following for Go-Go music. Brown’s Go-Go renditions of classic songs by Duke Ellington and Johnny Mercer mobilized the movement of Go-Go to an even bigger scale, earning him a place in American music royalty. He received numerous accolades, including the National Endowment for the Arts Lifetime Heritage Fellowship Award (2005). In 2009, D.C. Council named a street after the Godfather—Chuck Brown Way. It’s clear Brown has been an influential musician in the D.C. area.

 

These days, Go-Go music can still be heard around the DMV, especially around U Street and Shaw-Howard University. This past April, there was a dispute in the neighborhood when a resident at a luxury apartment complex complained that the go-go music played at the local Metro PCS at Seventh St. and Florida Avenue was too loud. The owner was forced to turn off its go-go music as the nearby resident threatened to sue the company. The owner of the store, Donald Campbell said many of his regular customers and nearby neighbors thought the store was “close[d]” because there wasn’t any Go-Go music playing. This sparked a grassroots movement called Don't Mute DC. Community organizers protested and 80,329 people signed a petition in support of the music played at the store. Eventually, the protestors won and the music played again. Go-Go music is not just a genre of music anymore, it’s a movement to stop gentrification in D.C. According to Campbell, Don’t Mute D.C. means “don’t mute or erase black people in D.C.”...which is to say “don’t let gentrification have the final say.”

 

Gentrification evokes strong emotions in D.C. as wealthier and whiter residents move in and longterm black residents feel displaced. Go-Go music, like mambo sauce, represents the core of Chocolate City and D.C. culture to many Washingtonians like myself. Go-Go music is more than a genre of music because it's an expression of art and culture. D.C. natives who have a strong connection to Go-Go want the music to stay alive because it plays a crucial role in storytelling. 

 

Because of Don’t Mute DC movement , Ward 5 Councilman Kenyan McDuffie introduced the Go-Go Official Music of the District of Columbia Designation Act of 2019 on June 4th, 2019 to make Go-Go the official music of the District of Columbia. When introducing the bill McDufflie said, “to me and so many other Washingtonians, go-go has become so much more than just a musical genre. It is the very fabric of the city’s cultural and artistic expression. Designating Go-Go the official music of the city signal to those who have been here and to those who continue to move here, that this music represent the lived experiences of native Washingtonians. It codifies into law that Go-Go will never be muted in the District of Columbia.” Antonio Ramirez, a historian from the University of Michigan said Go-Go can be understood as a “local manifestation of global black music who roots can be traced to the historical African American experience.”

 

Preserving African-American culture is crucial to the future. The new bill is a major step towards protecting the history of go-go for future generations of black D.C. residents. As the city continues to transform its identity, it’s important for D.C. to remember its history.

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173841 https://historynewsnetwork.org/article/173841 0
We are living in the most practical period of music listening Earlier this month, Rolling Stone reported that Spotify is made a $60.4 million profit in the third quarter this year. This is only the third quarter the company has ever made a profit in its 11 years in existence.

 

The music-streaming service attributes much of its success this quarter to premium subscriptions. Premium users pay $9.99 per month to essentially own every song on the service: premium allows users to play any song on-demand and download any song to play offline. Are we currently living in the most economical and practical period of music listening yet?

 

In 1877, inventor Thomas Edison invented the phonograph to record human sound on metal cylinders. Each cylinder recording would only last one play and the sound was very low-quality in comparison to today's digital recordings. Edison soon updated the phonograph so it would play music recorded on wax cylinders and his company began to market popular music.

 

Next came the gramophone, invented by Emile Berliner in 1887. This device was the first to use flat discs to play music. Since electricity wasn't widely available, users wound up the device to hear the music. The United States Gramophone Company formed in 1894 to market the product. This historic event has been designated as the “true beginning” of the record industry. The invention of discs made it possible to mass-produce music in a way that the structure of wax cylinders would not allow. 

 

RCA Victor Records released the first full-length vinyl record in 1930, and this medium remained the most popular until around 1975 when compact cassette tapes emerged. Eight track cassettes also were common between the ‘60s and ‘80s in the United States. Compact cassettes were smaller than vinyl records, lending themselves to portability in car stereos. The Sony Walkman cassette player debuted in 1979, changing music listening forever as a portable activity to take with you everywhere. 200 million total Walkmans were sold, according to Time. By 1983, compact cassettes outsold vinyl records. Time’s Meaghan Haire wrote, “Walkman's unprecedented combination of portability (it ran on two AA batteries) and privacy (it featured a headphone jack but no external speaker) made it the ideal product for thousands of consumers looking for a compact portable stereo that they could take with them anywhere.” The Walkman revolutionized music listening, paving the way for products like the iPod.

 

Compact discs (CDs) began to be sold commercially in Japan in 1982 and came to the US in 1983. The first CD players, including the Sony CDP-101, cost as much as $1,000. CDs outsold vinyl records by 1988 and cassettes by 1991. In a recent Fortune article, Andrew Nusca wrote that the music industry raked in a record $25.2 billion in revenue globally, based on sales of tangible music like CDs, cassette tapes and vinyl records.

 

Then the industry took a hit. Napster, a file-sharing service, was started in 1999 and took the world by storm. Fortune’s Richard Nieva wrote in 2013 that “Napster was rebellious of convention, threatening to established norms, and, well, really loud.” The software allowed users to send files to other computers freely over a shared network. Nieva wrote that Napster proved to the world that the web had the “power to create and obliterate value.” The music industry lost value as people shared music for free with friends and family, eliminating purchase costs. Napster was ultimately brought down by an intellectual property case filed by the Recording Industry Association of America (RIAA) and shut down its service in 2001.

 

The iPod came to the forefront of music listening almost instantly in 2001, and in 2003 Apple launched the iTunes Store, a music-purchasing platform where songs are sold individually. According to the International Federation of the Phonographic Industry, at this time the music industry’s revenues had dropped by about $4 billion. Music fans could buy songs for 99 cents, bringing in almost $70 million in the store’s first year, according to PBS. The price raised to $1.29 for more popular songs. 

 

Spotify entered the scene in 2006 internationally, then finally negotiated its way into the U.S. in 2011. While major labels were reluctant to hand over their music, many felt the decision would ultimately benefit them in the long run. It was inescapable that the industry was changing. Streaming had become commonplace thanks to personalized radio apps like Pandora and iHeartRadio. There would be no returning to the days of exclusively vinyl or cassette tapes. The industry had to adapt. Errol Kolosine, the former head of the record label Astralwerks and a professor at New York University’s Clive Davis Institute of Recorded Music, told Fortune, “The major labels kind of sat on their hands at the advent of streaming. The music industry went through a period of relative uncertainty, driven by not understanding the permanence of streaming. That reality has set in now.”

 

Music listeners must understand the industry’s past to appreciate the convenience and affordability of today’s streaming giants. Spotify (as well as Apple Music, Amazon Music, Tidal and more) gives subscribers access to virtually every song ever recorded at the tip of their fingers. And for what? $9.99 a month is not a high fee compared to a $1,000 CD player or a $200 walkman. Then factor in the cost of the actual music. The low personal cost for each listener is unprecedented in the industry, but improves the music experience for everyday Americans and has proven itself an effective business model.

 

Then why have some artists been so resistant to allowing their music to be streamed? Taylor Swift and Beyonce, two of the biggest pop stars in the world, refused to have their music on Spotify for quite some time before eventually giving in. Swift lasted almost three years without adding all of her music to Spotify. Kaitlyn Tiffany wrote about Swift's unique position as a performer and songwriter in The Verge: "She’s uniquely positioned to speak authoritatively on this issue as a public-facing brand and a behind-the-scenes creative." Swift values her creative abilities, refering to music as "important" and "rare" in her 2014 Wall Street Journal op-ed. Swift is right in the sense that music is valuable, but she was not able to see a future for music past buying CDs in a store, which makes her argument fall flat. Eventually she realized her mistake, because she allowed all her music to be used on Spotify in June 2017. 

 

Spotify’s success this past quarter makes me optimistic about the company’s future, and record labels should be enthusiastic too. After all, artists and labels need Spotify just as much as Spotify needs their music.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173616 https://historynewsnetwork.org/article/173616 0
Subsidies Can't Solve Farmers' Trade War Losses Last month, Chinese Vice Premier Liu He, U.S. Treasury Secretary Steven Mnuchin, and U.S. Trade Representative Robert Lighthizer discussed a phase-one trade deal to potentially limit the ongoing US-China trade war. Many farmers and manufacturers hope such a deal could provide financial relief.  History can help U.S. policymakers understand that subsidies are not a reliable form of long-term assistance for American farmers and the agricultural sector. Often, subsidies provided by the U.S. government are harmful in the long run to American farmers.

 

While President Trump has boasted that the United States is winning the trade war, it has cost American farmers millions in agricultural sales. The Trump Administration has approved $16 billion in farm subsidies since June to compensate for lost income. These subsidies, however, have not been enough to offset the losses. For example, Iowan farmers received $973 million in direct payments from the first portion of Trump’s relief subsidy, while Iowa State University estimated the trade war cost farmers across the state $1.7 billion. This is not the first time that the U.S. government has failed to provide adequate relief to American agriculture.

 

In 1933, newly inaugurated President Franklin Delano Roosevelt held a special session of Congress to introduce 15 significant pieces of legislation. One of these bills was Roosevelt’s Agricultural Adjustment Act (A.A.A.), which declared Congress would  “balance supply and demand for farm commodities so that prices would support a decent purchasing power for farmers,” otherwise referred to as “parity.” FDR gave an address in support of the Agricultural Adjustment Act in which he stated: “I want to pledge to you not only our wholehearted cooperation as you go forward, but our continued deep interest in a problem that is not just a farmer's problem because, as I have said before, your prosperity is felt in every city home, in every bank and in every industry in the land.”

 

The Agricultural Adjustment Act came to Congress as one part of Roosevelt's packaged New Deal initiatives. Jim Powell, a senior fellow at the Cato Institute, discussed the struggle to finance all of the programs. The government had no other choice but to triple federal taxes from $1.6 billion in 1933 to $5.3 billion in 1940. Excise taxes, personal income taxes, inheritance taxes, corporate income taxes, and holding company taxes all increased. Everyday products like margarine and fruit juice faced high excise taxes in order to create revenue for the New Deal. The taxes imposed for the New Deal turned out to be damaging to jobs in the 1930s, as unemployment was prolonged and held steady at 17%.

 

According to American historian Gilbert C. Fite,   past government attempts at relief for the agricultural sector just left western and southern farmers dependent on federal assistance. Further, the A.A.A. primarily benefitted large producers. One farmer commented that the A.A.A. "will be of no benefit to the small landowner. Only the big landowner will benefit from it, as he can leave his percentage of land idle and farm the balance at a profit, besides drawing a large subsidy from the government as rents to buy more land to leave more land in idleness and to draw bigger subsidies.”

 

Similarly, Trump has lead farmers to believe he is helping their situation. While at a June state visit in Iowa, Trump told guests, "Within a year and a half, I would say, you'll be in the best position you've been in 15 years as farmers. And you deserve it. Under my administration, we will always protect and defend our great American patriot farmer. Always."

 

Yet, President Trump has increased U.S. farm debt -- debt that is expected to reach almost $427 billion in 2019, which is the highest level since the 1980s. The situation for American farmers and the heartland does not seem to have any positive outlook. Many farms have had to close shop. Within one year - from September 2018 to September 2019 -  all regions across the United States saw increased farm bankruptcies from the previous year, totaling 580 Chapter 12 bankruptcies. This amounted to a 24% increase compared to last year's levels.

 

In 1933 and now 2019, the U.S. government has used subsidies as a mechanism to fix the woes of American farmers. This strategy has only provided temporary relief and proved to be more detrimental to the agricultural sector in the long run. Economists Justin Spittler, Robert Ross, and Walter Block all explained that U.S. agricultural subsidies as historically lead to severe resource misallocation. High prices for products like food or clothing and other normal goods only exacerbated  the dark years of the Great Depression.

 

Very few would ever compare the actions and accomplishments of President Franklin Delano Roosevelt to President Donald Trump. However, both relied on farm subsidies to ease agricultural economic woes. While the 1933 Agricultural Adjustment Act is very different than 2019's trade war with China, the unfortunate economic situation for farmers shows a parallel failure in both cases by the federal government to help America’s heartland.

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173670 https://historynewsnetwork.org/article/173670 0
Rodney Reed, A Story That is All Too Familiar Rodney Reed, 51, was scheduled for execution on November 20th by the state of Texas.  On November 15th, the Texas Court of Criminal Appeals indefinitely stayed Reed's execution to review his case.

 

Many doubt Reed’s guilt and the case is drawing attention from the media and celebrities. Rodney Reed was convicted of murdering and raping 19-year-old Stacey Stites in 1996. Initially, Red claimed he did not know Stites but eventually confessed that Stites and he were having an affair. For over two decades, Reed has adamantly maintained his innocence. 

 

New evidence suggests that Stites fiancé, a police officer named Jimmy Fennel, actually committed the crime. Fennel was released from prison in 2018 after pleading guilty to kidnapping and raping a woman while he was on duty in 2008. In a sworn affidavit, Arthur J. Snow Jr, who served time with Fennel in prison, said  Fennell bragged about killing his fiancée when Fennell said “I have to kill my n***** - loving fiancée.”.

 

The rough outline of this case reminds me of America’s long and violent history towards black men. The criminalization of black men as sexually deviant and harmful to white women was first sensationalized after Reconstruction in the 1880s. Ida B Wells, an African American journalist and activist, investigated cases where black men accused of raping white women were lynched by mobs. Ida B Wells often reflected on the lynching of black men in her neighborhood of Memphis and began to question the rape allegations as being consensual interracial affairs. In fact she found that in some cases black men who were accused of raping a white woman were actually having a consensual sex. At other times, she found accusations of rape to be completely false and used as a tactic to subjugate black men economically. 

 

In Mia Bay’s To Tell the Truth Freely: The Life of Ida B Wells, Bay notes the criminalization of African American males was based on the idea of male sexuality and the evolution of races. Bay also notes that Wells concluded that “white southerners routinely invoked the sanctity of white womanhood as a justification for violence against blacks”. This teaches us, that in the post Reconstruction Era, white men were particularly threatened by the presence of a free black man and created narratives to tear them down. Combined with the ideology surrounding the myth of the black rapist, the criminalization of black men via legal and extralegal tactics became commonplace. This ideology set in motion a vicious and racist machine that has affected our judicial system throughout history.

 

In 1906, Chattanooga, Tennessee, a black man, Ed Johnson was wrongfully convicted and sentence to death for the rape of a white women.

 

In 1931, Paint Rock, Alabama, eight young black men were  wrongfully convicted of raping two white women and were subsequently executed. These men are more well known as the Scottsboro Boys. 

 

 In 1977, Monroe, Alabama, African American Brian Baldwin was wrongfully convicted of murdering and raping a 16-year-old white girl, and as a result was executed. He was coerced into confessing and there was evidence that he could not have committed the crime because he was right handed, and the murderer was left-handed. All evidence was lost or destroyed after Baldwins execution.

 

In 1988, four black men were wrongfully convicted for the murder and rape Carol Schmal, a white woman, and the murder of Lawrence Lionberg. This case is commonly referred to as The Ford Heights Four. Verneal Jimerson and Dennis Williams were sentenced to death while Kenneth Adams and Willie Rainge were imprisoned. 18 years later, all four men were exonerated. These men were convicted by false forensic testimony, coercion of witnesses, and police misconduct. Three investigative journalists disproved the entire case. 

 

The list of wrongful convictions of black men for raping and or murdering white women is endless. The list of black men not receiving their right to due process is even longer.

 

These are cases where there is either lack of evidence or exonerating evidence. These cases show that there is a dangerous ideology criminalizing black men as sexually while consecrating white women. Rodney Reed fits this pattern of injustice because he was sentenced to death while potentially exonerating evidence was ignored. 

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173545 https://historynewsnetwork.org/article/173545 0
Reform or Revolution: England’s Lessons of 1688 for Today and Tonight's Democratic Debate

 

Tonight’s debate will once again expose a tension between moderates and leftists in the Democratic Party. At the core of this tension is a debate over how to bring about political change: through reform, or through revolution?  

 

This debate goes back centuries, especially when discussing England and France.  In England, the Glorious Revolution of 1688 achieved a moderate reform of the English political system. Conversely, the French Revolution that began in 1789 seemed to usher in a new era of hard-edged revolutionary politics, with worldwide reverberations. In both cases, people struggled for genuine change in the face of undoubted injustice. Yet England managed to implement key reforms while still maintaining historic institutions. The French, on the other hand, struggled for equilibrium, enduring a succession of failed governments and horrific violence. The lesson:  That institutions can prove surprisingly flexible, something that can bode well for reform impulses.  

 

The English Glorious Revolution of 1688 occurred when the English overthrew the absolutist, tyrannical monarchy of James II (r. 1685-88), the last male Stuart ruler. Unwilling to subject themselves to James’ flagrant violations of core English liberties, parliamentary liberals invited James’ daughter Mary to rule England with her husband, William, from the Dutch House of Orange. Living in the Netherlands, William and Mary crossed the Channel, meeting surprisingly little resistance from James. Because it avoided violence and extremism, the English called this bloodless takeover of the throne “glorious.”

 

With William and Mary established as co-rulers, Parliament presented the monarchs with a Bill of Rights, which they promptly signed. Going forward, the English Bill of Rights provided a prototype for what would later become a hallmark of Enlightenment-era liberal idealism — a written constitution guaranteeing liberties. Some of the classic freedoms instituted in the Bill of Rights included things like the freedom to debate and the right of subjects to petition the king.  Overnight, the English Bill of Rights turned England from an absolutist state into a constitutional monarchy, one of history’s most successful parliamentary systems.  

 

The Glorious Revolution deeply influenced subsequent politics, and not just in England. In the aftermath, the political philosopher John Locke re-articulated the Bill of Rights’ ideas about fundamental liberties existing beyond the reach of the state. American documents, like the Federalist Papers, or the Declaration of Independence, often echoed John Locke’s language about inalienable rights and the role of the state in protecting them. The Eighth Amendment’s ban on “cruel and unusual punishment” quotes the English Bill of Rights, virtually verbatim. 

 

Still today, the American structure of government reflects ideas from the Glorious Revolution. However imperfectly realized, these ideals continue to shape American values.  According to law, Congress has the power of the purse. America does not use standing armies to police the citizenry.  No ruler is above the law.  People have a right to speak their minds.  Such principles are at the heart of current American debates, ranging from the impeachment of President Trump to the Black Lives Matter movement.

 

While the English enumerated all these core political ideals, and avoided social and cultural tumult, others were not so lucky. America achieved its freedom in relatively moderate fashion, but only after a military war of independence. In contrast, the French Revolution of 1789 provided another model for change that was far more violent. 

 

During the French Revolution, the push for change pitted reformers against revolutionaries.  Sitting on the proverbial “left” of the assembly hall, the Parisian faction usually pushed for radical change, at a faster pace. Their goals were often laudable, including liberty, equality, and fraternity, the Revolution’s slogan. But as events veered out of control, the French revolutionaries seemed to grow increasingly impatient. Some of their most famous efforts included taking France off the Gregorian calendar and starting a new religion, the Cult of the Supreme Being. They moved to remove statues, and even entire buildings, that represented the values and ideals of the old days. 

 

In contrast to the revolutionary Parisian faction, moderate French reformers pushed for more measured change. Initially, they wanted to keep the monarchy and a viable Roman Catholic Church, which had been central to French history and identity. By 1792, however, the French had abolished the monarchy and disestablished the Church, breaking with the Vatican. A few years earlier, few would have even imagined such tumultuous change possible, let alone advocated it. Once events began to escalate, however, they proved hard to stop, pushed by a potent combination of ideological extremism and social unrest. A climate of emergency and scarcity gripped the nation.  Seemingly impossible circumstances sometimes made it seem like France had lost control of itself. 

 

Both the ranks of the English reformers and the French revolutionaries included plenty of good people who sincerely wanted the best for their respective countries. They just often disagreed on how to get there. In France, the revolutionaries sought to change France in ways that seemed practically fundamental. But they still needed to preserve order in the streets and bread in the markets – the basic functions of any decent government.  Meanwhile, an irresistible need for stability helped popularize the military authoritarianism of Napoleon, who took power in late 1799.       

 

In contrast to France, the English experience in 1688 affirms that a society can achieve reforms without descending into chaos and violence. It reminds us that we can strive together for improvement without attacking the country’s history and people. Reform does not require holding earlier versions of ourselves in contempt, actually, quite the opposite. 

 

As we go forward into the holidays, and then a contentious election, the English example in 1688 continues to provide an important lesson. We can cherish our history and institutions, while still being clear-eyed about necessary changes. Today, England is one of America’s oldest, and closest, allies. Hopefully, this English tradition of reform offers vision that all Americans can share—Democrats, Republicans, and independents alike.     

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173889 https://historynewsnetwork.org/article/173889 0
2020 Will Be More Turbulent Than 2019, Unless…

 

Unless some drastic measures are taken, the various conflicts in the Middle East will become ever more intractable and exact a horrifying toll in blood and massive economic dislocation. The continuing severity of these crises and their repercussions will depend on whether or not the combatants assume a realistic posture, or new leadership rise and commit to finding equitable solutions that can endure. We must keep in mind though that the turmoil we experienced in 2019 may further intensify in 2020 because of the continuing global crisis of leadership and the challenges posed to the global order that was established in the wake of World War II. The following brief review of seven Mideast conflicts reflects these developments and raises the question as to what must be done to change the dynamics in the hope of solving some of these conflicts. Betrayal: The Israeli-Palestinian conflict is the oldest and most intractable conflict that has consumed both people for more than 70 years, and has been further impaired by their leadership’s refusal to recognize each other’s right to the same land. The leaders betrayed their people by failing to appreciate each other’s psychological, religious, and historic attachment to the land and being blind to the inter-dispersement of the population that makes coexistence inevitable. Israelis and Palestinians must now choose between endless violence, or living in amity and peace. Given the crisis in leadership, the hour calls for new visionary and courageous leaders who recognize that their people’s future security and prosperity still rests on the only viable option—the two-state solution. Grandiose delusion: Following the revolution in 1979, Iran sought to become the region’s hegemon equipped with nuclear weapons. The turmoil sweeping the Middle East points to Iran’s complicity in most of the conflicts destabilizing the region, including Lebanon, Yemen, Iraq, and Syria, while enlisting, financing, and training jihadist and terrorist groups and threatening Israel’s existence. Although the US’ withdrawal from the Iran deal was a mistake, Iran’s defiance led to crippling US sanctions. Seeking regime change and destroying Iran’s nuclear facilities is not the answer. The resumption of US-Iran talks offers the only way out, provided Iran plays a constructive regional role and abandons its grandiose delusion to become a nuclear power and the region’s hegemon. Yearning for identity: Since Iraq was established in 1932, it went through frequent political turbulence, overshadowing its glorious history. Following the revolution in 1958, the Ba’ath Party, a nationalist and socialist regime, rose to power and was able to finance ambitious projects throughout the 1970s. In 1979, Saddam Hussein, a ruthless autocrat, assumed power and led the country to the disastrous Iran-Iraq and Gulf Wars. The 2003 war killed over 100,000 Iraqis, decimated the country, and invited Iran to exercise immense influence on all Iraqi affairs, while the people suffer from profound economic hardship. The current massive demonstrations demanding the ouster of Iran will ultimately prevail and restore Iraq’s unique national identity, for which all Iraqis yearn. Killing in God’s name: The Yemen war will be recalled as perhaps the most horrific humanitarian disaster in modern history. It is a proxy war pitting the leader of Sunni Muslims—Saudi Arabia, which supports the internationally recognized government and is determined to prevent Iran from establishing a foothold in the Arabian Peninsula—against the leader of Shiite Muslims—Iran, which backs the Houthi rebels. Yemen became the battleground, and the Yemenites are killed in God’s name. Tens of thousands have died, millions are starving, and over a million children are infected with cholera, all while the country lies in ruin. Five years later, the warring parties have finally realized the war is simply unwinnable. Ultimately, both sides must negotiate a solution. The price of insatiable lust for power: Syria’s civil war that started in 2011 is hard to fathom. What began as a peaceful demonstration became the most devastating war of the 21st century. Had President Assad responded to his fellow citizens’ demands by providing them with basic human rights, he might have averted a calamitous war that has killed nearly 700,000 people, rendered 11 million refugees or internally displaced, and leveled half the country to the ground. Now Assad is at the mercy of Russia, Turkey, and Iran, who are determined to maintain a permanent foothold in Syria. Syria may well become the battleground between Israel and Iran, while scores of militia, jihadist, and terrorist groups roam the country with no foreseeable end. Erdogan’s self-defeating dictatorship: Soon after Turkey’s President Erdogan came to power in 2002, it was believed that under his stewardship Turkey would become the first functioning Islamic democracy. He embarked on socio-political reforms and extensive economic developments, and engaged the Kurds to end a decades-long conflict while improving Turkey’s prospective integration into the EU. But then he reversed gears. For him, democracy was only a vehicle to promote his Islamic agenda and lead the Sunni Muslim world. He pursued his religious and ideological rivals with vengeance, imprisoning tens of thousands of Gülen followers and Kurds, along with nearly 200 journalists who are still languishing in jails. He will leave behind a legacy of a ruthless leader possessed with Ottoman revivalism, who squandered Turkey’s prospective brilliant future for a self-defeating dictatorship. A preordained defeat: The Afghanistan war, the longest in American history, should have ended one year after it began in 2001. It was clear that the Taliban’s initial defeat was temporary and that they would return to reclaim their inherent right to the millennium-old land of their ancestors. The US’ efforts to establish a democracy, coupled with a mounting build-up of American troops and escalating cost, bore little fruit. The Taliban relentlessly maintained their counter-offensive and, irrespective of their heavy losses, reestablished their central role. Under any negotiated agreement, the Taliban will eventually take over. All that the US can do is require the Taliban to fully adhere to human rights, and punish any violations with crippling punitive sanctions. There are certainly many other countries in the Middle East and North Africa suffering from political instability, daunting economic hardship, violence, uncertainty, and fear. Sadly, the efforts that have been made by the UN, EU, and the US to quell or resolve many of these conflicts – be they in Lebanon, Libya, South Sudan, or many other countries – have largely failed. The year 2020 will most likely be as turbulent if not more so than 2019 due mainly to the lack of American leadership and the rush of other powers, especially Russia, China, and to a lesser extent Turkey and Iran, to fill the vacuum the US is leaving behind. Beyond that, however, we are witnessing a global transformation where nationalism, extremism, and xenophobia are on the rise, millions of refugees are on the move, and poverty and economic dislocation are rampant, which together greatly contribute to instability and violence. Sadly, these developments coupled with a worldwide crisis of leadership may well worsen before a new generation of leaders can rise and try in earnest to resolve many of these conflicts humanely, passionately, and equitably to ensure their durability.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173890 https://historynewsnetwork.org/article/173890 0
Roundup Top 10!  

Why we all have the knowledge to decide whether Donald Trump should be impeached

by Karin Wulf

The crucial importance of keeping historical records public and easily accessible.

 

History is clear: Voters reward, rather than punish, political courage

by Jon Meacham and Michael E. Shepherd

Over the past six decades, courageous high-profile votes have tended not to cost the courageous their seats. From Southern Democrats who supported the Civil Acts Right of 1964 to Republicans who backed Medicare and Medicaid in 1965 through the Clinton impeachment in the 1990s, tough votes have been difficult but not necessarily fatal.

 

 

Women Have Always Had Abortions

by Lauren MacIvor Thompson

False histories of abortion dominate contemporary politics, selling Americans on a past that never existed and creating the possibility of a future that has no precedent.

 

 

Sacred Objects: Medieval History and Star Wars

by Stephenie McGucken

For European believers, relics allowed worshipers to encounter some aspect of an object of devotion—a holy person or place—when the object itself was physically unavailable or geographically inaccessible.

 

 

Confederate Christmas ornaments are smaller than statues – but they send the same racist message

by Nicole Maurantonio

Take a good look at those old Christmas ornaments before hanging them on the tree – you may find it's time to retire some family keepsakes.

 

 

The Secret Plan to Force Out Nixon

by Tim Naftali

A newly released diary shows House Republican leaders pledging to oust the president months before he resigned. Why did they back down?

 

 

How War Targets the Young

by Andrea Mazzarino

War on Terror, War on Education

 

 

The (Failed) War on Terror's Precursor

by Danny Sjursen

It Was “Progress” All the Way Then, Too

 

 

The purpose of history in the Age of Trump

by Daniel W. Drezner

The president’s “strongest and most powerful protest” has a familiar ring to it.

</

 

The apocalyptic worldview hidden in Trump’s letter to Pelosi

by Thomas Lecaque

Throughout history, the letter has been one of the literary genres most closely associated with apocalyptic texts.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173912 https://historynewsnetwork.org/article/173912 0
Impeachment and the Electoral College

 

In 1956, a young Senator John F. Kennedy made his now famous speech defending the Electoral College. In the manner of Mr. Smith goes to Washington, he valiantly declared before the U.S. Senate that abrogating the Electoral College would “would break down the federal system under which most states entered the union, which provides a system of checks and balances to insure that no area or group shall obtain too much power.”

 

Even more important, the future president recognized that the key provisions of the Constitution represent a “whole solar system of government power. If it is proposed to change the balance of power of one of the elements of the solar system, it is necessary to consider the others.”

 

As Kennedy recognized, provisions such as those relating to impeachment cannot be considered in a vacuum, but must be considered in relation to all the other elements of the solar system of government power. For that reason, application of Article II’s grounds for presidential impeachment — high crimes and misdemeanors — must be considered in the context of all other constitutional provisions relating to government power.

 

The greatest fear of the Framers was concentration of power in one branch of government that might overpower and abuse the other branches. It was for this reason that the framers’ firmly rejected the unitary parliamentary model of Great Britain, since it gave one branch — the legislative branch — undue power over the executive branch by giving parliament the power not only to select an executive leader, but also to dismiss that leader virtually at will — thus making the executive branch ‘beholden’ to that legislature. (One has only to look at unstable ‘coalition’ governments around the world to appreciate the Framer’s rejection of a unitary parliament).

 

It was the Framers’ great vision of separation of powers to create two separate parliaments: the first to enact legislation (Congress); and an entirely separate parliament, elected by the people, whose only duty was to elect the executive leader (an Electoral College). Only in such a way could the true goal of separation of powers be achieved. 

 

It was this element of the ‘Grand Compromise’ which united in one country not only the small states, which demanded a Congress based on the principle of “one state, one vote” (the Senate), large states which demanded representation based on “one person, one vote” (the House), and other states which demanded an entirely separate parliament (the Electoral College) based on the number of representatives each state was entitled to in both the Senate and the House. 

 

It remained only to reconcile those who demanded that Congress be given the same power over the executive as enjoyed by the British Parliament, with those who believed in an executive with powers equal to—though not superior to—those given to Congress. Though Alexander might be grouped with the former, professing his opinion in Federalist 68 that a majority in Congress be given partisan power to remove the executive simply for “maladministration,” Madison is placed in the latter group who believed that the principle of separation of powers required a much higher, non-partisan standard for removal. In the end, a compromise was reached, giving Congress the power to remove, but only on the much higher standard of an actual ‘High Crim(e) and Misdemeanor.”

 

Although legal definitions and vocabulary of the law has changed since 1789, the closest modern legal term for ‘High Crime’ is a felony. For example, in most jurisdictions, perjury is a felony, but “maladministration’, or even incompetent or over-reaching abuse of power is not — e.g. Truman’s unconstitutional seizure of steel mills without due process was deemed not impeachable, nor was President Roosevelt’s herding of innocent citizens into internment camps without due process of law. Likewise, it is not a criminal “obstruction of justice” or a “high crime” for a defense lawyer to simply assert the lawyer-client privilege on behalf of a client, at least until the judiciary branch rules on that assertion. 

 

In 1868, the country dodged a bullet when by a narrow margin Congress declined to impeach a palpably inept and incompetent president on the flimsy “abuse of power” pretext of firing a cabinet member without authority, which would have set a future precedent for disrupting the Framer’s delicate balance and separation of powers along purely partisan lines.

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173887 https://historynewsnetwork.org/article/173887 0
Why FDR Turned Away Jewish Students

 

A recent New York Times article told the poignant story of an unlikely romance between two prisoners in Auschwitz.  

 

Buried deep within the full-page feature about the relationship between David Wisnia and Helen Spitzer, however, was a heartbreaking fact whose significance most readers may not have recognized. 

 

It turns out that all the years of starvation, beatings, and torture Wisnia endured at the hands of the Nazis could have been avoided, if not for the Roosevelt administration’s harsh policy towards Jewish refugee students seeking admission to the United States.

 

Wisnia was an exceptionally talented singer. “Before the war, he’d written a letter to President Franklin D. Roosevelt requesting a visa so he could study music in America,” the Times noted. “His mother’s two sisters had emigrated to the Bronx in the 1930s, and he’d memorized their address.”

 

American immigration policy during those years was governed by a strict quota system, based on national origins. But the law contained three major exceptions: clergy, professors, and students could be admitted outside the quota restrictions.

 

To qualify for a student visa, an applicant needed to show a letter of acceptance from an accredited American school. It may be that Wisnia had not yet been accepted, and therefore technically didn’t qualify for a student visa. But that was part of the problem: U.S. consular officials in Europe, acting in accordance with President Roosevelt’s policy, looked for every possible reason to keep refugees out. Had they been inclined toward kindness, instead of coldheartedness, they could have chosen to grant Wisnia a regular visa on the grounds that he had close family members—two aunts—already living in the United States.

 

Wisnia was a Polish citizen. From 1933 to 1945, the quota for Polish immigrants to the United States was never filled, and in most of those years it was more than 70% unfilled. Tens of thousands of unused visas for would-be Polish Jewish immigrants were thrown into the waste basket, alongside David Wisnia’s letter pleading to be let in.

 

What makes this story even more tragic is that many Jewish refugee students who did have letters of acceptance from American schools were kept out of the country, anyway.

 

Prof. Bat-Ami Zucker (Bar-Ilan University) has found multiple instances in which European Jewish students were denied visas even though they had been admitted to Dropsie College, in Philadelphia, or the Jewish Theological Seminary of America, in New York City. 

 

An official at the U.S. consul in Berlin said the visas were rejected because in each case, the student was “a potential refugee from Germany” and therefore was “unable to submit proof that he will be in a position to leave the United States upon the completion of his schooling.”

 

Similarly, Prof. Stephen Norwood (University of Oklahoma) has described a case in which a European Jewish refugee student who was admitted to Bryn Mawr College was denied a visa “because she could not meet the requirement of identifying a permanent residence to which she could return after completing her studies.” Bryn Mawr’s president learned that the American Friends Service Committee, which assisted refugees, knew of another fifty instances in which European students who had been admitted to U.S. colleges—and offered scholarships—were denied visas for that reason.

 

It was a Catch-22. These Jewish students were seeking to study in American schools precisely because the Nazis prevented them from studying in German universities. Now the Roosevelt administration was telling them they could not enter the United States—and therefore would have to remain in the country where they were being persecuted—because they could not guarantee they would return to the country where they were being persecuted.

 

Prof. Zucker points out, however, that the immigration regulations “did not require that an applicant for a student visa prove that he would be able to return to Germany….The regulations required only an affidavit stating his intention to return to Germany.” 

 

This was, in other words, yet another example of the numerous extra requirements and obstacles imposed by Roosevelt administration officials in order to suppress Jewish refugee immigration below the levels permitted by law.

 

A handful of refugee students did manage to reach America’s shores during the Nazi era. Several dozen studied at Yeshiva University. Twenty were admitted to Harvard (although for some reason Harvard felt it necessary to announce that “a large number” of them were not Jews). Rutgers University, the New Jersey College for Women, and McPherson College, in Kansas, each took one.

 

But so many more could have been saved if President Roosevelt and his administration had simply allowed the existing student visa exemption to be fully utilized, instead of going out of their way to keep the Jews out.

 

]]>
Sun, 19 Jan 2020 17:43:56 +0000 https://historynewsnetwork.org/article/173888 https://historynewsnetwork.org/article/173888 0