Tuesday, July 12, 2016

Can Historians Contribute to Society?


A few weeks ago Patrick Johnson, vice-chancellor of Queens University in Belfast, expressed what I suspect is a view of history commonly held by Euro-American elites: “Society doesn't need a 21-year-old who is a sixth-century historian. It needs a 21-year-old who really understands how to analyze things, [and] understands...contributing to society.” He then announced the creation of an Office of Analyzing Things and Contributing to Society, and the appointment of a deputy vice-chancellor, two deans, and seven assistant deans to manage it. I'm kidding. (Sort of.)



In response to Johnson's remarks, Professor Jonathan Healey wrote an engaging defense of the formal study of history, which I commend to my readers' attention, Briefly, Healey noted that social leaders like to use historical examples to justify themselves, so society needs good history students to serve as fact-checkers; the general public loves a good story, and historians can provide content to the museums and script-writers who furnish the public with its history; the weirdness of the past obliges students to develop their analytical skills in order to study it; and that very weirdness reminds us that societies do change over time, and that our modern values and hierarchies are contingent and mutable.



Healey's clear, elegant essay makes a nice complement to Timothy Burke's 2008 HNN piece on the purposes of historical analysis. Using Burke's article, I would add that history also helps 21-year-olds more fully understand long-term processes, like human-induced environmental change, that have ongoing consequences in the twenty-first century. It lets us appreciate the importance of individuals in society, particularly obscure characters (like George Robert Twelves Hughes or Domenico Menocchio Scandella) whose lives and thoughts tell us more about the lived experience of an era than those of a Napoleon. Finally, history teaches humility, as students learn that they are not the only generation and that persons in the deep past have solved complicated problems without help from their descendants.



I suspect academic administrators, politicians, and other elites don't want ordinary college students to develop these skills. They want round pegs for round holes, not challengers of the status quo. But, as Healey points out – and as everyone over the age of 35 could attest – the status quo is fragile and as subject to change as any other human construct. We need people who appreciate this fact, who have studied change over time and who have the intellectual flexibility to respond to it. Their willingness to say this explains why history professors are almost never invited to speak at university commencements. It's probably just as well. 


(Above image, "Clio, the Muse of History," by Giovanni Baglione [1620], is in the public domain.)

Monday, July 04, 2016

The Revolutionary Monarch


I think many Americans assume that autocratic states rest exclusively on fear, that subjects of a repressive dictator or oligarchy obey only because they and their families will otherwise suffer terrible punishment. Those familiar with the history of monarchies recognize, however, that dictatorships (hereditary or otherwise) also rest on a kind of popular magical thinking, a widespread belief that supreme rulers have powers superior to those of mere mortals. Only a few centuries have passed since Britons believed their sovereign's touch could cure scrofula; only a few decades since Japan was ruled by an actual deity; and only a few years since North Koreans paid their final tribute to Dear Leader Kim Jong Il, whose sacred birth was attended by supernatural omens.

In Domination and the Arts of Resistance (1992), James Scott observes that this tendency to ascribe super-human traits or virtues to monarchs certainly applied to imperial Russia. He quotes Lenin's contempt for Russian peasants' superstitious monarchism, their tendency “naively and blindly to believe in the Tsar-batiushka [Deliverer]” and petition him for redress (p. 97). Accompanying their faith in the Tsar-Deliverer, however, was the peasantry's complementary belief that any evils done in the tsar's name were actually the work of corrupt officials. Peasants could resist those officials while retaining their loyalty to the tsar, confident that “if the tsar only knew of the crimes his faithless agents committed in his name, he would punish them and rectify matters.” The reactionary worship of a semi-divine monarch could lead to insurrectionary, even revolutionary action.

A similar dynamic drove the decade of colonial uprisings preceding the War for American Independence. Opponents of the Stamp Act, the Townshend Acts, the military occupation of Boston, the Tea Act, and the Coercive Acts assumed (or persuaded themselves) that these impositions came not from the king but from a corrupt Parliament. His Majesty was good and patriotic, but in the colonies, away from his watchful eye, his evil ministers tried to plant their boots on freeborn English colonists' backs. Thus Patrick Henry, denouncing the Stamp Act, simultaneously pledged to defend George III to his dying breath. Sons of Liberty settling in Pennsylvania's Wyoming Valley gave their new townships such patriotic names as Hanover (after Britain's ruling dynasty) and Kingston. New Yorkers erected an equestrian statue of the king in 1771, well into the imperial crisis. As late as 1775, the American rebels referred to the British forces fighting them in Boston as “the ministerial army,” not the king's army.

Brendan McConville pointed out (The King's Three Faces, 2006) that American monarchism had not come over in the Mayflower, bur rather had been built by colonial and imperial elites. By putting royal images in their homes (on tea sets and objets d'art), celebrating royal birthdays, and burning the king's enemies in effigy on Pope's Day, the leaders of colonial society imbued their followers with affection for a distant and artificial* British monarchy. The colonists, however, viewed the king much as Russian peasants viewed the tsar: a benevolent father-figure who would right the wrongs perpetrated by aristocrats and officials. Rebellious slaves, for instance, invoked the king's aid against their masters, and rebellious white colonists considered their resistance to tax collectors and soldiers entirely consistent with loyalty to the king.

The big change, as Pauline Maier reported (American Scripture, 1997), came in early 1776, when colonial newspapers reported that George III had declared the colonies in rebellion and withdrawn his royal protection. The king had now publicly proclaimed himself the colonists' enemy. Common Sense, published at the same time, made it safe to discuss the superstitious absurdities that underlay devotion to a monarch, and the Declaration of Independence pointedly indicted the king (not the Parliament) for abuse of power. Later in the War of Independence, the new loyalty oaths that the rebels forced upon former royalists helped dissolve the bonds of duty that still bound many to the Hanoverians. Yet the desire to follow or at least show affection for a monarch persisted in the United States into at least the 1780s – for I agree with Forrest McDonald's argument (Novus Ordo Seclorum, 1985) that residual king-worship explains Americans' celebration of the French royal family and their naming of towns and counties for the Bourbons. Arguably, it took the “party war” of the 1790s, in which “monocrat” became a deadly epithet, and the rise of a post-Revolutionary generation to bury American monarchism for good. Until the early nineteenth century, monarchism was as American as corn cakes or witchcraft trials.

So, Happy Independence Day, and God Save the Queen.


* The Hanoverian dynasty was imposed on Britain by act of Parliament, and its first two rulers didn't speak English particularly well