Objectivity and and the classroom: ten historians respond.
There are no big surprises, but the similarity of views here might be of interest to non-academic readers.
The point is this: HfB [Historians for Britain] and its opponents share exactly the same, entirely conventional approach to the ‘relevance’ of history to current political debate; in other words, to ‘why history matters’. That relevance, as is clear from any close reading of almost all of the contributions to the exchange, consists entirely in the deployment of historical ‘fact’. I find this, to be blunt, more than a tad wearisome. [The involvement of historians in the referendum on Scottish separatism took exactly the same tedious form.] What seems largely to be at stake is who can assemble the biggest pile of facts. This is not going to make much of a difference, let alone decisively carry the day, either way. By way of demonstration, and if you can bear it, just read ‘below the line’ on the on-line version of the ‘Fog in the Channel…’ piece. To quote the (in my estimation) criminally underrated Andrew Roachford, ‘I don’t want to argue over who is wrong and who is right’. Let me row back from the extreme position that might be inherent in that statement. First, it is very important to deploy historical fact to counter misleading public presentations of what ‘history shows us’. This was another reason I signed up to the ‘Fog in the Channel…’ letter. There were some seriously dubious elements in Abulafia’s piece ... However, as I have said too many times to count, history (as opposed to chronicling or antiquarianism) is the process of thinking, interpretation, explanation and critique, carried out on the basis of those facts, it does not stop at the latter's simple accumulation (most historical 'facts' are, to me, not especially interesting in and of themselves: this happened; that did not happen – factual accuracy is a duty not a virtue). More seriously, both sides essentially see the course of history (as established by these facts) as providing a set of tramlines governing the proper path we should take in future. This removes any kind of emancipatory potential from the study of history. Put another way, and to restate the counter-factual posited earlier, suppose you agreed with Abulafia (and after all he’s not wrong about everything) that Britain’s history was, fundamentally, profoundly different from that of mainland Europe and had run a quite separate course (there is a case that could be put to support that contention that would have to be taken seriously, even if it is not the one put forward by HfB). But suppose that, unlike him, you thought that this had been a terrible thing and thought that Britain needed to incorporate itself more fully in Europe. Or suppose that you thought that the authors of ‘Fog in the Channel…’ were fundamentally right that Britain’s history was entirely entwined with that of the mainland but that you thought that this was wholly regrettable. In still other words, suppose that – like me – you thought that the course of the past had no force and provided no secure or reliable guide at all to what ought to be done in future. So, what would one be able to contribute to a debate on these (or other) issues if one held a (superficially) seemingly nihilistic view, like mine, of history as random, chaotic, ironic, and unpredictable, and of the past as having no ability in and of itself to compel anyone to do anything? What, so to speak, would be the point of history? Why would it matter? In his excellent blog-post – in my view, the best intervention in this discussion by some way (impressive not least for its concision) – Martial Staub draws attention to the discontinuities of history that subvert any attempt at a unified narrative or quest for origins. This seems to me to point us at a much more valuable and sophisticated means by which the study of history (rather than ‘History’ itself, that somehow mystic object, or objective force) can make a political contribution. Every dot that is later joined up to make a historical narrative represents a point of decision, of potential or, if you prefer, of freedom, where something quite different could have happened. To understand any of these decisions, as again I have said many times before, it is necessary to look at what people were trying to do, at what the options open to them were, or those they thought were open to them, what they knew – in short at all the things that didn’t happen, which frequently include the intended outcomes. You cannot simply explain them as steps on a pre-ordained path towards a later result, or as the natural outcomes of the preceding events. Any present moment of political decision represents the same thing: a point of choice, of decision, which requires serious thought. It should not be closed down by the idea that some 'burden of history' or other compels us to go one way rather than another. Those decision points that I just called the dots joined up to make a story were, at their time, points of freedom when any number of things were possible. The unpicking of narrative constructions makes this very clear and that – in my view – is the point that emerges from historical study. This lesson, for want of a better word (I mistrust ‘lessons from history’), points at a string of possible subversions. Staub points out the subversion of the ‘national’ story but at the same time it subverts any similar ‘European’ master narrative. Britain can be said to have had a history different from that of other European countries – true enough - but to no greater extent than any other region of Europe has a history different from the others. It may be true that at some points British history seemed to run on a course that bucked European trends, but exactly the same can be said of, for example, Italian or Spanish history at various points. What is Europe anyway? Is it any more natural a unit of analysis than any other? In the Roman period, the idea of thinking of the north of Africa as somehow a different area from the northern shore of the Mediterranean would be very odd. Indeed the Mediterranean basin can be seen as a unified area of historical analysis (who, after all, knows that better than David Abulafia?), rather than as different continents divided by a sea – perhaps one with different histories from northern Europe or the North Sea cultural zone (which obviously includes Britain). All of these points also contribute to a historical critique and dismantling of the idea of the nation (any nation) itself, not simply the national story (see also here). All historical narratives are constructs so (unless one is based upon the misuse or fabrication of evidence, or not staying true to the basic 'facts' of what did or did not happen) one cannot be claimed to be more accurate than another. No one can win an argument on that basis. The best that can happen (and it is important) is the demonstration that there is more than one story to be told. Above all, what I find to be one of the most important contributions that historical study can make, in terms of social/political engagement, is the subversion of all reifications, of all attempts to render contingent categorisations as natural. And of course it similarly subverts claims to represent contingent oppositions as eternal or natural. All these subversions arise from what I have repeatedly argued on this blog are historical study’s most important benefits: the critique of what one is presented with, as evidence, and the simultaneous requirement to see similarity – shared human experience – in difference and diversity, or to listen to and understand that evidence). So I would contend that the view of history sketched above is very much not a disabling, nihilistic one but quite the opposite. The careful, sympathetic yet critical investigation of the traces of the past, the deconstruction of narrative, nation and so on, can and should free us from the burdens that people want to impose on us in the name of history. The appreciation of the once possible but now impossible potentials at the decision-points of the past can and should allow us to think twice about what people tell us are now impossibilities and open our minds to the potentials and possibilities of the present. If you want a catchphrase, try this: think with history, act in the present.
Sure enough, some of those sessions (which I should point out were very good and interesting) included a lot of griping and grouching about the misuse and ambiguity of the word medieval. You would think that a bunch of scholars who by their very nature of their discipline are experts in the evolution of the meaning of words would by now have gotten over the fact that though it doesn’t make a lot of sense to call “the Middle Ages” by that term, and that coming up with a really good, chronological definition of those ages is impossible, we are stuck with the words medieval and Middle Ages anyway. But no, there is a lingering feeling that it should be possible to nail down these terms – Middle Ages, medieval – once and for all. Or ditch them. If all the experts agreed, everybody else would have to fall into line – right?
You know that’s not going to happen.
Scholars of the Middle Ages, like experts in any other field, feel they should be in control of the terminology that defines their work and gives them legitimacy. But the truth is that any important subject is contested between a whole bunch of different individuals and groups who have an interest in that field. A single word – medieval – is shattered into a variety of definitions, many of which are out of date – at least in the eyes of people working on the cutting edge of, say, “medieval studies.” Old assumptions and terms and generalizations which current practitioners have rejected hang on in popular and nonspecialist discussions.
This can be intensely irritating for people who know that certain phrases and analyses lost their cogency back in 1927 and want to talk about what their friends are doing in the field now. Nevertheless people whose business is words should really accept the fact that words like “medieval” have a number of popular meanings, and when one of them shows up in current discussion (when, for instance, a Game of Thrones shows up and is widely labelled as medieval, even though the world of Game of Thrones is not our earth at all), the fact can be dealt with a good-humored way. It certainly would reflect credit on any field where a good-humored approach was the norm.
David Parry made the most sensible remark of the entire week when he pointed out that an imprecise word like medieval has a lot of cultural value for people who make their living interpreting that era. Indeed there is a financial payoff being associated with it. As he said, “the word makes students registering for courses press the button on the screen that says ‘enroll.’ The phrase ‘early modern’ doesn’t have that effect. ”
The moral imperative driving this is what we can call the quest for authenticity. This is the search for meaning in a world that is alienating, spiritually disenchanted, socially flattened, technologically obsessed, and thoroughly commercialized. To that end, “authenticity” has become the go-to buzzword in our moral slang, underwriting everything from our condo purchases and vacation stops to our friendships and political allegiances.
There are two major problems with this.
The first is that authenticity turns out to be just another form of hyper-competitive status seeking, exacerbating many of the very problems it was designed to solve. Second, and even more worrisome, is that the legitimate fear of the negative effects of technological evolution has given way to a paranoid rejection of science and even reason itself.
Modernity, as a civilization, sits at the confluence of secularism, liberalism, and capitalism, and it is not everyone’s cup of coffee. The promise of the authentic is that it will help us carve out a space where true community can flourish outside of the cash nexus and in a way that treads lightly upon the Earth. More often than not, this manifests itself through nostalgia, for a misremembered time when the air was cleaner, the water purer, and communities more nurturing.
It was never going to work out that way. From its very origins, the quest for the authentic was motivated by that most ancient and base of human urges, the desire for status. The authenticity craze of the past decade is simply the latest version of what the economist Thorstein Veblen, in his 1899 book The Theory of the Leisure Class, called “conspicuous display.” Veblen was mostly concerned with the pretensions of the failing aristocracy and their obsession with obsolete endeavours such as hunting, swordfighting, and learning useless languages. Yet his basic insight – that consumption is first and foremost about social distinction – remains the key to decoding our consumer driven cultural shivers.
As recently as a decade and a half ago, organic food was the almost exclusive bastion of earnest former hippies and young nature lovers — the sort of people who like to make their own granola, don’t like to shave, and use rock crystals as a natural deodorant. But by the turn of the millennium, organic was making inroads into more mainstream precincts, driven by an increasing concern over globalization, the health effects of pesticide use, and the environmental impact of industrial farming. The shift to organic seemed the perfect alignment of private and public benefit.
In the past few juice cleansing has become a 5 billion dollar industry in the U.S., appealing to those who want to lose weight and “detox” their bodies.
It also became an essential element of any “authentic” lifestyle. Yet as it became more popular, the rumblings of discontent within the organic movement became harder to ignore. What was once a niche market had become mainstream, and with massification came the need for large-scale forms of production that, in many ways, are indistinguishable from the industrial farming techniques that organic was supposed to replace. Once Walmart started selling organic food, the terms of what counts as authentic shifted from a choice between organic and conventional food to a dispute between supporters of the organic movement and those who advocate a far more restrictive standard for authenticity, namely, locally grown food.
But when it comes to shopping locally, how local is local enough? If we want to live a low-impact, environmentally conscious lifestyle, how far do we need to go?
The short answer is, you need to go as far as necessary to maintain your position in the status hierarchy.
The problem is you can only be authentic as long as most of the people around you are not, which has its own built-in radicalizing dynamic. You start out getting an organic-vegetable delivery service once a month, then you try growing chickens in your urban backyard. Then the next thing you know, your friends have gone all-in on paleo, eschewing grains, starches, and processed sugar and learning how to bow-hunt wild boar on weekends.
The Whole Food chain plans to start rolling out a system that ranks fruits and vegetables as “good,” ”better” or “best” based on the supplier’s farming practices.
There’s a deeper issue here though, which is that the problem with radicalization is that it breeds extremism. It is one thing to play at being anti-modern by eating only wild game, becoming an expert in axe-throwing, or building a whisky still in your backyard. It is something else entirely to push that ethos into a thoroughgoing rejection of science, technology, and reason itself.
Yet this is where we have ended up. The neoprimitivist logic of authenticity has pushed its way into every corner of how we think, act and consume.
Even with these data, an even more serious problem concerns the move from DNA to conclusions about ethnic or political identity. Ethnic identity is multi-layered. It is deployed (or not) in particular situation as the occasion demands, and can be changed. DNA cannot give you a sense of all the layers of that person's ethnicity, or of which she thought the most important, or even if she generally used a completely different one, or when and where such identities are stressed or concealed. A male Saxon immigrant into the Empire in, say, the fourth century, would – one assumes – have DNA revealing the area where he grew up, but he would probably increasingly see himself, and act, as a Roman. Saxon origins would have little part in his social, cultural, or political life, and even less for his children, if they stay in the Empire. If he returned home with the cachet of his Imperial service, it might have been his Roman identity that gave him local status. He might even have called himself a Roman. However, if a distant male relative moved to Britain 150 years later, his DNA might be very similar but, in complete distinction, he might make a very big deal of the Saxon origins. They would, or could, propel him to the upper echelons of society. DNA tells us nothing about any of this. What is pernicious about this use of genetic data is its essentialism. It views a person's identity as one-dimensional, unchanging, and as entirely derived from that person's biological and geographical origins. In short, it reduces identity to something similar to 19th century nationalist ideas of race. Everyone sane knows that people moved from northern Germany to Britain in the fifth and sixth centuries. In that sense, these expensive analyses tell us nothing we do not already know. In their implicit reduction of identity to a form of race, masking all the other contingent and interesting aspects of cultural interaction and identity-change they risk setting back the understanding of this period by more than a century. Moreover, they provide pseudo-historical and pseudo-scientific ammunition for present-day nationalists xenophobes and racists.If you teach history, wouldn't you want your students to be exposed to such a clear discussion of a historiographical problem? One with real relevance to the present?