Killing the text

We know texts are partial, incomplete and unsatisfying representations of historical phenomena, of past experience. We know this both because it is the logical conclusion arrived at during the course of our source criticism, the bread and butter of the historical method, and because a bunch of French philosophers and associated hangers on told us so (or at least that is how it often appeared to undergraduate me), perhaps often with more ferocity and provocation than clarity, in the late-20th and early-21st centuries. So unless we adopted the ostrich method (head, sand), we know our texts are a problem.

Yet we historians rely on texts; often with little choice if we wish to get anything done. And that reliance on the textual record handed down to us has been amplified in much digitally-driven historical research. Using thousands of titles, millions of pages, billions of words, scholars have begun to examine at a distance various historical phenomena: the relationship between trial length and trail verdict and what that means (Hitchcock, 2011), the transmission of phrases around newspaper networks (Smith, Cordell, Maddock Dillon, 2013), what the changing use of words says about society, culture (Schmidt, 2013). This is great work. But there is also some problematic work out there, work that appears to use volume to smooth over the irritations of source criticism, work that claims to reveal macro trends in human history through no more than patterns of words in texts, work that – in a sense – appears too good to be true (Acerbi et al, 2013).

Explaining away the macro by recourse to vanilla historicism, an old favourite that for many proved its worth during the battles with macro socio-economic approaches in the mid- to late- twentieth century, won’t I expect help us here. Neither will blunt refutations of ‘scientific’ work (whatever that means, for as scholars from Kuhn‎ to Bod have observed the scientific method is often much less cold and objective as us humanists assume). Though valid, the stock humanist defence of “well it is much more complex than that…” can start to sound unhelpful and territorial when overused, limiting its shelf-life, reducing it caricature. Which is why I’m heartened by having come across in recent weeks projects that are beginning to ask questions of text as a source, and that are doing so by examining the contemporary experience of using those texts as a starting point for building a case for (and explaining the validity of) a nuanced approach to text. (21/02/14: In retrospect, I realise from a conversation with Helen Rogers that I came over a little negative, a little on the side of needing a defence against all this pesky macro text analysis. I’m not. Rather what I was trying to explore here is a fruitful, productive direction the quant ‘vs’ qual discussion could take…)

Experience of texts is at the heart of the Virtual Paul’s Cross Project, an ambitious project that is attempting to model the aural, sensual and physical experience of John Donne’s 1622 Gunpowder Day sermon. Thanks to this work (and the NEH) we now are able, from the comfort of our desks, listen to a modern recreation of Donne’s unamplified two-hour sermon from eight different positions in the space outside medieval St Paul’s (a space that no longer exists) and change the experience by filtering for different crowd sizes. Here technology transforms how we might imagine such texts were delivered, introducing the effect of reverberation, crowd responses and the cathedral bell into he mix. And so the text of the sermon recedes from view, becoming – in the memorable words of John Wall, project lead – little more than “a memorial” of a lived human experience.

There are other examples of this category of work. Two reading and blogging projects run by folks at the University of Leicester and Dickens Journals Online (on A Tale of Two Cities and No Name  respectively) alert us to the ways in which that physical wholeness of the novel obscures the often complex relationship between Victorian novels, especially those that began life as serialised narratives, and lived experience: they might have been read in bursts, only in part, or starting with something other than the beginning as we know it. In a similar fashion, techniques that examine traces of oil on the pages of texts, traces of human interaction with prose, knowledge, songs, sermons and the like, promise to approximate those passages of historical texts most favoured, most visited, most loved. [I appear to have entirely misplaced the details of this project. When I have them, I’ll post an update here! 21/02/14: As generously pointed out by Bob MacLean, the project in question is the ‘Dirty Books’ project at the University of St Andrews, see Rudy, 2010]

Together these projects kill the text, at least the text as a linear thing that represents – either individually or, more often, alongside other such texts – historical phenomena, past experience. But then we all knew ‘that’ text was dead… right?


Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s