[This is a thought piece written for HIST946: Digital Humanities with Professor William Thomas during the Fall 2011 semester. This week’s reading was Jerome McGann, Radiant Textuality: Literature after the World Wide Web, Stephen Ramsay, “Algorithmic Criticism”, Matt Kirschenbaum, “So the Colors Cover the Wires”, and Geoff Rockwell, “What is Textual Analysis, Really?”. You can find related posts here.]
As Jerome McGann writes, all the critical tools we bring to analysis have always been “prostheses for acting at a distance” the same “distance that makes reflection possible” (McGann 103). The central theme tying his book together is the dissection between gnosis (concepts) and poeisis (construction), distinctions that allow for the “concrete acts of imagining” and allows for “the imagining of what you don’t know” (McGann 83). After all, humanists prize above all “interpretation and self-aware reflection” (McGann xii). McGann challenged scholars to think hard and experiment with ways of expanding our “interpretational procedures” or otherwise risk “the general field of humanities education and scholarship not taking seriously the use of digital technology” (McGann xii). Only through the development of new tools to improve the ways we explore and explain texts and its applied use to scholarship would digital humanities begin to have value for the broader field of humanistic study.
If digital texts allow a new way to think critically and interpretively about documents, how can digital humanities lend itself to that task? Answering that challenge came new digital tools (based on old theoretical methodologies) designed for the analysis of text. Literary critics, like historians, remain interested in the interpretation of cultural artifacts. Digital textual analysis, as Geoff Rockwell explained, meant humanists could become independent from paper concordance and use electronic tools instead (Rockwell 4). The analysis of text by upending the text itself allows humanists new ways of looking at text. To read these works as isolated textual elements or to displace text altogether helps, as McGann argues, to see what usually “escapes our scrutiny” (McGann 116) because we no longer look to the documentary features of text but instead to what it means linguistically. Thus, the possibilities of failure, play, and serendipity generates knowledge; we are, as McGann writes, imagining what we don’t know (McGann 83).
What remains essential in the digitally aided analysis of text is interpretation, debate, and discussion. As Stephen Ramsay notes, and what I called in passing last week the “science of the humanities,” textual analysis is “the most quantitative approach” to the study of humanistic artifacts and, “in the opinion of many, the most scientific form of literary investigation.” The problem of ascribing too much value into the data generated by textual analysis is an assumption that the human reading of textual elements are somehow deficient when put against a machine reading the same text. According to some the humanities needed a cure, and the antidote was provided by the digital humanities and its supposed scientific characteristics. Digital humanists ascribing to this view felt that the computer, in Ramsay’s words, delivered “corrective tendencies … against the deficiencies of ‘human reading’.” (3) Textual analysis arose within a framework of scientific methodology and hypotheses. Instead, however, Ramsay argues rightfully that the humanist endeavor does not operate in the same specificity of “fact, metric, verification, and evidence” (4) because the field is not to settle things definitely but rather to generate “richer, deeper, and ever more complicated” discussions (Ramsay 7).
The working of digital text inherently requires the use of computers. Through the course of many years and the development of editing standards, humanists have reached a point of marking up and encoding textual elements that have outstripped the ability of most tools to use the fullest extent of the texts (Rockwell 9). How we come to represent texts and the information they contain is just as important as representing the texts themselves. Humanists have already generated ways of representing texts, namely through the book, and the computer has introduced another method for conveying information. Matt Kirschenbaum introduces examples of innovative ways of displaying information with aesthetics that generate the playful and spontaneous creation of knowledge. Being able to interact and manipulate text directly leads to what Rockwell described as disciplined play, but our ability also means having interfaces that allows for a broad set of intiutions to be acted upon as ideas are confronted and theorized.
To return to the opening of this piece — that tools are prosthesis for reflection — the computer, the interface we interact with, and the deconstruction and algorithmic interpretation of text allow us new ways of asking questions and communicating ideas. The tools at our disposal allow for a broad range of analysis and visualization useful not just to amass data, but rather to think differently about our humanistic questions and engage in the continuous bouts of discussion that defines our field.