Digital Humanities as a viable means of scholarship has been an issue I have been thinking about since the beginning of my Masters. Often closely aligned with English Literature and Linguistics departments (Kirschenbaum 2010) the digital humanities has never fully defined itself as a field (Svensson). In fact, the digital humanities sometimes prides itself on its ability to be used by multiple disciplines, as witnessed by attending a recent HASTAC conference where scholars from History, English, Writing, and Philosophy were on the same panel. However, within its usefulness there is still a struggle for verification, seen through the need for validation in documents such as the MLA guidelines for evaluating digital work or through journals like Digital Humanities Quarterly. This type of emphasis on authenticity and worthiness is echoed throughout much digital humanities work and is reflected in the very process of digital documentation itself, perhaps providing an answer as to why it is difficult for digital humanities to generate academic currency in terms of respected value.
As Susan Hockey examines in history of humanities computing, many scholars saw humanities computing as simply tools and not valuable parts of the research equation; yet, a shift in attitudes in some scholars brought about journals and the regulation of humanities computing thus enabling a more standardized approach to using it. This type of pattern is witnessed in the ephemerality and fluidity of digital documentation that Matthew Kirschenbaum engages with in Mechanisms: New Media and the Forensics Imagination. As Kirschenbaum notes “we tend to fixate on the fluidity of electronic text, its malleability and putative instability,” (56) even though data is often much more permanent than appears. This misconception, or rather misdirection of public perception is then correlated to authenticity as “questions of authenticity are directly related to an electronic object’s ability to not only resist but also to expose tampering” (Kirschenbaum 57). In the past I’ve looked at digital forensics and I hope to “practice” this here, expanding on my previous research to illustrate how this translates into problems for the digital humanities, as well as its possible opportunity for activism. In special editions of comic books/graphic novels companies often include original script pages from the authors, using the never-before-seen tagline to attract readers; often companies do this with DVDs and easter eggs or books and handwritten scripts (Kirtz 2014). However, what is unusual is that the majority of these scripts are typewritten, connoting a type of material process before even entering the digital realm. Regardless special edition graphic novels such as Daredevil Vol.1, and Batman: Year One, by famous author Frank Miller sell for increased prices with these “original” scripts as part of the special packaging.
What a forensic approach reveals is the careful digital manipulation of these scripts. For example, in Batman: Year One, the edges are covered in scanning marks, the borders are misaligned and stains are left (Kirtz). This is not merely for the reader’s enjoyment but to authenticate this text; in the digital era there is much uncertainty around text due to exactly what Kirschenbaum describes. Furthermore this is illustrated in the backscript of Daredevil Vol. 1, whereby scanning marks follow a perfect pattern from page to page, the coloration is markedly uniform and there are no ink bleeds or smudges on the typewritten letters, something which occurs on nearly every other scan (Kirtz).
This therefore illustrates the lack of authentic presence and the digital manipulation, and even possibly digital creation of this script. An interesting example is at the end of Arkham Asylum: A Serious House on Serious Earth, which includes the entirety of Grant Morrison’s script as a typewritten, scanned document but with red computer font overlays that give additional context to the writing. These red overlays are Morrison’s current commentary, released on the fifteenth anniversary of Arkham Asylum. Comments are used to restore scenes and explain choices such as, “Robin appeared in a few scenes at the beginning […] Dave McKean, however, felt that he had already compromised his artistic integrity sufficiently by drawing Batman and refused point blank to bend over for the Boy Wonder–so after one brave but ridiculous attempt to put him in a trench coat, I wisely removed him from the script” (Morrison). However, the font choice is not Times New Roman or another popular computer font; it is actually the exact same as the typewritten font. What differentiates the original script and the new annotations then is the straightness of lines, ink smudges and thickness and overall uniformity of words. Therefore even in a document that is openly digitally manipulated, it still perpetuates physical, human-like characteristics in order to convey certain truths about authenticity.
What each of these illuminates is the ways in which scholars can forensically examine documents, as well as the preconceived biases of general society about document safety and authenticity. Contrastingly, digital documents are skeptically accepted, such as comic scripts from online communities. Websites dedicated to collecting digital scripts, whether they are PDFs or Word documents or in Pages format, are often seen with hostility and skepticism by the general comic book public. In order to assert the validity of one of these documents, I tried to engage with Kirschenbaum’s methods of understanding data encryption and storage. For example, when looking at an Alan Moore Word document script, things to look for are registered author, file location and version (done through a Hex editor and through looking at the properties).
The author of this particular script was Tim Simmons, the website owner as seen in the properties section of this document. This casts doubt on the accuracy of this script but also permits the possibility of its realness in that it could have been saved by him on his computer before being uploaded to the site. In thinking about digital rights and the locality of the text, I was able to figure out with more certainty whether the text itself came from the webmasters computer or someone else’s. This is contrasted by another Alan Moore script from Simmon’s webpage, which is obviously obtained from a fax transmission as it has the scanning marks as well as slight distortion of letters and scanning date and paratext at the bottom of each page (it’s linked here: Youngblood-2-script). Further things like time signatures, electronic signatures, and website authority/IP address that help determine the level of authenticity of the scripts. Therefore because of the skepticism and apparent lack of ability to assert authority over documents, people approach anything involving the digital with apprehension thus perhaps illustrating the difficulty with defining digital humanities and with allowing the digital humanities to evolve into its own discipline.
Hockey, Susan. “The History of Humanities Computing.” A Companion to Digital Humanities. ed. Susan Schreibman, Ray Siemens, John Unsworth. Oxford: Blackwell, 2004. Web. 13 Sept 2015.
Kirschenbaum, Matthew G. Mechanisms: New Media and the Forensic Imagination. Cambridge, MA: MIT, 2008. Print.
Kirtz, Jaime Lee. “Computers, Comics and Cult Status: A Forensics of Digital Graphic Novels.” Digital Humanities Quarterly 8.3 (2014). Web. 13 Sept 2015.
Morrison, Grant. Arkham Asylum: A Serious House on Serious Earth 15th Edition. New York: DC Comics, 2005. Print.
Svensson, Patrick. “The Landscape of Digital Humanities.” Digital Humanities Quarterly 4.1 (2010). Web. 13 Sept 2015.