Ainsworth, Peter, and Michael Meredith. “e-Science for Medievalists: Options, Challenges, Solutions and Opportunities.” Digital Humanities Quarterly 3.4 (Fall 2009): Web. 15 May 2011. http://www.digitalhumanities.org/dhq/vol/3/4/000071/000071.html
Ainsworth and Meredith first describe the difficulties faced by medievalists seeking to work with primary research materials. After reviewing some recent aids offered through digitization, such as image compression and juxtaposition, they introduce the manuscript viewer tool that they developed, Virtual Vellum (www.sheffield.ac.uk/hri/projects/projectpages/virtualvellum.html) and compare it to similar tools. Ainsworth and Meredith also discuss the change in research methods from a traditional one of a lone scholar to interdisciplinary and collaborative work.
Burrows, John. “Textual Analysis.” A Companion to Digital Humanities. Ed. Susan Schreibman, Ray Siemens and John Unsworth. Malden: Blackwell, 2004.
This article looks at the study of words that occur most frequently in writing. For example British writers make more use of which and should than American and Australian writers. Australian writers show a preference for we/our/us where British and American writers favor me. The statistical analysis of such words is necessary because they hold together whatever is being said, as they are “the underlying fabric of the text.”
Cohen, Matt. “Untranslatable? Making American Literature in Translation Digital.” Modern Language Studies 37.1 (Summer 2007): 43-53. Print.
This article takes The Whitman Archive as its object in order to examine the politics of both choosing editions and having your chosen edition institutionalized and widely disseminated. The article is useful to our class’s considerations in several respects: 1) it explores the divide in textual criticism between physical and digital editions, 2) it discusses the politics involved in edition selection in the humanities academy, and 3) it looks at some of the challenges of digital editing, especially where translation is involved.
Deegan, Marilyn, and Kathryn Sutherland. “The Cultural Work of Editing.” Transferred Illusions: Digital Technology and the Forms of Print. Burlington, VT: Ashgate, 2009. 59-88.
The second half of this chapter, sub-titled “Information and Noise: Text Encoding” is useful for its description of the tasks involved in encoding, instead of editing, a text, and the stakes implicated if these encoders are not literary scholars. The authors give an overview of the history and vocabulary of text encoding, and they summarize some debates we have considered in class. For instance, they refer to McLeod’s analysis of “Easter-Wings” when discussing the problems faced by encoders dealing with a text’s presentational features. They also look towards the future of encoding, arguing that the possibility for radical changes in the transcription and mark-up process exists due to advances in digital imaging.
Duggan, Hoyt N. “Some Unrevolutionary Aspects of Computer Editing.” The Literary Text in the Digital Age. Ed. Richard J. Finneran. Ann Arbor: University of Michigan Press, 1996. 77-98.
This chapter addresses some of the issues that we have discussed before in class, namely whether annotating a text electronically is different from publishing an edition. Duggan argues that “a golden age of textual editing has not arrived” and reminds us that texts are often chosen because their copyrights have expired, calling for scholars to choose texts based on their quality (77). Yet, his main point remains that scientific methods used in digital editing tools should be supported rather than distrusted (92). Duggan outlines his process of creating an electronic archive to construct a new critical edition of a fourteenth-century text, Piers Plowman, and the difficulties he encounters in working with witnesses without an authorial text. His solution to supplying the form and divisions of the fifty-four manuscripts, in addition to transcribing the words, is to provide visual reproductions and to use a mark-up language to make the changes searchable.
Hoover, David. “The End of the Irrelevant Text: Electronic Texts, Linguistics and the Literary Theory.” Digital Humanities Quarterly. 1.2 (2007): Web. http://digitalhumanities.org/dhq/vol/1/2/000012/000012.html.
This article states that scholarly digital editions deepen our understanding of texts because they provide easy access to multiple versions of texts. Hoover argues that it is time to return to the text, specifically the electronic text in an attempt to observe what these digital editions as well as what text analysis, statistical stylistics and text alteration can reveal about textual meaning.
McGann, Jerome. Black Riders: the Visible Language of Modernism. Princeton: Princeton University Press, 1993.
This book focuses on the form explosion of modernism represented in the printing press. McGann takes the approach that graphic and typographic design are key in creating and relating meaning, and explores how a book’s design and presentation are just as important to the understanding derived from an edition as the platonic form of the words represented. This extends our thinking of different editions into the actual shape of the book and look of the type. Of particular interest is his discussion of the different versions of Pound’s Cantos, and how the physical presentation of the books constitutes a display of their meanings.
McGann, Jerome. “From Text to Work: Digital Tools and the Emergence of the Social Text.” Romanticism on the Net 41-42 (February-May 2006): Web. http://www.erudit.org/revue/ron/2006/v/n41-42/013153ar.html.
This article looks at critical editions and how they work through the analysis of J.C.C. Mays’ edition of Coleridge’s poetry. This analysis launches the terms for exploring the prospects that digital technology provides for scholars tracking the socio-historical character of literary works. The article also looks at the work of D.F. McKenzie, whose theory of the social-text edition argues for a more inclusive editorial method. This article claims that this editorial method can be best realized through digital resources.
McGann, Jerome. Radiant Textuality: Literature after the World Wide Web. New York: Palgrave, 2001.
McGann’s book transcribes his decade-long work on the digital humanities project The Rossetti Archive (www.rossettiarchive.org). He discusses his theoretical choices and considerations about the nature of book textualities and editorial methods. Although this project mainly took place in the 1990s and technological tools have greatly evolved since then, Radiant Textuality is a good resource for readers interested in learning about the creation of a successful project in the early days of digital humanities.
Rommel, Thomas. “Literary Studies.” A Companion to Digital Humanities, ed. Susan Schreibman, Ray Siemens, John Unsworth. Oxford: Blackwell, 2004. http://www.digitalhumanities.org/companion/view?docId=blackwell/9781405103213/9781405103213.xml&chunk.id=ss1-2-8&toc.depth=1&toc.id=ss1-2-8&brand=9781405103213_brand.
This chapter takes a historical approach of discussing the fusion of the cultures of science and the humanities with the advent of computers in the field of literary study, as databases enabled a more thorough textual criticism. However, “literary computing” was seen as a lesser form of literary criticism, despite the fact that most literary computing is concerned with issues of theory and method and use empirical textual data to strengthen their criticism, while being critically aware that markup is interpretation that happens even before analysis. The essay ends with an argument for the increasing importance of literary computing as a meta-critical method for innovative approaches to literary texts, especially electronic scholarly editions.
Tanselle, G. Thomas. A Rationale of Textual Criticism. Philadelphia: University of Pennsylvania Press, 1989.
This short book addresses the role of the textual and literary critic. While it is thought that the task of the textual critic is to provide reliable texts for the literary critic to carry out the task of textual analysis, Tanselle’s point of view is that the two tasks are actually inseparable.
Taylor, Gary. “In Media Res: From Jerome through Greg to Jerome (McGann).” Textual Cultures 4.2 (Autumn 2009): 88–103. Web. May 29 2011. http://lion.chadwyck.com/searchFulltext.do?id=R04380134&divLevel=0&queryId=../session/1306719320_15084&trailId=12FA40BB18A&area=mla&forward=critref_ft.
This is a very interesting article that is relevant to the discussions we have been having about different theories of editing. The author compares three editors and their methods in order to discuss editing practices. First, the editor W.W Greg very explicitly transcribed, rather than translated – meaning that he refused to editorialize texts to make them more readable to the audience. Second, Saint Jerome who edited religious texts did the opposite as Greg – he did not transcribe, but translated text or editorialized it. As such, the author maintains that Greg did not consider St Jerome an editor, but a translator or even an author, since changing a text is authoring a new one. This sets up the entry of digital humanists like Jerome McGann, who do not think the distinction between translating and transcribing is so clear since both are acts of interpretation. The author then introduces a new term, “transmediation,” to refer to editors as interpreters, “they mediate between the past and the future, the present and the distant, but attempt to do so in ways that render invisible their own acts of mediation and remediation.” Taylor ends suggesting that editors can never escape re-coding or encoding, but if we seek to multiply it in many different media forms instead of minimize it, readers will benefit by learning how to “read across media.”