Digital Research Tools

Alexis, Antonia, Craig, Hugh, and Jordan, Ellen. “The Bronte Sisters and the Christian Remembrancer: A Pilot Study in the Use of the ‘Burrows Method’ to Identify the Authorship of Unsigned Articles in the Nineteenth-Century Periodical Press.” Victorian Periodicals Review 39.1 (Spring 2006): 21-45.

This article considers an entirely different aspect of digital humanities research: the creation of a program that, using key words and phrases, and computing for statistical probability, is being used to determine the possible authorship of anonymous articles in Victorian print media. Created by literary scholar and early digital humanities figure John Burrows at the University of Queensland, Australia, the program analyzes unsigned texts for the recurrence of words and phrasal patterns, frequency of personal or collective pronoun use, interrogative vs. declarative statements, and other linguistic “tells” that researchers claim are, while not quite fingerprints, nevertheless reliable markers of the gender and even identity of individual authors when compared against their signed texts.

Bonnet, John, and Kevin Kee. “Transitions: A Prologue and Preview of Digital Humanities Research in Canada.” Digital Studies 1.2 (2009): Web. http://www.digitalstudies.org/ojs/index.php/digital_studies/article/view/167/222.

Bonnett and Kee’s editorial introduction provides a good overview of digital humanities scholarship in Canada. They begin by looking at the past and trace the first project, concerned with the French population in New France, to the University de Montréal in 1966. The authors then provide brief descriptions of significant projects and scholars across Canada today. (If anyone else is looking at a Canadian project for their critique, this part might be useful for an environmental scan.) Lastly, they address three challenges faced by digital humanists: new platforms that require a change in the work culture, “the topographic revolution” (domains that are multidimensional, dynamic, and autonomous), and the challenge of aggregation of content and its implications. The authors conclude that digital scholars will constantly work in transition, also meaning that “their thought and practice continually remains in a state of transition.”

Brown, John Seemly. “Learning in the Digital Age.” The Internet & the University: Forum 2001. Ed. Maureen Devlin, Richard Larson and Joel Meyerson. Published as a joint project of the Forum for the Future of Higher Education and EDUCAUSE, 2002. Web. http://net.educause.edu/ir/library/pdf/FFPIU015.pdf.

This file looks at the social process of learning. The current and next generation no doubt view technology as synonymous with learning. Colleges and universities have also embraced technology as it presents an opportunity for new learning environments that enrich the ways humans learn. The file looks at the ways that teaching and learning have moved into a digital age.

Brown, Susan et al. “Between Markup and Delivery; Or, Tomorrow’s Electronic Text Today.” Mind Technologies: Humanities Computing and the Canadian Academic Community. Eds. Raymond Siemens and David Moorman. Calgary: University of Calgary Press, 2006. 15-32.

I found this chapter to be a good model of a project critique similar to the ones we are completing for this class. A crucial difference, however, is that the authors of this text are critiquing their own project rather than the work of others and can therefore be quite specific in their analysis. The authors look at the Orlando Project (http://www.arts.ualberta.ca/orlando/), an electronic experiment on the history of women’s writing. They identify the project’s major strength as its complex and large tag scheme, but concurrently see it as a drawback for the same reason. For instance, tags exist not only for items such as titles and divisions, but also for attitudes to writing and intertextuality. When discussing the project’s usability, the authors make the important point that academic projects should not be limited to delivering “packets of information,” as is often the case online; they should instead “attempt to foster knowledge” (18). With this idea in mind, the authors propose six improvements to the delivery of the Orlando Project.

Choi, Haeryun and Joseph M. Pire. “Expanding Arts Education in a Digital Age.” Arts Education Policy Review 110.3 (2009): 27-34.

Choi and Pire discuss the narrowing of the public education system due to an emphasis on “market readiness,” and the subsequent decline of teaching the arts to students. They suggest new technology as a way for the arts to “reinvent themselves as pathways of innovation,” which ties into the digital humanities. After describing how technological competency is being encouraged in teachers and academics, they describe a database they worked on called “Rembrandt and Collections of His Art” that illustrates the potential of linking technologies to the humanities for new potential in the pedagogy of the arts. Most interesting in this review is the discussion of “digital reconstruction” of Rembrandt’s works, which are fragments. This article would relate well to any discussion of the process of recensio written about by Maas in Textual Criticism.

Flanders, Julia. “Trusting the Electronic Edition.” Computers and the Humanities 31.4 (1998): 301-310.

In this article from back in the days when the digital edition was still relatively young, Flanders explores the question of what place images have in electronic editions of print texts, as well as those digital editions created exclusively for circulation online. I mostly thought this could be of interest to us because ECGText is ECGText; we haven’t even considered the idea of illuminated manuscripts, different versions of story-telling tapestries, or anything else of that variety. It may help to envision the social edition in terms of how imagery is represented, in order to think outside the paradigms that are usually associated with thinking about text editing and representation.

Gray, Brenna Clarke. “Using Wikipedia in the Classroom: A Pedagogical Approach”. Presentation of paper, ACCUTE conference, Fredericton, NB, May 28, 2011.

In this presentation of her paper at the ACCUTE conference today, Brenna Clarke Gray of Douglas College in New Westminster, BC discussed her ongoing Wikipedia projects with first-year college students.  In the project, she has students research relatively minor or regional Canadian authors who nevertheless have a notable body of work, and then has the students create a Wikipedia entry for them.

Mentioning the students’ extensive participation in online media outside of the classroom setting, she modeled the project as a way to channel their regular activities into getting them thinking and writing about literature.  Students mentioned that the “pressure” of a Wikipedia entry (a public forum, as opposed the “closed loop” of a term paper) pushed them into more thorough research than they might do for a paper, and that they experienced great satisfaction at contributing to a body of knowledge that others would use. As well, the Wikipedia “peer review” system (which includes editors who “patrol” certain subjects quite rigorously) brought an outside, objective presence into evaluating the students’ work.  

For example, an editor known as “Bearcat”, who seems to exercise a certain grumpy authority in Canadian literature, changed, edited, and even deleted a few entries until they were modified. 

As Gray stated in her lecture, the project was successful in that it engaged with the question of “what tools students are using on the web, and how can these be used or molded to assist in education?” 

Liu, Alan. “Transcendental Data: Toward a Cultural History and Aesthetics of the New Encoded Discourse.” Critical Inquiry 31 (Autumn 2004): 49-84.

This article from 2004 charts several different methods of data representation and management becoming available at the beginning of the 21st data, including data pours, XML code, and other features of metaindustrial world. There are a lot of moving parts to Liu’s piece, but a few main points of interest for us. In particular, Liu talks about Discourse network 2000 and its goals of making technologic discourse prioritize transformability, autonomous mobility, and automation, which all work together to enforce Discourse network 2000’s ideology of separating content from its presentation. This segues well from our discussions about materiality, and several possible viewpoints are extrapolated.

Marshall, Catherine C. “Rethinking Personal Digital Archiving, Part 1.” D-Lib Magazine 14.¾ (2008): Web. http://www.dlib.org/dlib/march08/marshall/03marshall-pt1.html.

This article presents four challenges for personal digital archiving: digital stewardship, distributed assets, value and accumulation, and retrieval for long-term storage. Marshall shows how the very same characteristics that make personal digital assets appealing are also what make digital stewardship burdensome.

Matheson, Jennifer L. “The Voice Transcription Technique: Use of Voice Recognition Software to Transcribe Digital Interview Data in Qualitative Research.” The Qualitative Report 12.4 (Winter 2007): 547-560.

Growing from the theme of tool development for humanities research, this article discusses the evolving technology of voice recognition software and the difficulties in using it faced by transcribers of audio materials. Voice Recognition Software, or VRS, is apparently useful only up to a point, and encounters particular challenges in transcribing interviews (due to the trouble VRS has with processing more than one voice at a time). In fact, using VRS on its own to transcribe interviews is often only slightly more efficient than manual transcription. Matheson has found a way around this by employing a technique that has the VRS work in tandem with a digital voice editor. The bulk of the article explains her technique in detail. This piece offers a detailed example of digital humanities tools at work, but more importantly provides another sample of digital humanists at work: it seems that tools are only meant to carry us so far, at which point creativity and a DIY attitude are supposed to take over to make the tools truly effective. 

McGann, Jerome. “Culture and Technology: The Way We Live Now, What Is to Be Done?” New Literary History 2.1 (2005): 71-83.

McGann criticizes humanities scholars for the “degree of ignorance about information technology and its critical relevance to humanities education and scholarship” among them. He advocates for an interdisciplinary approach that does not treat the digital humanities theoretically, but practically as well. To make it easier for humanities scholars to start implementing this praxis, McGann describes the Networked Infrastructure for Nineteenth-Century Electronic Scholarship (NINES) project, which is a comprehensive digital resource as well as an advocacy group “to protect the interests and rights of scholars.”

Interestingly, he uses Bruno Latour’s urging to “experience and experimentation” to say that breaking down the “big totalities” like, he insinuates, non-digital humanities, is a form of social action. He calls for academics to be citizens of the world, to embrace the practical. Funny though – what he deems “practical” is the digital humanities, which still involves sitting in front of a computer and thinking about stuff! So the invocation of Marx and his proletarian is misplaced, I think. Also interesting that he encourages journal publication and the peer review process to be online.

Nowviskie, Bethany. “A Scholar’s Guide to Research, Collaboration, and Publication in NINES.” Romanticism on the Net 47 (2007): Web. http://www.erudit.org/revue/ravon/2007/v/n47/016707ar.html.

This article describes the design and scholarly use of Collex, the social software and browsing system that powers NINES, a “networked infrastructure for nineteenth-century electronic scholarship.” Collex allows scholars to search, browse, collect, and annotate digital resources pertaining to 19th century studies. The article includes helpful screenshots for a simple step-by-step user ‘how-to’ guide.

O’Gorman, Marcel. E-Crit: Digital Media, Critical Theory and The Humanities. Toronto: University of Toronto Press, 2006. Web. 5 June 2011.

O’Gorman describes the Electronic Critique (E-Crit) program at the University of Detroit Mercy, an interdisciplinary response to adapting the university to the rise of new technologies and media. Like the sentiments of some of the other digital humanities practitioners we have read, “resistance and vigilant critique” is a large part of the theory behind the program, which “opposes the compartmentalization of knowledge.” This book suggests that a new discourse is forming that is a combination of poststructuralist critical theory and technology.

Rettie, Ruth. “Exploiting freely available software for social research.” Social Research Update 48 (2005).

This short article by Ruth Rettie explains to readers a number of methods for making both research and writing on a computer more efficient. Rettie first shows how the Microsoft Word autocorrect function can be used to create a type of shorthand when doing qualitative or academic research and even writing papers. Her technique involves creating a kind of grammar that can easily be programmed into the autocorrect function that can make many words into two- to three-keystrokes for the user. In subsequent sections, Rettie instructs readers on the usefulness of desktop search engines for poring over data; she makes particular mention of Google3, which allows one’s desktop search to be interlaced with an online search. The article finishes with a section on intelligent searching methods, i.e. the kind of techniques employed in advanced searches. I happened upon this article while writing my journal entry on textual standardization; I wanted to see if any theoretical or discursive literature existed on autocorrect. Instead I found this piece, which takes quite a different tack in addressing the technology, but I think it constitutes a great do-it-yourself kind of approach to using basic tools to increase efficiency. The section on search engines is also useful, though the portion on intelligent searching doesn’t provide much information that anyone who’s ever done an advanced search wouldn’t already know.

Rydberg-Cox, Jeffrey A. “Helping Readers Understand Scholarship.” Digital Libraries and the Challenges of Digital Humanities. Oxford: Chandos Publishing, 2006. 51-69.

In this chapter, the author considers the stage following the creation of a digital humanities project – when users access the project – and focuses on tools that can be included in projects to make it easier to use. He believes that technologies, which are already widely used commercially, should also be integrated into digital texts to help readers understand them. He discusses the use of parsers and tags, keyword extraction and analysis, information retrieval (query expansion, visualization, and multilingual searching), and linking similar passages.

When considering potential changes that could be made to ECGText, I found that the query expansion technique would be helpful. It takes a user’s initial search for information and tries to automatically connect it with related terms. For instance, this technique could be interesting for collating synonyms. I also like the author’s suggestion of representing grouped information visually rather than in a standard list format.

Sansing, Chandler. “Case Study and Appeal: Building the Ivanhoe Game for Classroom Flexibility.” TEXT Technology 2 (2003): 43-52.

This article discusses one specific example of how a renowned digital humanities tool, the Ivanhoe Game, can be adapted to suit any audience, even sixth graders. The Ivanhoe Game is meant to illustrate the possibility of multiple, collaborative interpretations of a piece of literature. To make a move in the game, the user writes the move themselves, playing the role of one of the people who are involved in the chosen book, from characters in the book itself to players in its production, including editors. After describing how this game works, and a summary of its pedagogical benefits (which include strengthened critical thinking skills), the author outlines the problems of such a game at a lower academic level; namely, that middle school students are not interested in narrative and critical theory. However, the author concludes that the game is still valuable in this context, especially in terms of classroom- specific pedagogical goals rather than the critical theoretical concerns that generally surround the Ivanhoe Game: the students still learned story structure, characterization, collaboration, and public speaking skills among other more abstract advantages.

Schreibman, Susan. “Computer-mediated Texts and Textuality: Theory and Practice.” Computers and the Humanities 36.3 (2002): 283-293.

This article is more about digital research than digital archiving, but that may make it a good point of reference (or at least a decent point of interest) when we are undertaking our project critiques. One use for Schreibman’s essay is to use it as a reference when evaluating digital research tools, as it describes the major theories involved in creating criteria (though more than one viewpoint is espoused, and some appear at first glance to be a tad contradictory of one another).

Trillini, Regula Hohl and Sixta Quassdorf. “A ‘key to all quotations’? A corpus-based parameter model of intertextuality.” Literary and Linguistic Computing 25.3 (2010): 269-286.

I was inspired by Kalervo’s journal entry on multiple editors and editions (posted May 12th) to think about intertextuality and the problems (and opportunities!) it poses for digital editing. This very technical article problematizes the contradictory definitions of stylistic methods of intertextuality such as quotations or allusions, and proposes a digital humanities resource that enables the user to dynamically categorize the intertextual text. Using the database HyperHamlet, the authors illustrate the advantages of such a comprehensive electronic analysis, which include a clearer rendering of the different parameters for intertextuality that does not compromise their independence but shows how they interact.

Wulfman, Clifford E. “The Perseus Garner: Early Modern Resources in the Digital Age.” College Literature. 36.1 (2009): 18-28. Web. http://0-lion.chadwyck.com.mercury.concordia.ca/searchFulltext.do?id=R04154813&divLevel=0&area=mla&DurUrl=Yes&forward=critref_ft.

This article discusses the Perseus Garner, a digital library of primary and secondary materials from the early modern period in England. Instead of just an archive of e-prints, the creators of the Perseus Garner understood their project to be a commentary on the new ways of understanding the literature that computational technology allowed: asking different questions, making different links. They are very conscious of markup on their textual resources, and standardize the markup according to the TEI. After discussing questions of freedom from editorial bias and proper amount of annotation, he suggests the solution of a hypothetical “hypervariorum,” a huge hypertext that gathers every source of secondary criticism on a particular author at once, which “substitutes the rhetoric of [editorial] judgment with the rhetoric of [reader] choice.” His theory is interesting but I suspect his idea is a bit idealistic.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s