Jerome McGann's work explores the convergence of traditional textual analysis and modern digital technologies. He discusses how theoretical views on textuality, once deemed impractical, have found new relevance in the digital age.
Radiant Textuality by Jerome McGann opens with an explanation of its own narrative: “how certain theoretical views of textuality once considered weird, impractical, and unserious discovered their moment of realization in the digital world of the late twentieth century
Jerome McGann
https://www.engl.virginia.edu/people/jjm2f
http://www2.iath.virginia.edu/jjm2f/
Traditional Humanities Vs. Digital Humanities
Resistance is futile; the struggle isn’t between books and computers, it is amongst people. “From now on scholarship will have both/The question is— the choice is— whether those with an intimate appreciation of literary works will become actively involved in designing new sets of tools for studying them.” (p. 186)
The Book as a Machine of Knowledge
The differences between original manuscript and updated typographies only become apparent through humanities computing textual analysis.
From the book, to the codex, to the electronic meta-book
The Rewards of Failure
Alas, the coming of humanities computing does not bring with it the death of the book, only book technology.
“The algorithmic character of traditional text,” “text generates text.” There are codes and mathematical properties to extant in traditional text but we need humanities computing to analyze them. (p. 151)
Points of Departure
Subtopic
IATH (Virginia’s Institute for Advanced Technology in the Humanities) is considered to bookend the first “distinct phase in the history of humanities computing.” (p. 3)
http://www.iath.virginia.edu
A Hypermedia Research Archive: “The Complete Writings and Pictures of Dante Gabriel Rossetti.” A research installment “built under the auspices of the University of Virginia’s Institute for Advanced Technology in the Humanities (IATH).” (p. 3)