I had the opportunity to meet Michael Ullyot, assistant professor of English at the University of Calgary, last week when we both attended the annual Digital Humanities Summer Institute (DHSI) at the University of Victoria. While I spent the week trying to wrap my head around data analysis, Michael was diving in to the fundamentals of text encoding, and eager to get his hands on some EEBO-TCP texts and start experimenting!
He’s written a really useful blog post reflecting on the process and implications of text encoding, and on how computers and humans can work together both to create these texts and (sometimes!) interpret them. Here’s an excerpt:
My project this past week was to learn the language of these tags, so I could overlay my readings and interpretations on TCP’s already-encoded files — so I could, more precisely, add my tags to theirs. I began with an old favorite: Thomas Heywood’s elegy for Henry, Prince of Wales (d. 1612). Since TCP had already marked the stanzas, lines, and emphasized words (among other elements), I tagged references to historical figures, places, and motifs.
It’s that last category that got me thinking about tagging text as formalizing vs. enabling interpretation. To begin, how much of encoding is formalizing things you recognize, making things explicit that would otherwise pass un-noted? How objective, or definitive, is it?
The whole post is worth a read, so do check it out! We at the TCP look forward to hearing more from Michael as his work progresses.