I was pleased to attend the annual meeting for the Society for Cinema and Media Studies recently, where I organized a workshop on the topic of “Digital Humanities & Media Studies: Exploring the Intersections.” This talk deliberately built on the session from last year, organized by Miriam Posner and Jason Mittell, on “DH & Media Studies: Staging an Encounter” and the MediaCommons’ Front Page Survey Question from April 2013: “What are the differentiations and intersections of media studies and the digital humanities?” (You can read my answer to that question in my response “The Boolean Logic of the Digital Humanities.”)
Whenever discussing the ways in which digital humanities intersects with any discipline, my preference is to root the discussion not in the abstract, but rather to embed our theorization of that work in specific examples. I was very happy, then, to be joined by representatives of four specific projects that operate in this space where DH meets media studies. All four workshop participants have received support for their work through an NEH Digital Humanities Start-Up Grant. Participants included Anne Balsamo (Dean of the School of Media Studies at the New School for Public Engagement; AIDS Memorial Quilt Digital Experience Project), Dene Grigar (Associate Professor and Director of The Creative Media & Digital Culture Program at Washington State University Vancouver; Pathfinders: Documenting the Experience of Early Digital Literature), Eric Kaltman(graduate student in Computer Science at UC Santa Cruz’s Expressive Intelligence Studio; Preserving Cultural Software, with Noah Wardrip-Fruin, Henry Lowood, and Christy Caldwell), and Lauren Klein (Assistant Professor in the School of Literature, Media, and Communication at Georgia Tech; TOME: Interactive TOpic Model and MEtadata Visualization).
I opened the panel with the following remarks. [Note that the content that follows reflects my opinion and should not be taken as official NEH policy.]
Digital Humanities and Media Studies: Exploring the Intersections
Thank you for joining us here today, on the final day of the conference. I’m very happy to have here this wonderful group of workshop participants who represent points of intersection between media studies and this eclectic set of activities that has been largely gathered under this rubric “digital humanities.” This panel, and its subtitle “Exploring the Intersections,” deliberately builds on a panel organized last year by Miriam Posner and Jason Mittell (that one was entitled “Digital humanities and media studies: staging an encounter“).
Posner opened this previous session by way of Johanna Drucker, who asserted that “We must theorize digital technology through critical engagement with the medium itself, through making and breaking and building and reflecting,” an observation that mirrors Tara McPherson’s own calls for the “multimodal scholar,” one that is likely drawn from the ranks of media studies, because, as she writes, “Who better to reimagine the relationship of scholarly form to content than those who have devoted their careers to studying narrative structure, representation and meaning, or the aesthetics of visuality? Who better to address the utopian registers of much popular commentary on technology than historians of media and scholars of political economy?”
Posner closed her remarks last year with the following: “I urge us to see this as an opportunity to draw on those qualities at which media studies excels…and to ask what they can bring to the digital humanities.”
Our guiding subtitle today–exploring the intersections–is also meant to suggest that: firstly, digital humanities has no clear definition, nor is its history or origin to be found in a single discipline or approach, and secondly, that DH might best be considered, as many have suggested, as a community–or better yet, communities–of practice, which constitute a fairly broad “possibility space” of intersecting points along three vectors (following Etienne Wenger):
with overlapping domains of expertise and knowledge bases across the different disciplines;
overlapping communities of scholars — the actual people and their interaction, be it online, on campus, or in shared convention halls;
and, overlapping practices–the variable methodological and theoretical approaches that comprise ways of doing research.
Are you a tweeting, social network analyzing media scholar from the SW, with memberships in SCMS and AoiR and HASTAC? Or, are you a geospatially-oriented film scholar from a NE liberal arts campus with membership in SCMS, MLA, and CAA, with a love of Facebook, a disdain for twitter, and an irrational fear of blogging? As Katie King argues “movement among knowledge worlds require understanding authorships, audiences and agencies in ways that keep redrawing forms of inclusion and exclusion, virtually moment to moment,” and I believe this informs the ways that the phrase ‘digital humanities’ can either feel inclusive or alienating at any given time to any give person engaged in work along these multiple vectors in this possibility space.
With so many concerns about what counts, who’s in and who’s out, and so on, I think it reassuring that this formation of a “possibility space” is a process of dynamic iteration that should open possibilities rather than foreclose them. For better or worse, an arrangement of these vectors often manifests most readily in the form of a project, which serves as a kind of defining entity of DH, containing the domains, collaborative groups, and practices that inform them. Scholars and their projects in media studies that share one very particular kind of vector — support through NEH funding — include some of the following:
MediaCommons was one of our earliest Start-Up Grants and is itself a sort of gathering of intersecting vectors– a social network supporting a community of practice, and offering mechanisms like the Survey Questions for theorizing the different inflection points shaping media studies. Of course, MediaCommons became a model for recasting online scholarly communities and methods of publication, offering a legacy that traces down to sites like MLA Commons.
Project Arclight, funded through a Digging into Data grant, and presented here yesterday by Eric Hoyt.
And also presenting yesterday, Michael Casey (Bregman Music & Audio Research) and Mark Williams (Film and Media Studies) from Dartmouth with the ACTION toolkit for cinematic information retrieval.
Scalar and Vectors, which received programming support under the auspices of three different Institutes for Advanced Topics in the Digital Humanities (Tara McPherson & her team remain, I think, the group that has received the largest sum of funding from the Office of Digital Humanities).
Grant support has been provided to those studying the history of various types of film, like Kevin Hamilton’s (University of Illinois) project: Online Video Archive & Prototype Interface for America’s Nuclear Test Films.
We’ve been actively engaged in exploring DH as a possibility space. Media Systems, a workshop jointly supported by NEH, NEA, NSF, and Microsoft Research, was held last year at UC Santa Cruz, led by Noah Wardrip-Fruin and Michael Mateas, in order to address this very problem of fostering a productive, overlapping community of practice centered on computational media. They brought in developers of technical systems, media scholars that study those systems, and artists/designers who create using them. You can read the final report online—newly released this week, so I encourage you to take a look.
There are a number of funding opportunities that support the study of media and film. You’ve heard about several DH Start-Up Grants, but other funding sources include the DH Implementation Grant program, Digging into Data, Fellowships & Summer Stipends (for that article or book project), Media Projects (for documentary films and radio), and the new Digital Projects for the Public.
In fact, we’ve been doing a little digging and have found several projects that reach back into some of NEH’s earliest history that support similar kinds of media work – this is a McBee record card from 1974, detailing a grant to Andy van Dam (Brown University) in “computer/film” for “an experimental program to teach a college-level English poetry course, utilizing a new form of computer-based ‘manuscript,’ called a hypertext.”
Today, we have four scholars who have worked under the auspices of digital humanities, in as much as they received funding for projects from the Office of Digital Humanities, although I’m not sure many of them would readily or quickly identify themselves first-and-foremost as “DH’ers” per se (which is, by the way, not a requirement in order to receive a grant), but found themselves in this possibility space of intersecting vectors.
First, we’ll hear from Anne Balsamo. She is the Dean of the School of Media Studies at the New School for Public Engagement. Her recent book, Designing Culture: The Technological Imagination at Work (Duke, 2011) examines the relationship between culture and technological innovation. Previously she was a Full Professor at the University of Southern California with joint appointments in the Annenberg School of Communication & the Interactive Media Division of the School of Cinematic Arts.
Dene Grigar is an Associate Professor and Director of The Creative Media & Digital Culture Program at Washington State University Vancouver who works in the area of electronic literature, emergent technology and cognition, and ephemera. She is the author of net art works, multimedia performances and installations, and nonfiction mobile projects like “Fort Vancouver Mobile.” She is President of the Electronic Literature Organization and Associate Editor of Leonardo Reviews.
Eric Kaltman is a graduate student in Computer Science at UC Santa Cruz’s Expressive Intelligence Studio. He was a member of the Preserving Virtual Worlds II project team, which investigated archival significant properties for computer games. From 2008-2012 he was a project archivist for the digital games collections at Stanford University, cataloging and archiving the Cabrinety Collection and Steve Meretzky’s Infocom Papers. He is currently a member of an IMLS-funded grant focused on videogame metadata and citation practices.
Lauren Klein is an assistant professor in the School of Literature, Media, and Communication at Georgia Tech, where she also directs the Digital Humanities Lab. Her writing has appeared in American Literature, American Quarterly, and In Media Res. The recipient of a NEH Digital Humanities Start-Up Grant, she is at work on a tool that will allow scholars to visualize text-based archives, as well as a book on the cultural history of data visualization from the eighteenth century to the present day. (This visualization is based on her article The Image of Absence: Archival Silence, Data Visualization, and James Hemings,’ which was published in the December 2013 issue of American Literature).
I’ve asked each of the workshop participants to speak for around 7-8 minutes, and I hope what follows is a conversation about both pragmatic and theoretical aspects of ‘doing DH,’ whatever that might mean.
What followed was a dynamic conversation about both the pragmatics and the theory informing each of these DH projects, the possible value of using DH to approach media objects, and the many ways that media studies can inform and challenge our understanding of the digital humanities. I’m grateful to the workshop participants and the audience for a dynamic and exciting two hours.
Recently I gave the following talk as part of the 2014 Game Developers Conference panel: U.S. National Investment in the Future of Games? The panel was arranged by Noah Wardrip-Fruin (associate professor of computer science and co-director of the Expressive Intelligence Studio at UC Santa Cruz), and other participants included William S. Bainbridge (Program Director for the National Science Foundation) and Elaine Raybourn (Principal Member of the Technical Staff in Cognitive Systems at Sandia National Laboratories, on assignment from to the Advanced Distributed Learning Initiative, Office of the Deputy Secretary of Defense). You can read Noah’s framing remarks here.
In an earlier session for the GDC Education Summit, I provided a review of many of the games-related projects that had been supported by NEH over the years, but as this panel’s audience was more likely to include a stronger mix of industry, I wanted to speak a bit more broadly about the kind of considerations that might go into a humanities-centered project, and the challenges of working with humanities material. [Note that the content that follows reflects my opinion and should not be taken as official NEH policy.]
Gaming & the Humanities: a description, a challenge, an appeal (& a rocket cat)
With my ten minutes, I want to talk about gaming and the humanities, and in doing so offer three things: a description, a challenge, and an appeal.
The description is this: I work in the Office of Digital Humanities, which is a small office at the National Endowment for the Humanities (NEH). NEH is an independent federal research funding agency for humanities subjects like history
philosophy, art history, and archaeology, and the study of media like film, TV, and (yes) video games. (We are to the humanities as NSF is to the sciences, with some cross-over in subjects like history of science, anthropology, archaeology, and so on.)
NEH gives grants to nonprofits, universities, museums, and other cultural organizations for all kinds of work, and those organizations in turn often collaborate with developers, documentary film makers, designers, and programmers in developing these projects.
Some of the scholarship we fund matches the kind of ideas you might have of literature scholars or art historians: grants to write books on Shakespeare or to digitize a rare manuscript in an archive like those with Rocket Cats (like this 1607 German manuscript held by the Folger Shakespeare Library). Apparently there are several of these images from 16th and early 17th century manuscripts, cats and birds with a flaming jetpack of hate attached to their back, used to infiltrate and burn villages and castles. At least, that was the idea [see more about rocket cats at Atlas Obscura].
Here’s another one (this one from Penn State).
And we fund some pretty innovative work, and weird collaborations, like Egyptologists and cardiologists using large-scale data analysis and MRIs on all the world’s mummies to better understand heart disease and the cultural practices of ancient Egypt. We have wide-ranging interests.
If you want to learn more about those interests and the kinds of things we fund, I encourage you to visit our website at www.neh.gov/odh
One of those interests is games. Among our more successful ventures is Metadata Games, from Mary Flanagan at Tiltfactor and her team of designers and archivists. Metadata Games is helping crowdsource metadata for huge amounts of digitized cultural heritage — pictures like these — making them in the process more discoverable. Mary received both a Start-Up Grant and an Implementation grant from the Office of Digital Humanities to support this work.
We’ve also dipped our toe into funding some educational games. With the Corporation for Public Broadcasting, NEH helped support Mission US, developed by a team at WNET (NY Public Media), a series of episodic adventure games on topics like the events leading up to the American Revolution,
or the flight from slavery and the 1850 Fugitive Slave Act.
Mission US was supported through our Division of Public Programs, which is about to release a new program–Digital Projects for the Public–that can support games that inform the public about humanities topics. There’s an informational sheet in the back if you’re interested in this new program.
The Office of Digital Humanities—where I work—has funded a variety of game and game-like experimentations through a program called Digital Humanities Start-Up Grants, which encourages early-stage prototyping,. These grants are mostly low level, 30-60k to experiment, try out new things, take some risks, see what works and doesn’t work, and share your results. I’ll talk about a few of them in a minute.
But that’s the description of what NEH is and does. And in that description, you likely picked up on some of the things that are important for scholars to do their work. They use primary documents–old manuscripts like that rocket cat–but documents can come in the form of oral histories, or novels or poems or old newspapers or even financial or shipping records. With that evidence and their own knowledge of events over time, scholars piece together the best understanding they can of the historical record. I’m sure to many of you this sounds fairly familiar.
It’s a quest, of sorts.
And so here’s the challenge: building games about humanities topics–games that really draw on the historical record, that show an interpretation of history, or literature, or art — is really kind of difficult. The humanities are messy, and the historical record is almost always going to be incomplete, as histories — and stories — often are. But we try to use this knowledge to understand how things work on a macro scale and on a micro scale. Let me give you an example.
This is the Trans-Atlantic Slave Database, developed by a team at Emory university led by David Eltis, a historian, and Martin Halbert, a librarian. This represents years of labor–the creation of datasets that inform this project started in the 1960s, and the team of developers and historians involves a ton of people, and decades of scholarship.
And they’ve created a pretty good record of how the slave trade worked, where the ships went, and from what ports,
but they’ve also been able to start reconstructing who these people– slaves taken from their home, shipped overseas–who they actually were, at least as best they can, recorded here in the African Names Database. This is the macro and the micro, the broad idea and the individual person. How do you translate this into a game environment, into a series of mechanics, a set of rules?
The whole of it, not just the movement of people (this broad idea of migration ….
… and movement, the journey, which is so important a theme in the humanities and in games),
but also the complicity of it, the suffering of it, the confinement of it (contrasting moments of mobility/confinement also so important in the humanities, an idea so elegantly captured in Train)…
the sorrow of it (explored thoughtfully in Terry Cavanagh’s retelling of the legend of Orpheus and Eurydice)?
How do you tell the story with sensitivity and respect to the historical record,
but also with sensitivity and respect to the individual lives involved?
As you can see from the images, there are efforts and models out there, not just something like Oregon Trail, from which we all learned about migration and dysentery, but also Journey, and Train, and Don’t Look Back, and even the historically informed Advanced Squad Leader and other wargames, which generate so many volumes of historical written record to inform their counterfactual play. But balancing the counterfactual and historical record, the macro level and the micro, the big picture and the individual story is really what I feel is the challenge of gaming in the humanities.
Through the Start-Up Grant program, we’ve also funded small experiments, such as telling one girl’s story of living in Japanese internment camps in the Jim Crow South during World War II, through Emily Roxworthy’s (UCSD) Drama in the Delta.
Or this recent transmedia project that received a small Start-Up Grant—it’s just getting started–called Traveling While Black — a collaboration between filmmaker Roger Ross Williams, game studio Playmatics, and a deep advisory board of historians and scholars. It aims to explore what it meant to travel in the Jim Crow South when segregation meant that black travelers often couldn’t find a hotel, or a place to eat, or a place to buy gas, or even a place to use the bathroom.
It centers on an extraordinary textual document, the Negro Motorists Green Book, published between 1936 to 1966 by Victor Green, a mailman and travel agent in New York. For more than thirty years, this book provided a list of safe havens and warned of dangerous locations.
These are just some initial steps. It’s a huge challenge, and not one that I believe either humanities scholars or game designers will be able to tackle alone. I’m watching with keen interest the work of designers like Tracy Fullerton, with Walden and Night Journey,
and Navid Khonsari’s development of the first episode of Revolution 1979. Both are drawing on cultural contexts, textual evidence, and scholarship to inform design.
So that is the challenge–one that I believe can only be truly addressed collaboratively, drawing on mutual expertise, and growing those rare talents that operate in this hybrid space. I promised a description, a challenge, and a appeal. Part of my appeal is this: if you’re interested in these challenges, or if you’re a person who works in these hybrid spaces, I’d love to hear from you. I’ll note that we often ask game designers who have knowledge of the humanities to serve as reviewers for grants, and even more generally I’d be grateful to have your feedback on how we might address challenges like these with national funding. I also encourage you to consider our grant programs if you’re working as–or with–a nonprofit or at a university.
To conclude, my final appeal is this: save the rocket cats.
What do I mean by this? The games being produced, that have already been produced, comprise an amazing corpus of cultural heritage material, and this includes the records of how you’re figuring out really difficult and fascinating challenges and creating genres of play.And scholars now and in the future will want to know how that is done, ways that you approached the problem, methods that you tried, scholarship you used to inform your game, just as they try to understand the form and content of a novel by William Faulkner or Zora Neale Hurston — not just their final novel, but the notes and papers and letters, all that contextual material that shaped the novel. That’s how scholars create an interpretative framework.
While I know some companies and game labs have local archivists, I also know — mostly through the amazing work done by the Preserving Virtual Worlds project (funded by the Library of Congress and the Institute for Museum and Library Services), that most do not. And even if the executable file of your game remains available, I’d like to leave you with a thought from literary scholar Matt Kirschenbaum, who has been working on born-digital preservation for a while.In talking about software, he argues the following: “Software is the product of white papers, engineering specs, marketing reports, conversations and collaborations, intuitive insights and professionalized expertise, venture capital (in other words, money), late nights (in other words, labor), Mountain Dew, and espresso. These are material circumstances that leave material traces…”
These are your archival materials, as much as the game itself.
And the lesson of the rocket cat is that you don’t know what hidden Easter egg, what manuscript, what “material trace” will cause a future scholar’s heart to swell with joy, or what trends or insights scholars and enthusiasts will find when they take a look at your work with the eye of a critic, a historian, an anthropologist, a literary scholar, an art historian.
The humanities employ a wide array of methods, and just as they would dig into the archives to understand the slave trade, or use MRI scans of mummies to understand ancient Egypt, or design a game to understand the complex racial relations of an internment camp placed in the deep South,
so too do scholars, even now, want access to the archives, tools, code, memos, and especially, when possible, game analytics. There may well be all sorts of legal and competitive reasons that one can’t share all of that, but I appeal to any of you have access or control of these materials to share what you can, and, even more importantly, to work with scholars and librarians to preserve the rest in a dark archive until you can share it. What you do is valuable, and our mission at the NEH is to help preserve and interpret our shared, valuable cultural heritage.
[The following was posted originally as a solicited response to MediaCommons’ Survey Question: What are the differentiations and intersections of media studies and the digital humanities? on April 18, 2013.]
Step back in time with me, if you will, just twelve years. In April of 2001, the Society for Cinema and Media Studies (SCMS) was simply known as the Society for Cinema Studies; the organization did not formally adopt “and Media” into its name until 2002, a simple Boolean operation that formally institutionalized more than a decade of active interdisciplinary growth within that organization. In that same April of 2001, as we’re told in a few versions of the origin story, John Unsworth, Susan Schreibman, and Ray Siemens were just beginning conversations with the acquiring editor for Blackwell Publishing for what would later be entitled the Companion to Digital Humanities, chosen only after discarding — through a different Boolean operation — the alternatives: NOT humanities computing; NOT digitized humanities.
Humanities computing was the prevalent term in 2001, and it too had its own sort of logic at work, an elaborate Venn diagram of digital libraries and archives, linguistics, and other computational methods. At the University of Maryland, where I was a PhD student in the English department at the time, the two phrases at play were “humanities computing” and “digital (media) studies,” with the former most often referring to the creation of archives and tools, and the latter to the study of electronic literature, videogames, and the changing face of cinema. Our colleagues over in American studies were engaged in “Constructing Cyberculture(s).” This was the title of their local 2001 conference, which I remember David Silver opening with remarks about the shape of this growing field where scholars were grappling with performance theory and Internet protocols, videogames and critical race theory. We all swapped articles and debated terms.
At that time, we were just barely into our second year of an NEH challenge grant to form the Maryland Institute for Technology in the Humanities (MITH) under the leadership of Martha Nell Smith. MITH wasn’t a DH center (that phrase wasn’t popularized yet, although we had plenty of good nearby examples at places like UVA, Brown, George Mason, and elsewhere), but rather a “new technology center in the university library.” Scholar-fellows came from all over the College of Arts and Humanities: women’s studies, American studies, ethnomusicology and comparative literature. They weren’t “digital humanists” (no such thing existed either, really). They were media scholars and literary historians. Feminists and formalists. Filmmakers and textual editors. Like most, I suspect, they were looking for ways to engage technology to enhance their scholarship and teaching, sifting through possible methods and technologies, all while theorizing the shifting landscape of cultural (and academic) production.
It’s within the context of these dozen or so years that I’d like to foreground the intersections of digital humanities and media studies. Boolean logic is a relatively straight-forward series of choices (AND, OR, NOT) that can generate complex results; it’s also a method that can control fields and establish taxonomies. A lot of recent conversation about the digital humanities has focused on how it should be defined, how it is institutionalized, and what it excludes. To be sure, definitions can be useful, but all too often they are seen as acts of foreclosure or negation, a movement to capture a certain present, often for strategic impact, and often obscuring messy histories and generative futures.
Instead of focusing only on defining DH, as though we can come to a single result from a complex Boolean query, I’d like to suggest that we also consider the practice of DH as a recurring process of refining. Boolean logic presumes winnowing and filtering, but as any scholar who has spent a few hours in the library knows, it also presumes iteration. The value in Boolean logic is that it allows us to start with some basic principles and come to very different results of equal value. How else to explain that digital humanities can describe the use of lasers and helicopters to investigate Maya civilization, on the one hand, and the study of game software as cultural artifact, on the other? The messy histories remind us that DH is a term in its relative infancy deployed — yes, strategically, tactically, rhetorically — to encompass a broader set of traditions that themselves have complex backstories threaded through a host of disciplinary backgrounds and, importantly, institutional types: not just universities, but galleries, libraries, archives, and museums (the GLAM quartet), small historic homes and historical societies.
When people ask me now how I define DH, I answer only: “broadly.” If I’ve learned anything in reviewing thousands of grant applications in the digital humanities, it’s that I could never sufficiently define the term to suit all disciplines and institutional profiles. However, many of the overlapping interests of media scholars and digital humanities practitioners tend to be those same institutional concerns that permeate our contemporary academic culture: changes in scholarly communication practices; the positive and negative effects of IT infrastructure on teaching and research; the study of computational forms and objects and their influence; the possibilities enabled for new knowledge through joined collections and increased access to data; the creation of tools to search, collect, mine, and visualize; fostering collaborations across disciplines and institutions.
Media scholars are particularly well-positioned to challenge assumptions we might make when it comes to the software and media that undergird much of this kind of DH work, which is one reason why ODH encourages grant submissions that focus on the history, criticism, and philosophy of digital culture and its impact on society. It’s worth noting that this emphasis is not new to the agency. In April 2000, NEH released a report (PDF) that not only committed “to ensuring that intellectual and cultural content in the humanities is available in digital form for our nation’s citizens,“ but also emphasized that the agency has “an important role to play in supporting projects that will examine and interpret the historical and cultural impact of this technology.” This report served as one of the early documents that shaped the eventual development of the Office of Digital Humanities several years later.
Given the broad range of institutional types and disciplines, media scholars have been actively represented in the formation of digital humanities work as reflected in the Office of Digital Humanities’ list of funded projects (which, I should note, is just but one of many measures of what constitutes DH). In 2008, in one of our earliest set of awards, we funded a start-up called MediaCommons to explore innovations in “peer-to-peer” review. Since then, ODH has funded platforms for film analysis; institutes for multimodal scholarship; software for cultural analytics that has been used to examine manga and computer games; investigations of how scholars can access born-digital materials in the archives; or processes for how to archive born-digital materials like computer games. This is a partial list (you can explore the full list here), and it’s only a fraction of the much longer history of digital humanities at NEH (most of which happened before ODH even existed, or “digital humanities” as a phrase was in vogue).
I’m not sure how many of our grantees would self-identify (without some reservation) as primarily “digital humanists.” I suspect they would identify first with their home discipline, not so different from the scholar-fellows at MITH from a dozen years ago, who looked to add to their knowledge base and their methodological approaches. In that respect, I like to think that DH, taken broadly, operates as a kind of Boolean composition — a process of invoking and refining combinations of disciplines, methods, subjects, and theories to investigate research questions of interest. Few people actually just “do DH.” Rather, they topic model feminist texts, or analyze the social network of art dealers in 19th-century Europe, or visualize videogame speed runs, or use helicopters and lasers to do digital archaeology. Some code while others interpret code. Some create archives, or digital scholarly editions. Some build tools and others theorize them. Overall, however, you’ll notice there are relatively few digital humanities efforts — even collaborative, interdisciplinary ones — that do not, in some way, carry forward the traditions, theories, and practices of home disciplines. In short, the humanities AND…
[Note that the following post was originally published on April 1, 2013. Cross-posted with Day of DH 2013 site]
The Office of Digital Humanities (ODH), a grant-making office for the National Endowment for the Humanities (NEH), announced today that a prototype for a formal definition of “digital humanities” is currently undergoing testing and would be released “soon.” ODH director Brett Bobley stated, “Given the contentious debates over the definition of ‘digital humanities,’ we thought it would be better to address the issue sooner rather than later.”
The task for completing the definition was assigned based on government procurement standards. An RFP was announced last year, with top bids reportedly coming in from Oxford University Press (publishers of the Oxford English Dictionary), an anonymous hive editorial team submitted through the shadow group “the Wikipedia Foundation,” and Donald Trump, revealed a source who spoke on condition of anonymity because of the privacy surrounding government procurement procedures. However, to the surprise of some, the bid was given to government defense contractor Lockheed Martin. “We are delighted to receive this contract and keep our workers on the job despite this period of austerity,” stated a Lockheed spokesperson, who further noted that “the digital humanities seems to be where you can find all the jobs these days.”
The definition of digital humanities is currently in “a testing phase,” noted Senior Program Officer Jennifer Serventi. She recently visited Lockheed Martin’s “Theory Tunnel,” where the development team was checking the definition’s “theorydynamics.” “We’re trying to make sure the definition responds equally well under different theory conditions, and that no one theory creates too much drag,” explained one unnamed senior theory scientist. “Right now we’re working on psychoanalysis, which is just bringing the definition to a full stop,” revealed a source who wished to remain anonymous due to the sensitivity of the testing. According to the source, psychoanalytic drag is causing the definition to adopt a “weird Da Vinci-style mirror script, which is obviously problematic with the definition in its infancy … we’re hoping to get past this ‘mirror stage’ soon.” Unsurprisingly, humanities disciplines began lobbying for their individual methodological approaches once word of the definition leaked late last month. “We’ve had some issues with security,” noted Senior Program Officer Perry Collins, who thanked the library community for standardizing security procedures yesterday. Jason Rhody, also a Senior Program Officer, revealed that just last week before the new security protocols took effect, a small group of game studies scholars collaborating with linguists “Zerg-rushed the definition… it took us three days to remove all the prepositions.”
Once the definition is finished, it will be copied and delivered to the Office of Digital Humanities, where it will be placed in the Office’s official vault. The original version of the definition will be transported to the National Institute for Standards and Technology (NIST), where it will take its rightful place between to the formal representations of the measurements for the inch and the mile. “Of course,” notes one NIST employee, “once we switch to the metric system like the rest of the world, all of these standards and definitions will be just so many bits of scrap metal and word cloud.”
Further information about how to define the digital humanities can be found at Day of DH 2013 (http://dayofdh2013.matrix.msu.edu/members/), a project that documents a day’s activity for digital humanities practitioners in a variety of disciplines and contexts. Day of DH 2013 is on April 8, 2013.
[Disclaimer: the above is a fictional April Fool’s Day amusement and is the personal work of the author written during his own personal time and representing neither endorsement nor opinion from the federal government, Donald Trump, or Lockheed Martin.]
While the instructions below can help remove the lines of code inserted into your php pages, it doesn’t necessarily remove the *exploit* that allowed such an incursion in the first place. What I’ve learned after the code re-appeared in the past 24 hours on ~7 blogs hosted (for reference, I’m on dreamhost):
1. Delete all unused, old themes. The “blue kino” theme looks like a possible culprit, but just get rid of whatever you aren’t using, and upgrade the one you are.
2. Update all plug-ins you are using, and delete the ones you are not.
3. Make sure WordPress itself is up-to-date.
4. Look for odd files that don’t fit. If you’ve been hacked, contact your host–they can run scripts to help you track these down. For example, on one site there was:
5. Consider a database dump and re-install (I believe @wayne_graham might be planning a blog post to outline a clear process for this).
Note: I’m hopeful these steps will work, but I’m also expecting to be surprised by a fresh round of cleaning (and full re-installations) tomorrow. So, caveat emptor.
I’ve found the the Sucuri.net blog (http://blog.sucuri.net/) an incredibly valuable resource when wordherder blogs have been hit with various hacks. Recently, George’s workbook.wordherders.net was hacked, and I was able to use the same script that Sucuri provided in a May 2010 posting to clean up the files.
The hack puts one line of php code in each of your php files. It begins with the following script:
<?php /**/ eval(base64_decode("aWY....
Cleaning the site requires extraction of that php code from all pages in all directories for your WP installation. The Sucuri solution uses SED to accomplish this. If you want to make sure this is the hack that impacted you, you can check by either downloading one of your php files by ftp or SSH in to read one. A very, very long line of php code should begin with that you see above.
Here is an old Sucuri post from May 2010 where I downloaded the original fix (which I used to clean a hack in 2010):
The link to the file they provided is broken, so here’s the copy that I have (again, all credit to the original Sucuri post):
Follow the directions from the May 2010 post under the section “via web” — this same script worked in cleaning up the recent attack from last week on http://workbook.wordherders.net/ (and also worked just now on my own site, which I had to clean before posting this). Remember that you have to change the file name so to wordpress-fix.php
Be patient…it can take a few seconds to run. It will give you a notice when it is done. Then go and check some of your php files to make sure it worked.
Another possible solution: in the comments feed from this Feb 2012 Sucuri post (http://blog.sucuri.net/2012/02/malware-campaign-from-rr-nu.html), Walker de Alencar provides this link to his github script rrnuVaccine:
Good luck all!
This brief paper was offered as my contribution to the Close Playing: Literary Methods and Video Game Studies roundtable at MLA 2012 in Seattle. I very much appreciate the dynamic audience (a full house is a wonderful thing for the final session of a 4-day conference) and a terrific group of colleagues to present alongside: Mark Sample (GMU),Edmond Chang (Univ. of Washington), Steven E. Jones (Loyola Univ., Chicago),Anastasia Salter (Univ. of Baltimore), Timothy Welsh (Loyola Univ., New Orleans), and Zach Whalen (Univ. of Mary Washington). The format involved 6 six-minute papers, which allowed for nearly an hour of active discussion. Many thanks to Mark Sample for keeping us on task.
[It should not need to be said, but: opinions expressed are those of Jason Rhody and do not necessarily reflect official positions of the National Endowment for the Humanities on gaming, narrative, or any other topic.]
These remarks stem from a much longer book-length project called “Game Fiction,” in which I offer an approach to moving past what I considered a discourse about games and narrative stymied by absolutes rather than models. Despite the fact that some computer games are clearly a site of narrative production and consumption, the relationship between narrative, on the one hand, and games, on the other, has remained somewhat uneasy, if not contentious, within the critical literature for both game studies & narrative theory.
As a path forward, I’ve proposed a rubric under the term “game fiction,” which I identify as a category of game that draws upon and uses narrative strategies to create, maintain, and lead the user (player) through a fictional environment in order to actualize a narrative and ludic goal.
I offer four principle properties of game fiction–they must be ergodic, competitive, progressive, and hold a goal of actualization (in combined narrative and game terms).
I don’t have time to fully detail each of these qualities, but it is important to note that these qualities are NOT absolutes; they serve as an analogue scale so that we can approach the issue with appropriate levels of nuance.
What drew me to this project was the fact that we still lacked an adequate vocabulary to address narrative in games. I’m less interested in questions of “are they or aren’t they,” and much more interested in questions of When are they narratives, and How are they narratives? These are questions of genre, and as we all know, the border-cases are often the most interesting cases of all.
So I’m less interested in how Tetris is not Prince of Persia, but more how StarCraft is different from StarCraft (the multiplayer map is on the left, and the single-player map is on the right… note the differences in symmetry).
In taking this turn toward genre, I’d like to identify two key changes where game fictions stand out from other fictional genres & how we might rethink this shift and understand the changing face of narrative in a world increasingly governed by computational function. I further believe these issues are both part of the root cause for the narratology/ludology debate and further reflect the value of studying games from a literary perspective.
First, there is fundamental shift in the narrative communication situation, as articulated by Seymour Chatman.
When we introduce a consistent feedback loop that involves player input, it creates a new dynamic of power and requires altered models of production and consumption, transmission and exchange.
Second, the black box that represents the narrative text in Chatman’s model becomes, with a game fiction, quite literally a black box of computational operations, which are often hidden or obscured to create the puzzles and challenges that are foundational to gaming.
So there is a need to explore what Matt Kirschenbaum calls the formal materialities of software, a call reflected also in the rise of software studies and platform studies.
To better understand how plotted events are embedded within a game fiction, we must move beneath the interface to explore relationships of narrative setting to database, for example, and of the quest to the query.
StarCraft, a real-time strategy game with both single and multi-player modes, serves as a particularly useful example of how the single player works as a game fiction while the multiplayer does not. While both single and multiplayer versions are ergodic and competitive, only the single-player maps tend to be progressive with actualizable moments embedded in the game.
Single-player matches in StarCraft have clearly articulated, deeply encoded requirements for staged progression– Mission Six of the Terran campaign (with triggers and pre-planned events annotated here on the screen) is a useful example. The properties of data, their placement and use in the game can tell us a great deal about how narrative functions in game fictions. Despite modern methods for encoding games, game fictions still function like early data models, which were 1) primarily hierarchical, 2) stressed parent-child or networked relationships between data sets, and 3) were, above all, navigational. Hierarchical and simple network databases often required user knowledge of the data design because navigation and query occurred via predefined relationships. Navigation, query, and data were tightly intertwined. These principles hold true for game fictions, which tend to be highly navigational and spatial. The stronger the tie between an objective and a location, the stronger the tendency towards progression and actualization.
You can use map modification tools like StarEdit to show how the design is staged. A designer would embed data structures within the fictional setting, and then the designer would imbue the data object (here, a battleship) with conditions and actions –precisely the way one programs triggers within typical (non-game) database structure.
The difference between common conditions and actions in multi-player maps versus single-player maps is particularly telling. The default StarCraft multiplayer maps uniformly rely on three sets of common conditions/actions, whereas the singleplayer maps often invite hundreds of variables (in this case, the player must bring the Jim Raynor character to a particular location) .
The manner in which players engage with data in a representational questing environment still recalls the fundamental concepts of data manipulation language (DML), most colloquially known as ‘the query.’ Commands such as SELECT, UPDATE, DELETE, or INSERT can be mapped to quest objectives such as find, deliver, slay, recover, and so on (in this case on the right, Update Location 0 with a character type). Data is staged to enable plot, and the quest functions as a query.
If the rise of the novel as a form of prose fiction in the 18th century reflected a growing “tendency for individual experience to replace collective tradition,” as Ian Watt argues (14), then comparatively the rise of game fiction could be seen to reflect a tendency towards collective tradition under the guise of individual experience.
The following slides and notes guided my presentation for the #alt-ac: The Future of ‘Alternative Academic’ Careers roundtable at the 2012 MLA convention in Seattle. I was grateful to be invited by the MLA Office of Programs, and pleased to join Bethany Nowviskie (UVA), Donald Brinkman (Microsoft Research), Neil Fraistat (UMD), Robert Gibbs (Univ. of Toronto), Charles Henry (CLIR), and Elliott Shore (Bryn Mawr) for the discussion. The bulk of the roundtable was organized toward discussion, but we each took about six minutes to discuss alt-ac from our individual perspective.
This panel followed another alt-ac roundtable session which immediately preceded it. Both were written up by William Pannapacker in the Chronicle of Higher Education.
I say it elsewhere, but it is worth repeating: opinions expressed are those of Jason Rhody and do not necessarily reflect official positions of the National Endowment for the Humanities.
Thank you to Bethany and the MLA for organizing this session. Concerns about employment for humanities graduates are not new, even for organizations like the NEH.
There was even some alt-ac in 1983, as you can see in this grant made to Noel Stowe with the title “Historians and the Private Sector:: A Graduate Program Preparing Historians for Business Careers” (discovered by Brett Bobley when making some changes to how we categorize our grants).
Fast-forward approximately three decades, and we can point to discussions of employment and professionalization in a slightly different context: digital humanities centers.
In 2010, Tanya Clement and Doug Reside (then both of UMD) hosted the NEH-funded workshop “Off the Tracks—Laying New Lines for Digital Humanities Scholars.” The workshop outcomes are discussed in full in their report (and I suspect that Neil Fraistat will offer some thoughts here as well), but it’s worth noting that alongside the crisis of humanities employment that we also see a shift in the academic ecosystem that supports and works alongside faculty in today’s colleges and universities.
I’m personally interested in this as an ‘alt-ac’er’ myself, and particularly invested in theorizing how we might go about reframing how we think about ‘service’ in and alongside the academy, and, further, how this relates to the rise of the public humanities (a trend we’ve seen a great deal of through the various grants funded in recent history). I’m also particularly keen to discuss how reframing service allows us to uncover these hidden networks of collaboration (with librarians and archivists for example) so often obscured by the myth of the solitary scholar.
I would be happy to talk more about these more theoretical issues during the Q&A, but due to limited time I would like to give a bit more attention to practical advice in looking for work in government service, or…
Given this somewhat tongue-in-cheek title, this might be an appropriate time to remind you that this presentation is a reflection of my own opinion, and does not reflect the opinion or position of the NEH. Disclaimers are one of the things I’ve had to get used to working in government service – and so this serves both as a disclaimer, and also my own caution that alt-ac comes with tradeoffs. I think it’s exciting that these kinds of positions are getting increased attention, and that scholarly societies like MLA and the AHA are working to make them more available, attractive, and rewarding. But I’m also wary of being overly Pollyanna. These are jobs, and no matter how much you love your job, all jobs come with certain challenges and restrictions (again, I’d be happy to talk more about that later).
So, here are a few tips as you consider alt-ac employment. I hope you find them useful whether you are a current graduate student, currently seeking employment, or an advisor how might help students look for these kinds of jobs in the future.
Search for government jobs at http://www.usajobs.gov/. You can search by keyword, but also try searching by agency (NEA, NEH, NSF, IMLS, Smithsonian, NHPRC, NARA, LOC). Explore working for the government while still in school (http://www.studentjobs.gov/) or consider opportunities such as the Presidential Management Fellows program (https://www.pmf.opm.gov/).
What were KSAs? The acronym stands for Knowledge, Skills and Abilities, which were essay statements that were frequently required along with a resume when applying for government jobs. KSAs are no longer officially used, per se, because of changes in hiring practices during the Obama adminstration. However, many Human Resources offices – which is usually the first group that evaluates all incoming applications up until they draw up a short list for the ‘selecting official’ – have been using KSAs for decades, and that just doesn’t go away (as you can see under “how you will be evaluated” in the slide below). Instead, KSAs are often reintroduced (though not by name) through a questionnaire often required during the application process. Be sure to check the advertisement for a sample questionnaire so you can prepare your answers early.
What follows is the abstract for my dissertation, Game Fiction, defended in November 2010.
“Game Fiction” provides a framework for understanding the relationship between narrative and computer games and is defined as a genre of game that draws upon and uses narrative strategies to create, maintain, and lead a user through a fictional environment. Competitive, ergodic, progressive (and often episodic), game fictions’ primary goal must include the actualization of predetermined events. Building on existing game and new media scholarship and drawing from theories of narrative, cinema, and literature, my project details the formal materiality that undergirds game fiction and shapes its themes. In doing so, I challenge the critiques of narrativism levied at those scholars who see a relationship between computer games and narrative forms, while also detailing the ways that computational media alter and reform narratological preconceptions. My study proposes a methodology for discussing game fiction through a series of ‘close playings,’ and while not intended to be chronological or comprehensive, provides a model for understanding narrative and genre in this growing field.
Chapter one, “Defining Game Fiction,” locates video games within the larger context of computer-mediated narrative design, and interrogates the power structure of reader to author, consumer to producer, and media object to its user. I articulate a framework for approaching computer games that acknowledges a debt to previous print, cinematic, and ludological forms, while taking into account computer games’ unique ergodic and computational status. Chapter two, “Paper Prototypes,” examines the principles of game fiction in three analogue forms: the choice book, the board game, and the tabletop role-playing game. My third chapter, “Playing the Interface,” theorizes the act of narrative communication within the ludic, multimodal context of Prince of Persia: The Sands of Time. Chapter four, “Data, Set,” posits the game quest as analogous to the database query in Adventure and StarCraft. Much like data exists in a database, requiring only the proper query for access, narrative exists in game fiction, shaped by quests through fictional settings. Chapter five, “The Game Loop,” argues that the grammar of user input within the game loop shapes the player’s relationship to the character and, in MediEvil, the subsequent themes of redemption.
I started the blog many years ago as a research and thinking space, and though I’ve used it very little publicly over the past few years, a great deal of the early writing and interaction here eventually made it in some form into my recently finished dissertation, “Game Fiction” (more about which soon).
I’m sure the look of the place will change here and there, and I’m not promising frequent updates, but I do suspect Misc will see a bit more activity than, say, during the past 2 years.
Note: those interested in the earlier MoveableType version of this blog may find the archive here: http://misc.wordherders.net/mt/
It has recently been argued that the generation of large data sets is the new science. I agree only insofar as the data sets are used to ask and answer unique questions about life.
Bigger Faster Better
By Craig Venter | Posted November 20, 2008
(and yes, it has been over a year since the last post.)
Lovely fan-created video for Grandaddy’s “Jed’s Other Poem (Beautiful Ground),” found via if:book.
the if:book post details the video better than I can at the moment, except to say that Grandaddy kept me going through several late nights during graduate school, and that this is wonderful example of constrained writing/animation (see the creator’s statement below).
Just came across something lovely. Video for “Jed’s Other Poem (Beautiful Ground)” by the now disbanded Grandaddy from their great album The Sophtware Slump (2000). Jed is a character who weaves in and out of the album, a forlorn humanoid robot made of junk parts who eventually dies, leaving behind a few mournful poems.
Creator Stewart Smith: “I programmed this entirely in Applesoft BASIC on a vintage 1979 Apple ][+ with 48K of RAM — a computer so old it has no hard drive, mouse up/down arrow keys, and only types in capitals. First open-source music video, code available on website. Cinematography by Jeff Bernier.” A nice detail of the story is that this was originally a fan vid but was eventually adopted as the “official” video for the song.
If you find yourself in London on the 21st… [via JISC’s blog]
Developing International Collaboration for Digitisation: the JISC – National Endowment for Humanities perspective
In celebration of their transatlantic digitisation collaboration grants, JISC (Joint Information Systems Committee) and the NEH (National Endowment for Humanities) are hosting an evening panel session looking at issues related to international digitisation. The evening will draw on the experiences of projects in the area and will also involve discussion to inform future directions.
Hosted by King’s College London. Monday 21st January, 5.30pm – 6.45pm (Room 2B08, Strand Campus)
Chaired by Sarah Porter, Head of Development, JISC, with presentations and commentary from:
- Bruce Cole, Chairman, National Endowment for Humanities
- Malcolm Read, Executive Secretary, JISC
- Paul Ell, Director, The Centre for Data Digitisation and Analysis, Queen’s University Belfast
- Robert K. Englund, Professor of Assyriology at the University of California and Director of the Cuneiform Digital Library Initiative
The event is open to all. The evening will be followed by a wine reception for all attendees.
JISC and the NEH are grateful to the Centre for Computing in the Humanities at King’s College London for hosting the event.
- April 2014
- March 2014
- April 2013
- March 2012
- January 2012
- March 2011
- February 2011
- February 2009
- January 2008
- September 2007
- June 2007
- May 2007
- April 2007
- March 2007
- February 2007
- January 2007
- December 2006
- November 2006
- October 2006
- September 2006
- August 2006
- July 2006
- June 2006
- April 2006
- March 2006
- February 2006
- January 2006
- December 2005
- November 2005
- October 2005
- September 2005
- August 2005
- July 2005
- June 2005
- May 2005
- April 2005
- March 2005
- February 2005
- January 2005
- December 2004
- November 2004
- October 2004
- September 2004
- August 2004
- July 2004
- June 2004
- May 2004
- April 2004
- March 2004
- February 2004
- January 2004
- December 2003
- November 2003
- October 2003
- September 2003
- August 2003
- July 2003
- June 2003
- May 2003
- April 2003
- March 2003