What Does it Mean to “Go Digital”? – By Dr. T. Mills KellyOctober 19, 2010
Back in the late 1990s – the days of Web 0.5 – I was a pioneer of sorts when it came to thinking about how new media might be changing the way students thought about the past. I got started with research on new media because I had an itch that needed scratching…What I wanted to know was whether or not the work I was putting into my website and into creating web-based assignments for my students was remotely worth it. I decided I needed to do a little research to see what I could learn about how my students used the digital learning materials I was creating for them and whether their use of those materials was changing their thinking at all.
As often happens with “little research projects,” the work I did that year transformed my career in that it opened me up to an entirely new way of thinking about teaching and learning. And because the results of my project found their way into an online journal, which then won an award, which then led to a job at George Mason University’s Center for History and New Media, I was suddenly an expert of sorts on digital pedagogy.
Other awards and a series of increasingly larger grants have followed, but I still am trying to get at that same itch that started bothering me in 1998. Like more “experts,” the more I know the less I am sure of.
Of course, I say all of that with the historian’s favorite tool – 20/20 hindsight. When I was in the middle of my transformation into a digital historian, I just knew I felt like I was getting closer and closer to something worth knowing. I still feel that way and so I keep scratching and scratching.
If you’ve ever taken a history class-who hasn’t-you know that historians are a deeply conservative tribe when it comes to their pedagogy and their research methods. Not many disciplines have a lineage as long as ours. After all, Herodotus published his book on the Persian Wars five centuries before the Common Era began. Sad to say, not a lot has changed in the past 2,500 years in either the way historians pursue evidence or teach students about the past.
That is, until the past decade.
The digital revolution has challenged so many assumptions about the way historians do what they do that if I were to list them all here it would require several blog posts. So, rather than list them all and bore you to death, I thought I would point to just a few that, to me anyway, seem worthy of more careful consideration,especially with respect to teaching and learning.
GIS and the blurring of boundaries: Historians have always borrowed freely from other disciplines, but only rarely have we allowed the boundaries we’ve set for ourselves to blur. The advent to cheap and easy GIS technology combined with the rapid growth of massive databases of humanities content has suddenly made it possible, if not imperative, for historians to think about ways that GIS can help us (and our students) understand the past better. In my own work I now geolocate every single source I can so that I can throw those sources up on a map to look at them in both time and physical space. I’ve only begun doing this with my newest project, but already I’m starting to see patterns in my data that I wouldn’t have seen unless I put them on a map.
Mobile computing: Historians are now confronted with the possibility of the wide adoption of what computer scientists like to call “itinerant, distributed, and ubiquitous computing.” You and I might call this the smart-phone revolution. Whatever we call it, we now need to come to grips with the fact that our students and the audience for our work increasingly can access previously unthinkable amounts of historical content whenever and wherever they choose. As a tribe historians are only just starting to debate what this revolution means. I happen to think that the single most important outcome of what we might call “history to go” will be the breaking down the walls-literal and figurative-of the history classroom. Why keep our students chained to their desks when they can access and work with historical content anywhere they choose? I’ll be experimenting with this idea in the spring 2011 semester when I teach a course called “Dead in Virginia” that forces my students to work with local family cemeteries as
their primary historical sources-work they can only do somewhere other than their classroom.
Malleability: How do you feel about Wikipedia? This question often defines the parameters of an important argument about the digital culture we are watching emerge all around us. While the argument is often cast as a Wikipedia good/Wikipedia bad binary, the real issue, it seems to me, is about both the malleability of online content and the degree to which we are willing to accept the participation of the public at large in the creation of what used to be known as “expert content.” Whether we like it or not, our students see digital content as malleable and they are often frustrated at attempts by their professors to tell them that mashups, remixes, and other forms of creative activity are somehow bad. With each passing month I find myself more and more excited about the ways that my students are trying to find new ways to make sense of the past by doing interesting (and sometimes strange) things with the sources they find. Yes, many of those things would make a traditional historian cringe, but if we are going to tell our students that they have to work with limits grownups have set for them, I think we can count on seeming more and more irrelevant with each passing year.
To cycle back to my original question above, I would say that the answer is pretty simple. Going digital means being open, even if it makes us cringe, to a teaching, learning, research, and creative landscape that is in a state of extreme flux at this particular moment. Boundaries are shifting. Rules are changing. No one is sure what the final result will be. The ride we’re on will likely speed up over the next few years, so I’m fastening my seatbelt and looking forward to the final result.
T. Mills Kelly is the Director of the Global Affairs Program at George Mason University. He is an Associate Professor of History and an Associate Director of the Center for History and New Media. In 2005 he received the Commonwealth of Virginia’s Outstanding Faculty Award, the state’s highest recognition of faculty excellence. Most recently he was an Associate Dean of the College of Humanities and Social Sciences. He blogs at http://edwired.org.
* The opinions expressed by guest bloggers do not necessarily reflect those of the MUA, its staff, or its partner organizations.