Wednesday, April 30, 2008
Showing the Extra Stuff
I recently came across a neat-looking equivalent of that sort of thing while catching up on various world events on BBC News' site last week. BBC News' articles increasingly have sidebars, popup windows, small videos, etc., which add extra substance or detail or visual examples to complement the text of their articles. While reading one article about the chaos in Zimbabwe and the international reaction to same, I found the infobox midway through the article to be a really good example of this sort of thing. Using nothing but very basic HTML (check the page source!), the article manages to briefly describe and illustrate the stances of ten African countries near Zimbabwe not only clearly, but in a way that only takes up a few square inches of screen real estate. Those who don't want to follow up can just keep scrolling through the article, and those who do can get a decent amount of information (by news-article standards) easily, compactly, and with minimal requirements for outside software (like the generally-overdone use of Flash for just about everything).
I can't argue much with the timing of coming across that kind of example (or at least coming across it while equipped to notice it), in any case. Having recently finished one educational project as part of the public history program, setting off on another as part of a summer internship, and tentatively putting together material for a third major one in the fall, I've had the presentation of supplemental information on my mind for awhile. I generally think it's better to present more information than less; too much rather than too little. At the same time, I know full well that if I'm erring on the side of excess, doing so in a manner that isn't overwhelming is kind of important. This one does that job quite well, and it's got me thinking of ways to adapt it to various purposes in my current and future projects.
Monday, April 21, 2008
Eye candy and the creation thereof
To that end, I'm trying to dig up some resources on visualization in interactive projects - stuff as much on a theoretical/technical level as anything else, sort of like the interaction design textbook we were using in the digital history course but more focused. A friend of mine out west has worked at BioWare for awhile - and who is thus completely surrounded by interactive visualization of the highest order - less recommended than demanded I check out Colin Ware's Information Visualization: Perception for Design as a starting point. I plan to pillage it from Western's library tomorrow.
Newbie to the field on any kind of detailed level as I am, I'm curious as to what else could be out there. Is anyone reading this who's familiar with the field aware of any other resources I might want to try checking out?
Saturday, April 19, 2008
Here, have a neat gadget
A stroll around the ancient city of Pompeii will be made possible this week thanks to an omni-directional treadmill developed by European researchers.
The treadmill is a "motion platform" which gives the impression of "natural walking" in any direction.
The platform, called CyberCarpet, is made up of several belts which form an endless plane along two axes.
Scientists have combined the platform with a tracking system and virtual reality software recreating Pompeii.
The bulk of what I have to say about this one is "Cool!" coupled with a big list of other places I'd like to see something like this used with, so I think I'll leave it at that for now.
Wednesday, April 16, 2008
Information Wants To Be Anthropomorphized
In my last post I spent awhile talking about the growth of amateur culture from a fairly niche sort of thing to a broader - and far more accessible - phenomenon. Considering the extent to which we've been throwing terms like "MySpace Generation" or "Youtube Generation" around in the last several years I think it's pretty safe to suggest that the Net might have had something to do with this. It makes sense to suggest that the Net is going to shape the interpretation and presentation of even traditionally-academic subjects: it's utterly ubiquitous in the developed world these days, and large swathes of its original incarnation developed out of academia in the first place anyway.
While it's weird to think so these days, with the Internet's recent proliferation of just about any kind of frivolity, inaccuracy, or Thing Man Was Not Meant To See, the Internet was once, at its heart, a fundamentally academic institution. Its culture was largely that of the American computer-science student community, with rhythms organized around the academic year. Many still use the term "September that Never Ended" to describe all time since September 1993 - referring to the annual influx of newbies with little awareness of proper netiquette which became an unending flood that fall - and speak with almost a religious hope of the prophesied October 1, 1994, which has yet to arrive. "What's your major?" was a question equivalent to "what's your sign?" and so on.
While it's somewhat obvious that the bulk of online culture is no longer centered around those kinds of patterns, a core of them is still around and at least influences a large portion of the Net. Large swathes of the parts of the Net with a useful signal-to-noise ratio have much in common with the Net's original large-scale discussion system, Usenet, before its collapse to in the mid-1990s. Usenet, and its current successors[1], had a combination of an anything-goes approach subject-wise and a lack of any barriers to participation beyond simple access to the network in the first place. The results varied from the completely silly (such as most of the thousands of groups in the "alternate" hierarchy[2]) to general-interest groups (such as rec.pets.cats), to artistic communities (such as alt.fiction.original, which a couple of friends and I created in the mid-90s as an escape from the ubiquity of horrific fanfiction) to more academic topics (such as sci.archaeology).
Unless someone was already known by name otherwise - a known scholar in a given field, or one of the various alpha geeks of the Internet at the time[3] - there were no clear credentials or pecking orders to go by other than reputations as established in a given forum. It resulted in an odd sort of situation where someone using a real name and some postnominals and someone else with an elaborate and obvious pseudonym would generally end up on an equal footing in discussion from the get-go and diverge based on shown competence alone. (As an example, someone I knew some years back, who wound up hugely respected in the spam-fighting field, was - and to my knowledge, still is - known only as "Windigo the Feral.") While a lot of people were, and still are, dismayed at the potential for this kind of anonymity, many others found (and find) it liberating.
This kind of digital culture - gleefully eccentric, technically minded, generally libertarian and explicitly meritocratic - and its descendants have had a tremendous influence on a lot of digital-humanities concepts, between their broader impacts on digital culture in general and the fact that a lot of people working in the digital humanities tend to have strong computing backgrounds. This kind of community was the one which is the source of the whole open source movement, the source of any number of tools I've mentioned on this blog. Zotero, Blender, Celestia, and Freemind are just four examples that immediately come to mind, and each is a program which came out of this movement which I've managed to turn to one historical purpose or another at least once this year. Each of those already exists as commercial software, but they have the little advantage of being free. When compared to Endnote ($300), Lightwave ($1100 - thousands of dollars cheaper than when I last checked!), Voyager 4 ($200) or Mind Manager Pro ($350) , I know full well where my preferences are going to lie - particularly since both sets of software are often similar or identical in terms of capabilities, even when correcting for open-source software's tendency towards lousy interfaces. There are moves to unify these sorts of things towards educational purposes, even in the realm of physical computers themselves. And to make it even better, should I feel moved to and have the capabilities I can, as Dr. Turkel put it, feel free to fix anything that is bugging me.
This sort of attitude - that I (and you!) should have a say in what's going on if we're able to do so in a competent manner - is, I find, a healthy one to have in the first place, and fairly central to a lot of the ideas of what academia was meant to be to begin with. I don't like gatekeepers who don't have a really good reason to be such, like the trauma surgeon I mentioned in my classically unfair example in my previous post. When you combine it with the capability for easy exchange of information modern technology tends to give us, it's only a short trip to a whole crop of notions like folksonomies, the pursuit of databases of intentions, turning filtering techniques meant to catch spam into more sophisticated information-extraction methods, and any number of other things. Between the basic potential of these sorts of techniques and the willingness of so many pursuing them to share their research and tools all over the place (notice on that netiquette link above that one of the items was "share your expertise?"), I think it's a pretty exciting time to be an historian. Dealing with the mass of information currently sitting around due to electronic archiving's one thing; turning a lot of these tools on the stuff we've traditionally been condemned to looking through via microfiche readers is another entirely.
There are potential problems because of the Net and its cultures as a new base for scholarship as well, of course. Pretty much the standard debate to do with the Internet's implications for scholarship is the reliability/verifiability issue. This is something I've discussed before on this blog, and will probably continue doing so for some time; it's a pet topic of mine. Even a cursory reading of what I have to say over my time writing here should make it clear that I fall clearly in the "digital resources are useful and often reliable" camp - and, more to the point, that I despise the casual elitism which says one must need a tangible degree to even attempt to contribute to knowledge on issues. Despite that, I have my own problems with content on the Net and try to turn a critical eye onto it. Besides, the reliability issue isn't limited to history online. Academic history isn't infallible, sometimes having a real problem being uncritical about sources (as I found out during my undergraduate thesis, watching Thomas Szasz get invoked as a reliable source about mental illness several times). Public history has its own problems, as evidenced by controversial manipulation of exhibits at the War Museum or Smithsonian, or the mendacious silliness that is documentaries like The Valour and the Horror. There's differences of degree, but we do need to take the beam out our own eye on that account, no?
While I have a thorough appreciation for the attitudes of openness and general egalitarianism which were fundamental parts of digital culture since well before I started seeing it in the early nineties, I think they can be taken too far or too blindly at times. As an example, I'm part of that rather small subset of big-on-digital-stuff historians who doesn't like Wikipedia, or more to the point, the predominant intellectual culture on it. Jaron Lanier echoes a good chunk of my views in his essay on Edge (another site on my "everyone should go here sometimes" list), though he takes his objections a little further than I would by invoking terms like "digital Maoism." On a more scholarly and financial level, there's some debate over the issue of open-access content for traditional scholarship, with arguments ranging from the silly ("socialized science!") to more
These sorts of issues are going to be hovering around academia for some time. The "problem" is that they will do so regardless of whether academia goes down the path of open access or Wikiality or whatever else. Many institutions are cheerfully doing this kind of thing, with some advocates going so far as to make religious allusions through the use of terms like "liberation technology."Those who don't, however, will still have to respond to it in one way or another, as it becomes more and more obvious that simply saying "don't use online sources" is not an acceptable approach.
The interesting aspect of this shows up when we start to see a fusion of the different cultures I've been talking about lately. We've got that amateur culture, big on do-it-yourselfing and drawn into their various pursuits out of a genuine desire to do something; we've got the loosely meritocratic and not-so-loosely anarchistic elements of digital culture, with its appreciation of openness in design or knowledge and growing use of hugely collaborative projects; we've got the traditional academy, feeling around the edges of the potential of this merger. Of course there's room for uneasiness between the three elements, but at their heart all three are (in theory) built around an appreciation for knowledge and ability.
[1] - Google has since largely revived Usenet with its establishment of Google Groups a few years ago (and the newsgroup links are courtesy of them); a chunk of the "original" Internet is once again out there, active, vibrant and wasting bazillions of man-hours per day. Huzzah!
[2] - This image from Wikipedia's (I know, I know) article on Usenet, which illustrates the meanings of the different "hierarchies" of Usenet newsgroups, illustrates the alt.* hierarchy perfectly in the eyes of anyone who remembers the original thing.
[3] - "Anyone ... [who] knows Gene, Mark, Rick, Mel, Henry, Chuq, and Greg personally" is one of popular definition of such people. The Net used to be a small world.
Tuesday, April 15, 2008
For Its Own Sake
In terms of my historical pursuits, and many others besides, I consider myself an amateur and very much hope to stay that way.
This has little to do with hoping to maintain humility in the face of a vast amount of knowledge, or recognition of living in a postmodern world where I'm doomed not to really know anything about anything, or anything else along those lines. (There's an element of truth to each, of course; if there's anything approaching completion of an advanced degree in history has taught me, it's precisely how little I know.) What I'm talking about is a rather older definition of the term.
The word "amateur" doesn't exist merely as an antonym to "professional" (or, more insidiously, "competent"). Being an amateur, ideally, is being an amator: doing or pursuing something out of a love of the subject and a desire to pursue it, with vocational aspects as secondary. This doesn't necessarily require years of formal training in a discipline, though I obviously think that helped in my own case. Sometimes that lack of training can be worked around, often in very surprising ways, by dedicated amateurs, however.
Consider chemistry sets. As we have been Thinking Of The Children rather more than is necessary for the last several years, they have, along with rather too many other things, fallen into disuse and obscurity to the point where it’s somewhere between difficult and impossible to find a “real” one. Depending on the safety level of various science-related hobbies, this is probably sometimes a good thing, but on the whole I think we’re losing something when we eliminate those kinds of incentives for learning. Others seem to agree with me. As a reaction to this sort of thing, many people and, rather likely, no shortage of kids have set about creating their own chemistry sets or ad-hoc equivalents thereof from scratch. However, the supplies needed are themselves difficult to impossible to find for the same reason a lot of the sets are.
This hasn’t prevented Darwin’s pager from being set off, however. Rather than abandon their interests due to the difficulty of getting modern materials, quite a few would-be amateur chemists have gotten together and formed their own communities [1] in which they’ve gone back to older textbooks – which often presuppose far less equipment or financial resources – in order to learn how to create the reagents or gear they need from scratch. This probably upsets the Department of Homeland Security or Public Safety Canada, but on the other hand just about anything does anyway, and I like the existence of environments like these where people can get together to learn about things.
That’s simply a more spectacular example of what happens when a bunch of amateurs get together and decide to pursue their interests than most. Similar things abound on more conventional, less high-entropy levels: ham radio operators, historical reenactors, astronomers and so on have long since provided an avenue for this sort of thing. Thanks to the various services and methods of communication brought about by the Internet, these cultures are changing in several ways. Perhaps the most obvious impact is that groups of amateurs are no longer restrained by geography. While the Hamilton Amateur Astronomers are rather obviously connected to the city of Hamilton, the members of Science Madness are less concerned about being in the same place. This can often be a hook for people who have an interest in one thing or another, but found no avenue or reason to seriously pursue it due to a lack of resources, a lack of knowledge, or a lack of like-minded individuals to create some nice positive feedback loops which encourage further learning or practice. I know I’ve picked up or maintained several interests as a result of encountering such groups, and I’m sure several of those who may be reading this have run into similar sorts of things.
The spread of these amateur cultures has been seen as a mixed blessing, of course. There are areas in which a lack of formal training or instruction really is a problem. Amateur historians or even chemists can often make those of us within the formal discipline twitch, but I’d be much happier dealing with either than I would an amateur trauma surgeon. [2] On a less life-and-death level, dealing with matters involving creativity, activism, history and so on, there are more shades of grey and room for vigorous debate, which often generates more heat than light but winds up illuminating nonetheless. At this point, in any case, there is no shortage of subjects, sites and organizations where anyone - amateur or expert - can dive into something without necessarily short-changing themselves.
A good chunk of my current history-related amateur geekery involved 513 over the last little while, as a component of our major class project where we were mounting a series of interactive exhibits built around the (very) general theme of "the sky." (My group was focused on the space race in general, and Sputnik in particular.) I've been interested in interactivity and visualization as tools to make exhibits more interesting for awhile, and thought I'd try to find some good visualizations and then ways for people to engage with them. The visualization part was a breeze, given the subject: I made use of Celestia, a fantastic - and free! - space simulation program that anyone interested in space needs to download right now. (I mean that. Why are you still here?)
For interacting with it I had planned on doing something a little more esoteric. Working with the SMART Board project for the main public history course got me interested in the idea of interactive whiteboards in general, especially when we got to the point of pushing their capabilities' limits somewhat. I like touchscreens in general, as a way of getting over that boring keyboard-talks-to-the-computer tradition. [3] To that end, I thought I'd try to build an interactive whiteboard in a roundabout way with a combination of a projector hooked up to my desktop and a sensor built from a homebuilt IR light pen and a Wii remote, inspired by Johnny Chung Lee's blog "Procrastineering."The idea was to project the image onto a wall, which could then be manipulated with one (or, preferably, two) pens to rotate, zoom, etc. the final product. I got partway through the process before the suddenly-Schrödingerian status of our having a projector - and the end-of-semester crunch season in general - caused theWii component of the whole project to end up by the wayside, forcing us to settle on the visualization and some audio I drew together towards the end. The whiteboard shall exist in time - indeed, it must, since it's going to remain a splinter in my mind until I get the thing working - but it has been relegated to a summer project.
The thing is, I wouldn't have come up with that kind of idea in my own, and would have been hard-pressed to get a hold of some of the basic theory to follow through on it, without these kinds of established and tolerated amateur cultures towards various esoteric fields of knowledge existing in the first place. While there's obviously going to be concerns about significant aspects of them - reliability for a page on historical maps, safety issues on amateur science sites specifically devoted to "fires and loud noises," etc. - I do like the variety, openness, and sheer weirdness of a lot of these sorts of resources, and I approve of living in a world where it's fairly easy to take just about any hobby or discipline and find a thriving community of engaged, helpful devotees already involved in it at any level from greenest beginners to world-renowned experts.
This is getting far more tome-ish than I originally planned, so I think I'll be belatedly polite and cut it off here. Next I want to talk about the digital aspects of this sort of thing. Between the Internet's ubiquity, its ability to help spread and coordinate amateurs of all skill levels from around the world, and its own distinct cultures as regard ideas and the transmission thereof, there are some pretty profound implications - and challenges - for historians to consider.
[1] – I was casting about earlier for some URLs connected to this, as I remembered coming across mention of it via a couple of news stories and blog posts a year or so ago. Upon asking a friend who had pointed me at said stories in the first place if he remembered where we'd seen them, he pointed me at this URL, a dedicated domain to the subject, instead. I (foolishly) expressed surprise at that and was informed that I "underestimate the Internet at [my] peril. ;)" I do indeed.
[2] – I’m entitled to one or two shameless straw men per semester, so I don’t feel too guilty about this one.
[3] - I was also inspired on this note by Jeff Han's magnificent presentation at the TED Talks in February 2006, in which he demos an intuitive interface that gets rid of mice and keyboards altogether. I want three.
Sunday, April 13, 2008
Mindsets in the Digital Humanities
It’s easy to get lost in the technical aspects of digital humanities. After all, isn’t the “digital” part of the term fundamental to the condition of the discipline? It would seem, anyway, that that was the core part of this whole area of study. Most of the elements we’ve been studying in the course of this year’s Digital History course at Western have had that technical nature about them. These have varied in complexity from subject to subject, of course; there is little that is really technical in a computing sense about how not to design web pages, but discussions of spam filtering techniques require some knowledge of computing theory to grasp, and their historical connections aren’t immediately relevant. All of these, however – and most of what we’ve discussed in the last year – have in common that substantial “digital” focus. The impression at first glance is that we’re talking about technology and history, not technology and history or technology and history.
It is important, however, to keep the “humanities” half of “digital humanities” in mind. The technologies we use these days, either for historical research and presentation or for any number of other uses, are – for now, mostly – designed and used by people. As we know as historians, people who aren’t mathematicians are likely to bring quite a lot of outside baggage into their work. The tools, theories and outputs used in the digital humanities – and anything else, for that matter – are going to reflect certain cultures, backgrounds and key assumptions inherent to those who produce them. In my previous post, I discussed Bonnett’s use of the term “hieroglyph” to refer to interfaces or tools which are too difficult for layfolk to readily understand. This can also be applied to the mindsets of people who produce those tools, which can either provide another layer of obfuscation or simply be the source of the initial problems. This can produce some elements of culture shock on top of the learning curves involved in using new technologies.
As an example: in class about a month ago, we were talking about locative technologies and ubiquitous computing. In the middle of the readings for that day was an article by Bruce Sterling on “blobjects” which started something of a stir in the discussion. A lot of this seemed to be about Sterling’s writing style in his article. Fair enough: technophile and spec-fic geek though I am, even I found it hyperbolic, annoying and laden with for-its-own-sake jargon. But there was some substantial context behind the words being written in that article. Sterling is a science fiction writer; not only that, but one of the writers whose work helped define and establish one of the most computer-centric fields of science fiction, cyberpunk, as a thriving genre. He was speaking at SIGGRAPH, a prestigious conference on computer graphics and research on same. There is going to be a different set of approaches, of expectations, of worldviews in a group of people who are likely to non-ironically talk about The Future (with Emphatic Capitalization, of course) than there would be for those who tend to get published in the Journal of Hellenic Studies. (There’s also going to be certain expectations from the audience on the author. I’m certain Sterling delivered on that front, but I also wasn’t the audience so I can’t be sure.)
So where am I going with all of this?
The plan here is to get a series of four or so posts up in the next few days, where I'll try to look at some of the approaches and mindsets out there which strongly influence different aspects of digital humanities (while also trying to draw together a bunch of material from my time in this course and program). I’m convinced a combination of parts of the digital cultures which developed in academia and migrated onto the net, and the rebirth of a broader amateur culture from the last few years, provide a lot of the foundation beneath digital humanities in general, and are only going to influence them more as they become more popular over the next several years. People don't, of course, need to be fully involved in, or even that aware of, what's going on behind the scenes of the tools they use in their day-to-day lives or projects in order to use said tools. It helps, however, especially when it comes to encountering concepts which are relatively new or strange, such as a growing emphasis of technology in the presentation of history and other humanities.