tag:blogger.com,1999:blog-22499682054257698172024-02-19T14:16:43.737-08:00Witty Yet Insightful Titlepstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.comBlogger17125tag:blogger.com,1999:blog-2249968205425769817.post-56475738698748363132008-04-30T22:24:00.000-07:002008-04-30T23:04:14.049-07:00Showing the Extra StuffWhile it's often tempting - and fun - to think of digital history applications (or digital applications at all) as large, flashy, extroverted things laden with all manner of bells, whistles and gongs, some of the neater ones are just as likely to be subtler or halfway invisible.<br /><br />I recently came across a neat-looking equivalent of that sort of thing while catching up on various world events on <a href="http://news.bbc.co.uk">BBC News</a>' site last week. BBC News' articles increasingly have sidebars, popup windows, small videos, etc., which add extra substance or detail or visual examples to complement the text of their articles. While reading <a href="http://news.bbc.co.uk/2/hi/africa/7360979.stm">one article about the chaos in Zimbabwe</a> and the international reaction to same, I found the infobox midway through the article to be a really good example of this sort of thing. Using nothing but very basic HTML (check the page source!), the article manages to briefly describe and illustrate the stances of ten African countries near Zimbabwe not only clearly, but in a way that only takes up a few square inches of screen real estate. Those who don't want to follow up can just keep scrolling through the article, and those who do can get a decent amount of information (by news-article standards) easily, compactly, and with minimal requirements for outside software (like the generally-overdone use of Flash for just about everything).<br /><br />I can't argue much with the timing of coming across that kind of example (or at least coming across it while equipped to notice it), in any case. Having recently finished one educational project as part of the public history program, setting off on another as part of a summer internship, and tentatively putting together material for a third major one in the fall, I've had the presentation of supplemental information on my mind for awhile. I generally think it's better to present more information than less; too much rather than too little. At the same time, I know full well that if I'm erring on the side of excess, doing so in a manner that isn't overwhelming is kind of important. This one does that job quite well, and it's got me thinking of ways to adapt it to various purposes in my current and future projects.pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com0tag:blogger.com,1999:blog-2249968205425769817.post-59862443265705622762008-04-21T21:52:00.000-07:002008-04-21T22:02:11.901-07:00Eye candy and the creation thereofI've been kinda banging the visualization drum somewhat this past year, in case it wasn't obvious. It's something I'm thinking of putting some more work into for the next few months, more along the lines of creation than simply use. There's also likely to be a substantial component of it in my summer internship, but that's as much excuse to get into this as it is reason.<br /><br />To that end, I'm trying to dig up some resources on visualization in interactive projects - stuff as much on a theoretical/technical level as anything else, sort of like the <a href="http://www.amazon.com/Designing-Interaction-Creating-Applications-Devices/dp/0321432061/ref=pd_bbs_sr_1?ie=UTF8&s=books&qid=1208840142&sr=8-1">interaction design textbook</a> we were using in the <a href="http://digitalhistory.uwo.ca/h513_0708/">digital history course</a> but more focused. A friend of mine out west has worked at <a href="http://www.bioware.com/">BioWare</a> for awhile - and who is thus completely surrounded by interactive visualization of the highest order - less recommended than demanded I check out <a href="http://ccom.unh.edu/vislab/CWBio.html">Colin Ware</a>'s <span class="sans"><span id="btAsinTitle"><a style="font-style: italic;" href="http://www.amazon.com/Information-Visualization-Perception-Interactive-Technologies/dp/1558605118">Information Visualization: Perception for Design</a> as a starting point. I plan to pillage it from Western's library tomorrow.<br /><br />Newbie to the field on any kind of detailed level as I am, I'm curious as to what else could be out there. Is anyone reading this who's familiar with the field aware of any other resources I might want to try checking out?<br /></span></span>pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com0tag:blogger.com,1999:blog-2249968205425769817.post-75998015839421115752008-04-19T08:04:00.000-07:002008-04-19T08:07:35.763-07:00Here, have a neat gadgetJust some mindless link propogation for the day. I recently stumbled over <a href="http://news.bbc.co.uk/2/hi/technology/7346325.stm">this link</a> from the <a href="http://news.bbc.co.uk">Beeb</a>:<br /><blockquote></blockquote><p class="first"> <b></b></p><p class="first"><b></b></p><blockquote><p class="first">A stroll around the ancient city of Pompeii will be made possible this week thanks to an omni-directional treadmill developed by European researchers. </p><p> The treadmill is a "motion platform" which gives the impression of "natural walking" in any direction. </p><p> The platform, called CyberCarpet, is made up of several belts which form an endless plane along two axes. </p> Scientists have combined the platform with a tracking system and virtual reality software recreating Pompeii.</blockquote><br /><br />The bulk of what I have to say about this one is "Cool!" coupled with a big list of other places I'd like to see something like this used with, so I think I'll leave it at that for now.pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com0tag:blogger.com,1999:blog-2249968205425769817.post-15869179039533336302008-04-16T11:21:00.000-07:002008-04-16T23:27:30.503-07:00Information Wants To Be AnthropomorphizedAnyone else ever notice that generations seem to be measured in months rather than decades these days? On the one hand, I'm exceedingly annoyed at the frequency with which these terms get reinvented and thrown around; people need an attention span, rather than recharacterizing entire generations by The Next Big Thing. On the other hand, the fusion of the old biological or cultural standard of generations with what seems like a new definition feeling more like <a href="http://news.bbc.co.uk/2/hi/technology/7080772.stm">Moore's Law</a> than anything else is rather fascinating.<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://imgs.xkcd.com/comics/wikipedian_protester.png"><img style="margin: 0pt 0pt 10px 10px; float: right; cursor: pointer; width: 440px; height: 240px;" src="http://imgs.xkcd.com/comics/wikipedian_protester.png" alt="Image courtesy of xkcd.com" border="0" /></a>In my <a href="http://p-stewart.blogspot.com/2008/04/for-its-own-sake.html">last post</a> I spent awhile talking about the growth of amateur culture from a fairly niche sort of thing to a broader - and far more accessible - phenomenon. Considering the extent to which we've been throwing terms like "MySpace Generation" or "Youtube Generation" around in the last several years I think it's pretty safe to suggest that the Net <span style="font-style: italic;">might </span>have had something to do with this. It makes sense to suggest that the Net is going to shape the interpretation and presentation of even traditionally-academic subjects: it's utterly ubiquitous in the developed world these days, and large swathes of its original incarnation developed out of academia in the first place anyway.<br /><br />While it's weird to think so these days, with the Internet's recent proliferation of just about any kind of frivolity, inaccuracy, or Thing Man Was Not Meant To See, the Internet was once, at its heart, a fundamentally academic institution. Its culture was largely that of the American computer-science student community, with rhythms organized around the academic year. Many <span style="font-style: italic;">still </span>use the term "<a href="http://catb.org/jargon/html/S/September-that-never-ended.html">September that Never Ended</a>" to describe all time since September 1993 - referring to the annual influx of newbies with little awareness of proper <a href="http://www.albion.com/netiquette/corerules.html">netiquette</a> which became an unending flood that fall - and speak with almost a religious hope of the prophesied October 1, 1994, which has yet to arrive. "What's your major?" was a question equivalent to "what's your sign?" and so on.<br /><br />While it's somewhat obvious that the bulk of online culture is no longer centered around those kinds of patterns, a core of them is still around and at least influences a large portion of the Net. Large swathes of the parts of the Net with a useful <a href="http://catb.org/jargon/html/S/signal-to-noise-ratio.html">signal-to-noise ratio</a> have much in common with the Net's original large-scale discussion system, Usenet, before its collapse to in the mid-1990s. Usenet, and its current successors[1], had a combination of an anything-goes approach subject-wise and a lack of any barriers to participation beyond simple access to the network in the first place. The results varied from the completely silly (such as most of the thousands of groups in the "alternate" hierarchy[2]) to general-interest groups (such as <a href="http://groups.google.ca/group/rec.pets.cats/topics?lnk=gschg">rec.pets.cats</a>), to artistic communities (such as <a href="http://groups.google.ca/group/alt.fiction.original/topics">alt.fiction.original</a>, which a couple of friends and I created in the mid-90s as an escape from the ubiquity of horrific fanfiction) to more academic topics (such as <a href="http://groups.google.com/group/sci.archaeology/topics?lnk=rgh">sci.archaeology</a>).<br /><br />Unless someone was already known by name otherwise - a known scholar in a given field, or one of the various alpha geeks of the Internet at the time[3] - there were no clear<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjN2DZbsjHhy11-Y7i4ftkXAFbC-_NL8zEYMQtYH21sxlgN_taBlnKfDd82qA_VMf6Mkf9-RUD4QvlW9MQZ25uFJN3eTybTqc3KIGoly9gIWAHiWF8t-y6bM0cRWD4oOEQvwU8L8fIK6Xst/s1600-h/internet-dog.gif"><img style="margin: 0pt 10px 10px 0pt; float: left; cursor: pointer;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjN2DZbsjHhy11-Y7i4ftkXAFbC-_NL8zEYMQtYH21sxlgN_taBlnKfDd82qA_VMf6Mkf9-RUD4QvlW9MQZ25uFJN3eTybTqc3KIGoly9gIWAHiWF8t-y6bM0cRWD4oOEQvwU8L8fIK6Xst/s320/internet-dog.gif" alt="" id="BLOGGER_PHOTO_ID_5190082316630083330" border="0" /></a> credentials or pecking orders to go by other than reputations as established in a given forum. It resulted in an odd sort of situation where someone using a real name and some postnominals and someone else with an elaborate and obvious pseudonym would generally end up on an equal footing in discussion from the get-go and diverge based on shown competence alone. (As an example, someone I knew some years back, who wound up hugely respected in the spam-fighting field, was - and to my knowledge, still is - known only as "Windigo the Feral.") While a lot of people were, and still are, dismayed at the potential for this kind of anonymity, many others found (and find) it liberating.<br /><br /><span>This kind of digital culture - gleefully eccentric, technically minded, generally libertarian and explicitly meritocratic - and its descendants have had a tremendous influence on a lot of digital-humanities concepts, between their broader impacts on digital culture in general and the fact that a lot of people working in the digital humanities tend to have strong computing backgrounds. This kind of community was the one which is the source of the whole <a href="http://www.opensource.org/">open</a> <a href="http://sourceforge.net/">source</a> <a href="http://code.google.com/opensource/">movement</a>, the source of any number of tools I've mentioned on this blog. <a href="http://www.zotero.org/">Zotero</a>, <a href="http://www.blender.org/">Blender</a>, <a href="http://www.shatters.net/celestia/">Celestia</a>, and <a href="http://freemind.sourceforge.net/wiki/index.php/Main_Page">Freemind</a> are just four examples that immediately come to mind, and each is a program which came out of this movement which I've managed to turn to one historical purpose or another at least once this year. Each of those already exists as commercial software, but they have the little advantage of being free. When compared to <a href="http://www.endnote.com/">Endnote</a> ($300), <a href="http://www.newtek-europe.com/uk/price/price_e.html#lightwave">Lightwave</a> ($1100 - thousands of dollars cheaper than when I last checked!), <a href="http://www.carinasoft.com/voyager4/index.html">Voyager 4</a> ($200) or <a href="http://shop.mindjet.com/site/MindManagerPro7.html">Mind Manager Pro</a> ($350) , I know full well where my preferences are going to lie - particularly since both sets of software are often similar or identical in terms of capabilities, even when correcting for open-source software's tendency towards lousy interfaces. There are moves to <a href="http://www.schoolforge.net/">unify these sorts of things towards educational purposes</a>, even in the realm of <a href="http://laptop.org/">physical computers themselves</a>. And to make it even better, should I feel moved to and have the capabilities I can, as Dr. Turkel put it, <a href="http://digitalhistoryhacks.blogspot.com/2008/02/freedom-of-expression.html">feel free to fix anything that is bugging me</a>.<br /><br />This sort of attitude - that I (and you!) should have a say in what's going on if we're able to do so in a competent manner - is, I find, a healthy one to have in the first place, and fairly central to a lot of the ideas of what academia was meant to be to begin with. I don't like gatekeepers who don't have a really good reason to be such, like the trauma surgeon I mentioned in my classically unfair example in my previous post. When you combine it with the capability for easy exchange of information modern technology tends to give us, it's only a short trip to a whole crop of notions like <a href="http://infotangle.blogsome.com/2005/12/07/the-hive-mind-folksonomies-and-user-based-tagging/">folksonomies</a>, the pursuit of <a href="http://battellemedia.com/archives/000063.php">databases of intentions</a>, turning filtering techniques meant to <a href="http://www.paulgraham.com/spam.html">catch spam</a> into more sophisticated <a href="http://www.siefkes.net/ie/">information-extraction methods</a>, and any number of other things. Between the basic potential of these sorts of techniques and the willingness of so many pursuing them to share their research and tools all over the place (notice on that netiquette link above that one of the items was "share your expertise?"), I think it's a pretty exciting time to be an historian. Dealing with the mass of information currently sitting around due to electronic archiving's one thing; turning a lot of these tools on the stuff we've traditionally been condemned to looking through via microfiche readers is another entirely. </span><br /><br />There are potential problems because of the Net and its cultures as a new base for scholarship as well, of course. Pretty much <span style="font-style: italic;">the </span>standard debate to do with the Internet's implications for scholarship is the reliability/verifiability issue. This is something I've <a href="http://p-stewart.blogspot.com/2007/12/enemy-at-gates.html">discussed</a> <a href="http://p-stewart.blogspot.com/2008/03/on-haystacks.html">before</a> on this blog, and will probably continue doing so for some time; it's a pet topic of mine. Even a cursory reading of what I have to say over my time writing here should make it clear that I fall clearly in the "digital resources are useful and often reliable" camp - and, more to the point, that I despise the casual elitism which says one must need a tangible degree to even attempt to contribute to knowledge on issues. Despite that, I have my own problems with content on the Net and try to turn a critical eye onto it. Besides, the reliability issue isn't limited to history online. Academic history isn't infallible, sometimes having a real problem being uncritical about sources (as I found out during my undergraduate thesis, watching Thomas Szasz get invoked as a reliable source about mental illness several times). Public history has its own problems, as evidenced by controversial manipulation of exhibits at the <a href="http://www.warmuseum.ca/">War Museum</a> or <a href="http://www.si.edu/">Smithsonian</a>, or the mendacious silliness that is documentaries like <span style="font-style: italic;">The Valour and the Horror</span>. There's differences of degree, but we do need to take the beam out our own eye on that account, no?<br /><br />While I have a thorough appreciation for the attitudes of openness and general egalitarianism which were fundamental parts of digital culture since well before I started seeing it in the early nineties, I think they can be taken too far or too blindly at times. As an example, I'm part of that rather small subset of big-on-digital-stuff historians who <span style="font-style: italic;">doesn't </span>like Wikipedia, or more to the point, the predominant intellectual culture on it. Jaron Lanier <a href="http://www.edge.org/documents/archive/edge183.html">echoes</a> a good chunk of my views in his essay on <a href="http://www.edge.org/">Edge</a> (another site on my "everyone should go here sometimes" list), though he takes his objections a little further than I would by invoking terms like "digital Maoism." On a more scholarly and financial level, there's <a href="http://chnm.gmu.edu/resources/essays/d/2">some debate</a> over the issue of open-access content for traditional scholarship, with arguments ranging from the silly ("socialized science!") to more <strike>sane</strike> measured discussions of the IP or financial effects of scholars' self-publishing or otherwise having more control over their work once it's in the field.<br /><br />These sorts of issues are going to be hovering around academia for some time. The "problem" is that they will do so regardless of whether academia goes down the path of open access or Wikiality or whatever else. <a href="http://ocw.mit.edu/OcwWeb/web/home/home/index.htm">Many</a> <a href="http://youtube.com/user/ucberkeley">institutions</a> <a href="http://cnx.org/aboutus/">are</a> <a href="http://www.perseus.tufts.edu/hopper/#">cheerfully</a> <a href="http://ocw.usu.edu/">doing</a> <a href="http://ocw.jhsph.edu/">this</a> <a href="http://ocw.tufts.edu/">kind</a> of thing, with some advocates going so far as to make religious allusions through the use of terms like "<a href="http://www3.isrl.uiuc.edu/%7Eunsworth/liberation.html">liberation technology</a>."Those who don't, however, will still have to respond to it in one way or another, as it becomes more and more obvious that simply saying "don't use online sources" is not an acceptable approach.<span style="font-style: italic;"></span><br /><br />The interesting aspect of this shows up when we start to see a fusion of the different cultures I've been talking about lately. We've got that amateur culture, big on do-it-yourselfing and drawn into their various pursuits out of a genuine desire to do something; we've got the loosely meritocratic and not-so-loosely anarchistic elements of digital culture, with its appreciation of openness in design or knowledge and growing use of hugely collaborative projects; we've got the traditional academy, feeling around the edges of the potential of this merger. Of course there's room for uneasiness between the three elements, but at their heart all three are (in theory) built around an appreciation for knowledge and ability.<br /><br /><br /><br />[1] - Google has since largely revived Usenet with its establishment of <a href="http://groups.google.com/">Google Groups</a> a few years ago (and the newsgroup links are courtesy of them); a chunk of the "original" Internet is once again out there, active, vibrant and wasting bazillions of man-hours per day. Huzzah!<br /><br />[2] - <a href="http://en.wikipedia.org/wiki/Image:Usenet_Big_Nine.svg">This image</a> from Wikipedia's (I know, I know) <a href="http://en.wikipedia.org/wiki/Usenet">article on Usenet</a>, which illustrates the meanings of the different "hierarchies" of Usenet newsgroups, illustrates the alt.* hierarchy <span style="font-style: italic;">perfectly</span> in the eyes of anyone who remembers the original thing.<br /><br />[3] - "Anyone ... [who] knows Gene, Mark, Rick, Mel, Henry, Chuq, and Greg personally" is one of popular definition of such people. The Net used to be a small world.pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com0tag:blogger.com,1999:blog-2249968205425769817.post-63394168329240756102008-04-15T15:25:00.000-07:002008-04-15T16:40:15.794-07:00For Its Own Sake<p class="MsoNormal">In terms of my historical pursuits, and many others besides, I consider myself an amateur and very much hope to stay that way.<br /><br />This has little to do with hoping to maintain humility in the face of a vast amount of knowledge, or recognition of living in a postmodern world where I'm doomed not to really know anything about anything, or anything else along those lines. (There's an element of truth to each, of course; if there's anything approaching completion of an advanced degree in history has taught me, it's precisely how little I know.) What I'm talking about is a rather older definition of the term.<br /><br />The word "amateur" doesn't exist merely as an antonym to "professional" (or, more insidiously, "competent"). Being an amateur, ideally, is being an <i style="">amator</i>: doing or pursuing something out of a love of the subject and a desire to pursue it, with vocational aspects as secondary. This doesn't necessarily require years of formal training in a discipline, though I obviously think that helped in my own case. Sometimes that lack of training can be worked around, often in very surprising ways, by dedicated amateurs, however.<br /></p><p class="MsoNormal"> </p><p class="MsoNormal">Consider chemistry sets. As we have been Thinking Of The Children rather more than is necessary for the last several years, they have, along with rather too many other things, fallen into disuse and obscurity to the point where it’s somewhere between difficult and impossible to find a “real” one.<span style=""> </span>Depending on the <a href="http://xkcd.com/382/">safety level</a> of various science-related hobbies, this is probably sometimes a good thing, but on the whole I think we’re losing something when we eliminate those kinds of incentives for learning. Others seem to agree with me. As a reaction to this sort of thing, many people and, rather likely, no shortage of kids have set about creating their own chemistry sets or ad-hoc equivalents thereof from scratch. However, the supplies needed are themselves difficult to impossible to find for the same reason a lot of the sets are.</p><p class="MsoNormal"> </p><p class="MsoNormal">This hasn’t prevented Darwin’s pager from being set off, however.<span style=""> </span>Rather than abandon their interests due to the difficulty of getting modern materials, quite a few would-be amateur chemists have gotten together and <a href="http://www.sciencemadness.org/madscifaq.html">formed their own communities</a> [1] in which they’ve gone back to older textbooks – which often presuppose far less equipment or financial resources – in order to learn how to create the reagents or gear they need from scratch. This probably upsets the <a href="http://www.dhs.gov/index.shtm">Department of Homeland Security</a> or <a href="http://www.ps-sp.gc.ca/index-en.asp">Public Safety Canada</a>, but on the other hand just about anything does anyway, and I like the existence of environments like these where people can get together to learn about things.</p> <p class="MsoNormal"> </p> <p class="MsoNormal">That’s simply a more spectacular example of what happens when a bunch of amateurs get together and decide to pursue their interests than most. Similar things abound on more conventional, less high-entropy levels: <a href="http://www.rac.ca/">ham radio operators</a>, <a href="http://www.rhcask.ca/debate-of-the-century/">historical reenactors</a>, <a href="http://amateurastronomy.org/">astronomers</a> and so on have long since provided an avenue for this sort of thing. Thanks to the various services and methods of communication brought about by the Internet, these cultures are changing in several ways. Perhaps the most obvious impact is that groups of amateurs are no longer restrained by geography. While the Hamilton Amateur Astronomers are rather obviously connected to the city of Hamilton, the members of Science Madness are less concerned about being in the same place. <span style=""> </span>This can often be a hook for people who have an interest in one thing or another, but found no avenue or reason to seriously pursue it due to a lack of resources, a lack of knowledge, or a lack of like-minded individuals to create some nice <a href="http://xkcd.com/140/">positive feedback loops</a> which encourage further learning or practice. I know I’ve picked up or maintained several interests as a result of encountering such groups, and I’m sure several of those who may be reading this have run into similar sorts of things.</p><p class="MsoNormal"> </p><p class="MsoNormal">The spread of these amateur cultures has been seen as a mixed blessing, of course. There are areas in which a lack of formal training or instruction really <i style="">is</i> a problem. Amateur historians or even chemists can often make those of us within the formal discipline twitch, but I’d be much happier dealing with either than I would an amateur trauma surgeon. [2] On a less life-and-death level, dealing with matters involving creativity, activism, history and so on, there are more shades of grey and room for <a href="http://www.lessig.org/blog/2007/05/keens_the_cult_of_the_amateur.html">vigorous debate</a>, which often generates more heat than light but winds up illuminating nonetheless.<span style="font-style: italic;"> </span>At this point, in any case, there is <a href="http://www.wikipedia.org">no</a> <a href="http://www.amateurastronomy.com/">shortage</a> <a href="http://www.diynetwork.com/">of</a> <a href="http://www.instructables.com/">subjects</a>, <a href="http://www.wikihow.com/Main-Page">sites</a> <a href="http://www.reenactor.net/">and</a> <a href="http://www.eventplan.co.uk/history_of_reenactment.htm">organizations</a> where anyone - amateur or expert - can dive into something without necessarily short-changing themselves.</p><p class="MsoNormal">A good chunk of my current history-related amateur geekery involved <a href="http://digitalhistory.uwo.ca/h513_0708/">513</a> over the last little while, as a component of our major class project where we were mounting a series of interactive exhibits built around the (very) general theme of "the sky." (My group was focused on the <a href="http://www.thespacerace.com/">space race in general</a>, and <a href="http://history.nasa.gov/sputnik/">Sputnik</a> in particular.) I've been interested in interactivity and visualization as tools to make exhibits more interesting for awhile, and thought I'd try to find some good visualizations and then ways for people to engage with them. The visualization part was a breeze, given the subject: I made use of <a href="http://www.shatters.net/celestia/">Celestia</a>, a fantastic - and free! - space simulation program that anyone interested in space needs to download <span style="font-style: italic;">right now</span>. (I mean that. Why are you still here?)<br /></p><p class="MsoNormal">For interacting with it I had planned on doing something a little more esoteric. Working with the SMART Board project for <a href="http://digitalhistory.uwo.ca/h500_1_0708/">the main public history course</a> got me interested in the idea of interactive whiteboards in general, especially when we got to the point of pushing their capabilities' limits somewhat. I like touchscreens in general, as a way of getting over that boring keyboard-talks-to-the-computer tradition. [3] To that end, I thought I'd try to <span style="font-style: italic;">build </span>an interactive whiteboard in a roundabout way with a combination of a projector hooked up to my desktop and a sensor built from a homebuilt IR light pen and a Wii remote, inspired by Johnny Chung Lee's blog "<a href="http://procrastineering.blogspot.com/">Procrastineering.</a>"The idea was to project the image onto a wall, which could then be manipulated with one (or, preferably, two) pens to rotate, zoom, etc. the final product. I got partway through the process before the suddenly-Schrödingerian status of our having a projector - and the end-of-semester crunch season in general - caused theWii component of the whole project to end up by the wayside, forcing us to settle on the visualization and some audio I drew together towards the end. The whiteboard shall exist in time - indeed, it <span style="font-style: italic;">must</span>, since it's going to remain a splinter in my mind until I get the thing working - but it has been relegated to a summer project.</p><p class="MsoNormal">The thing is, I wouldn't have come up with that kind of idea in my own, and would have been hard-pressed to get a hold of some of the basic theory to follow through on it, without these kinds of established and tolerated amateur cultures towards various esoteric fields of knowledge existing in the first place. While there's obviously going to be concerns about significant aspects of them - reliability for a page on <a href="http://www.thomaslessman.com/History/Maps.html">historical maps</a>, safety issues on <a href="http://www.roguesci.org/">amateur science</a> sites specifically devoted to "fires and loud noises," etc. - I do like the variety, openness, and sheer weirdness of a lot of these sorts of resources, and I approve of living in a world where it's fairly easy to take just about any hobby or discipline and find a thriving community of engaged, helpful devotees already involved in it at any level from greenest beginners to world-renowned experts.<br /></p><p class="MsoNormal">This is getting far more tome-ish than I originally planned, so I think I'll be belatedly polite and cut it off here. Next I want to talk about the digital aspects of this sort of thing. Between the Internet's ubiquity, its ability to help spread and coordinate amateurs of all skill levels from around the world, and its own distinct cultures as regard ideas and the transmission thereof, there are some pretty profound implications - and challenges - for historians to consider.<br /></p><p class="MsoNormal"><br /></p><p class="MsoNormal"> </p><p class="MsoNormal">[1] – I was casting about earlier for some URLs connected to this, as I remembered coming across mention of it via a couple of news stories and blog posts a year or so ago. Upon asking a friend who had pointed me at said stories in the first place if he remembered where we'd seen them, he pointed me at this URL, a dedicated domain to the subject, instead. I (foolishly) expressed surprise at that and was informed that I "underestimate the Internet at [my] peril. ;)" I do indeed.</p> <p class="MsoNormal">[2] – I’m entitled to one or two shameless straw men per semester, so I don’t feel <i style="">too </i>guilty about this one. <span style="font-style: italic;"><br /></span></p><p class="MsoNormal">[3] - I was also inspired on this note by Jeff Han's <a href="http://www.ted.com/index.php/talks/view/id/65">magnificent presentation</a> at the <a href="http://www.ted.com/">TED Talks</a> in February 2006, in which he demos an intuitive interface that gets rid of mice and keyboards altogether. I want three.<br /><span style="font-style: italic;"></span></p>pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com0tag:blogger.com,1999:blog-2249968205425769817.post-91909446485974794022008-04-13T11:55:00.000-07:002008-04-13T12:42:39.300-07:00Mindsets in the Digital Humanities<p style="margin-bottom: 0in;">It’s easy to get lost in the technical aspects of digital humanities. After all, isn’t the “digital” part of the term fundamental to the condition of the discipline? It would seem, anyway, that that was the core part of this whole area of study. Most of the elements we’ve been studying in the course of this year’s <a href="http://digitalhistory.uwo.ca/h513_0708/">Digital History course</a> at <a href="http://www.uwo.ca/">Western</a> have had that technical nature about them. These have varied in complexity from subject to subject, of course; there is little that is <i>really</i> technical in a computing sense about <a href="http://chnm.gmu.edu/resources/essays/topten.php">how not to design web pages</a>, but <a href="http://www.paulgraham.com/spam.html">discussions of spam filtering techniques</a> require some knowledge of computing theory to grasp, and their historical connections aren’t immediately relevant. All of these, however – and most of what we’ve discussed in the last year – have in common that substantial “digital” focus. The impression at first glance is that we’re talking about <i>technology</i> and history, not technology and <i>history</i> or <i>technology and history</i>.</p><p style="margin-bottom: 0in;"><br /></p><p style="margin-bottom: 0in;"> </p><p style="margin-bottom: 0in;">It is important, however, to keep the “humanities” half of “digital humanities” in mind. The technologies we use these days, either for historical research and presentation or for any number of other uses, are – for now, mostly – designed and used by people. As we know as historians, people <a href="http://xkcd.com/263/">who aren’t mathematicians</a> are likely to bring quite a lot of outside baggage into their work. The tools, theories and outputs used in the digital humanities – and anything else, for that matter – are going to reflect certain cultures, backgrounds and key assumptions inherent to those who produce them. In my <a href="http://p-stewart.blogspot.com/2008/03/omnia-mihi-lingua-graeca-sunt.html">previous post</a>, I discussed Bonnett’s use of the term “hieroglyph” to refer to interfaces or tools which are too difficult for layfolk to readily understand. This can also be applied to the mindsets of people who produce those tools, which can either provide another layer of obfuscation or simply be the source of the initial problems. This can produce some elements of culture shock on top of the learning curves involved in using new technologies.</p><p style="margin-bottom: 0in;"><br /></p> <p style="margin-bottom: 0in;"></p> <p style="margin-bottom: 0in;">As an example: in class about a month ago, we were talking about locative technologies and ubiquitous computing. In the middle of the <a href="http://digitalhistory.uwo.ca/h513_0708/?page_id=23">readings for that day</a> was an <a href="http://www.boingboing.net/images/blobjects.htm">article</a> by<b> </b><a href="http://blog.wired.com/sterling/">Bruce Sterling</a> on “blobjects” which started something of a stir in the discussion. A lot of this seemed to be about Sterling’s writing style in his article. Fair enough: technophile and spec-fic geek though I am, even <i>I </i>found it hyperbolic, annoying and laden with for-its-own-sake jargon. But there was some substantial context behind the words being written in that article. Sterling is a science fiction writer; not only that, but one of the writers whose work helped define and establish one of the most computer-centric fields of science fiction, <a href="http://project.cyberpunk.ru/idb/scifi.html">cyberpunk</a>, as a thriving genre. He was speaking at <a href="http://www.siggraph.org/">SIGGRAPH</a>, a prestigious conference on computer graphics and research on same. There is going to be a different set of approaches, of expectations, of worldviews in a group of people who are likely to non-ironically talk about The Future (with Emphatic Capitalization, of course) than there would be for those who tend to get published in the Journal of Hellenic Studies. (There’s also going to be certain expectations from the audience on the author. I’m certain Sterling delivered on that front, but I also wasn’t the audience so I can’t be sure.)</p><p style="margin-bottom: 0in;"><br /></p><p style="margin-bottom: 0in;"> </p><p style="margin-bottom: 0in;">So where am I going with all of this?</p><p style="margin-bottom: 0in;"><br /></p><p style="margin-bottom: 0in;">The plan here is to get a series of four or so posts up in the next few days, where I'll try to look at some of the approaches and mindsets out there which strongly influence different aspects of digital humanities (while also trying to draw together a bunch of material from my time in this course and <a href="http://history.uwo.ca/gradstudy/publichistory/">program</a>). I’m convinced a combination of parts of the digital cultures which developed in academia and migrated onto the net, and the <a href="http://news.bbc.co.uk/1/hi/entertainment/4685471.stm">rebirth of a broader amateur culture</a> from the last few years, provide a lot of the foundation beneath digital humanities in general, and are only going to influence them more as they become more popular over the next several years. People don't, of course, need to be fully involved in, or even that aware of, what's going on behind the scenes of the tools they use in their day-to-day lives or projects in order to use said tools. It helps, however, especially when it comes to encountering concepts which are relatively new or strange, such as a growing emphasis of technology in the presentation of history and other humanities. <span style="font-weight: bold;"></span></p> <p style="margin-bottom: 0in;"> </p>pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com0tag:blogger.com,1999:blog-2249968205425769817.post-20368136638698633882008-03-14T13:15:00.000-07:002008-03-14T13:29:21.675-07:00Omnia Mihi Lingua Graeca Sunt<p class="MsoNormal">Yesterday, in a fairly crowded <a href="http://www.lib.uwo.ca/weldon/"><u>Weldon</u></a>, I found myself in the nice and ironic position of being stymied by the self-checkout system to the point where I gave up and resigned myself to being stuck in front of a line of people, each of whom was checking out an entire floor of the place. The fact that the books I was trying to check out were a Marshall McLuhan book and another book on using technology to facilitate learning, of course, helped the inherent awesomeness of the whole situation. </p> <p class="MsoNormal">After I got bored of looking silly and checked the things out in <a href="http://xkcd.com/354/"><u>the manner of the previous century</u></a>, I got to thinking about the learning processes for various different things in general, and technology in particular. It’s been bouncing through my head a lot this semester, which is probably a good thing when one of my textbooks is on interaction design. Quite a bit lately, between my making heavy use of <a href="http://www.blender.org/"><u>Blender</u></a> for various personal and school projects, figuring out the foibles of the <a href="http://smarttech.com/"><u>SMART Board</u></a> software for another project, and considering seeing if I can pick up <a href="http://www.python.org/"><u>Python</u></a> as well. Only one of the three really strikes me as especially esoteric at this point, though the two I’ve got some experience messing with these days have each routinely set me off on some proper rants.</p><p class="MsoNormal"> </p><p class="MsoNormal">I’ve been reading Raymond Siemens’ and David Moorman’s <u>Mind Technologies: Humanities Computing and the Canadian Academic Community</u> for the last few days. It’s more or less what the title implies – a survey and discussion of the state of technology use in academia in the country, in the form of a series of essays or chapters by various scholars in different fields, discussing how they’re applying technologies to their various projects. One of the chapters was by John Bonnett, whose work I’ve <a href="http://p-stewart.blogspot.com/2007/11/ending-our-fences.html"><u>written about</u></a> here before. The chapter mainly discussed his <a href="http://3dlearning.iit.nrc.ca/3DVirtualBuildings/"><b style="font-weight: bold;"><u>Virtual Buildings Project</u></b></a><span style="font-weight: bold;">,</span> although the secondary focus of it was on, as he puts it, “how hieroglyphs get in the way.” Bonnett defines “hieroglyphs” in his chapter not as the pictographs used by Egyptians, Mayans, Mi’kmaq and so on used, but rather expanded the concept towards any sort of system which is unnecessarily complex or abstruse. When we encounter something we just can’t figure out ,either due to a lack of knowledge on our part or a lack of effective design on the part of what we’re trying to use, it might as well be in another language.<span style=""><br /></span></p><p class="MsoNormal"> </p><p class="MsoNormal">Part of learning these other “languages” is, to be fair, our own burden to deal with. There are very few things out there which one <i style="">can’t</i> figure out, but quite a few that they <i style="">won’t </i>figure out, due to frustration, a lack of time or energy, and so on. And the more “languages” one knows, the more they can pick up as time goes by. With too little background, just about anything can come across as Linear A, frustratingly indecipherable despite our wishes or efforts.</p><p class="MsoNormal"> </p> <p class="MsoNormal"> </p> <p class="MsoNormal">But that doesn’t excuse opaque interfaces or user-hostile design in the process of developing whatever you’re working on, be it a scanner, a piece of graphics software, a book, or a simple tool. Back in December, <a href="http://carrieannlunde.blogspot.com/"><u>Carrie</u></a> <a href="http://carrieannlunde.blogspot.com/2007/12/week-12-assignment.html"><u>asked</u></a>, “why can there not be more formulated programs for the <i style="">not-so-computer-savvy </i>historian?” (emphasis added.) It’s a good question, one which should be asked more often and more forcefully than is done these days. There’s a lot of incredibly powerful tools for any number of tasks just sitting out there – including the tools to, if need be, make additional ones – that are seeing very little use either because they’re hard to find or they’re difficult for the uninitiated to get their heads around.<span style=""> </span>As I said above, part of the responsibility to deal with that falls on those of us who want to use these things: a certain minimum of effort is necessary to learn how to do anything, of course. At the same time, not everyone is going to have a background in computer science or the like, especially if they’re in the humanities.<span style=""><br /></span></p> <p class="MsoNormal"> </p> <p class="MsoNormal">Rather than use this lack of background as an excuse to avoid the entire field, we ought to be putting some effort into finding what’s out there that <i style="">can</i> be readily used and making it known to others. We could be finding tools that are out there and suffer from these learning curve programs and try to figure out how they can be improved through better interfaces, documentation, and so on. And probably most ambitiously – but most rewardingly – we could see what we can do about making, or at least planning or calling out for, some of those tools ourselves, as we (usually) know what we want in such things. Each of these is necessary for those of us wanting to incorporate new tools into our work as historians (whether public or academic), and aspects of each are at least feasible for any of us.<br /></p>pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com0tag:blogger.com,1999:blog-2249968205425769817.post-63182348641560898412008-03-12T12:02:00.000-07:002008-03-12T12:50:13.208-07:00On Haystacks(Reports of my death, etc. Let's try taking that post-turned-gigantic-monster and convert it into a series rather than a single Epic Tome of a post!)<br /><br />“Oh, they just look things up online now,” people often say of students (or maybe just undergraduates) these days. As loaded statements go, it’s a pretty good one. The implications are manifold. It implies that Kids These Days lack the work ethic of us Real Scholars - people who, of course, <em>never</em> had to scramble to fill the necessary sources on a paper or started anything at the last minute. It implies that this will hasten the Imminent Death of the Library, or that it necessarily requires a decline in the quality of the students’ scholarship. The idea that electronic sources are intrinsically bad or unreliable notwithstanding – a concept, it should be obvious by now, which I reject out of hand – there’s another implication about the above quote which matters rather more to me. The fact that people are willing to look beyond the world of monographs and journal articles, provided they don’t leave those behind entirely, is a good thing in my opinion. What does worry me is not <em>the fact</em> that students will “just look things up online” as much as the question of <em>whether they can</em>.<br /><br />To say there’s a <em>lot</em> of material out there is something of an understatement on the order of saying “setting yourself on fire is often counterproductive.” <a href="http://www.intute.ac.uk/artsandhumanities/history/">A</a> <a href="http://astronautix.com/">lot</a> <a href="http://www.theodoregray.com/PeriodicTable/">of</a> <a href="http://owl.english.purdue.edu/internet/resources/index.html">this</a> <a href="http://www.ted.com/index.php">material</a> <a href="http://www.bbc.co.uk/history/">is</a> <a href="http://fletcher.tufts.edu/inter_resources/Default.htm">nothing</a> <a href="http://swordforum.com/">short</a> <a href="http://conceptart.org/">of</a> <a href="http://conceptart.org/">fantastic</a> for casual, professional or academic reference, and really ought to be more obvious or widely-used than it is, either for the purposes of casual browsing or for actual study. (This isn't even taking amateur culture into account, something I want to address in a future post.)<br /><br />The difficulty arises from actually finding the stuff. A given resource,<em> once found</em>, may be laid out in an intuitive, accessible manner (<a href="http://www.webpagesthatsuck.com/">or not</a>), but getting there in the first place can often prove difficult. The Net as a whole is anarchic, indexed mainly in an ad-hoc manner when it is indexed at all, with the indices themselves usually hidden under the hoods of search engines (<a href="http://www.google.com/dirhp">Google Directory</a> and <a href="http://www.dmoz.org/">similar</a> <a href="http://dir.yahoo.com/">sites</a> notwithstanding). The visible indices that do actually exist are often obscure, arcane or both. Much like <a href="http://www.loc.gov/catdir/cpso/lcco/">another arcane system</a> of managing information many of us use as scholars often without a second thought, these systems can be adjusted to and eventually mastered.<br /><br />In searching for materials online, there’s usually a lot more involved than simply plugging a word or three into Google. This can often work – <a href="http://www.google.com/search?q=Dalhousie+University&rls=com.microsoft:en-ca&ie=UTF-8&oe=UTF-8&startIndex=&startPage=1">searching for Dalhousie University</a> is not likely to be a challenge, and the first page of searches for nearly anything is going to bring up a Wikipedia article or two (although that <em>does</em> bother me, and I'll rant on it later). On the other hand, ambiguity or obscurity can cause otherwise simple searches to become annoyances – the first five hits when <a href="http://www.google.com/search?q=Saint+Mary%27s+University&rls=com.microsoft:en-ca&ie=UTF-8&oe=UTF-8&startIndex=&startPage=1">searching for Saint Mary’s University</a> point to five separate universities. (Of course, in my entirely unbiased opinion, the One True SMU is the first hit.) Context makes it obvious which is which in cases like these, but that is not always going to be the case.<br /><br />As part of a project I’m working on – actually, a piece I intended to write here but which has spiraled out of proportion like some kind of even-more-nightmarish <em>katamari</em> – I’ve got a small pile of books sitting on a shelf at home (or in my spine-ending backpack here). One of them is Tara Calishain and Rael Dornfest’s <a href="http://www.oreilly.com/catalog/googlehks/">Google Hacks</a>, part of the vast horde of <a href="http://www.oreilly.com/">O’Reilly reference books</a>. A lot of it is more or less what it sounds like – a series of ways to game Google’s search engine and its other applications to do various <a href="http://douweosinga.com/projects/googlenewsmap">useful</a>, <a href="http://googlesightseeing.com/">entertaining</a>, or <a href="http://googleblog.blogspot.com/2005/09/googlebombing-failure.html">malign</a> things – but the main thing which interested me about it is the fact that it’s a 330-page book on the detailed use of an online tool which is so utterly ubiquitous at this point as to either be invisible or supposedly simple. Recognizing even a handful of the points raised by the book turn a “simple” search engine into a fairly complex and powerful tool whose capabilities aren’t recognized by the majority of people which use it on a day to day basis. Now, I believe that just about anything can be used in a variety of different, useful and creative ways, but as aware as I <em>thought </em>I was about what you could do with Google, quite a few things in this book managed to surprise me. It leaves me wondering what else is out there, either useful for its own sake or for direct application into my own fields of interest.<br /><br />I don’t think the problem is the fact that people are simply looking things up online at all, but I do think a lot of them are probably doing so poorly, and lack the basic knowledge to fix that. Tempting as the “I feel lucky” button may be, a search for anything moderately obscure or ambiguous is going to have little success or yield problematic results unless the searcher knows how the game is supposed to work. Having an idea of what you’re looking for is by far the most important aspect of looking for anything, whether online or off. If that condition is satisfied, however, the next step of knowing how to find information can often be a challenge as well. People might scoff, but using something like Google is a skill, and not simply a form with a button attached, and I think it should be taken less for granted than it currently is. Despite the intimidating nature of books like <u>Google Hacks</u>, the basic gist of how to use it, or other forms of computer-assisted searching in general, is something that can be taught or learned readily enough. A little bit of getting one's head under the hood of how these engines operate, and a little bit of understanding about how the system they are meant to navigate works, can go a huge way.<br /><br />I don’t see it as the Internet’s fault whenever someone uses it to receive shoddy information, any more than I believe a library is at fault when someone checks out a book by Erich von Daniken or Anatoly Fomenko and uncritically takes them at face value. Rather than dismiss the utility of using these kinds of resources at all – something which I consider futile at the <em>very</em> best – we really need to pay more attention to them, learn how they work (for Google at least, this is far easier than many may think), and teach others how to manage and interpret the results they find.pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com1tag:blogger.com,1999:blog-2249968205425769817.post-83992463247308422152007-12-13T22:49:00.000-08:002007-12-14T00:04:55.378-08:00eNemy At The GatesIf you want to see an entertainingly polarized debate among a lot of historians - or academics in general, but it seems historians in particular lately - you need do no more than invoke online sources. Particularly in the context of assignments, those two words tend to result in a couple of common reactions:<br /><ol><li>"It's an online source. So what? They're just as good as conventional ones."</li><li>"Online sources are intrinsically bad should never ever be used."</li></ol>Of course there's others, but most of the time I've discussed the idea I've heard variants of those. I'll wax provocative for a moment and suggest that exhibiting either opinion likely shos a certain, perhaps conscious, lack of thought on the issue. Both views are highly problematic and not questioned often enough, which is where I'm going to try to come in tonight. I'm mainly going to focus on the latter claim with tonight's article - it is by far the more commonly-heard one. (Before doing so, I <span style="font-style: italic;">will </span>cheerfully say that I consider automatic trust of Web sources to be at least as silly as automatic trust of AM talk radio, or perhaps the <a href="http://www.weeklyworldnews.com/"><span style="font-style: italic;">Weekly World News</span></a>.)<br /><br />There are usually a few, pretty predictable arguments presented when people argue for an automatic rejection or disdain of online sources. I'm going to address the most common ones I've run into, in order from most to least absurd. (I'm not going into arguments which go so far as to dismiss government or university sources for being online; that, I hope, is too self-evidently ridiculous to warrant refutation.)<br /><br /><span style="font-weight: bold;">Objection the First: </span><span style="font-style: italic;">"It's too difficult to track down references. You can't cite Web pages as specifically as you can books or other materials: there's no page numbers!</span>" The crux of this argument is that online sources are not print sources - duh - and therefore are too difficult/unreliable/etc to source to bother citing, because of inconsistent layout and the fact that it may not be immediately obvious where one may get all the information needed to do a <a href="http://library.osu.edu/sites/guides/chicagogd.php#2wwwsite">proper, full citation</a>. A number of simple solutions exist here. If all the information isn't there, then that's fine; it's not <span style="font-style: italic;">your </span>fault if the specific author or organization behind a Web site isn't explicit enough, for instance, provided most of the information (and the location itself) are there. As far as citing specific parts of a site, of course Web sites aren't going to have page numbers. They aren't books. I don't see a problem here. On the other hand, most sites out there - and all more static media like PDFs - are organized into small enough chunks that you can usually narrow down a cite to a moderately specific page. (They're also often equipped with <a href="http://www.utoronto.ca/webdocs/HTMLdocs/NewHTML/anchors.html">anchors</a>, which are great in properly-designed sites.) If one can't due to a large block of text, browsers come equipped with search functions for a reason, at least if one's simply concerned with confirming that the information's there.<br /><br />There <span style="font-style: italic;">is </span>a real problem with some aspects of online sources. "<a href="http://www.lib.berkeley.edu/TeachingLib/Guides/Internet/InvisibleWeb.html">Deep Web</a>" materials - material which is usually procedurally generated, only accessible in its specific form through cookies, searches or other forms of interaction, and so on, are considerably more difficult to get a hold of. As if that wasn't bad enough, they're growing: the Deep Web is <a href="http://www2.sims.berkeley.edu/research/projects/how-much-info-2003/internet.htm#wbsamp">hundreds of times larger</a> than the "surface" one right now. There will have to be mechanisms to deal with this in time; handling them on a case-by-case basis is a bare minimum, however. I'm not convinced at the desirability of rejecting an entire field because of some slight inconvenience.<br /><br /><br /><span style="font-weight: bold;">Objection the Second</span>: <span style="font-style: italic;">"Just anyone can put up a Web page!"</span> Oh ho! Yes, this is true - and so what? If you believe it's difficult to have a particularly absurd piece of work <a href="http://www.daniken.com/e/bibliography.shtml">show up in book form</a> - or, in the right circumstances, <a href="http://physics.nyu.edu/%7Eas2/">appear as a published article in an academic journal</a> - then I have a bridge I'd like you to consider buying. This argument doesn't impress me at all, mainly because its main underlying assumptions - that "just anyone" <span style="font-style: italic;">can</span> put something online, and that "real" repositories can effectively prevent people from getting their crackpottery in among them - are both flatly untrue.<br /><br />Another implication of this claim bothers me considerably more. It is the claim, sometimes explicit but usually not, that the identity of a person making an argument has some bearing on the quality of the argument itself - or, indeed, is more important than said argument. This is a contemptible idea, built around <a href="http://www.nizkor.org/features/fallacies/appeal-to-authority.html">a set</a> <a href="http://www.nizkor.org/features/fallacies/ad-hominem.html">of logical</a> <a href="http://www.nizkor.org/features/fallacies/circumstantial-ad-hominem.html">fallacies</a> that all but the most sophistric freshmen are usually aware of. If we are talking about a world of debate and scholarship - and even amateurs can engage in either! - then these arguments should rise or fall on their own merits. An historical stance should be effective regardless of its creator, provided it stands up to scrutiny - but using its creators' identity as the sole point of that scrutiny is not an appropriate way to handle such things. The identity of a person <span style="font-style: italic;">can </span>influence an argument to a point - after all, consistently good (or bad) arguments can imply more of either in the future - but in the end the effectiveness of a stance should be determined by, well, its effectiveness, and not its creator.<br /><br />With that in mind, I also think it's <span style="font-style: italic;">fantastic </span>that it's easier for people to put information up for all the world (or at least a specific subset of it) to see. The amount of lousy history - and economics, and science, and art, and recipes - will go way up as a result, but there's room for the good stuff as well. We shouldn't ignore the latter because of the presence of the former, any more than we should shun good archaeologists because von Daniken ostensibly published in the field. We're dealing with a medum here which allows people to do end runs around the gatekeepers for various fields. So what if things get somewhat nuts and over-varied as a result? Personally, I want to embrace the chaos.<br /><br /><span style="font-weight: bold;">Objection the Third: </span><span style="font-style: italic;">"Online material isn't peer-reviewed and therefore shouldn't be used</span>." While this is often used synonymously with #2, above, it is a distinct complaint, and the only one of these three which I don't see as enirely without merit. While the first two complaints are ones of mere style or elitism, this is an issue of quality control. While the lack of (obvious) peer review - detailed criticism and corroboration by a handful of experts in a specific field - is indeed a problem, it is one which provides some good opportunities for the readers both lay and professional to hone some abilities.<br /><br />A huge component of the discipline of history, on the research side of things, is the notion of critical examination of sources. Note that this is not the same as merely rejection them! We are taught to look with a careful, hopefully not too jaundiced, eye at <span style="font-style: italic;">any </span>source or argument with which we are presented, keeping an eye out for both weaknesses and strengths. The things to which historians have applied this have diversified dramatically in the last several generations, moving out of libraries and national archives and accepting - sometimes grudgingly, sometimes not - everything from oral traditions to modern science to (as in public history) popular opinions and beliefs about the issues of the day or the past. It's a good skill, and probably a decent chunk of why people with history degrees tend to wind up just about everywhere despite the expected "learn history to teach history" cliche (which, of course, I plan to pursue, but hey!). Online sources shouldn't get a free pass from this - but they should not get the automatic fail so many seem to desire either.<br /><br />To one point or another, we are all equipped with what Carl Sagan referred to in <span style="font-style: italic;">The Demon-Haunted World - </span>find and read this - as baloney detection kits - a basic awareness of what may or may not be problematic, reliable, true or false about anything we run into in day-to-day affairs. There's <a href="http://homepages.wmich.edu/%7Ekorista/baloney.html">semi-formal versions</a> of it for different things, but to one level or another even the most credulous of us have thought processes along these lines. It's a kit which needs to be tuned and applied towards historical sources online - just like all other sources - and in a far more mature way than the rather kneejerk <a href="http://hotcupofjoe.blogspot.com/2007/08/defining-psuedoskepticism.html">pseudoskepticism</a> which is common these days.<br /><br />(I compiled a sample BDK for evaluating online resources a couple of years ago as part of my TAing duties at <a href="http://www.smu.ca/">SMU</a>; once I'm back home for the holidays I intend to try to dig that up and I'll follow up with this post by sticking it here.)<br /><br />The reflexive dismissal of sources of information based entirely on their media is not just an unfortunate practice. It involves a certain abdication of thought, of the responsibility to at least attempt to see some possibility in any source out there, even if it doesn't share the basic shape and style of academic standards. Besides, as I mentioned earlier, there are opportunities in this as well. The nature of online soruces isn't simply the "problem" that someone else didn't do our work for us, pre-screening them for our consumption ahead of time. Their nature is such that it underscores the fact that <span style="font-style: italic;">we need to be taking a more active role in this anyway</span>. For the basic materials out there, it's far easier to vet for basic sanity than many might think - I did effectively show a room full of non-majors how to do it for historical sources in an hour, anyway - and giving everyone a little more practice in this sort of thing can't exactly hurt. In other words, we need to approach online sources with a <span style="font-style: italic;">genuine</span> skepticism.<br /><br />But guess what? This whole thing's just a smokescreen for a larger issue anyway. We're willing, indeed eager, to hold varying degrees of skepticism towards online sources, but why are we singling them out? Why the complacency as regards citations of interviews, of magazine articles, of books? If you're going to go swinging the questioning mallet, you should at least do so evenly, don't you think?<br /><br /><br />And on that note, I head off to be shoehorned into a thin metal tube and hurled hundreds of kilometers. I shall post at you next from Halifax!pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com0tag:blogger.com,1999:blog-2249968205425769817.post-29522460706173272352007-12-04T22:42:00.000-08:002007-12-04T22:50:03.920-08:00Silently PosturingAs an aside exercise for my digital history class, we were <a href="http://digitalhistory.uwo.ca/h513_0708/?page_id=14">asked</a> to read a <a href="http://www.chi-sa.org.za/articles/posture.htm">paper</a> by Alan Cooper called "Your Program's Posture." Cooper categorizes programs as sovereign, transient, daemonic, or parasitic, the specific classification depending on how it interacts with the user, and the assignment asked us to consider where the programs we use in the course of our work lie in that grouping. I already had a good idea of where the software I use would lie, but I also felt I should read the article before going with my gut instinct of classifying everything as daemonic.<br /><br />Cooper's categories are described in terms of "postures," essentially their dominant "style" or gross characteristics which determine how users approach, use, and react to them. The first of these four postures is the "sovereign" posture: sovereign programs are paramount to the user, filling most or all of the screen's real estate and functioning as the core of a given project.<br /><br />The second is "transient," and is the opposite of sovereign software both visually and in terms of interfaces. Intended for specific purposes, meant to be up only temporarily (or, if up for a long time, not constantly interacted with), transient programs can get away with being more exuberant and less intuitive than sovereign applications.<br /><br />I realized my gut reaction of describing half the annoying stuff I use as daemonic when I realized that the third posture refers to <em>daemon</em> in the <a href="http://catb.org/jargon/html/D/daemon.html">computing sense of the word</a> rather than the more traditional <a href="http://www.deliriumsrealm.com/delirium/demonlistA-E.asp">gaggle of evil critters with cool names</a>. (Computing jargon tends to come from <a href="http://ei.cs.vt.edu/%7Ehistory/Daemon.html">the oddest places</a>.) Daemonic postures are subtle ones, running constantly in the background but not necessarily being visible to the user at any given time. Daemonic programs tend to either have no interface (for all practical purposes) or tend to have very minimal ones, as the user tends not to do much with them, if anything. They're usually invisible, like printer drivers or the two dozen or so processes a typical computer has running at any time.<br /><br />The final set of programs are called "parasitic" ones, in the sense that they tend to park on top of another program to fulfill a given function. Cooper describes them as a mixture of sovereign and transient in that they tend to be around all the time, but running in the background, supplementary to a sovereign program. Clocks, resource meters, and so on, generally qualify.<br /><br />In the interest of this not being entirely a CS post, I should probably answer the initial request on the syllabus as to how it can affect my historical research process. I'm not sure, fully, but I'm also answering this entirely on the fly and and more concerned with how it <em>should</em> affect my process. At present, I'm not using many programs specifically for research purposes. Firefox and <a href="http://www.openoffice.org/">OpenOffice</a> (which I use en lieu of Microsoft Office, moreso since that hideous new interface in Office '07 began to give me soul cancer), the main programs I tend to have up at any given time and which I obviously do a lot of my work in, are definitely sovereign program, taking up most of my screen's real estate. The closest thing I have to a work-related application that's transient is Winamp, which is usually parked in the semi-background cheerfully producing background noise I need to function properly. I don't make much use of parasitic programs due to a lack of knowledge of the options about them, mainly, and of course my daemonic ones are usually invisible.<br /><br />The chunks of this I make use of are mostly a case of "if it ain't broke, don't fix it." I've got my browser, through which I access a lot of my research tools (including <a href="http://www.zotero.org/">Zotero</a>, the most obvious parasitic application I have, and the aggregator functions of <a href="http://www.bloglines.com/">Bloglines</a>, the, uh, <em>other</em> most obvious parasitic application I have); I've got my word processor, through which I process my words; I've got Photoshop for 2D graphics work and hogging system resources; I've got <a href="http://www.blender.org/">Blender</a> for <em>3</em>D graphics stuff (much though I am annoyed by its <a href="http://zibzib.sandwich.net/wyis/blender_screenshot.gif">coder-designed interface</a>); I've got FreeMind, which is great for planning stuff out. I've no shortage of big, screen-eating sovereign applications, in other words, most of which do their often highly varied jobs quite well.<br /><br />Some of these can wander from one form to another, of course. I spent an hour earlier this evening working with Blender's animation function to produce a short CG video. When I started the program rendering the six hundred frames of that video, I wasn't going to be doing anything else with it for awhile, and was thus able to simply shunt it out of the way. That left me with a small window showing the rendering process in one corner of my screen, allowing me to work in some other stuff, albeit slightly more slowly as the computer chundered away. Cast down from the throne, the sovereign program became transitorily transient.<br /><br />What I'm wondering about now, though, are applications which fill the other two postures; stuff that you can set up and just let fly to assist with research or other purposes. An simple and obvious example of this sort of thing would be applications which can trawl RSS feeds for their user. Some careful use setting the application up in the first place - search, like research, is something which can occasionally take significant skill to get useful results - and you could kick back (or deal with more immediate or physical research and other issues) and allow your application to sift thousands of other documents for things you're interested in. Things like this are not without their flaws - unless you're a <a href="http://catb.org/jargon/html/W/wizard.html">wizard</a> with searches or otherwise incredibly fortunate, you're as likely as not to miss quite a bit of stuff when trawling fifty or five hundred or five thousand feeds. Then again, that's going to happen anyway no matter what you're researching in this day and age, and systems like this would greatly facilitate at least surveying vast bases of information that would otherwise take up scores of undergraduate research assistants to get through.<br /><br />The information is out there; there just need to be some better tools (or better-known tools) to dig through it. Properly done, something like this would need minimal interaction once it gets going; you set it up, tell it to trawl your feeds (or Amazon's new books sections, or <a href="http://www.h-net.msu.edu/">H-Net</a>'s vast mailing lists, or more specialized databases for one thing or another, etc.), and only need to check back in daily or weekly or whenever your search application beeps or blinks or sets off a road flare, leaving you to spend more of your attention on whatever else may need doing. Going through the <em>results</em> would still involve some old-fashioned manual sifting, as likely as not, but if executed properly you would be far more likely to come up with some interesting results than you would by sifting through a tithe of the information in twice the time.<br /><br />Something like this could help get data from more out-of-left-field areas, as well; setting up a search aggregator as an historian and siccing it, with the terms of whatever you're interested in, on another field like economics or anthropology or law or botany or physics might be a bit of a crapshoot, but could well also yield some surprising views on your current topic from altogether different perspectives, or bring in new tools or methods that the guys across campus thought of first (and vice versa). That sort of collision is what resulted in classes like this (or, at a broader level, public history in general), of course. I want to see more of that - much more.<br /><br />It could be interesting to see what kind of mashups would result if people in history and various other fields began taking a more active stance on that sort of thing. Being able to look over other disciplines' shoulders is one of those things that simply can't hurt - especially if we have the tools to do so more easily than we could in the past.<br /><br />I meant to segue into daemonic applications by talking some about distributed computing research, as much to see if I could find ways to drag history into <em>that</em> particularly awesome and subtle area of knowledge, but as usual my muse has gotten away from me and forced a tome onto your screen. So I do believe I shall keep that for some other time...pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com0tag:blogger.com,1999:blog-2249968205425769817.post-895084439885494072007-11-15T15:35:00.000-08:002007-11-15T12:39:54.304-08:00Ending our Fences<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSIspYPcOPTsHXqkEcM_MqA_xzCqZouNcvIe1NPTd5LGZEHXQECOAcyeiheBaEwWHa63uIzNyae1Q3KyRrHLcUM1vZGvu2etLbz-UeZUgiWNRY3bQ2VmVW_oVlYq7mSDXh3vtWF4bjzSQ9/s1600-h/picketfence.jpg"><img id="BLOGGER_PHOTO_ID_5133168493334880578" style="FLOAT: left; MARGIN: 0px 10px 10px 0px; CURSOR: hand" alt="" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSIspYPcOPTsHXqkEcM_MqA_xzCqZouNcvIe1NPTd5LGZEHXQECOAcyeiheBaEwWHa63uIzNyae1Q3KyRrHLcUM1vZGvu2etLbz-UeZUgiWNRY3bQ2VmVW_oVlYq7mSDXh3vtWF4bjzSQ9/s320/picketfence.jpg" border="0" /></a><br /><div>I've been thinking about barriers between disciplines lately - in particular, what happens when we can tear those barriers down. </div><div></div><br /><div>Last Saturday I only managed to catch a little bit of the Remembrance Day programming on TV, but I was pleased at the one thing I did see for a few reasons. It was part of a <a href="http://www.history.ca/ontv/titledetails.aspx?titleid=99344">documentary</a> about evidence from the <a href="http://www.vac-acc.gc.ca/general/sub.cfm?source=history/firstwar/canada/Canada8">Battle of the Somme</a> in general, and the fate of the (Royal) <a href="http://www.army.forces.gc.ca/37CBG_HQ/1nfa_home.htm">Newfoundland Regiment's 1st Battalion</a> at <a href="http://www.firstworldwar.com/today/beaumonthamel.htm">Beaumont-Hamel</a> in particular. Most of the part that I was able to catch involved trying to identify and evaluate some of the footage allegedly taken during the battle, a particularly important thing to keep in mind considering that even then war footage was <a href="http://www.bbc.co.uk/schools/worldwarone/hq/wfront3_03.shtml">staged for propaganda reasons</a>.<br /><br />To confirm (or disprove) the veracity of the footage, the researchers drew together people from several different disciplines: historians, archaeologists, archivists, surveyors, video experts, forensic scientists, and I'm fairly sure I'm missing a few. The main piece of footage they focused on was the detonation of one of the great explosive mines at the very start of the battle (visible, very prominently, just under one and a half minutes into <a href="http://youtube.com/watch?v=eewXEOfuIsQ">this compilation of clips</a> from the battle). This one wound up confirmed as accurate; through piecing together footage, accounts of the battle, a large amount of surveying and GPS work around the mine crater, and talking to descendants of the Somme's veterans who were shown the <a href="http://www.battlefield-site.co.uk/no_mans_land_01_beaumont_hamel.jpg">still</a>-<a href="http://www.battlefield-site.co.uk/no_mans_land_02_beaumont_hamel.jpg">scarred</a> <a href="http://www.battlefield-site.co.uk/shell_holes_beaumont_hamel.jpg">battlefield</a> by their parents or grandparents after the war, and other research which turned up the records of the cameraman who had shot the scene. The confirmation was a spectacular success, as the crew found the exact point, to within a couple of feet, where the cameraman had stood that day. Demonstration of this, fading the original footage in and out on top of the new footage, created a fairly eerie effect, blurring the lines between past and present in an interesting way.<br /><br />The project also confirmed the veracity of a few other sequences, which turned out to have been shot within minutes of that one, from the same spot, as the cameraman panned the camera to one side to capture some footage of the battalion's disasterous advance. I think that was an unintended discovery, but a good one nonetheless, another brick of This Really Happened in the knowledge wall. Alas, I surrendered the TV at that point to the roomies and the sacred tradition of The Game (and just when they were taking those videos a step further by trying to ID the figures in them - nice!), and didn't get to see what happened next.<br /><br />But what I saw was some neat enough application.<br /><br />The day before, a few of the other digital history students and I went to a guest lecture at the <a href="http://www.uwo.ca/">university</a> given by <a href="http://www.brocku.ca/history/faculty/jbonnett/index.php">Dr. John Bonnett</a> of <a href="http://www.brocku.ca/">Brock University</a>. Dr. Bonnett, an historian and Canada Research Chair in Digital Humanities, was giving a talk with the triple-fisted title of "new challenges, new opportunities for history: collaborative environments, high-performance computing, and the future of the historian's craft." The talk was, to be honest, a little on the disorganized and ill-paced side, and could've gone better in ninety minutes instead of sixty. On the other hand, we had a time slot of an hour, and Bonnet's <span style="FONT-STYLE: italic">hundreds </span>of minutes' advance warning would make it difficult to get across fairly simple topics, never mind the highly-technical ones he discussed.<br /><br />So what did he discuss? I could be a <a href="http://xkcd.com/336/">smartass</a> and say that he talked about new challenges and opportunities for history by discussing collaborative environments, high-performance computing, and the future of the historian's craft, but I should probably give at least some detail. Dr. Bonnett's talk outwardly appeared to be something one would expect to see coming from a computer science (or at least <a href="http://www.fims.uwo.ca/mlis/index.htm">information science</a>), but there was a lot of meat in there which has potential uses in either digital history specifically, or the broader field as a whole.<br /><br />Much of the first half of the talk was focused on the versatility of various sources of information - even original, primary documents - when combined with new tools and techniques which have become available over the course of the last generation. This was explained in the context of a <a href="http://3dlearning.iit.nrc.ca/3DVirtualBuildings/">project</a> (description at another site <a href="http://www.intute.ac.uk/artsandhumanities/cgi-bin/fullrecord.pl?handle=humbul2731">here</a>, for those who tire of the awkward site design) Dr. Bonnett was engaged in, where various primary sources such as photographs, street plans and so on were used to generate three-dimensional recreations of Canadian street scapes from the late nineteenth and early twentieth centuries. The reconstruction process also involved some judicious use of <em>educated </em>guesses (a notion I also consider sorely underrated) to fill in gaps, e.g., determining what the east side of a building may look like if photos only show the north and west faces.<br /><br />The buildings were not the only result of the project; the plan was to produce not merely an exhibit, but a research tool in and of itself. To do this, various other sources including the primary materials used in the original reconstruction, modern sources, various other audio/visual records, and so on can be brought into the reconstruction. On top of this, the reconstructions weren't limited by a specific point in time, either; different views of the street at different times could be changed accordingly. The materials were all brought together in terms of a hub-spoke model, where objects were defined as much by their relationships with one another as in isolation. Done properly, this approach results in a detailed, interactive and highly nonlinear narrative, allowing the user to notice unexpected connections or create and explain their own. There's a lot of potential in this kind of arrangement, to say the least.</div><br /><div></div><div>The remainder of the lecture had a far more technical focus involving two major concepts: the use of dedicated or distributed networks as collaborative research environments, and agent-based modeling as a source for simulation or experimentation in historical research. I want to go into some detail on those, but it would greatly expand an already-large post. If anyone's interested, prod me and I'll talk about those in a subsequent post. Instead, I'm going to go on to Dr. Bonnett's conclusions from all of this, as well as my own.</div><div></div><br /><div>Dr. Bonnett made a rather bold - and, in my opinion, accurate - statement about the significance of all of these tools. He argues that the development and proliferation of these sorts of research and collaboration environments is <em>at least as significant to the spread of human knowledge as the development of the book itself</em>. Implications of <a href="http://www.jargon.net/jargonfile/s/SturgeonsLaw.html">Sturgeon's Law</a> aside, the changes these sorts of things are potentially bringing into history in particular and communication in general really are a difference of kind, not simply degree. A lot of the results of the digital revolution that's been riccocheting around the world in the last few decades, whether research tools like Dr. Bonnett's or exhibitions for the public which make the most use of new tools (something I <a href="http://p-stewart.blogspot.com/2007/10/exhibits-of-futuretm.html">discussed in a previous post</a>) simply could not exist, at all, in earlier years. Now they're here, and they're not going anywhere. </div><br /><div></div><div>I'm convinced that tools and methods of these sorts are woefully misunderstood, in both a passive and a very, very active sense, in the field of history. Modern tools such as computing or other sci-tech applications are certainly <em>studied</em> a great deal in universities - <a href="http://www.fordham.edu/halsall/science/sciencesbook.html">a</a> <a href="http://www.fas.harvard.edu/~hsdept/">simple</a> <a href="http://www2.lib.udel.edu/subj/hsci/internet.htm">Google</a> <a href="http://hos.princeton.edu/">search</a> <a href="http://histsci.wisc.edu/">can</a> <a href="http://www.ou.edu/cas/hsci/">find</a> <a href="http://echo.gmu.edu/center.php">a</a> <a href="http://ocw.mit.edu/OcwWeb/Science--Technology--and-Society/STS-310Fall-2005/CourseHome/index.htm">veritable</a> <a href="http://www.clas.ufl.edu/users/rhatch/pages/10-HisSci/links/">cornucopia</a> <a href="http://depts.washington.edu/hssexec/library_list.html">of</a> <a href="http://www.lib.lsu.edu/sci/chem/internet/history.html">examples</a> <a href="http://hss.sas.upenn.edu/mt-static/">of</a> <a href="http://digicoll.library.wisc.edu/HistSciTech/">this</a> <a href="http://www.indiana.edu/~hpscdept/">sort</a> <a href="http://web.jhu.edu/hsmt">of</a> <a href="http://groups.physics.umn.edu/hsci/">thing</a> - but not nearly enough effort is being put into <em>applying </em>them, or even understanding them at a level beyond theory. This really does need to change; expanding the discipline's knowledge base in these sorts of directions (and others, such as merely interfacing with other disciplines considerably more than we tend to) will gain scholars and students both a great deal in terms of resources, topics and other opportunities. Avoiding this gains little at best. </div><br /><div></div><div>While it's more or less taken for granted in an academic environment that we'll tend to erect our little picket fences (or trench lines) between departments or concepts or the like, I'm convinced that doing so too actively is a Very Bad Thing for a number of reasons. Rants about active refusal to learn an available topic at a university aside, I think that there are simply too many potential opportunities for most aspects of history - research, teaching, presentation on both the academic and public levels - to discard or mischaracterize as pointless out of hand. While I don't take things quite so far in the generalist direction as, say, <a href="http://www.elise.com/quotes/a/heinlein_specialization_is_for_insects.php">Heinlein did</a>, I do believe that these sorts of changes <a href="http://www.youtube.com/watch?v=pMcfrLYDm2U">aren't going anywhere</a>, and we should do a better job of recognizing that sort of thing than we currently do. </div>pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com3tag:blogger.com,1999:blog-2249968205425769817.post-3848718017222360672007-11-11T20:35:00.000-08:002007-11-11T21:42:01.045-08:00Lest We RememberIt is, for at least the next half-hour or so, Remembrance Day: the day of the year so explicitly built around the concept of memory that the notion is enshrined in its very name.<br /><br />So, of course, I'm going to spend a good chunk of this post talking about its inverse.<br /><br />At the appropriate hour today in his time zone, a friend of mine in Australia made a fairly simple commemorative post on his personal blog. It consisted simply of the date and time of the Armistice and the word "Remember." I can certainly appreciate the minimalist nature of that kind of comment; it says most of what needs to be said about the date where it really matters. (Myself, I tend to post some form or another of antiwar poetry at whatever online presence I've been loudest at that year.)<br /><br />What caught my interest, however, was a statement in the discussion comments beneath the post. Someone mentioned that they'd forgotten the date altogether, but seemed to find that alright, because "[w]ar isn't something to be remembered."<br /><br />My initial reaction to that kind of statement tends to boil down to "the hell it isn't," but that's largely a combination of my inner historian and (as far as I can comfortably carry the concept) pacifist speaking. I'm old-fashioned enough as far as the notion of remembering events goes that I can say I think George Santayana got it right in one without feeling silly. But there's something else in there that warrants thinking about, as there is with all but the most inane statements.<br /><br />I'm pretty sure everyone reading this - and everyone who isn't - has a few files parked in their brains' storage that they'd like to be able to delete for one reason or another. They might be big things - being on the wrong end of violence (including, yes, war) or a natural disaster, a major personal failure leaving one with a nagging case of the wouldacouldas - or they might not, being simply minor slights or shortcomings which anyone else would consider not a big deal but, to their bearer, ache like an old wound years after the fact. Most people do have a few things that they simply Don't Want To Remember, but we're largely forced to deal with not having that capacity outside of damaging levels of repression. (That is <a href="http://www.emaxhealth.com/32/17942.html">beginning to change</a>, no doubt raising temptation and concern in equal measure in quite a few heads.)<br /><br />That's forgetting on a very personal and individual level, of course, and I'm (generally) libertarian enough to think that people should think or do as they will in that regard - if we own anything at all, we certainly own our own minds. With statements like "war isn't something to be remembered," however, a larger issue comes up. It implies the recommendation that there are experiences and memories which are best excised not just from individuals' minds, but from the collective memory of entire cultures.<br /><br />Cultures - families, towns, nations - deal with that issue of things they'd rather not have experienced on a level which may be somewhat more diffuse than you or I do, but the notion is still there. They handle it in different ways, some good and some bad. Witness Japan, still <a href="http://news.bbc.co.uk/2/hi/programmes/from_our_own_correspondent/3632699.stm">struggling</a> with its role and experiences in the Second World War; witness Rwanda, <a href="http://www.wilsoncenter.org/index.cfm?topic_id=1417&categoryid=09B744E1-65BF-E7DC-46BC6A31FBCAF62A&fuseaction=topics.events_item_topics&event_id=64588">taking very much the opposite route</a>; witness Canada, which has integrated its current status as a (largely) tolerant and open multicultural society to the point where few people in my generation have the least clue that <a href="http://archives.cbc.ca/IDC-1-71-1579-10644/conflict_war/echoes_of_auschwitz/">we have our own moments of shame</a>. Different reactions for each one; Japan is simultaneously trying to remember and forget; Rwanda is drawing its memories out as much as possible to face and address them; Canada has largely <em>successfully</em> forgotten some of its own black marks.<br /><br />I'm torn on this kind of thing. On a personal level I despise the idea of excising events from the community's memory. I don't like denial, and the idea that [insert concept here] should not be remembered or thought about is something I generally find deeply appalling. Of course, I'm one person; other historians may not have as much of a problem with this, and historians as a community don't exactly have very strong control over the community's memory at large. I'm not hubristic enough to see myself as The Gatekeeper Of Historical Memory. Simply put, the question of what to remember and what to forget just isn't my decision beyond an immediate level: my own mind, those of people I teach or speak to or write to. If a society at large decides that something is to be forgotten, I must admit that while I may have some influence over that decision, in practice I'm relatively powerless.<br /><br />So here I am, left wondering precisely how to react to the attitude and the concept. I'd prefer not to vanish wailing over the precipice of <a href="http://despair.com/despair.html">despair</a>. I believe historical memory is pretty important - especially when we're talking about it in the context of the current holiday, where we mainly remember a series of intertwined conflicts (I was going to say "from this century," but I reminded myself it's been "the previous century" for some time now) where over a hundred million lost their lives in the name of purest good, blackest evil and everything in between. "Really" remembering those events - having those who had been through them in person around to remind us in a more visceral way than a textbook ever could - is only going to become more difficult in the next decade or so as the survivors of that time pass on. What do we do <strike>to</strike>about the people who believe that we should gloss over things like that, or forget their existence altogether, though?<br /><br />Obviously we have it in us to continue to present material others would prefer not to think about. That's one of the fun things about the field, after all. (There's a running gag in political science that if you don't anger someone now and then, <a href="http://zibzib.sandwich.net/wyis/wrong05.jpg">you're doing it wrong</a>; I believe that applies to history as well, or possibly moreso.) Of course, on the public history side of things, there's going to be situations where that's not an option. We're likely to be told now and then that we must put such-and-such a face on things, emphasising one set of memories while pushing another set, which may be every bit as relevant and important and interesting, aside. We're going to be told now and then that we <em>should</em> whitewash, distort, forget or refuse to mention and discuss something, as the War Museum and Smithsonian (<a href="http://www.jsonline.com/story/index.aspx?id=679606">among</a> <a href="http://www.canada.com/reginaleaderpost/news/viewpoints/story.html?id=d11494ae-efdb-4183-9191-12d06a20ec7c">other</a> <a href="http://www.economist.com/world/europe/displaystory.cfm?story_id=10064654">places</a>) have discovered in recent years.<br /><br />This issue's here, it likely has always been, and it likely always will be. So what do we <em>do</em> about it? Do we muddle through like we always have been or ignore the people who advocate such ignore-ance (to borrow Michael Frish's term), in effect forgetting them? Do we engage, or possibly confront them? How should we respond to having to choose between a representation (or misrepresentation) of history which mandates that we forget something we consider important on the one hand, and our careers on the other?<br /><br />I have no particular clue at the moment. But then again, it's no longer the eleventh; it's 12:30 on the twelfth, and is starting to feel it. So I'll waive my personal responsibility to answer my own rhetorical questions even as I shout 'em into the darkness, and claim retroactively that it was merely my point all along to stick those questions into your head, allowing them to fester in a multiplicity of minds rather than just one. I meant for that to be the case. Really.<br /><br />At least, I'd prefer you remember it that way.<br /><br /><br /><br /><br />And it's overdue, but I mentioned it as a tradition of mine at the start of the post. As it is still Remembrance Day in my head, I invite you all to have some Wilfred Gibson poetry.<br /><br />I also invite you to think for awhile of that "long war" that consumed most of three decades of the twentieth century (for the Great War was not an isolated one); not just its courses and the numbers, but the reasons, the ideas, the dreams, and most importantly, the people it affected. History is a human thing, comprised of humans' stories and experiences; it cannot exist without us and is diminished when we are. That bloody century we're still staggering out of robbed us of far too many stories and storytellers both; if we at least hope that we do a better job with this century, then we've at least got a better start on this one.<br /><br /><blockquote> They ask me where I've been,<br />And what I've done and seen.<br />But what can I reply<br />Who know it wasn't I,<br />But someone just like me,<br />Who went across the sea<br />And with my head and hands<br />Killed men in foreign lands...<br />Though I must bear the blame,<br />Because he bore my name.<br /> - Wilfred Gibson (1878-1962), "Back"<br /></blockquote><blockquote></blockquote>pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com0tag:blogger.com,1999:blog-2249968205425769817.post-7963522330792511002007-10-17T08:50:00.000-07:002007-10-17T09:09:22.593-07:00A Review of LiviusAs an assignment for my public history seminar, we were required to review an historical website. For my own target, I chose <a href="http://www.livius.org/">Livius - Articles on Ancient History</a>, created and maintained by Jona Lendering in Amsterdam. This site has been around for a <em>long</em> time, as I'll mention below, and I've been familiar with it for most of its existence. I hadn't looked at it in a few years as of the assignment, actually; I figured that would be an interesting target for my newfound Mad History Skillz, and reviewed it last week. Without further ado:<br /><br />Asked why he created Livius on his site's <a href="http://www.blogger.com/%3Ca">FAQ</a>, Dutch historian Jona Lendering cites his impatience with scholars' tendency to write to specialists more than the general audience. That, along with the lack of clearly-written and easily-accessible, yet still scholarly, material for non-specialists, inspired him to launch his considerable website on ancient and classical history in 1996. For the last eleven years – an eternity in “Internet time!" – Livius has remained more or less exclusively a one-man endeavor. The site is regularly maintained, being modified or expanded roughly once or twice a week. Lendering refuses to accept outside help producing content for the site, preferring to bear sole responsibility – and blame – for any errors on the site. (Many of the site's pictures are the main exception, many of them having been taken by his colleague Marco Prins.)<br /><br />For a personal project, Livius' scope is vast. As of its September 29 update, the site boasts over 3,200 separate pages. While many of these can be quite short, with Lendering promising to expand them later, several hundred are <a href="http://www.livius.org/jo-jz/judaea/judaea.htm">substantial</a>, <a href="http://www.livius.org/ga-gh/gallic_empire/gallic_empire.html">encyclopedia</a>-<a href="http://www.livius.org/as-at/assyria/assyria.html">style</a> <a href="http://www.livius.org/oa-om/olympias/olympias.htm">articles</a>. Nearly all articles are illustrated to one extent or another, with a mixture of maps, images of coins or ancient artwork, and photographs of what different regions discussed look like today. Several articles expand into large subsections in their own right. For example, the <a href="http://www.livius.org/caa-can/caesar/caesar00.html">section on Julius Caesar</a> is a twelve-section biography with two dozen annotated and translated excerpts from primary sources, a single link in the main index branching into thirty-seven separate pages. The vast majority of articles on the site are heavily cross-linked to others, with some off-site links as well. The scope of the site is impressive geographically and chronologically as well: the broadest sections of the site are nine of the major regions generally accepted within ancient and classical history (<a href="http://www.livius.org/anatolia.html">Anatolia/Asia Minor</a>, <a href="http://www.livius.org/carthage.html">Carthage/North Africa</a>, <a href="http://www.livius.org/egypt.html">Egypt</a>, <a href="http://www.livius.org/germinf.html">Germany</a>, <a href="http://www.livius.org/greece.html">Greece</a>, <a href="http://www.livius.org/judaea.html">Judaea/Palestine</a>, <a href="http://www.livius.org/babylonia.html">Mesopotamia</a>, <a href="http://www.livius.org/persia.html">Persia</a> and <a href="http://www.livius.org/rome.html">Rome</a>), with the sections on Greece and Rome the most developed. Other sections have their own strengths: for example, a large collection of <a href="http://www.livius.org/cg-cm/chronicles/chron00.html">Mesopotamian primary sources</a> with images, transliterations, and translations.<br /><br />Two major problems exist with the site's content: the issue of sourcing and the Livius' inward-looking nature. The first is perhaps the most serious: very few articles have formal bibliographies, although several (particularly in the Greco-Roman and Jewish sections) do discuss primary sources, often at length with excerpts. This is by no means consistent across the site, unfortunately. (Lendering mentions in his site's FAQ that he is reluctant to reference secondary sources often because of growing plagiarism using Livius.) The balance of links is another major problem, as the vast majority of links are within the site itself. While this means the site is <em>very</em> well cross-referenced, it limits the site's use as a jumping-off point to other resources, at least directly. Livius' front page does have a collection of links to “related websites,” however.<br /><br />Lendering begins to run into accessibility problems with how he organizes and presents his information. Livius is organized as several layers of indices, which means someone accessing the site will usually have to encounter one or two alphabetized lists (sometimes roughly subcategorized into geography, biography, etc.) before getting to the articles they seek. This can be daunting if a reader is seeking general information rather than a particular topic. Lendering has recently added a Google custom search to his main page, however, which makes finding specific articles easier than in the past.<br /><br />In terms of appearance, Livius betrays its age. Lendering first launched his site in 1996, before the combination of ubiquitous broadband and greatly expanded computer capabilities began to shape Web design. Lendering has continued to use many of the design principles of that earlier era on his page, keeping to a very minimalist, no-frills design which may appear (please pardon the pun!) rather Spartan to contemporary eyes. This approach, combined with a navigation bars at the top or bottom of most pages, makes navigating the site quite easy: links are obvious and pages load quickly, even on dialup connections. However, this sometimes causes problems visually; images are often themselves sized by the standards of lower-resolution monitors. Many appear <a href="http://www.livius.org/as-at/asellius/clodius_albinus.jpg">unpleasantly small</a> on modern screens, particularly for those viewers who like lots of detail or close-ups. As with the rest of the site, however, the images are themselves being steadily updated, with <a href="http://www.livius.org/a/1/romanempire/gladiators_sIBCE_mus_munchen.JPG">more “modern” sizes</a> appearing in newer articles.<br /><br />Lendering seems to have had mixed success in his stated goals for Livius. He does accomplish part of his intended purpose, by having a free resource online from which readers can get a fairly good picture of the ancient world, particularly classical times. However, ease of access to this information is limited by the site's significant organizational problems and some gaps in its selection. Livius is a work in progress, and due to the scope of the era which Lendering is attempting to document – and the fact that it is, at its heart, a personal project – it will likely remain so for some time. Perhaps unfortunately for Lendering's intentions, it is likely to be more accessible to students or hobbyists who already have some amount of ancient history knowledge under their belt before visiting, especially if his idea of the “general audience” is those just beginning to study the period. Livius is a site which aspires to be comprehensive and which aspires to be accessible to the wider public, but does not quite – as a living site, perhaps does not yet quite – meet these goals.pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com0tag:blogger.com,1999:blog-2249968205425769817.post-38457830194568332432007-10-11T15:02:00.000-07:002007-10-11T15:19:33.746-07:00Exhibits of The Future!(tm)In my last post I linked to an interview on "technologies of persuasion." There's a pretty heavy advertising element to that, obviously, but it's an element I think could be used in producing history at times. Anyway, I'm bringing that up mainly because I found an example of this sort of thing the other day that could be fairly easily applied both to that concept - it is, at its heart, an advertisement - and as a neat way of presenting history that shows the kinds of things you can do with contemporary technology, a bit of creativity, and a tremendous amount of caffeine.<br /><br />A few days ago, a friend of mine pointed me at a neat example of how one could present a history exhibit with modern technology in the form of <a href="http://www.halo3.com/believe">this advertisement for Halo 3</a>. What could a science-fiction FPS have to do with the presentation of history, you ask? Well, take a look at that site. It's Flash-heavy and has audio and video components for those of you whose computers may not be up to the task, but anyone with a moderately-recent machine shouldn't have a problem.<br /><br />For those of you who can't (or won't) check out the URL, the basic premise of the advertisement, with all the game-setting stuff boiled out, is that it is a historical exhibit - specifically, a war diorama/memorial. It's a very large one, hence the tremendous amount of caffeine, but what's neat about this is the way it's displayed. The viewer's perspective isn't looming over the entire display, the way we tend to stand over most such exhibits in a typical museum, but it's down at the display's ground level as the camera pans and weaves through it. (That panning and weaving is largely under the user's control; you can go through it relatively freely.) That's just neat on a visual level, but what makes it <em>especially</em> neat, at least in my opinion, is how additional content is worked in at various points. At regular intervals in the tour through it, a link will pop up over one figurine or another. Those links lead to content which expands the context of the scene - a "first person account" in the form of a statement from Someone Who Was There on this link, a biographical sketch of another person on that one, a video of a veteran being interviewed elsewhere in the museum for another, a description of one alien baddie or another at another link, and the occasional spot where the tour pauses to allow a full panoramic view of an important location.<br /><br />At first I simply looked at it thinking "well, this is certainly a damn cool piece of work" - I tend to have a healthy respect for anything that was obviously done painstakingly and well, and this is no exception to that. But after a few minutes I started thinking about it some more. This advertisement is in the form of an exhibit at a fictional museum, of course. It's an ad for a computer game, after all. But what if we got a few other people together and gave them some modern midrange hardware and software, a bit of creativity, and a tremendous amount of caffeine?<br /><br />This thing isn't just an advertisement to me, although it is (at least to this semi-casual Halo fan) a pretty effective and extremely good-looking one. It is also, perhaps after one distills the game's elements out of it and looks at it on a more abstract level, a template for a pretty impressive, interactive type of exhibit in general. On top of the eye candy factor, it's a neat way of taking a diorama - normally a pretty passive sort of display, much like most things you'll see in museums - and turning it into something interactive.<br /><br />If this could be made, then why not, say, a similar treatment of a diorama of Stalingrad?<br /><br />Or Rome at its height?<br /><br />Or 1930s New York City?<br /><br />Or <span style="font-style: italic;">anything</span> else, for that matter?pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com2tag:blogger.com,1999:blog-2249968205425769817.post-40470016382230046872007-10-03T08:57:00.001-07:002007-10-03T09:50:10.751-07:00You Are Getting Verrrrry Innnnnterested....A few years ago I earned the nickname "Patient Zero" among several of my friends. Fortunately for all involved, the infections involved were mental: I had a tendency for awhile to get bitten by one interest or another in such a way that those around me managed to pick it up as well. A couple of them would take advantage of that: "I want the guys to listen to this new album, so I'll get Patrick interested in it and the rest will take care of itself." Given how esoteric my interests, and those of my friends, are, this has caused some spectacular feedback loops at times.<br /><br />So when I just stumbled across <a href="http://www.worldchanging.com/archives/007341.html">this interview</a> over on <a href="http://worldchanging.com/">WorldChanging </a>with author Doug Rushkoff, in which he's asked about his recently-created course on "technologies of persuasion," my curiosity was piqued. Early in the interview, he takes issue with some popular ideas on what persuasion entails:<br /><blockquote></blockquote><blockquote>Seriously, I wouldn't want to use any tactic to <em>get</em> someone to take my course, or to do anything at all. Once a person has been cajoled, there's almost always a negative effect later on. Chairman Mao used to talk about this – how people can't be inspired to foist a revolution, but that it has to come from them. (Not that he lived or led true to this dictum.)<br /><br />I get asked all the time, "how can we <em>get</em> people to be more this or more that?" Usually by Jewish groups looking to get kids to be more Jewish, progressive groups looking to get people to be more politically active (or at least to contribute money to the right PAC), or my editors asking me to get more people to buy my books. And I think the object of the game is to get out of the mindset of "getting people to do something" and instead just create a really nice, really open invitation.</blockquote> <blockquote></blockquote><br />The key to doing this, Rushkoff believes, comes in the form of <span style="font-style: italic;">connections</span>:<br /><br /><blockquote>My whole pitch on marketing and communications is for companies to stop creating mythologies and persuasion campaigns around the products that they're disconnected with, and to<span style="font-weight: bold;"> start getting involved in some aspect of the thing they're selling</span>. [emphasis added]<br /></blockquote><br />It definitely has a larger focus on things like, say, marketing or politics than a broader, in-general How To Convince People About Stuff sort of persuasion, but I also believe there's room for some overlap here into "our" topics such as presentation of history outside the academy. As historians, we may not be selling a product in the conventional, give-us-money-we-give-you-stuff sense - though the universities may well be, given the rise of the student-as-customer mindset (which is a whole other rant anyway). But we <span style="font-style: italic;">are</span> trying to get ideas across to others, and most of us at least aren't trying to limit that to a stagnant, preaching-to-the-choir sort of situation.<br /><br />I'm not, at least, as someone who's studying public history, and also as someone who has his own portion of that vaguely ivorytowerian "why oh why don't people know anything about their history woe woe arrgh?"angst (which I'm sure I share with many of those who read this). Doing some looking around in order to find ways to reach audiences, or perhaps even create them, seems like something worth chasing to me. I'm normally allergic to marketing lingo, joking that people should need a license to use the word "paradigm" in a sentence, but this interview at least piqued my interest enough to try to <strike>get</strike><em>persuade</em> you guys to take a look at it and think about some of it.<br /><br />It's worthwhile for those two points I quote above, I believe: that you can't really <span style="font-style: italic;">make</span> someone be interested in something (after all, as several of us discussed yesterday, the consumers' - and audience's - thoughts and beliefs <span style="font-style: italic;">will</span> remain their own, beyond our feasible reach, unless they themselves decide otherwise), and that some kind of involvement and connection - <span style="font-style: italic;">doing</span> and <span style="font-style: italic;">being</span> instead of simply <span style="font-style: italic;">selling</span> or <span style="font-style: italic;">pushing</span> - is probably a better way to spark interest in others.<br /><br />Great! It's all so clear now!<br /><br />Well, aside from the implementation part. Yeah.<br /><br />I definitely like and agree with the idea. The question of how we can <span style="font-style: italic;">do </span>these things, of course, depends on as many separate variables as our interests and circumstances and projects may present. So I don't know. On the one hand, the advice may seem unnecessarily vague, especially if we're a little outside the box as historians. On the other hand, it's still useful for all its vagueness: blank checks can be fun!pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com0tag:blogger.com,1999:blog-2249968205425769817.post-88959393291745363932007-09-26T09:38:00.000-07:002007-09-26T09:52:22.479-07:00Technopeasants of the Academy, Unite!Academia isn't, of course, the only realm where people are currently <strike>going at it hammer and tongs</strike>debating the implications of the Internet as a tool for production and distribution of ideas both new and old. Some of those realms, though, might surprise people who are entering or becoming fully aware of the debate within history or other such fields, though.<br /><br />Back in April, a fellow by the name of Howard Hendrix <a href="http://community.livejournal.com/sfwa/10039.html">flew off on a self-described "rant"</a> condemning writers who use the Internet to give their work away freely. He says that he is "opposed to the increasing presence in our organization of webscabs, who post their creations on the net for free," going on to define the neologism "webscab" as someone who undercuts his fellow workers (or in this case writers), thereby undermining the fight for better pay and working conditions, etc. He says they are "rotting our organization from within" along with a few other similarly loaded terms of phrase, and goes on to describe the victims of webscabs - the people who sell their work in the traditional venue - as being converted into "Pixel-stained Technopeasant Wretch[es]." The existence of these webscabs, in fact, offends him so much that, as part of his right to resist technology he sees as "destructive to [his] ways of life and [his] beliefs," that he's decided not to seek a renewed term as the vice-president of his organization. After his term ended, he would in fact step away from technology altogether, saying he'll answer emails but "won't blog, wiki, chat, post, LiveJournal, lounge or lurk -- and [he]'ll be the happier for it."<br /><br />So what's so unusual about this? It does, after all, sound like a kind of complaint that has come from a variety of different directions in the last few years, though worded in less confrontational terms. And confrontational those terms are; Howard Hendrix's words sparked outrage of terrible power, still palpable when people in his field discuss it today, several months later.<br /><br />Oh, I forgot to tell you what his organization is? I should probably do that - Howard Hendrix was vice-president of the Science Fiction and Fantasy Writers of America, one of the primary SF/F-related organizations on the planet, source of the Nebula Award, one of the higher honours a writer in either field can receive.<br /><br />In a blog article over on <a href="http://www.boingboing.net/">Boing Boing</a>, Cory Doctorow - who in January 2003 released his first novel, <u>Down and Out in the Magic Kingdom</u>, online for free in addition to releasing it as "a physical object" in bookstores, under a Creative Commons license - <a href="http://www.boingboing.net/2007/09/24/debate-pixelstained.html">gives his own views on Hendrix's statements</a> after having followed a debate between Hendrix and "web-novelist/podcast-novelist" Scott Sigler at a science-fiction event in San Francisco last week. As one can safely assume from the fact that Doctorow <i>is</i> one of the "webscabs" Hendrix rails against, he takes strong exception to the arguments against using the Net as a medium for releasing new material, particularly if it's being done freely. Hendrix made various arguments, ranging from economic problems to mere rhetoric, and these seem to be addressed in turn well enough. While Hendrix is obviously in the minority within the SFWA on this opinion - witness the vitriol in the LiveJournal thread relaying his original statement - there is still a debate going on there to the present.<br /><br />So if big names in the <i>science fiction community</i>, one of the more technology-friendly bodies of people on the planet (at least for the most part), are arguing back and forth over the Net as a medium, there's clearly something worth discussing here. As I am posting it here, I rather obviously believe that there's some connection to digital history practices.<br /><br />One of Doctorow's points, and <acronym title="In My Humble Opinion">IMHO</acronym> his most important one, is that the Net as a medium "diversifies the ways in which works find audiences adn vice-versa, undoing the 20th century's enormous trend to concentration and more bargaining power for fewer media companies." The concerns about monopolization of knowledge are probably less pressing in academia - at least in the humanities or social sciences upon escaping the freshman level tomes. However, the potential benefit of getting information to audiences which <i>want</i> information but may not have access to it - or indeed, may not even know the information they want is out there[1] - is fairly obvious to me. I, like Doctorow, flatly reject arguments that suggest people will stop buying books, or the "ugly straw-man, visibly untrue" that those who support this kind of ready distribution are naive optimists.<br /><br />What I <i>do</i> see as an issue in online distribution - particularly of the free and unfettered kind, <i>particularly</i> particularly of the free and unfettered kind dealing with academic topics such as history - is the problem of quality control. <a href="http://www.science.uva.nl/~mes/jargon/s/sturgeonslaw.html">Sturgeon's Law</a> holds for a lot of things released online, to the point where one may think that Sturgeon was perhaps being a little optimistic. There is no shortage of <a href="http://timecube.com/">incomprehensibly weird, if sincerely-expressed</a> material out there[2], and I do believe seperating the good from the bad, or the bad from oh-my-God-make-me-unsee-that, is a problem. While it's different in degree from what we run into in the average bookstore, or even the average university library, though, I don't think the matter is that different in <i>kind</i>.<br /><br />Others may disagree, of course; I know full well that I'm nowhere near the <a href="http://digitalhistoryhacks.blogspot.com/2007/04/luddism-is-luxury-you-cant-afford.html">"self-proclaimed Luddite"</a> camp (and in fact often joke semi-seriously that I would love a brain-to-Photoshop interface for working with graphics). I certainly see more potential than risk or threat in the digital age, though. I'm unconvinced by the arguments surrounding The Imminent Death Of The Book and other such things, and have always believed that if someone wants to get their work out there for free (or for whatever they want to request in exchange for it), more power to them. The Internet and the media shooting off from it make it easier to do things like that <a href="http://youtube.com/watch?v=bDaB-NNyM8o">in some unorthodox and interesting ways</a>, and I enjoy seeing some of the things that can result from that.<br /><br />As for the quality issue? I dunno. Even the bad stuff out there can spark discussions which can lead in interesting directions. And the bad stuff out there that doesn't do that, which doesn't languish either, but merely incites or misleads or otherwise displays itself as the result of abuses of history or cryptohistory? Well, they put their stuff out there, so what's preventing us from issuing forth refutations. Hendrix complained about the impact of free releases on the SF industry; Doctorow's response in both words (his refutation of Hendrix's argument) and deed (his first novel was a commercial success <i>despite being available for no-strings-attached free download</i>, and nearly won a Nebula besides[3]) is perfectly clear.<br /><br />[1] - I assume you've all had your fair share of "so <i>that's</i> what that is! now I have a <i>name</i> for it!" moments. If not, what's your excuse?<br /><br />[2] - A friend of mine is attempting to popularize the idea of using "the Timecube" as a unit of measurement for just how, well, stark raving mad a given source or person or argument sounds. "Some guy called into the radio talk show this morning, and flew into this incoherent rant that topped out around 0.8 Timecubes!"<br /><br />[3] - As a result of this whole debacle, <a href="http://www.boingboing.net/2007/04/15/april-23-is-internat.html">International Pixel-Stained Technopeasant Day</a> was declared on April 23rd, in which authors (and anyone else who wanted in on it) could release "a professional-quality work" for free on their websites. Somewhere between scores and hundreds of people, including some fairly big names in the field, participated, and thousands of amateurs had fun with it as well.pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com0tag:blogger.com,1999:blog-2249968205425769817.post-4634158138934056912007-09-09T13:05:00.000-07:002007-09-10T15:24:42.252-07:00Cleverly-Named Introductory Post"First post!" as they say over on <a href="http://slashdot.org/">Slashdot</a>. Not quite as satisfying since I'm the only one who <span style="font-style: italic;">can</span> post here, but oh well. That just means I can claim the right to a moderately rambling and self-indulgent introduction, doesn't it?<br /><br />Who I am can be seen off to the right of this post, and this blog exists as a component of that public history program. Specifically, it's a component of <a href="http://digitalhistory.uwo.ca/h513_0708/">History 513</a>, also known as Digital History: Methodology for the Infinite Archive, taught by Professor Turkel here at the <a href="http://www.uwo.ca/">University of Western Ontario</a>.<br /><br />One of the fun things about being in a program as (relatively) obscure as public history is the raised eyebrows. If I had a nickel for every time in the last few months I've been asked "public history? What's that?" I would probably be gazing down upon you all in air-conditioned comfort from the privacy of my newly-purchased <a href="http://news.bbc.co.uk/2/hi/science/nature/6253054.stm">space hotel</a>. It tends to result in interesting discussions, at least, and the topic is usually quickly understood, even if it's seen as a bit weird, not the standard "History of [Topic] in [Place] during [Time]" that most people associate with history classes.<br /><br />When course schedules come up, and it's learned that I have something called <span style="font-style: italic;">Digital </span>History on Wednesday afternoons, the same sort of question shows up, with a little less confusion, a little less "what's that?" and a little more incredulity, a little more "<span style="font-style: italic;">why's</span> that?" Shouldn't history courses, after all, deal primarily with the hows and whens and whos and whats of the past - preferably involving, to complete the cliche, <a href="http://www.hup.harvard.edu/loeb/">Dusty Old Tomes</a> and entweeded professors with four hyphens in their surnames? Short of its use as an archiving tool, maybe, isn't most modern technology either irrelevant to the study of history or even an active hindrance to it?<br /><br />I obviously don't think so, otherwise I wouldn't be here, doing this.<br /><br />On the one hand, I'm well aware of the arguments and controversies and confusion and even contempt about the ubiquity of technology in modern life. I remember the large part of it firsthand, particularly the aspects to do with the Internet back in the early nineties. ("The Information Superhighway: Threat Or Menace?") On the other hand, I grew up sufficiently around and engaged with most of it that I take much of it completely for granted, wondering what the <span style="font-style: italic;">fuss</span> over new technology is rather than wondering what the <span style="font-style: italic;">point </span>of it is.<br /><br />It's an interesting position to be in. Even though I can sympathise with some of the revulsion towards the more <a href="http://xkcd.com/296/">absurd</a> or <a href="http://xkcd.com/281/">annoying</a>[1] aspects of connected culture, and tend to take issue with a lot of the jargon and hype within it - I'm allergic to the prefixes "<a href="http://www.worldwidewords.org/articles/cyber.htm">cyber</a>" or (when followed by <span style="font-style: italic;">anything </span>other than "-mail") "e" - I probably qualify as one of the digerati or whatever term is used to describe netizens this week. At least, I've been around net.culture long enough to remember "what's your major?" being a pickup line or understand why today's September 5122, 1993, so I know it's got its share of unique quirks and familiar mundanities. I think I've chosen my camp.<br /><br />So here I am, in any case, one more drop in a delightfully growing noosphere. Obviously I'll be posting about digital history; it is, of course, <strike>required on the syllabus</strike> a terrific confluence of several of my main interests. This kind of topic, the wealth of resources and opportunities under its umbrella, brings together a lot of things: history proper, its research and presentation[2], ways of outreach that bring the materials to an audience outside the classroom or archives, and the various facets of digital and online cultures. It helps create the kind of environment in which someone can easily <a href="http://www.perseus.tufts.edu/cgi-bin/ptext?doc=Perseus%3Atext%3A1999.01.0125;layout=;loc=1.1.0;query=chapter%3D%232">read Herodotus in the original</a>, catch <a href="http://www.bbc.co.uk/radio4/history/document/document_20070723.shtml">radio shows on obscure conspiracies</a>, see the past <a href="http://www.worldwaronecolorphotos.com/">without</a> <a href="http://www.worldwaronecolorphotos.com/">the</a> <a href="http://www.loc.gov/exhibits/empire/">monochrome</a>, discuss <a href="http://groups.google.com/group/soc.history.what-if/topics?lnk=gschg">counterfactual history</a> (or <a href="http://groups.google.com/group/soc.history.ancient/topics?lnk=rgh">more conventional fields</a>) in a lively and active environment... the list goes on, and I am most happy that it does.<br /><br />Much as I am aware and understanding of the anxiety and controversy over many aspects of the Net in this day and age, and much as I <a href="http://xkcd.com/218/">wonder about some of the people on it at times</a>, I remain a pretty enthusiastic supporter of its adoption by historians (and people in just about any other field, from <a href="http://mathworld.wolfram.com/">mathematics</a> to <a href="http://mailleartisans.org/">metalworking</a>). It's an area where a lot of the potentials are only starting to be tapped, and a lot of the concerns are, I believe, somewhat overblown, and it's something I intend to explore in considerable detail.<br /><br />When I first became aware of this sort of field, back before I had a name for it, I almost immediately thought "I want in on this." As I start the transition from "just" studying history to "doing" it, that remains the case.<br /><br />[1] - Don't lie: you've done this.<br /><br />[2] - One of the more amazing examples of this I've encountered can be seen <a href="http://www.youtube.com/watch?v=RUwS1uAdUcI">here</a>. In this video, Professor Hans Rosling, a professor of international health at the Karolinka Institute in Sweden, makes a point of the usefulness of new technologies as teaching tools that is hard to ignore. Taking the fairly abstruse topic of developing-world demographics over the last several decades, he handily spins together a presentation which is both riveting and very easy to understand. Rosling's point - that a lot of relevant, important, practical information is lying fallow, and could be understood and applied very easily were it just for a different perspective - becomes clear as day starting at about five minutes in.pstewarthttp://www.blogger.com/profile/13634698609139164994noreply@blogger.com0