ph: +1 416-691-2516 info @ archimuse.com
Join
our Mailing List.
published: March 2004 ![]() |
![]() More than just papers, MW2003 offered a chance for dialogue MW2003 featured a variety of differently formatted interactions, so we could learn from the concrete experiences of others. Mini-Workshops introduced tools, methods, or techniques. Crit Rooms featured a review of museum Web sites in "real time" and testing of attendees' Web sites took place in the Usability Lab. Interactions are listed chronologically below, or you can see an overview of the program. Focused one-hour mini-workshop sessions were designed to introduce tools, methods, or techniques for developing, maintaining and evaluating museum Web sites.
A new format for 2003 the Professional Forum allowed MW attendees to voice their opinions about issues of concern to the profession. Conveners presented their case for new kinds of professional information exchanges and for new methods of museum content aggregation. Attendees spoke out about the proposals and made their own suggestions.
Crit Rooms Experienced Web designers reviewed real museum Web sites and offered their comments in the Crit Room sessions. In this interaction, modeled on the art school critique, Web sites were volunteered in advance by MW2003 attendees, who were present to respond. New in 2003 was a chance to offer in-depth feedback on a site under development: a 'Day At Qumran".
On Friday, March 21, a "User Testing" laboratory ran all day long. The purpose of the session was to provide an opportunity for conference participants to 1) observe user testing of museum Web sites in action; 2) volunteer to participate as a user test subject and discover some of the problems users have on unknown sites; and 3) volunteer their site to be tested. People were encouraged to drift in and out of the session all day long--as they move, for example, from one talk to another. Each user test lasted 20 minutes or so (with time for audience comments and questions). Therefore, it was very easy for individuals to observe and even participate without having to sacrifice a large amount of time. Mike Twidale and Paul Marty administered the user tests. Sites to be tested were not evaluated in advance and volunteer users were selected at random. Anyone could signup for a time to have their site tested. Volunteer user testers were selected at random. The volunteer user temporarily left the room while the owner of the site described what they considered a typical scenario of use--something the average visitor to the site would be trying to do. These scenarios were then converted into a task, which together with some randomly selected standard tasks, were given to the user to perform during the test. The site was projected on a big screen for the audience to follow the user's experience. The user was then brought back into the room and we conducted a simple, low-cost, high-speed user test. Twidale and Marty demonstrated a variety of testing techniques throughout the day--but emphasized the "thinking aloud" method so that the audience can easily follow the test subject's thoughts. After the conclusion of each test, the user, site owner, test administrators and audience discussed what was learned.
|