/mw/














A&MI home
Archives & Museum Informatics
158 Lee Avenue
Toronto Ontario
M4E 2P3 Canada

ph: +1 416-691-2516
fx: +1 416-352-6025

info @ archimuse.com
www.archimuse.com

Search Search
A&MI

Join our Mailing List.
Privacy.

 

published: March 2004
analytic scripts updated:
October 28, 2010

Creative Commons Attribution-Noncommercial-No Derivative Works 3.0  License
Museums and the Web 2003 Papers

 

What Clicks? An Interim Report on Audience Research

Jim Ockuly, Minneapolis Institute of Arts, Minneapolis, Minnesota, USA.

Abstract

The Minneapolis Institute of Arts is conducting a major research and development project assessing its audiences' awareness, usage, and satisfaction regarding its interactive media/Web resources. The Institute, an encyclopedic art museum, produces and maintains two Web sites (http://www.artsmia.org and http://www.artsconnected.org, the latter in conjunction with Walker Art Center). It also provides its physical visitors with a host of interactive media programs in the museum itselffrom a museum directory to a large number of permanent collection-based programs located throughout the building. The research has been designed to a) measure audience awareness, usage, and satisfaction regarding these resources; b) respond to those findings with improvements; then c) re-measure to gauge the effect of the improvements. A further goal of this project is to share its logic model, methodology, instruments, and findings with the museum community. To date, the benchmarking data has been compiled and analyzed. Production and marketing work are now in progress. This paper is a mid-project report on what we've learned so far, and where we're going. It includes detailed descriptions of the instruments used for data collection (from Usability Lab to Web and in-gallery surveys), presentation of the initial findings, and a discussion of how those findings informed the actions now being taken. What Clicks? is funded by an Institute of Museum and Library Services (IMLS http://www.imls.gov) National Leadership Grant.

Keywords: audience research, evaluation, interim report

Figure 1:
The Minneapolis Institute of Arts
Figure 2:
MIA Interactive Directory

Introduction

In 2000, The Minneapolis Institute of Arts (MIA) received a National Leadership Grant from the Institute of Museum and Library Services (IMLS http://www.imls.gov ). The grant's purpose was to fund a media and technology-oriented audience research and development project. The projecttitled What Clicks? — is now underway. This is an interim report on the project's progress, findings, and activities to date.

Figure 3: African art and Culture Interactive Learning Station Figure 4: General Web site home page (www.artsmia.org)

What Clicks? Project Profile

The specific purpose of the What Clicks research and development project is to study audience effectiveness of The Minneapolis Institute of Arts' digital media resources. This includes measurement of audience awareness, use, and satisfaction as it applies to the museum's Interactive Directory, Interactive Learning Stations, and Web site. Further, the project aims to interpret and react to the initial baseline findings by setting aside significant time to make improvements to the above resources, and then re-measure in an effort to gauge the impact those improvements have made. Ultimately, it is hoped that the What Clicks project will benefit The Minneapolis Institute of Arts and its audience through the identification of audience needs, and in turn benefit other museums and their audiences through the publication of both the project's process and its findings.

Museum Profile

Established in 1883 and now considered one of the top ten art museums in the United States, The Minneapolis Institute of Arts has built an encyclopedic collection of approximately 100,000 objectsdating from classical to contemporary times. The Institute offers regular programs that include permanent collection display, special exhibitions, lectures, classes, and tours. It also has a longstanding record of using media and technology to help connect its audiences with artparticularly under current Museum Director Evan Maurer.

Figure 5:Entrance to special exhibition Eternal Egypt: Masterworks of Ancient Art from The British Museum Figure 6: detail of one of the museum's main entrances

The Minneapolis Institute of Arts employs a staff of seven full-time professionals in its Interactive Media Groupor IMG. The IMG works with other museum staff to design, implement, and support the Institute's media and technology resources.

Currently, the museum houses an Interactive Directory as an information aid for visitors, and 17 Interactive Learning Stations situated throughout the building. The museum also offers both special exhibition and permanent collection audio tours, as well as an electronic Daily Events screen.

For the off-site audience, the Institute provides two Web sitesthe general museum site (http://www.artsmia.org) and ArtsConnectEd, a resource designed for K-12 teachers and students (http://www.artsconnected.org). ArtsConnectEd is a joint project of The Minneapolis Institute of Arts and Walker Art Center (http://www.walkerart.org).

The museum recorded approximately 500,000 on-site museum visits in fiscal year 2001/2002, while the Institute's general Web site logged roughly 2.5 million visits, and ArtsConnectEd approximately 500,000. For several years, on-line visitation has grown consistently at a rate of around 50% per year.

Interactive Directory

The Directory consists of three touch-screens that are used from a standing position or from a wheelchair. It is located in the inner lobby at the museum's most frequently used entrance. Contents include information about special exhibitions, permanent collection galleries, lectures, films, Family Days, tours, membership, and amenities (restrooms, coat check, cafes, etc.).

Interactive Learning Stations

The museum's 17 Interactive Learning Stations each concentrate on a specific area of The Minneapolis Institute of Arts' permanent collection (e.g. photography, Prairie School architecture, African art, etc.). They provide further content and context for works of art on display. Early thinking was to avoid creating a centralized media ghetto. Most are installed in discrete spaces in or near the galleries whose objects they address. Some are installed in plain view in permanent collection galleries. They range from video "jukeboxes" with a small set of linear segments to highly interactive Web programs with database components.

Web Site

The Web site acts as both an aid for museum visitation and an on-line art resource. Its major sections are The Collection, Special Exhibitions, Events, Visit, General Info, Education, Interactive Media, Join, Shop, and Electronic Postcards. There are also prominent links on the home page to ArtsConnectEd, press releases, and highlighted exhibitions and events. The site has been noted for its in-depth permanent collection programs (e.g. Modernism, Arts of Asia) and on-line curriculum units (also available through ArtsConnectEd). Awards and high use have testified to the quality of these programs. There is presently no e-commerce or transaction-based activity on the site.

What Specifically is What Clicks Measuring?

Awareness > Use > Satisfaction

Audience awareness, use, and satisfactionwith each condition leading to the nextconstitute the What Clicks mantra. Because of the breadth of the Institute's electronic media resources and the impossibility of studying audience relationships within that entire range, there was a desire to limit the scope of the What Clicks project. It was decided that the main focus of What Clicks — in terms of media resourceswould be the Interactive Directory, two of the 17 Interactive Learning Stations (Arts of Asia and African Art and Culture), and the museum's general Web site — artsmia.org. General public awareness of all of these resources was also measured, and basic demographics were captured to see if awareness levels were consistent across groups.

Process Outline

Baseline Research > Interpretation/Analysis > Enrichment/Redesign > Follow-up Research

The project process can be reduced to a general arc, starting with baseline audience measurement for benchmarking; then analysis and interpretation of findings; a six-month period of enrichment and redesign; then a second round of audience measurement, performed exactly a year after the baseline measurement and conducted with identical instruments. Interim and summary reporting, both internally (to MIA staff) and to the field, are woven into the process. The project is currently in the enrichment and redesign phase.

Getting Started

The first step was to form a project team that included the museum's Assistant Director and members of the Interactive Media Group, the Education Division, the External Affairs Division (which includes Marketing and Communications), and the museum's Visitor and Member Services department. These people would work directly with the research data and are primary decision-makers in matters regarding the Institute's media programs.

Logic Model and Evaluation Plan

To focus the project team's thinking, a consultant (evaluator Mary Ellen Murphy) was hired to work with the group in developing a project evaluation plan, including a logic model. This also addressed a grant component, since an outcome-based process and final report had been requested by the Institute of Museum and Library Services. Before any audience research was done, the project team attempted to envision a process whereby the initial goals of the project might be met (the goals being to increase audience awareness, use, and satisfaction regarding the Institute's electronic media resources, and to share the process that led to that outcome with peer institutions). The resulting logic model is expressed in a spreadsheet that includes columns headed by these questions:

  • What are the desired outcomes?
  • Who will benefit?
  • What key activities will bring about the intended changes?
  • What are the inherent values of the proposed actions?
  • What are the — perhaps hidden — assumptions about the actions?
  • What kind of lasting impact is expected?

The project team met several times in this phase, each time getting closer to consensus an important condition if the project was to get off to a good start. This process revealed the kinds of questions that the team was most interested in answering and eventually informed the survey instruments themselves.

An evaluation plan was developed so there would be a way for the project team to measure success after what was sure to be a long and complex process, and also to provide some early, agreed-upon project structure. The plan matrix included:

  • Outcomes (e.g. increase on-line visitors' awareness of the depth and breadth of artsmia.org)
  • Indicators (e.g. increased duration of user sessions)
  • Methods (e.g. Web stats pre-/post-comparison of user session duration and server directories visited)
  • Person(s) Responsible
  • Timeline.

Survey Instruments and Methodology

With an agreed-upon logic model and evaluation plan in hand, a consulting firm (Cincinnatus, Inc., http://www.cincinnatus.com) was hired to develop the instruments to be used in the audience research and then to carry out the baseline study itself. The project team and consulting firm identified the following research instruments:

  • An Internal Focus Group, involving staff identification of existing best practices in interactive media production at The Minneapolis Institute of Arts
  • a Usability Lab Study
  • a Technology Awareness Survey to be given to people walking into the museum
  • a Directory Survey for people who had just used the Interactive Directory
  • a Learning Station Survey for people who have just used one of the two Learning Stations being studied
  • an Online Web Survey (delivered via both pop-up and pop-under windows) which would appear on the artsmia.org Web site for a limited time period.

There was also existing data from a previous audience survey that included some questions about media and technology, as well as demographic information for comparison.

Instrument Details

Each instrument was developed collaboratively among the project team, the research firm, and the IMG. Feeding into this process were the logic model and evaluation plan developed earlier.

Internal Focus Group

The What Clicks project came at a time in the Institute's development when a great deal of successful work in the realm of interactive media had already taken place over the course of more than a decade. To identify the best practices that had contributed to the museum's success so far, a roundtable discussion was held. It was facilitated and recorded by members of the Cincinnatus consulting firm, and included several members of the Interactive Media Group and the Chair of The Minneapolis Institute of Arts' Education Division. A transcript of the conversation resulted, as did lists of Best Practices and Goals. This instrument was implemented before any of the others and proved to be a catalyst for thinking about how far the museum had come, and where it was hoping to go next in terms of interactive technology.

Usability Lab Study

The Institute was fortunate to partner with Minneapolis-based Target Corporation in the Usability Lab Study. While the What Clicks project team brought questions, ideas, and desired outcomes to Target, it was Target's well-developed process, their ability to accommodate a slightly unusual (read non-commercial) client, and their newly reinstalled facilities that made this study work.

Three days of testing were planned, with three subjects (users) per day. (In the end, one subject was unable to attend, so the total was eight.) The users had been screened based on the desire to have a good mix of: people who had personally visited the Institute versus those who had not; museum members versus non-members; people with a high degree of interest in art versus those with moderate interest. Experience with Web design was a quality that ruled potential subjects out. A monetary incentive of $50 was offered to compensate for each user's time commitment.

Upon testing, each subject was led into the facility and briefly interviewed. Some of the initial screening questions were repeated (Have you been to The Minneapolis Institute of Arts? for example) and, for the first time, users were asked if they'd been to the Institute's Web site.

Then, sitting at a monitored computer setup with a browser set to an unrelated Web site, each user was asked to find The Minneapolis Institute of Arts' site. Following this step, users were given several scenarios one at a time and given several minutes to complete them. The facilitator got them started, then left the room while encouraging them to verbalize their experience and impressions as they went. Audio and video recordings were made as the sessions unfolded. In the control room, members of the project team and other Institute staff observed the tests and recorded their own comments.

The first scenario given the users was simply to browse the site based on their personal interest. The next was to imagine they had visitors coming in a few weeks and wanted to get an idea of what would be going on at the museum at that time. Then, with their visitors' arrival just days away, they were asked to get more logistical information. Sometimes the facilitator would re-enter the room to ask a specific question like, "Would you be able to see [a particular film] as part of your visit?" or "What would you do if your visitors' children were especially energetic?"

Other scenarios were designed to get users to specific parts of the site. In one, users were encouraged to research the work of Frank Lloyd Wright. Ideally, this would get them into the Permanent Collection section as well as to one of the site's many rich interactive sections in this case the Unified Vision project (http://www.artsmia.org/unified-vision). Another scenario encouraged users to find and send an E-Postcard.

After each session, the project team would meet in a conference room with Target staff to review the session and make notes about observations. This led to a list of findings, some of which turned up time and again over the 3-day period. This list became an important working document. The IMG is relying heavily on this list during enrichment and redesign.

Technology Awareness Survey

In an effort to measure general awareness of museum technology among museum visitors, 379 visitors aged 15 or older were randomly intercepted on their way into the museum and given an interviewer-administered questionnaire. This took place between the dates of August 2 and August 15, 2002. The questions were designed to get at visitor awareness of various facets of the museum (restaurant, coffee shop, Interactive Directory, etc.), and allow for comparisons between technology-based and non-technology-based amenities. Visitors were also asked when, if ever, they'd visited last, and basic demographic data was also gathered. A small incentive gift was offered (a packet of MIA postcards).

Directory Survey

A total of 128 Interactive Directory users were given a self-administered survey upon being observed using the Directory. This took place between August 16 and 29, 2002. The original sampling plan based on usage history was abandoned because of unexpectedly low traffic during the study period, and, instead, the researchers intercepted everyone who touched the Directory screen. The questions measured awareness (e.g. How did you first become aware of the Interactive Directory?), motivation (e.g. What initially motivated you to use the Directory?), and satisfaction (Users chose from a list ranging from Extremely Satisfied to Not at All Satisfied). Further questions asked specifically what could be done to improve the Directory, and, again, basic demographic information was captured. Packages of MIA postcards were given as an incentive for participation.

Learning Stations Survey

Museum volunteers intercepted 105 users of the two Interactive Learning Stations between August 16 and September 1, 2002. Because two Learning Stations in different parts of the building were being studied, motion detectors and pagers were used to alert the volunteer when a Station was in use. Each Learning Station's motion detector triggered an auto-dialer which caused the volunteer's pager to indicate which Station was occupied. The volunteer could then go to the Station in time to intercept the user. Users were given a self-administered questionnaire similar to the one used for the Directory. Again, packages of postcards were offered to survey participants.

Online Web Survey

Between August 28 and September 11, 2002, a pop-up survey appeared as viewers entered the artsmia.org Web site. The pop-up window designed to graphically complement the site aestheticcontained text that invited visitors to participate in a survey by providing an email address. Those who complied received an email invitation to complete the on-line survey. It was essential that visitors fill out the survey post-visit. If visitors initially declined, a second window popped under their main browser window, with the idea that visitors would eventually see it upon exiting their browser. The pop-under again invited participation through email, but also offered the option of completing the survey immediately. A chance for a $500 gift certificate to Amazon.com was offered as an incentive. A total of 573 people completed the survey. Considering that there were 35,357 site visits during the survey period, the response rate was 1.62%. The on-line survey was designed to measure awareness, use, and satisfaction regarding the Web site itself, and collected demographic information consistent with the other instruments.

Major Findings

With reams of data from the surveys and countless subjective impressions from the more observation-based instruments, the project team and the members of the Interactive Media Group faced the formidable tasks of interpretation and analysis. A pattern quickly emerged. It was immediately clear that satisfaction was extremely high across the board. Once visitors found their way to the Institute's electronic resources, they generally reported positive experiences with the Directory, Learning Stations, and Web site. On the other hand, general awareness and, in some cases, usage, scored relatively low. This suggested that the biggest opportunity would be in getting more people to the resources. In brainstorming actions to be pursued in the enrichment phase, emphasis was placed on strategies that might increase awareness and use.

Of course, much light was shed on possible satisfaction enhancements as well particularly through the in-depth research, as in the usability lab. Some relatively simple methods were also employed subsequently to get at the satisfaction questions (For example, Visitor and Member Services staff and Security personnel confirmed in interviews that museum visitors' top request is for help in locating specific works of art or types of art in the museum. This information supported survey data as well as decisions about improvements for the Interactive Directory.).

It should be noted that curatorial perspective was brought to the project at this point, and that the initial findings were reported to the entire staff. In fact, an internal communications plan was developed to get staff thinking and talking about the project, and to help generate interest in and support for eventual changes.

Directory Findings

  • Awareness of the Interactive Directory, when compared with other museum elements, scored relatively low (35% total aided and unaided awareness compared with 89% for both the museum's Information Desk and Shop, and 72% for the cafe).
  • Most awareness came through direct observation during a museum visit (78%).
  • Just 22% of museum visitors reported using the Interactive Directory.
  • Directory users say they were looking for exhibition and event information (note: this is provided by the Directory).
  • Thinking that the Directory was primarily a way-finder, 47% of visitors who had used it once or never said they know their way around well enough.
  • 88% of users found the Directory extremely or very easy to use.
  • 59% were extremely or very satisfied with what they experienced.
  • 73% found the information extremely or very clear.
  • Ideas for improvements included (in order of popularity)
    • Show me where specific works of art are in the museum,
    • Add a daily events calendar,
    • Make Directories available at additional locations, and
    • Tell me what's going on in the museum at the moment.

Interactive Learning Station Findings

  • Learning Stations scored slightly higher than the Directory in terms of awareness (43% to 35%)
  • Most awareness came from direct observation (75%).
  • 58% of users of one Learning Station are not aware there are other Learning Stations in the museum.
  • Half of users say the Stations are "not very visible" or "easy to miss."
  • More visitors report having used the Leaning Stations than the Directory (34% to 22%)
  • 70% of users find the Learning Stations extremely or very satisfying.
  • 87% find them extremely or very easy to use.
  • 82% say the information is extremely or very clear.
  • 76% say the Learning Station enhanced their understanding of the art very much or quite a bit.
  • When asked how the Learning Stations could be improved, users tended to favor improvements to the physical setting.
  • People tend to feel the Learning Stations are too isolated. 78% of users would like this kind of resource "in the galleries close to the art."

Web Site Findings

  • Museum visitors have a higher awareness of the museum's Web site (54%) than of its Directory (35%) or Learning Stations (43%).
  • A higher percentage of museum members are aware of the Web site (63%) than of non-Members (49%).
  • 1/4 of Web site visitors are museum members, while 1/3 of museum visitors are members.
  • Top motivators for visiting the Web site in order of popularity were: to find specific information about the museum, to plan a trip to the museum, to learn about art for personal enrichment, just browsing, to learn about art for a class assignment, and to look for employment opportunities.
  • Web visitors tend to be younger than museum visitors and reside further from the museum, and a higher percentage are employed.
  • Repeat visitation to the Web site is high (78%).
  • 80% of Web visitors found what they were looking for. An additional 10% were just browsing.
  • Top ideas for improvement included more works of art on the site and more information about works currently on view in the museum and on the site.
  • Usability Lab findings showed that while users didn't necessarily expect the depth and breadth of art information on the site, they were particularly pleased to find it. General impressions were very positive (look of site, amount of information). Most said the on-line experience revived their interest in visiting the museum itself.
  • Usability Lab findings also indicated some trouble with the relationship between the Visit and General Info sections, navigation back to the home page, understanding the calendar function and image zooming controls, and the relationship between the Collection section, the on-line multimedia programs, and ArtsConnectEd.

From Analysis to Action

The next step was to approach the enrichment and redesign phase with four separate but related projects in mind. Three would be based on the medium-specific resources (Interactive Directory, Learning Stations, and Web site) and the fourth would be a general marketing campaign that would raise awareness of all of the above.

The project team, having worked through an analysis of the data with input from the IMG, decided to form four working groups with each group addressing one of the major projects to be completed during enrichment and redesign. Each working group was to be chaired by a member of the project team, and was to include resource members from the IMG, Visitor and Member Services, and The Minneapolis Institute of Arts' External Affairs division (which includes Marketing and Communications). The working groups would meet anywhere from one to three times and report their recommendations back to the project team.

This process got underway and led to a clearer picture of what actual action steps were to be taken. In the course of planning these actions, the project team drew up a list of criteria that specific ideas would have to meet.

These criteria included:

  • Supported by visitor research
  • Doable within What Clicks enrichment/redesign timeframe (January through July, 2003)
  • Doable with What Clicks budget/MIA staff resources
  • Supports MIA Strategic Plan
  • Consistent with Grant/Logic Model
  • Benchmarks exist
  • Minimal long-term cost implications
  • Director's Approval

The process yielded a realistic set of proposed actions that were then taken to the Museum Director, who having been primed with regular project updates provided a thoughtful and thorough review. Discussions between the Director and the project team took place and, ultimately, the Director's approval paved the way for enrichment and redesign.

Case by Case: Recommended Actions

As the project is currently in the midst of the enrichment and redesign phase, what follows are general descriptions of the kinds of actions being taken in the four major projects.

Interactive Directory

Key ideas to emerge regarding the Directory include recasting the concept to make it clearer to visitors and to better address their immediate needs: What's going on right now in the museum? How can I find a specific work of art or type of art? The plan is to redesign the Directory's interface, contents, and installation in its current location. Next, additional locations will be tested. The Directory currently a multimedia program not connected to the event database or collections management system, which feed the Web site, will be rewritten as a Web-based program. In response to findings indicating a desire for livelier graphics and images, a video preview of the museum's galleries and other offerings is under consideration. This preview would most likely be incorporated into the design as an additional non-interactive screen. The Directory will also be included in the general marketing campaign as an important component of in-museum technology.

Interactive Learning Stations

With the positive outcome that visitors find the Learning Station contents highly satisfying but have difficulty finding the Learning Stations in the first place, most of the changes will be physical ones. Work is underway to make the stations more visible through consistent signage and more lively attract screens. In some cases, dramatic changes to installations will be made, including the addition of two-sided vitrines in walls that currently hide the Stations. Addressing the desire to better incorporate these resources in the galleries, closer to the art, several instances have been identified where small LCDs can be installed next to objects. These screens will display object-specific video clips. If successful, this idea will open a whole new avenue for in-museum media. Also, the general marketing campaign will attempt to raise visitor awareness of Interactive Learning Stations as effective tools for learning about the collection.

Web Site

Again, with awareness being a key factor, the major effort for the Web site will be to get more people to the site. These include more frequently and prominently featuring the site's URL in print publications and on take-away items, better using the member's magazine as a promotional vehicle, creating an on-line version of Recent Acquisitions exhibitions, asking museum staff to incorporate the URL in their email signatures, and purchasing on-line ads. The findings of the usability lab will guide changes to the site's navigation, structure and contents. Also under consideration is the addition of a new section featuring some of the Institute's recent acquisitions a concept that is mirrored with ongoing Recent Acquisitions exhibitions in the museum itself.

Marketing Campaign

As of this writing, a museum technology marketing campaign is being designed to attract more people to the Web site, Learning Stations, and Directories. The campaign will be timed to have had substantial impact by August, '03, when re-measurement is done. A healthy portion of the grant funding has been set aside for this very purpose.

Questions and Issues

As we've worked to get a grasp of the findings, proposed enhancements, and process complexities, several questions have arisen.

  1. Given that we've decided to use the very same instruments in both the baseline and follow-up research, how do we measure new ideas that have come out of the process? For example, the study clearly indicated that users of the Interactive Learning Stations would like to see pieces of those programs more closely integrated with the objects in the galleries. This led to the idea of installing LCD screens near some works of art. What this means is that the environment itself will be changing in a way that the survey instrument was not designed to measure. The Learning Station surveys were given to people who had just finished using one of the two Stations that were part of the study. How will we measure the impact of the LCDs? There are two potential solutions. One is to design a mini-study focused specifically on the LCDs. The other is to see if the Technology Awareness Survey indicates any increase specifically due to this new technology. It is likely that both solutions will be pursued. This methodological challenge also applies to the proposed physical changes to Learning Stations not initially studied. Of course, we could have decided to be strict with ourselves and require that each suggested change be measurable with the existing instruments, but that would have defeated the larger purpose.
  2. With the Web site constantly evolving, with changes big and small, how will we remember what state it was in during the initial survey period? How can we freeze the site so as to return to it a year later? While it's next to impossible to technically save a state of such a dynamic site, we have incorporated a Web Change Log into our work process. When changes are made, they are recorded in a spreadsheet, with Nature of Change, Date, and Reason for Change (Some changes are happening in response to the What Clicks project, while others are part of the normal process that would be happening in the absence of What Clicks.). The Web Change Log will allow at least a crude way to mentally reconstruct the site as it was when the initial research was conducted.
  3. Given a record of growth in Web visitation at the rate of about 50% per year over the last several years, can we assume that that rate would have continued this year, and that any increase beyond 50% could be attributed to enrichments coming out of the What Clicks project? Again, separating natural growth from that caused by project initiatives will be tricky. Even without a major research project in the works, the Web site would be undergoing change. Those changes simply wouldn't be as driven by audience data. One thing to watch to the extent possible will be where people are going on the site and where they're coming from. Also, any available external benchmarks for Web usage, particularly for museum sites, will be monitored.
  4. When multiple changes are made over the course of many months, how will we know specifically which change caused which result? This is a hazard of almost any research project. Resources don't allow for the measurement of each change, yet it may be possible to find opportunities for measurement as changes are implemented (e.g. the use of Web stats to show the popularity of a new section of the site). As much in-process measurement as is possible will be done during the current phase.

Conclusion

Since this is an interim report, ultimate conclusions are still to come. The proof of the pudding, as they say, is in the eating. While much has been learned, it remains to be seen what impact the current phase of enrichment and redesign will have on audience awareness, use, and satisfaction. The emphasis is now on implementation structure and creative development. Currently, the project team is meeting monthly, the working groups are each keeping regular meeting schedules, the IMG has its own structure of project leadership and coordination, and ongoing museum-wide communication is taking place.

We look forward to the follow-up measurement and to sharing the results with our museum colleagues.