Museums and the Web 2005
Papers
Accessibility Logo

Reports and analyses from around the world are presented at MW2005.

The accessibility of Museum Web Sites: Results From An English Investigation And International Comparisons

Helen Petrie, Neil King, Centre for Human Computer Interaction Design, City University London and Marcus Weisen, The Council for Museums, Libraries and Archives, United Kingdom

Abstract

Museums, like many other organizations, now use the Web to interact with their audiences.  Thus the accessibility of museum web sites is as important as the accessibility of their premises and services.  This paper presents the results of an audit of 125 museum Web sites, 100 in England and a group of 25 museums from around the world as a comparison group.  The technical accessibility of the Web sites was assessed against the WCAG1 checkpoints using the accessibility tool WebXM™. In addition, the extent to which disabled people could use the Web sites was assessed by a panel of 15 people with disabilities (5 blind people, 5 partially sighted people and 5 people with dyslexia) who undertook two simple tasks on each Web site and rated the site on a number of dimensions. The results of the automated testing showed that only 30% of English museum Web sites and 20% of the international group could meet even the most basic accessibility criteria.  The user testing showed that disabled people could successfully complete very basic tasks on the Web sites only 76.2% of the time and got lost on over one third of the sites.  However, further analyses showed that these problems were originating from a relatively small number of accessibility issues that are not especially complex to address.

Keywords: web accessibility, automated Web site testing, user testing, standards compliance, e­government policies, disability legislation

 

Introduction

Disabled people are among the most excluded in society and face many barriers in accessing the premises and services of museums.  Disability legislation in a number of countries (e.g. Australia, Germany, the U.K. and the U.S.A.) establishes a right of access to services for disabled people, with some legislation specifically recognising Web sites as a service.  For example, in the U.K., although the Disability Discrimination Act (1995) makes no explicit reference to Web sites, the accompanying legally recognised Code of Practice produced by the Disability Rights Commission (2002) makes it clear that a Web site constitutes a service.  U.K. government policy also requires that public sector Web sites meet moderate levels of web accessibility (WCAG1 Level AA, see Methodology section for a further explanation) by 2005.  Similarly, in Germany, the disability equality law (Bundesbehindertengleichstellungsgesetz, 2002, see 217.160.60.235/BGBL/bgbl1f/bgbl102s2654.pdf), in which barrier-free access is a central concept, requires that Web sites meet the same level of accessibility.  The European Union's eEurope Action Plan 2005 (see europa.eu.int /information_society/eeurope/2005/index_en.htm) also states that public Web sites should be accessible.  In the cultural sector, the European MINERVA Project (www.minervaeurope.org) has established quality principles that will help the sector turn these policy objectives into practice.  In the U.S.A., the Americans with Disabilities Act (ADA) also makes no explicit reference to Web sites, but a number of legal cases (although not all cases) have found that Web sites come within the scope of the Act.  For example, a recent ruling in New York State found that two major travel Web sites were in breach of the ADA; the Web site owners were fined and required to make efforts make their Web sites more accessible (see www.oag.state.ny.us/press/2004/aug/aug19a_04.htm). 

Providing access to Web sites for disabled people is clearly a challenge museums cannot afford to ignore. In 2003, the English Council for Museums, Libraries and Archives (MLA) commissioned an investigation of the accessibility of museum, library and archive Web sites in England with an appropriate international comparison.  MLA is the government agency that provides strategic guidance, advice and advocacy to U.K. Government departments on museum, library and archive matters.  MLA's Operational and Strategic Plan sets out a vision for the central role of the sector in enabling easy access to knowledge, information and inspirations for all; supporting the Government's access, learning and inclusion agendas and driving forward the knowledge society and economy. The main objectives of the audit were:

  • to establish the current state of accessibility of museum, library and archive Web sites in England;
  • to benchmark the current state of accessibility against national and international standards;
  • to identify current areas of best practice and those which require improvement; and
  • to create a strategy for improving accessibility of web sites in the sector. 

This paper will concentrate on the audit of English museum Web sites and the international comparison.

The audit was inspired by a study conducted by City University London in 2003/2004 of the accessibility of 1,000 general Web sites for the Disability Rights Commission (2004).  This was the largest and most comprehensive web accessibility audit ever undertaken and unusual in prominently involving extensive user testing to complement automatic testing of Web sites.  A similar methodology for the MLA audit of museum Web sites was used, creating baseline data of unprecedented scope and breadth.

Methodology for the audit

A sample of 100 English museum Web sites was chosen to reflect the range of different museums: academic museums, those funded by local authorities, national museums and independent museums.  Within each of these domains, a random selection was made from the complete listing of museums provided by MLA.  The number of Web sites in each domain is given in Table 1, below. A sample of 25 national museums from across the world was selected to form the International comparison group.  Two criteria were used in making this selection:

  • that the museum have an English language Web site (to allow us to understand the results of the automated accessibility testing and to have English speaking disabled people conduct user testing with some of the Web sites, see below);
  • that Web sites were chosen from all the continents of the world and a broad range of countries; and
  • the museums be of national or international standing.
English Museum Web Site Number
Academic
20
Local authority
31
Independent
30
National
19
English museum total
100
 
International comparison museums
25
   
Total sample
125

Table 1:  Sample of Museum Web Sites

The criterion for technical accessibility was the extent to which the Web sites conformed to the World Wide Web Consortium's (W3C) Web Content Accessibility Guidelines (WCAG, version 1, henceforth WCAG1) (see www.w3.org/TR/WCAG10/) developed by the Web Accessibility Initiative (see www.w3.org/WAI/).  WCAG1 provides 14 Guidelines that are divided into 65 specific Checkpoints.  Each Checkpoint has a priority level (1, 2 or 3) assigned to it, based on the Checkpoint's expected impact on accessibility.  Thus, violating Priority 1 Checkpoints are said to have the largest impact on a Web site's accessibility, while violating Priority 3 Checkpoints are said have the least impact on accessibility.  If a web page passes all the Priority 1 Checkpoints, it is said to be Level A conformant; if it passes both Priority 1 and 2 Checkpoints, it is said to be Level AA conformant; and if it passes all Checkpoints, Priority 1, 2 and 3, it is said to be Level AAA conformant.  If a whole Web site achieves a particular level of conformance, it may wish to display the corresponding logo from the WAI (see Figure 1).

    

Figure 1: WAI logos for Web site accessibility conformance

An automated tool was used to check conformance to WCAG1, the accessibility module of WebXM™ from Watchfire (see www.watchfire.com/products/webxm/ accessibilityxm.aspx).  Like other automated testing tools (see www.w3c.org/WAI/ER/existingtools.html for a comprehensive listing of such tools), it can check the conformance of a Web site against some of the 65 Checkpoints.  However, it should be remembered that many of the Checkpoints require some human judgement. For example, an automated tool can check whether images on a web site have descriptions associated with them (Checkpoint 1.1: Provide a text equivalent for every non-text element), but at the moment, an automated tool cannot check whether the colour contrast on the page is adequate (Checkpoint 2.2: Ensure that foreground and background color combinations provide sufficient contrast when viewed by someone having color deficits or when viewed on a black and white screen), this requires a human judgement.  However, automated tools can give warnings, that is, indicate aspects of a page that should be manually checked by a human. To ensure that a site is fully conformant with the WAI guidelines, both automatic and manual checking of WCAG1 Checkpoints is required.

The home pages of all 125 museum Web sites were tested using the accessibility module of WebXM.  Following this initial audit, 7 English and 5 international museum Web sites were selected for in-depth automated and user testing. The selection criteria for these Web sites took into account the different museum domains, the varying popularity of the sites, the results of the initial automated testing and whether the site was embedded into a host site.  Up to 700 pages of each of these 12 sites (or the whole site if smaller) were tested with the WebXM™ accessibility module.

The 12 Web sites were also tested by a User Panel of 15 disabled people and accessibility experts at City University. The User Panel included blind people, partially sighted people and people with dyslexia. Previous research conducted into Web site accessibility by City University showed that these three groups are amongst the most disenfranchised users of the Web (Disability Rights Commission, 2004) and that testing by these groups detects most accessibility problems. 

The User Panel consisted of:

  • five people who are totally blind or have no functional vision (who use screen readers with synthetic speech or Braille output to interact with the Web)
  • five people who are partially sighted (who may use screen magnification programs or large screen monitors to interact with the Web)
  • five people with dyslexia (who may use specialist text to speech Web browsers to interact with the Web)

As far as possible, the User Panel reflected the diversity of disabled people in Britain in terms of age, gender, technology, computing, and internet experience and assistive technologies used.  Although the research took place in London we included people from other parts of England in the User Panel.

The user evaluations of Web sites were run individually in dedicated testing areas at City University.  Participants were provided with the assistive technologies they normally use such as JAWS, ZoomText, Read-Please and large screen monitors.  All the sites were evaluated three times – once by a member of each of the three User Panel groups. 

Each User Panel member assessed four Web sites, undertaking two representative tasks with each site: 

  • What time does the museum open on Mondays?
  • What facilities does the museum provide for disabled visitors?
  • Information collected for each task included:
  • Time spent attempting to complete the task
  • Whether the User Panel members succeeded or failed to complete the task
  • How easy the Panel members found it to perform the task, irrespective of whether or not they succeeded (rated on a scale of 1 to 7, where 1 indicates "very difficult" and 7 indicates "very easy")
  • What made it particularly easy or difficult to do the task
  • How easy the Panel members found it to navigate the site when attempting the task (rated on a scale of 1 to 7)
  • Problems encountered, as articulated by the User Panel members or observed by the experts

Additional information collected for a whole site included:

  • The extent to which the Panel members believed each site took their impairment into account (rated on a scale of 1 to 7)
  • Whether the Panel members experienced a feeling of being 'lost' when navigating around the site
  • What Panel members most liked and disliked about the site

Results

WCAG Conformance

34% of the English museum home pages and 20% of the International comparison group had no WCAG1 Priority 1 Checkpoint violations that automated testing could detect (see Table 2).  However, all these home pages did attract WCAG1 Priority 1 'warnings' (for an average of 7.9 Checkpoints for the English sites and 9.6 Checkpoints for the International sites). For pages to be WAI Level A conformant, they must pass both the automated Level A checks and the manual checks indicated by the warnings. It is almost certain that some of the home pages would have failed some of the manual checks, so 34% is the maximum percentage of home page accessibility at Level A. 

Of the different domains in the English museums, the Local Authority and National museums fared best, each with 42% of home pages passing WCAG1 Level A for automated tests. Academic museums fared somewhat less well with 30% of home pages passing and only 23% of home pages for independent museums passing.  However, all these figures are higher than the corresponding figure found in the Disability Rights Commission's (2004) general investigation of British Web sites, which found that only 19% home pages in a more general sample passed the automated Level A tests.

  Priority 1
(Level A)

Priority 1 + 2
(Level AA)

Priority 1 + 2 + 3
(Level AAA)
Academic
30%
0
0
Independent
23
0
0
Local Authority
42
0
0
National
42
5
0
 
All English Museums
34
1
0
International museums
20
0
0

Table 2: Percentage of home pages that passed automated WCAG1 automated testing

Only 1 homepage, of an English national museum, had no automatically detectable Priority 1 and Priority 2 Checkpoint violations and was thus potentially Level AA conformant.  However, this home page did have both Priority 1 and 2 'warnings' and may therefore not have been AA compliant.  Nonetheless, this museum is to be congratulated for its efforts in accessibility.

No home pages achieved AAA Compliance, having no automated Priority 1, 2 or 3 Checkpoint violations.

Designer Metric and User Measures of Web Site Accessibility

Looking at whether a web page passes a particular WCAG1 level of accessibility provides only a very basic and rather crude measure of its technical accessibility.  A web page might miss achieving Level A conformance by failing on only one instance of one Priority 1 Checkpoint, or it might have 20 instances of 10 different Priority 1 checkpoints.  At the other extreme, one can try to digest the full set of figures available on the technical accessibility of a web page: for example, the number of different Checkpoints violated, the number of instances of those violations, the number of different Checkpoints giving warnings and the number of instances of warnings (see Table 3).  This overwhelms one in information and makes it very difficult to make meaningful comparisons.

  Priority 1 Priority 2 Priority 3
  Violations Warnings Violations Warnings Violations Warnings
  CPs Insts CPs Insts CPs Insts CPs Insts CPs Insts CPs Insts
Academic 0.8 10.5 8.0 42.6 2.4 20.3 14.6 52.9 2.1 6.7 8.9 8.9
Independent 1.0 8.9 7.8 51.1 2.7 44.0 15.0 68.2 2.4 10.8 8.8 8.8
Local Authority 0.6 5.7 7.4 30.6 2.5 18.5 14.6 44.6 2.3 8.4 8.5 8.5
National 0.9 5.8 9.9 92.5 2.7 41.6 17.0 115.2 3.3 19.6 10.3 9.8
International 1.0 12.8 9.7 73.4 3.0 40.0 15.9 88.3 2.8 15.1 9.6 9.8

Table 3: Mean number of Checkpoints (CPs) violated, instances (Insts) of such violations and mean number of Checkpoints with warning indications and instances of warnings, for each level of Checkpoint priority

To try to provide more detailed, but comprehensible, information about technical accessibility, we have developed two measures of accessibility: the "designer measure" and the "user measure" (Petrie and Hamilton, 2004).  The number of different Checkpoints violated (at any level of priority) we have termed the "Designer measure" of accessibility.  This is because it indicates the number of different accessibility issues the web designers of the particular site need to consider and understand in order to develop a process for implementing a design solution which will yield an accessible Web site.  The number of instances of violations of all Checkpoints we have termed the "User measure" of accessibility.  This is because every single instance of a violation of a Checkpoint is a potential stumbling block to using a web page for a disabled user.  These stumbling blocks may range from a page which completely prevents a disabled web user from progressing any further in a Web site (which following Nielsen, 19xx, might be termed an "accessibility catastrophe") to aspects of a page which simply cause irritation.  Of course, not all disabled web users will be affected by all the instances of all the accessibility problems.  Some instances relate specifically to the use of the Web by blind people, others relate to use by dyslexic people and so on.  In addition, disabled users will not always read every part of a Web page, and will therefore not encounter every accessibility problem.  Nonetheless, the total number of instances of violations of Checkpoints on a web page is a reasonable measure of its overall accessibility and usability from the viewpoint of disabled users, and the quality of the disabled user experience.  This analysis of automatically detected violations of Checkpoints, in terms of number of Checkpoints and instances of violations of those checkpoints, can also be applied to the warnings for manual checks.

To illustrate the relationship between the Designer Measure and the User Measure, consider the following example.  On a particular home page there may be violations of two Checkpoints: failure to provide ALT text for images (Checkpoint 1.1) and failure to identify row and column headers in tables (Checkpoint 5.1).  Therefore, the number of different Checkpoints violated, the Designer Measure, is 2.  However, if there are 10 images that lack ALT text and 3 tables with a total of 22 headers, then the number of instances of violations of all Checkpoints, the User Measure, is 32. This example also illustrates how violations of a small number of Checkpoints can easily produce a large number of instances of violations, and potentially a very poor disabled user experience.

  Designer Measure (Violations) Designer Measure (Warnings) User Measure (Violations) User Measure (Warnings)
English museum domains:        
Academic
5.2
31.7
37.5
104.4
Independent
6.0
31.7
63.7
128.1
Local Authority
5.4
30.5
32.6
83.7
National
6.8
37.2
67.1
217.5
 
All English Museums
5.8
32.4
49.4
126.6
International museums
6.9
35.2
67.9
171.5

Table 4: Average Designer and User Measure per museum home page

The Designer Measure (Violations), the mean number of different Checkpoints violated, was 5.8 per home page for the English museums and 6.9 for the International comparison museums (see Table 4). In other words, the average museum home page violates approximately 6 to 7 different WCAG1 Checkpoints.  This means that there are 6 to 7 issues that the average web design team need to address to make the home page accessible – not such a daunting task.  The difference between the English and International museums was significant (F 1, 123 = 5.3, p < 0.05) showing that the English museums fared significantly better on this measure.

The differences between the four English museum domains showed a trend towards a difference (F 3, 96 = 2.5 p = 0.065)  The academic museum home pages fared best, with an average of only 5.3 different Checkpoints violated and the National museums fared worst, with an average of 6.9 different Checkpoints violated.  Note that this ordering of the different domains is different from that revealed by the percentage of museums passing Priority Level 1.  The higher number of Checkpoints violated by national museums may reflect the use of more ambitious web designs by these museums (a possibility being investigated in current research at City University). 

The User Measure (Violations), the mean number of instances of all Checkpoint violations, was 49.4 per home page for English museums and 67.9 for the International comparison museums.   Note how the typically small number of different Checkpoints violated produces a large number of instances of violations, and hence a potentially poor user experience. The difference between the English and International museums was not significant, in spite of the large difference in the mean values (F 1, 123 = 1.84), undoubtedly because of the large variance within the English museum domains.  Clearly the division here is between the English academic and Local Authority funded museums that fare much better than the English Independent and National museums and the International museums that fare much less well.  Again, this may well reflect more ambitious web designs in the later group, and is currently being investigated further. 

Taking the same analysis for warnings rather than actual violations, the Designer Measure (Warnings) produced a mean of 32.4 for English museums and 35.2 for the International comparison museums.  Thus there are many Checkpoints that the average web design team need to address to make the home page accessible – which seems quite a daunting task.  However, the Disability Rights Commission (2004) study found that there was quite a low rate of "conversion" of warnings to actual violations.  This is clearly an area where the automated tools could attempt to improve the information they are providing to web developers.  Indeed, this is an area where groups who provide such tools are working to improve the support they offer.  The difference between the English and the International museums was significant (F 1, 123 = 4.05, p < 0.05), with English museums again faring better on this measure.  The differences between the different English domains were also significant (F 3, 96 = 5.05, p < 0.005), with the National museums clearly faring worse than the other domains.

The User Measure (Warnings) produced a mean of 126.6 instances of warnings for English museums and 171.5 instances of warnings for the International comparison museums.  One now begins to understand now why disabled people find browsing web pages so difficult.  The average English museum home page presents a disabled user with a mean of 159.0 potential stumbling blocks (32.4 actual violations and 126.6 warnings of possible problems). The corresponding figure for the International comparison group is 206.7 potential stumbling blocks (35.2 actual violation and 171.5 warnings of possible problems). The difference between the English and International Museums was not significant for this measure.  However, the differences between the different English domains was highly significant (F 3, 96 = 7.10, p < 0.0005), with Local Authority funded museums faring best and national museums faring worst.

User evaluations

Technical accessibility, the conformance to the WCAG1 Checkpoints, is one vital aspect of the accessibility of a Web site.  However, equally important is whether users with disabilities can use the site.  As outlined above, for a subset of 12 of the 125 Web sites studied, this was investigated by a user evaluation study.

The User Panel members succeeded in 76.2% of the attempted tasks for the English museums and 73.3% of attempted tasks for the International comparison museums (see Table 5).  Blind User Panels members had the most difficulty in using the Web sites, succeeding in only 62.5% of their tasks, compared to 82.5% success for partially sighted User Panel members and 83.3% success rate for dyslexic User Panel members.  These success rates compare favourably with the Disability Rights Commission study, which found an overall task success rate of 76%, but only 53% for blind Web site users. (Inferential statistics comparing the different museum and user categories will not be presented in this section, as a factorial design was not employed - not all users assessed all the museums, making such an analysis very difficult with the numbers involved).

User Group English
museums
(N = 7)
International
museums

(N = 5)
All museums
Blind
64.3%
60.0
62.5
Dyslexic
85.7
80.0
83.3
Partially sighted
78.6
88.0
82.5
All users
76.2
73.3
75.0

Table 5: Percent success at basic website tasks for each user group

The members of the Panel were asked to rate the ease of performing a task (see Table 6).  Mean ratings were all close to the mid point of the 7 point rating scale, indicating that User Panel members considered the ease of task performance neither especially easy nor difficult.  Interestingly, partially sighted User Panel members gave the lowest mean ease of task performance rating, although the difference between their ratings and those of the blind User Panel members was small, and for the international museums there was no difference between the two groups.  Dyslexic User Panel members were substantially higher, indicating a perception of greater ease of task performance.

  English
museums
International
museums
All
museums
Blind
4.3
4.4
4.3
Dyslexic
5.3
4.8
5.1
Partially sighted
3.8
4.4
4.1
All users
4.5
4.5
4.5
NB. 1 = very difficult, 7 = very easy

Table 6: Mean ease of task performance ratings for each user group

The members of the Panel were asked to rate the ease of navigation when attempting a task (see Table 7).  Again, mean ratings were close to the mid-point of the scale and partially sighted User Panel members gave the lowest ratings, indicating the most difficulty in navigation, although for the international museum group their ratings were higher than the blind or dyslexic User Panel members. 

User Group English
museums
International
museums
All
museums
Blind 4.4 4.6 4.5
Dyslexic 5.1 4.4 4.8
Partially sighted 3.9 5.2 4.4
All users 4.5 4.7 4.6
NB. 1 = very difficult, 7 = very easy

Table 7: Mean ease of navigation ratings for each user group

The members of the User Panel were asked whether they felt 'lost' on at least one occasion when exploring the Web sites (see Table 8). Over a third of all User Panels members felt "lost" on at least one occasion on the Web sites, with dyslexic Panel members reporting this situation more frequently than the other two user groups.

User group English
museums
International
museums
All
museums
Blind
28.6%
40.0
33.4
Dyslexic
42.9
40.0
41.7
Partially sighted
42.9
20.0
33.4
All users
38.1
33.3
36.1

Table 8: Percentage of each user group feeling 'lost' on at least one occasion

The problems reported by the Panel members and those observed by the researchers assisting in the evaluations were collated and categorised. Overall, 112 instances of problems were identified during the user evaluations.  Below, we outline the most common problems users encountered, the number of instances when they were reported and if the problem is covered by the WAI guidelines.


Problem

International
museums

English
museums

Total

Links - targets
Target of links not clearly identified
8
14
22
Use of color
Inappropriate use of colors and poor contrast between content and background
11
5
16
Text presentation
Information presented in dense blocks with no clear headings to identify informational content
6
11
17
Navigation
Navigation mechanisms used in an inconsistent manner
4
5
9
Image description
ALT tags on images non-existent or unhelpful
5
1
6
Ordering of information
Important information not located at top of list, page etc
3
3
6
Content complexity
Meaning of text makes it hard to read and understand
1
4
5
Links - grouping
Links not logically grouped, no facility to skip navigation
0
4
4
Graphical text
Images and graphical text used in-place of plain text
2
1
3

Table 9: Most frequent accessibility problems from user testing of Web sites

The 9 problems listed in Table 9 constituted 79% of all the problems uncovered during the user testing evaluations. Over half of these problems relate to orientation and navigation issues.

It is worth considering the 9 most frequent problems uncovered in more detail to highlight the very real difficulty and frustration experienced by disabled User Panel members, thus preventing them from utilising the full potential of museum Web sites. The problems below are grouped under three separate headings: alternative descriptions for images and other media; issues related to presentation of content; and navigation and orientation problems.

Alternative descriptions for images and other media

All controls, links and other elements should have clear and informative labels or descriptions associated with them that are both meaningful and add to a user's understanding. This was not often the case, especially on the International museum Web sites where four of the five sites evaluated carried image description violations, two of them so critical that they rendered the site impossible for a blind person to successfully use. One of these museums even stated that: "Our staff is committed to making the Museum accessible to all" but as one blind User Panel member commented: "...they certainly aren't with regards to their Web site ... none of the links have Alt tags so I can't even get past the homepage!"

This problem not only applies to links, images and pictures, but also to graphical text.  For example, one site used graphical text for their 'Accessible Site' link but failed to provide any form of ALT tag to it, therefore blind users were unaware that this option even existed.

Issues related to presentation of content

Partially sighted and dyslexic User Panel members often found the use of presentation features made text harder to read and understand.  For example, Web sites that include bold text within the main content, in a manner that does not appear to highlight specific information, or convey any extra information, becomes a distraction.  The presentation of the content on a web page needs to be carefully chosen as to aid the user experience and special features limited to headings or key words.

Many of the complaints from the dyslexic and partially sighted members of the User Panel related to the color scheme and color contrast used for page designs. While some of these complaints were of a purely subjective nature, the color scheme often affected these Panel members abilities to perform tasks, particularly when the contrast between the text and the background was inadequate.  Pale text on pale backgrounds was a common problem.  Moreover, different people benefit from different colour schemes.  For example, while many people are happy with black text on a white background, some partially sighted people find this contrast too 'harsh', and benefit from a very strong inverted contrast such as yellow text on a black background.  On the other hand, dyslexic people often prefer a reduced contrast, with a pastel background.  The ability to easily change the color of text and background allows people to view a page and read its text with presentation combinations that suit their particular needs.

Navigation and orientation problems

In terms of navigating sites, two key problems emerged.  Firstly, ambiguously named links that led to unexpected content were responsible for many of the navigation problems User Panel members encountered.  For example, opening times were often found under the link Contact Us.  As one dyslexic User Panel member commented "...important information like opening times and disabled access should not be hidden under other obscure titles... why can't they just put a link saying Opening Times”.  Secondly, many sites offered inconsistent navigational mechanisms.  Links were not located in navigation bars but dispersed across the page, and icons and images were inconsistently used as active links.

Poor page design (in terms of layout) led to recurrent orientation problems for all the user groups involved in the evaluations.  Both the experts at City University and the members of the User Panel considered many sites to have overly complex and "cluttered" pages with dense blocks of text.  No clear indication of main headings, secondary headings and so on was a recurring problem throughout the sites evaluated.  While sighted people would be able to infer some of this logic from text sizes, colour coding, etc, blind User Panel members did not have access to this visual information and so pages were experienced as "illogical", apparently lacking a logical structure.

Good accessible design features

In addition to the specific problems they encountered, the Panel members were also asked to report what they particularly liked about the sites they evaluated. Perhaps unsurprisingly, many of the positive aspects were the opposite of the problems outlined above.  For example, blind Panel members appreciated headings and links marked-up with "sensible names".  The other user groups appeared to share these sentiments, with Panel members also liking sites that had clear navigation mechanisms, logical page layouts, clear contrast and straight-forward language.

Discussion and Conclusions

This research has shown that the level of accessibility of English and international museum Web sites is not high.  On technical accessibility, only 30% of English museum sites and 20% of international museum sites passed the basic accessibility Level A, even on the subset of automated tests.  However, when one investigates these finding further, the average museum home page has approximately seven known accessibility problems (Designer measure - mean for English museums 6.8 and for international museums 6.9), which means there are not a large number of problems to be solved.  The calculation of the Design and User measures shows how this relatively small number of different problems can lead to a poor user experience, with an average of approximately 36 instances of known accessibility problems per home page (37.2 for English museums and 35.2 for international museums). 

These findings on technical accessibility are reinforced by the user testing data which found that disabled users could only successfully complete 75% of very basic tasks on the Web sites and 36% of disabled users feeling lost on at least one occasiong.  Blind people in particular found the sites difficult to use with only a 62.5% success rate.  However the user testing data also showed that the number of problems that developers of museum Web sites need to address is not particularly long or difficult to address.  Nearly 80% of all the problems encountered by the disabled users were addressed by nine accessibility issues (see Table 9). 

Web sites are now an integral part of the public-facing presence of most museums.  Disabled people find information provided on the Web an extremely useful format, if it is provided in ways that are appropriate for them, or allow them to transform the information into appropriate formats such as synthetic speech for visually impaired people.  Currently museum Web sites are not meeting that challenge very well.  However, this research has shown that the issues that need to be addressed to change this situation and provide exemplary digital access to museums are not difficult to meet.

References

Disability Rights Commission. (2002).  Code of Practice: Rights of access, goods, facilities, services and premises.  London: The Stationery Office.

Disability Rights Commission. (2004).  The Web: Access and inclusion for disabled people.  London: The Stationery Office.

Petrie, H., and Hamilton, F. (2004). The Disability Rights Commission Formal Investigation into Web Site Accessibility. In Dearden, A. and Watts, L. (Eds.), Proceedings HCI 2004: Design For Life.  London: British Computer Society.

Cite as:

Petrie, H., King, N. and M. Weisen, The accessibility of museum Web sites: results from an English investigation and international comparisons, in J. Trant and D. Bearman (eds.). Museums and the Web 2005: Proceedings, Toronto: Archives & Museum Informatics, published March 31, 2005 at http://www.archimuse.com/mw2005/papers/petrie/petrie.html