finance.gov.au

Contact and help

The Australian Government's study into the Accessibility of the Portable Document Format for people with a disability

Phase three: user evaluations – the lived experience

The phase two technical evaluations identified seven assistive technologies used by people who are blind or have low vision that provide ‘sufficient’ or ‘partially sufficient’ technical capability for the Portable Document Format. However, the user perspectives offered in phase one identified that users still experienced difficulties in accessing PDF files through technologies deemed to provide ‘sufficient’ or ‘partially sufficient’ capability.

User evaluations play a critical role in identifying technical and practical accessibility issues from the end-users perspective. While conformance to standards is important, it is widely accepted that user evaluations provide the most reliable information as to whether a product or service achieves its purpose.

In terms of web accessibility, the W3C offers that “web accessibility evaluation often focuses on conformance to accessibility standards such as WCAG. While conformance is important, there are many benefits to evaluating with real people to learn how your website or web tool really works for users and to better understand accessibility issues. Evaluating with users with disabilities and with older users identifies usability issues that are not discovered by conformance evaluation alone13.

The user experience evaluations carried out in this Study were conducted against the ISO’s approach to assessing usability and accessibility defined earlier. This was interpreted to mean that, for the PDF format to be usable and accessible, the participants should be able to use the PDF files with their chosen adaptive strategy to achieve their goals:

Twenty-three user experience evaluations were undertaken on eight PDF files by people using different adaptive strategies. Participants were asked to perform six tasks on two sets of PDF documents and were assessed by a Vision Australia evaluator. Participants were interviewed before and after their evaluations to ascertain their level of experience prior to undertaking the tasks and then to determine their overall satisfaction with the result. A full breakdown of the interview questions is included in the Supplementary Report.

In developing the user experience evaluations, findings from the focus groups conducted in phase one were used to highlight three core dimensions for consideration:

  1. The type of disability and the user’s adaptive strategies, including the type of assistive technology used to interact with PDF files;
  2. The skill and experience of the user; and
  3. The way the PDF file is created.

Participants

The accessibility of PDF files potentially affects a whole range of people with a disability. The main disability groups when referring to how people with a disability use the web are:

  1. Blind (including Deafblind)
  2. Low vision
  3. Cognitive
  4. Mobility
  5. Hearing14

Within the disability groups, participants were recruited who use various assistive technologies that were classified as ‘sufficient’ or ‘partially sufficient’ in the Technical Evaluation. Experience or skill level had been highlighted as a factor that impacted on a participant’s ability to use PDF files during the user consultations in Phase one of the Study. As such, users with different skill levels/experience using their assistive technology, particularly for screen reader users, were chosen.

Dragon Naturally Speaking Professional, Read Out Loud, Read & Write Gold, ZoomText speech component, and MAGic speech component were omitted from the technical testing in Phase two because these assistive technologies do not interact with the structure of the PDF file. However, they were included in the User evaluations to explore any usability considerations that may exist when interacting with PDF files.

The Supplementary Report includes participant profiles by disability group and provides a summary of the adaptive strategies they use.

The PDF test documents

Phase one focus group participants highlighted that the layout and structure of PDF files contributed to their negative experiences. This is largely dependent upon how PDF files are created. To provide a comparative analysis, two sets of document collections were created for the evaluation, Collection A and Collection B.

Collection A comprised a set of PDF documents that had been optimised for accessibility using the Adobe published characteristics of an accessible PDF document 15. The documents were created with:

Collection B included a sample of documents recently published by government departments in Australia.

Document categorisation

To capture the diversity of documents mentioned in the Phase one focus groups, four types of documents were included for the user evaluations. These included:

Two common PDF document types were excluded from the Study:

Document selection process

To apply objectivity in selection of the PDF documents used in this Study, a four-stage selection process was used. It included:

  1. An invitation to Adobe to provide documents in PDF that met the characteristics of an accessible PDF document for Collection A; and an invitation to government departments, via AGIMO, to nominate typical government documents in PDF for Collection B;
  2. The documents were assessed for accessibility characteristics by Vision Australia, using the Adobe Pro tool with both a screen reader and screen magnification software. They were categorised into one of the four document types;
  3. Many of the documents provided did not meet the specified characteristics required for Collection A so Vision Australia sourced documents from the public domain and asked Adobe to adapt (retrofit) them to meet the characteristics; and
  4. Vision Australia completed an independent verification of the documents for Collection A against the accessibility characteristics (defined by Adobe).

Adobe was unable to provide examples of documents for each of the document types that met all the accessibility characteristics, but they did provide examples of the brochure and form document types. The short and long document examples were sourced from the public domain. The accessible characteristics were then retrofitted to these documents by Adobe’s accessibility experts, however the process and time taken for retrofitting was not recorded.

Similarly, the files provided by government departments did not meet the accessible characteristics either. In most cases, the departments believed the documents had been created with accessibility in mind. This aligns with the observations from the public consultation which highlighted that misinformation exists about the accessibility of PDF files. It also shows a lack of information about how to create them.

The Supplementary Report provides the summary examples of the documents used in the evaluations.

User evaluation tasks

Six tasks were created, based on the three core activities people perform when interacting with documents and forms: read, navigate and interact. The specific tasks chosen for the user experience evaluations were guided by the following factors:

The six tasks were:

Read

  1. Read a short document (general navigation and interacting with the text);
  2. Navigate information in a table structure;
  3. Access and understand information portrayed through an image/alt text;

Navigate

  1. Identify and move through a document using document structure;
  2. Navigate through a large document using page numbers;

Interact

  1. Interact and complete a form.

Activities such as opening a PDF file, saving a PDF file and editing a PDF file were excluded from the set of tasks.

User evaluation result (measure of effectiveness)

Overall the participants succeeded in 77% (188/245) of the tasks they attempted across both document collections. Table 5: Task success rates for each user group highlights the overall findings for each of the core disability groups. In each case the percentage of tasks completed successfully is shown, together with the number of successful tasks out of the total number of tasks attempted. In many cases, not all of the tasks for Collection B were completed by the participants due to time constraints.

The PDF files that were optimised for accessibility (Collection A) provided an enhanced user experience. Overall, participants achieved a 90 % success rate for tasks completed on these documents, compared with a 60% success rate for tasks completed on general PDF files (Collection B).

The user experience testing confirmed that people who are blind and use screen readers experience the greatest difficulties when accessing and interacting with PDF files. Participants who are blind were the only disability group to fail tasks on the documents that were optimised for accessibility.

Table 5: Task success rates for each user group

Task Collection Blind Low Vision Mobility Cognitive Hearing Overall
Read a short document (general navigation) A

100% (11/11)

100% (5/5)

100% (3/3)

100%
(3/3)
100%
(1/1)
100%
(23/23)
B 0%
(0/9)
75%
(3/4)

100% (2/2)

67%
(2/3)
100%
(1/1)
42%
(8/19)
Navigate information in a table structure A 64%
(7/11)
100% (5/5) 100%
(3/3)
100%
(3/3)
100%
(1/1)
83%
(19/23)
B 0%
(0/7)

100% (4/4)

100% (2/2)

100%
(2/2)
100%
(1/1)
56%
(9/16)
Access and understand information portrayed through an image/alt A 100% (11/11)

100% (5/5)

100%
(3/3)
100%
(3/3)
100%
(1/1)
100%
(23/23)
B 0%
(0/9)
100%
(3/3)

100% (2/2)

100%
(3/3)
100%
(1/1)
50%
(9/18)
Identify and move through a document using structure such as headings A

100% (11/11)

100% (5/5)

100% (2/2)

100%
(3/3)
100%
(1/1)
100%
(22/22)
B

90% (9/10)

100% (2/2)

100% (2/2)

100%
(2/2)
100%
(1/1)
94%
(16/17)
Navigate through a large document using page numbers A 9%
(1/11)

100% (5/5)

100%
(3/3)
100%
(3/3)
100%
(1/1)
57%
(13/23)
B 63%
(5/8)

100% (4/4)

100%
(3/3)
67%
(2/3)
100%
(1/1)
79%
(15/19)
Interact and complete a form A 100% (11/11)

100% (5/5)

100%
(3/3)
100%
(3/3)
100%
(1/1)
100%
(23/23)
B 0%
(0/10)

100% (4/4)

100% (2/2) 50% (1/2) 100%
(1/1)
42%
(8/19)
TOTAL A 79% (52/66)

100% (30/30)

100% (17/17) 100% (18/18) 100%
(1/1)
 
B

26% (14/53)

95% (20/21)

100% (13/13)

80%
(12/15)
h14  

Acceptance of time (measure of efficiency)

Actual time taken for users to complete a task was not used as measure because of the significant variation in task completion time resulting from the different adaptive strategies for each user group. For example, people who are blind and use a screen reader typically take longer than a person that does not solely rely on audio feedback.

A more appropriate measure of efficiency is to ask the users to rate whether the time taken to complete a task was acceptable or not. Though this measure is subjective, it enables the user to judge the acceptability of the time taken against similar experiences based on their interaction method.

Overall, users gave a 91% acceptance rating to the time taken to complete the tasks on the accessibility-optimised documents in Collection A. For those documents representative of government PDFs, Collection B, users rated their acceptance as 84% for tasks completed.

Satisfaction ratings (measure of satisfaction)

For each of the document collections, the participants were asked to also rate their overall satisfaction with the PDF format. Minimal statistical differences were evident between the blind (screen reader) group and the other participants.

Participants were generally satisfied with their use of the Collection A documents and indicated they would be very comfortable in using PDF documents like these again.

After using documents in Collection B, most participants noted they would not be as comfortable using those documents again. However, people who are blind and use screen readers expressed that they were not at all comfortable with these documents, which correlates with their very low level of success achieved with the test tasks.

Problems experienced by users

Where participants encountered an issue during the user evaluation sessions this was recorded. Full details of all the test issues are available in the Supplementary Report

For context, the issues encountered by the participants during the user experience testing have been categorised into four groups. Importantly, none of these factors directly relate to the Portable Document Format. These are summarised in Table 6: Factors affecting PDF files and their impact.

Table 6: Factors affecting PDF files and their impact

Factor Number of Issues Impact
Document Design 76 (51%) The design of the document (e.g. missing tags or elements, problems with reading order, or the way the information was presented) created a barrier for the user.This required the use of an alternative approach or prevented them from interacting with the document.
AT Support 41 (28%) The user's AT did not provide sufficient functionality to enable the user to interact with the PDF file using their chosen approach. In some cases this confirmed findings in the technical testing, in others it highlighted new areas of technical incompatibility.
User Skill 24 (16%) A lack of knowledge by the user about using their AT, Adobe Reader, or PDF files led to confusion and difficulty in completing the task.
Adobe Reader 8 (5%) Features provided by the Adobe Reader did not support the user to interact with the document using their adaptive strategy.

Notes: The number of issues comprises 'unique' issues encountered by each assistive technology on each task. It does not take into account the frequency with which an issue occurred for a specific task, or how many users of a specific assistive technology were affected by the issue on a specific task, as the number of users for each assistive technology varied.

Full details of all the test issues are available in the Supplementary Report.

Document design

Document design presented the most frequent cause for the issues encountered in the user evaluations. While the issues primarily related to the PDF files in Collection B, a small number of document design issues were also uncovered with the documents optimised for accessibility in Collection A.

The prevalence of document design issues, and the impact these had on users successfully completing tasks, highlights the importance of correct approaches in the creation of accessible PDF files. This confirms the observations made by participants during the User Consultation session. It is concerning that users still experiences significant barriers even though many of the documents from both collections were created with some measure of accessibility in mind.

Page number task

The task relating to navigating through a large document using page numbers in Collection A had a very low success rate for the eleven screen reader users (9%). As this task was so problematic and the experience consistent for all screen reading software, it is discussed here. Table 7: Success rates for navigating by page highlights the severity of the issue.

Table 7: Success rates for navigating by page number

  JAWS NVDA SATOGO Window-Eyes
Success 1 0 0 0
Fail 6 1 1 2

Only one of the eleven screen reader users successfully completed this task – the Deafblind participant who used his remaining sight alongside JAWS.

The high failure rate was due to the document design. The underlying cause of the problem was that the document uses a spread layout, two pages side by side on each A4 page that is meant to facilitate printing but when viewed online presents problems.

The task required the user to move to page 16 of the document. However, because of the two page spread layout Adobe Reader only recognised that the document had 14 screen pages. In essence, the two print pages are included on a single screen page.

Page numbers were specifically identified by participants in the phase one - user consultations as a significant problem for using PDF files with screen readers. The spread layout (used in Collection A tasks) adds extra complexity as screen reader users are unable to ascertain the correct page number using the Adobe Reader Page Navigation toolbar. When documents only present one print page per screen page this issue is largely alleviated (see Collection B results for the same task). This result confirms that finding.

Assistive technologies

Several assistive technology support issues were exposed during user evaluations. While some of these supported findings from the Phase two - technical evaluations, new issues were found with some of the screen readers:

The first two issues relate to technical problems that were not identified by vendors nor addressed in the technical testing.

The final issue, with paragraph navigation using JAWS, is not a specific WCAG 2.0 requirement and therefore does not lead to a failure based on the technical evaluation criteria. Instead, this is a user experience issue. In other formats, users are able to navigate by paragraphs; using JAWS, however, this feature is not available when using PDF files. While the participant was able to complete the required task using another method, they reported that the overall experience was not satisfactory because it did not compare to their experience using other common document formats.

Since the PAC Mate vendor claimed some support for PDF files, the intention was to include PAC Mate in phase three – user experience. However, PAC Mate (using the recommended Orneta PDF viewer) was unable to open any of the test PDF files, so the evaluation could not be conducted. Based on this experience, the overall level of technical capability provided by PAC Mate is deemed to be not sufficient in this Study.

Overall combined accessibility testing results

In combining the results of each of the phases of this Study, a comprehensive picture of the status of PDF accessibility is evident. Table 8: Combined summary results provides a comparison.

As noted earlier, the Adobe Test Suite has limitations satisfying all the relevant Success Criteria of WCAG 2.0. This finding, combined with the lack of Sufficient Techniques for WCAG 2.0 available for PDF files (as at August 2010) leads to the conclusion that there is insufficient evidence to prove that PDFs can conform to WCAG 2.0. As WCAG 2.0 is the internationally accepted benchmark for testing the accessibility of web content, and is the endorsed web standard for Australian Government websites, the Australian Government is unable to classify PDF as an ‘Accessibility Supported’ technology at this time.

While using JAWS and ZoomText to access PDF files satisfied all of the test cases provided in the Adobe Test Suite, the lack of Sufficient Techniques (at present) to support conformance to WCAG 2.0, means that this is not sufficient evidence of ‘Accessibility Support’. Further, the use of JAWS was unable to satisfy all of the user evaluation tasks in either document collection (A and B). While the test results do indicate that JAWS is more accessible than other assistive technologies (SATOGO or NVDA, for example), the assertion that the use of PDF files with JAWS is accessible is incorrect.

The Study did demonstrate that some assistive technologies and some people with a disability can use and benefit from PDF documents optimised for accessibility. Agencies are encouraged to ensure that, where PDF files are used, these meet the Adobe accessibility characteristics – in addition to providing alternatives.

Table 8: Combined summary results

Adaptive Strategies AT Device Version Adobe Test Suite User evaluation tasks A / B collections
Braille Notetaker BrailleNote N/A Not tested Not evaluated
PAC Mate 6.5 Not tested Technical failure – Could not test
Screen Reader JAWS 8-11 43/4 A 35/42 B 8/29
NVDA 2009.1 v 2009.1 A 4/6 B 2/6
SATOGO 3.0 v 3.0 A 4/6 B 1/6
Window-Eyes 10.5 & 10.6   Not evaluated
VoiceOver 7   A 9/12 B 3/12
Screen Magnifier MAGic 9.5 - 11 v 9.5 A 6/6 B 3/4
ZoomText 8 & 9 21/21 A 24/24 B 17/17
Other ATs tested in User evaluation phase Adobe Read Out Loud   Not tested A 6/6 B 2/5
Read & Write Gold   A 6/6 B 4/4
Dragon Profess.   A 6/6 B 4/4
Keyboard only   A 11/11 B 9/9

Footnotes:

  1. World Wide Web Consortium (W3C), 2010, Involving Users in Evaluating Web Accessibility, viewed October 2009, http://www.w3.org/WAI/eval/users External Site
  2. World Wide Web Consortium, How People With A Disabilities Use the Web, Working-Group Internal Draft, 5 May 2005, viewed 20 June 2010, http://www.w3.org/WAI/EO/Drafts/PWD-Use-Web/ External Site
  3. Adobe Systems Incorporated, Adobe® Acrobat® 9 Pro Accessibility Guide: PDF Accessibility Overview, 2008, United States, viewed October 2009, http://www.adobe.com/accessibility/products/acrobat/pdf/A9-pdf-accesibility-overview.pdf External Site
  4. Round Table on Information Access for People with Print Disabilities Inc, 2009, Guidelines for Accessible E-text, section 2: Accessible, report prepared by E-text Working Group, http://www.e-bility.com/roundtable/guidelines.php External Site

 

Contact for information on this page: wcag2@finance.gov.au


Back to top

Last Modified: 25 November, 2010