Iso Iec +25066-2016

I N TERNATIONAL S TANDARD ISO/IEC 25066 First editio n 2 0 1 6- 0 6- 1 5 Systems and software engineering — Systems an

Views 59 Downloads 4 File size 940KB

Report DMCA / Copyright

DOWNLOAD FILE

Recommend stories

  • Author / Uploaded
  • moha
Citation preview

I N TERNATIONAL S TANDARD

ISO/IEC 25066 First editio n 2 0 1 6- 0 6- 1 5

Systems and software engineering — Systems and software Quality Requirements and Evaluation (SQuaRE) — Common Industry Format (CIF) for Usability — Evaluation Report Ingénierie des systèmes et du logiciel — Exigences de qualité et évaluation des systèmes et du logiciel (SQuaRE) — Format de l’industrie commune pour l’utilisation — Rapport d’évaluation

Reference numb er I SO /I EC 2 5 0 66: 2 0 1 6(E )

©

I SO /I E C 2 0 1 6

ISO/IEC 25066:2016(E)

COPYRIGHT PROTECTED DOCUMENT © I SO /I EC 2 0 1 6, Publis hed in Switzerland

All rights reserved. Unless otherwise specified, no part o f this publication may be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting on the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address below or ISO’s member body in the country o f the reques ter.

ISO copyright o ffice Ch. de B lando nnet 8 • C P 40 1 CH -1 2 1 4 Vernier, Geneva, Switzerland Tel. + 41 2 2 749 0 1 1 1 Fax + 41 2 2 7 49 0 9 47

[email protected] www.iso. o rg

ii

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

Contents

Page

Foreword .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. iv Introduction . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . v 1

Scope . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . 1

2

Conformance . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . 1

3

Terms and definitions . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . 1

4

Purpose and types of usability evaluations . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . 6 4.1

4.2 4.3 5

Purpose of an evaluation . . . . . . . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. 6

Types o f usability evaluations Assessing con formity o f the object o f evaluation against specified criteria

. . . . . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .

6

. . . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . .

7

Content elements of usability evaluation reports . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . .. . . . . . 9 5 .1

5.2

Selecting content elements . . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. 9

Description o f the content elements for each type o f evaluation 5.2.1 Executive summary (i f used) 5.2.2 Description o f the object o f evaluation

. . . . . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . .

10

.. . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. .

10

.. . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . .

10

5 .2 .3

Purpose of the evaluation . . . . . . .. . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . 1 1

5 .2 .4

M ethod . . . . . .. . . . . . . . . .. . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . 1 2

5 .2 .5

Procedure .. . . . . . . . . .. . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. 1 7

5 .2 .6

Results . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . 2 2

5 .2 .7

I nterpretation of results and recommendations . . . . . . . . . .. . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . 2 4

5.2.8

Additional content for conformity assessment (as part o f a usability

evaluation rep ort) . . .. . . . . . . . . .. . . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . .. . . . . . . . . . 2 5

Annex A (normative) Overview on required and recommended content elements for each type of evaluation . . . . .. . . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . 26 Annex B (informative) Usability test report example . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . 29 Bibliography . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . . .. . . . . . . . 37

© I SO /I E C 2 0 1 6 – All rights res erved

iii

ISO/IEC 25066:2016(E)

Foreword I SO (the I nternational O rgani zation for Standardiz ation) and I E C (the I nternational E lec trotechnical

Commission) form the specialized system for worldwide standardization. National bodies that are

memb ers of I S O or I E C p ar ticip ate in the development of I nternational Standards through technical

committees established by the respective organization to deal with particular fields o f technical activity. ISO and IEC technical committees collaborate in fields o f mutual interest. Other international

organi zation s , governmental and non- governmental, in l iaison with I SO and I E C , al so take p ar t in the

work. In the field o f in formation technology, ISO and IEC have established a joint technical committee, I SO/I EC J TC 1 .

T he procedures used to develop this do cument and those intended for its fur ther maintenance are describ ed in the I S O/I EC D irec tives , Par t 1 . I n p ar ticu lar the di fferent approval criteria needed for

the di fferent types o f document should be noted. This document was dra fted in accordance with the editorial ru les of the I SO/I E C D irec tives , Par t 2 (see www. iso . org/direc tives) .

Attention is drawn to the possibility that some o f the elements o f this document may be the subject o f patent rights. ISO and IEC shall not be held responsible for identi fying any or all such patent rights. Details o f any patent rights identified during the development o f the document will be in the I ntro duc tion and/or on the I SO l is t of p atent declarations received (see www. iso . org/p atents) .

Any trade name used in this document is in formation given for the convenience o f users and does not cons titute an endorsement.

For an explanation on the meaning o f ISO specific terms and expressions related to con formity as ses s ment, as wel l as in formation ab out I SO ’s adherence to the WTO principles in the Technical B arriers to Trade ( TB T ) , see the fol lowing U RL: T he

committee

res p ons ible

for

this

do cument

Ergonomics of human-system interaction technology,

iv

Sub committee S C 7,

Foreword — Supplementary in formation . is

I SO/ TC

1 59,

Ergonomics,

Sub committee

and Joint Technical C ommittee I SO/I EC J TC 1 ,

SC

4,

Information

Software and system engineering .

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

Introduction T he hu man- centre d de s ign appro ach o f I S O 9 2 41-2 10 i s wel l e s tabl i s he d and ma ki ng

s ys tem s

u s able .

Us abi l ity

c an

be

ach ieve d

b y applyi ng

fo c u s e s

huma n- centre d

s p e ci fic a l ly on

de s ign

th roughout

the l i fe c ycle . I n order to enable a hu man- centre d appro ach to b e adop te d, it i s i mp or ta nt th at a l l the relevant typ e s o f i n formation relate d to u s abi l ity (i n formation item s) are identi fie d a nd com mun icate d . T he identi fic ation and com mun ication o f releva nt typ e s o f i n formation relate d to u s abi l ity enable s the de s ign and te s ti ng o f the u s abi l ity o f a s ys tem .

This

I nternationa l

Sta nda rd

provide s

eva luation o f a n i nterac tive s ys tem . do c u menti ng a nd

a

framework

and

I t i s i ntende d to

com mu n ic ati ng u s abi l ity-relate d

con s i s tent

term i nolo g y

as s i s t u s abi l ity s p e c i a l i s ts

i n formation

as

p ar t o f the

for

rep or ti ng

the

and develop ers

in

s ys tem development

l i fe c ycle .

T he

C om mon

I ndu s tr y

Format

(C I F )

for

Us abi l ity

fam i ly

o f I nternationa l

Sta nda rd s

is

de s c rib e d

i n I S O/I E C T R 2 5 0 6 0 and i s p ar t o f the S QuaRE (S ys tem s and s o ftware Qua l ity Re qui rements a nd E va luation) s erie s o f s tandard s on s ys tem s a nd s o ftwa re pro duc t qua l ity re qu i rements and eva luation

(I SO/I EC 2 5 0 0 0

1)

, I SO/I E C 2 5 0 01 , I SO/I E C 2 5 02 1

2)

, I SO/I E C 2 5 02 3

3)

, I S O/I E C 2 5 0 40 , I SO/I E C 2 5 0 41 and

I S O/I EC 2 5 0 51) . T he C I F

fam i ly

o f s tandard s u s e s defi n ition s that are con s i s tent with the I S O 9 2 41 s erie s o f s tandard s

(E rgonom ics o f hu man- s ys tem i nterac tion) , as th i s i s the term i nolo g y that i s norma l ly u s e d s ubj e c t matter. I n s ome c as e s , the s e defi n ition s d i ffer

from

for

th i s

tho s e i n I S O/I E C 2 5 0 0 0 .

C I F s tandards are published or planned for the fol lowing information items: —

C om mon I ndu s tr y Format (C I F )

NO TE

for

us abi l ity te s t rep or ts (I S O/I E C 2 5 0 62 ) ;

I S O/I E C 2 5 0 62 provides more detai l for the content of a u s er ob s er vation rep or t for p erforma nce

me as urement.



C ontex t o f u s e de s c rip tion (I S O/I E C 2 5 0 6 3 ) ;



Us er ne e d s rep or t (I S O/I E C 2 5 0 6 4) ;



Us er re qu i rements s p e c i fic ation (I S O/I E C 2 5 0 6 5 ) ;



E va luation rep or ts (I S O/I E C 2 5 0 6 6) ;



Us er i nterac tion s p e c i fication ( pla nne d) ;



Us er i nter face s p e c i fic ation ( pl an ne d) ;



Field data rep or t (planned) .

T he

CIF

s tandards

are

p ar t of the

“E xtens ion

D ivis ion”

of the

I SO/I E C

25000

SQuaRE

series

of

I nternational Standards . Table 1 presents an over view of the s truc ture and the contents of the SQuaRE series of I nternational Standards .

1)

Withdrawn.

2)

Withdrawn.

3)

Under development.

© I SO /I E C 2 0 1 6 – All rights res erved

v

ISO/IEC 25066:2016(E)

Table 1 — Organization of SQuaRE series of International Standards SQuaRE Architecture and Sub-projects

ISO/IEC 2501n: Quality M o del D ivi s ion

ISO/IEC 2503n: Quality

ISO/IEC 2500n: Quality

ISO/IEC 2504n: Quality

Requ i rement D ivi s ion

M anagement D ivi s ion

E va luation D ivi s ion

ISO/IEC 2502n: Quality M ea s urement D ivi s ion I S O/I E C 2 5 0 5 0 – 2 5 0 9 9 S QuaRE E xtens ion D ivi s ion

ISO/IEC 25051: Requirements for quality o f Ready to Use So ftware Product (RUSP)

ISO/IEC 2506n Common Industry Format Division

and i ns truc tion s for tes ti ng

Figure 1 — Relationship of CIF documents to human-centred design in ISO 9241-210 and system lifecycle processes in ISO/IEC 15288 Figure 1 i l lus trates the interdep endence of these information item s with the human- centred des ign

activities described in ISO 9241-210, as well as the corresponding System Li fe Cycle processes described in I SO/I EC 1 52 8 8

4)

.

The following discussion also serves as alternative text for the figure. The figure depicts the activities as a set o f intersecting circles. The circles overlap to represent that the activities are not separate, but rather overlapping in time and scope, and the outcome o f each activity provides the input to one or more other activities. As each human-centred design activity can provide input to any other, no starting point, end point, or linear process is intended.

4)

vi

Withdrawn. Replaced with I SO /I E C /I E E E 1 5 2 8 8 : 2 0 1 5 .

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

T he human- centred des ign is comp osed of four interac ting ac tivities represented as overlapping circles in the diagram where User Needs are at the centre. T he fi rs t ac tivity i nvolve s C ontex t o f Us e . Hu man- centre d de s ign rel ie s on u s er ne e d s that are fi rs t identi fie d du ri ng o f the C ontex t o f Us e ana lys i s . Us er ne e d s a re do c u mente d i n the Us er ne e d s rep or t

(I S O/I EC

2 5 0 6 4) ,

which is

an intermediate del iverable

that l in ks

the C ontext of Use

D escrip tion

(I S O/I EC 2 5 0 63 ) that contains in formation ab out the us ers , their tasks and the organi zational and phys ic a l envi ronment, to the u s er re qu i rements . T he s e item s are develop e d du ri ng the Sta keholders re qu i rements defi n ition pro ce s s de s crib e d i n I S O/I E C 1 5 2 8 8 .

T he

s e cond

ac tivity

i nvolve s

D erive d

(I S O/I E C 2 5 0 6 5 ) provide s the b a s i s

for

Re qu i rements .

T he

Us er

re qu i rements

s p e ci fic ation

de s ign and eva luation o f i nterac tive s ys tem s to me e t the u s er

ne e d s . Us er re qu i rements are develop e d i n conj u nc tion with a nd

from

p ar t o f the overa l l re qu i rements

s p e ci fic ation o f a n i nterac ti ve s ys tem .

T he on

th i rd

ac tivity

de s ign i ng u s er

i nvolve s

D e s igne d

i nterac tion

S olution s .

that me e ts

u s er

T he

“P ro duce

re qu i rements .

de s ign

This

s olution s ”

ac tivity ta ke s

ac tivity place

fo c u s e s

du ri ng the

Architec tural D es ign, I mplementation, and I ntegration pro ces ses describ ed in I S O/I EC 1 52 8 8 and pro duce s the i n formation item s “Us er i nterac tion s p e c i fic ation” a nd the “Us er i nter face s p e c i fic ation”.

T he

fou r th

ac tivity i nvolve s E va luation Re s u lts . T he “E va luate” ac tivity s tar ts at the e arl ie s t s tage s

i n the proj e c t, eva luati ng de s ign concep ts to ob tai n a b e tter u nders tand i ng o f the u s er ne e d s . D e s ign s olution s c an b e eva luate d mu ltiple ti me s as the i nterac tive s ys tem i s b ei ng develop e d and c a n pro duce variou s typ e s o f eva luation rep or ts and u s abi l ity d ata s uch a s that de s c rib e d i n I S O/I E C 2 5 0 62 . T he s e eva luation s c an s upp or t the I S O/I E C 1 5 2 8 8 Va l idation P ro ce s s th at con fi rm s th at the s ys tem compl ie s

with the s takeholders ’ requirements .

© I SO /I E C 2 0 1 6 – All rights res erved

vii

INTERNATIONAL STANDARD

ISO/IEC 25066:2016(E)

Systems and software engineering — Systems and software Quality Requirements and Evaluation (SQuaRE) — Common Industry Format (CIF) for Usability — Evaluation Report 1

Scope

This

I nternationa l

Standard

de s crib e s

the

C om mon

I ndu s tr y Format (C I F )

for

rep or ti ng

eva luation s . I t provide s a cla s s i fic ation o f eva luation appro ache s and the s p e c i fic ation s

items (content elements) appro ach(e s) .

T he

for

u s abi l ity

the content

to b e included in an evaluation rep or t b ased on the selec ted evaluation

i ntende d u s ers

o f the

u s abi l ity eva luation

rep or ts

are

identi fie d ,

as

wel l

as

the

s ituation s i n wh ich the u s abi l ity eva luation rep or t c an b e appl ie d .

T he

u s abi l ity

eva luation

rep or ts

in

th i s

I nternationa l

hardware s ys tem s , pro duc ts or s er vice s u s e d

for

Sta nda rd

a re

appl ic able

to

s o ftware

a nd

pre defi ne d ta s ks (e xclud i ng generic pro duc ts , s uch

as a d i s play s c re en or a keyb o ard) . T he content elements are i ntende d to b e us e d a s p a r t o f s ys tem-

level

do cumentation

res ulting

from

development

pro ces ses

s uch

as

those

in

ISO

92 41-2 10

and

I S O/I E C J TC 1/SC 7 pro ces s s tandards . T he content elements

NO TE

for m at

for

do c u menti ng eva luation s c an b e i ntegrate d i n a ny typ e o f pro ce s s mo del .

Fo r the pu r p o s e o f e s tab l i s h i ng p ro ce s s mo del s , I S O/ I E C T R 2 47 74 a nd I S O/I E C 3 3 0 2 0 s p e c i fy the a nd con fo rm a nce re qu i rements

for

p ro ce s s

mo del s , re s p e c tivel y.

I n add ition ,

the typ e s a nd co ntent o f i n fo rm ation item s de velop e d a nd u s e d i n pro ce s s mo del s

I S O/I E C

fo r

1 52 89

de fi ne s

s ys tem a nd s o ftwa re

l i fe c ycle m a n agement. I S O/ I E C 1 5 5 0 4 -5 a nd I S O/I E C 1 5 5 0 4 - 6 (to b e rep l ace d b y I S O/I E C 3 3 0 6 0) de fi ne work pro duc ts , i nclud i ng i n for m atio n item s , a s s o c i ate d i n for m atio n item s

for

for

the pu r p o s e o f pro ce s s c ap ab i l ity a s s e s s ment. P ro ce s s mo del s a nd

hu m a n- centre d de s ign o f i nterac tive s ys tem s a re conta i ne d i n I S O/ T R 1 8 5 2 9

and I S O/ T S 1 81 5 2 .

2

Conformance

An evaluation rep or t conforms to this I nternational Standard if it contain s al l the required content elements in C lause 5 that a re appl ic able to the typ e(s) o f eva luation, i nclud i ng: —

add itiona l op tiona l content elements that were s ele c te d to b e p ar t o f the eva luation;



the content elements

for

the con form ity a s s e s s ment (i f u s e d) .

3 Terms and definitions For the pu r p o s e s o f th i s do c u ment, the

NO TE

T he C I F

fa m i l y o f s ta nd a rd s

fol lowi ng

term s and defi n ition s apply.

u s e s de fi n ition s th at a re con s i s tent with the I S O 9 2 41 s er ie s o f s ta nd a rd s ,

a s th i s i s the term i nolo g y th at i s nor m a l l y u s e d

for

th i s s ub j e c t m atter. I n s ome c a s e s , the s e de fi n itio n s d i ffer

from tho s e i n I S O/I E C 2 5 0 0 0 .

3.1 accessibility ex tent to wh ich pro duc ts , s ys tem s , s er vice s , envi ron ments and

faci l itie s

c an b e u s e d b y p e ople

from

a p opu lation with the wide s t range o f cha rac teri s tic s and c ap abi l itie s to ach ieve a s p e ci fie d go a l i n a s p e ci fie d conte xt o f u s e N o te 1 to entr y: C onte x t o f u s e i nclude s d i re c t u s e or u s e s upp o r te d b y a s s i s tive te ch nolo gie s .

[S OU RC E : I S O 2 6 8 0 0 : 2 011 , 2 .1 ; mo d i fie d , No te 2 to entr y dele te d]

© I SO /I E C 2 0 1 6 – All rights res erved

1

ISO/IEC 25066:2016(E)

3.2 action us er b ehaviou r th at a s ys tem accep ts a s a re que s t

for

a p ar tic u lar op eration

[S O U RC E : I S O/I E C T R 11 5 8 0 : 2 0 0 7, 2 . 3 ; mo d i fie d , E xample dele te d]

3.3 conformity assessment demon s tration that s p e ci fie d re qui rements rel ati ng to a pro duc t, pro ce s s , s ys tem, p ers on or b o dy a re

fu l fi l le d [S O U RC E : I S O/I E C 170 0 0 : 2 0 0 4, 2 .1 ; mo d i fie d, No te s dele te d]

3.4 context of use us ers , tas ks , e qu ipment ( h ardwa re, s o ftware a nd materi a l s) , and the phys ic a l and s o ci a l envi ron ments

in which a produc t is used [S O U RC E : I S O 9 2 41-1 1 : 19 9 8 , 3 . 5 ]

3.5 dialogue i nterac tion b e twe en a u s er and an i nterac tive s ys tem a s a s e quence o f u s er ac tion s (i nputs) and s ys tem

res p ons es (outputs) in order to achieve a goal N o te 1 to entr y: Us er ac tion s i nclude no t o n l y entr y o f d ata b ut a l s o n avigation a l ac tio n s o f the u s er.

N o te 2 to entr y: D ia lo gue re fers to b o th the

for m

(s ynta x) a nd the me a n i ng (s em a ntic s) o f i nterac tio n .

[S O U RC E : I S O 9 2 41-1 10 : 2 0 0 6 , 3 . 2 ]

3.6 effectiveness acc u rac y and comple tene s s with wh ich u s ers ach ieve s p e ci fie d go a l s

[S O U RC E : I S O 9 2 41-1 1 : 19 9 8 , 3 . 2 ]

3.7 e fficiency re s ou rce s exp ende d i n relation to the acc u rac y and comple tene s s with wh ich u s ers ach ieve go a l s

[S O U RC E : I S O 9 2 41-1 1 : 19 9 8 , 3 . 3 ]

3.8 goal intended outcome [S O U RC E : I S O 9 2 41-1 1 : 19 9 8 , 3 . 8]

3.9 information item s ep a rately identi fi able b o dy o f i n formation that i s pro duce d a nd s tore d

for

hu man u s e duri ng a s ys tem

or s o ftware l i fe c ycle

[S O U RC E : I S O/I E C/I E E E 1 5 2 8 9 : 2 011 , 5 .7 ]

3.10 inspection-based evaluation eva luation b as e d on the j udgment o f one or more eva luator(s) who e xam i ne or u s e a s ys tem to identi fy p o tentia l u s abi l ity problem s (i nclud i ng deviation s

from

e s tabl i s he d c riteria)

N o te 1 to entr y: T he e va lu ato rs m a ki n g the i n s p e c tion s typic a l l y a re u s abi l ity s p e c i a l i s ts but c a n a l s o i nclude end

us ers and memb ers of the des ign te am .

2

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

Note 2 to entry: Established criteria typically include user requirements, usability guidelines in standards, design conventions contained in manu facturer guidelines and style guides, task models to be supported, as well as s tandard i zed pri nciples .

Note 3 to entry: The evaluation can be conducted with or without the help o f re ference documents. Note 4 to entry: Inspection-based evaluation is a generic term for methods that include but are not limited to heuristic evaluation, cognitive walkthroughs, standards inspection, pluralistic walkthroughs, and consistency i ns p ec tions .

Note 5 to entry: Inspection-based evaluation can be conducted by machines in some cases, e.g. when consistency with required terminology is being evaluated. In this case, the machine represents the evaluator. 3.11 requirement

condition or capability that must be met or possessed by a system, system component, product, or service to satis fy an agreement, standard, specification, or other formally imposed documents [SOURCE: ISO/IEC/IEEE 24765:2010, 3.2506, Clause 4.] 3.12 satisfaction freedom from discomfor t, and p os itive attitudes towards the use of the pro duc t

[SOURCE: ISO 9241-11:1998, 3.4] 3.13 stakeholder

individual or organization having a right, share, claim, or interest in a system or in its possession o f charac teris tics that meet their needs and exp ec tations

[SOURCE: ISO/IEC/IEEE 15288:2015, 4.1.44] 3.14 system combination of interac ting elements organi zed to achieve one or more s tated purp oses

Note 1 to entry: A system may be considered as a product or as the services it provides. Note 2 to entry: In practice, the interpretation o f its meaning is frequently clarified by the use o f an associative noun, e.g. aircra ft system. Alternatively, the word system may be substituted simply by a context dependent synonym, e.g. aircra ft, though this may then obscure a system principles perspective.

[SOURCE: ISO/IEC/IEEE 15288:2015, 4.1.46; modified, Note 3 to entry deleted] 3.15 task ac tivities required to achieve a go al

Note 1 to entry: The term “task” is used here, as in ISO 9241-11:— to the specifics o f use o f the dialogue system.

5)

, i n its wides t s en s e, rather than i n reference

[SOURCE: ISO 9241-11:1998, 3.9; modified, Notes changed] 3.16 usability

extent to which a system, product or service can be used by specified users to achieve specified goals with e ffectiveness, e fficiency and satis faction in a specified context o f use Note 1 to entry: According to ISO/IEC 25010, “Usability can either be specified or measured as a product quality characteristic in terms o f its sub-characteristics, or specified or measured directly by measures that are a subset o f quality in use.” The definition o f usability in this International Standard is consistent with the second approach. 5)

Under preparation.

© I SO /I E C 2 0 1 6 – All rights res erved

3

ISO/IEC 25066:2016(E)

[SOURCE: ISO 9241-210:2010, 2.13; modified, Notes changed] 3.17 usability defect pro duc t attribute(s) that lead(s) to a mis match b etween user intentions and/or us er ac tions and the

system attributes and behaviour

Note 1 to entry: Typical usability de fects include the following: — additional unnecessary steps not required as part o f completing a task; — misleading in formation; — insu fficient and/or poor in formation on the user inter face; — unexpected system responses; — limitations in navigation; — ine fficient use error recovery mechanisms; — physical characteristics o f the user inter face that are not suitable for the physical characteristics o f the user. Note 2 to entry: Deviations o f product attributes o f the object o f evaluation from established criteria are also usability de fects. 3.18

identified usability de fect and/or usability problem or positive usability-related attribute u

s

a

b

i

l

i

t

y

f i

n

d

i

n

g

3.19 usability problem

situation during use resulting in poor e ffectiveness, e fficiency or satis faction 3.20 use error

user action or lack o f user action while using the interactive system that leads to a di fferent result than that intended by the manu facturer or expected by the user Note 1 to entry: Use error includes the inability o f the user to complete a task. Note 2 to entry: Use errors can result from a mismatch between the characteristics o f the user, user inter face, task, or u s e envi ronment.

Note 3 to entry: Users might be aware or unaware that a use error has occurred. Note 4 to entry: An unexpected physiological response o f the patient is not by itsel f considered a use error. Note 5 to entry: A mal function o f an interactive system that causes an unexpected result is not considered a u s e er ro r.

[SOURCE: IEC 62366-1:2015, 3.21; modified, Medical device replaced by interactive system, Notes changed] 3.21 user

person who interacts with a system, product or service Note 1 to entry: Users include people who operate a system, people who use the output provided by a system and people who conduct support tasks using the system (including maintenance and training). Note 2 to entry: According to ISO/IEC 25010, User is defined as “individual or group that interacts with a system or benefits from a system during its utilization”.

4

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

Note 3 to entry: Primary and secondary users interact with a system, and primary and indirect users can benefit rom a system. This definition includes a broader understanding o f individuals and organisations that act as users.

f

[SOURCE: ISO 26800:2011, 2.10; modified, Notes changed] 3.22 user-based evaluation

evaluation that involves representative users per forming tasks with the system to enable identification o f usability problems and/or measurements o f e fficiency, e ffectiveness, user satis faction or other user exp eriences

3.23 user experience a p erson’s p ercep tions and res p ons es that res u lt from the use and/or anticip ated use of a produc t,

system or service

Note 1 to entry: User experience includes all the users’ emotions, belie fs, pre ferences, perceptions, physical and psychological responses, behaviours and accomplishments that occur be fore, during and a fter use. Note 2 to entry: User experience is a consequence o f: brand image, presentation, functionality, system per formance, interactive behaviour, and assistive capabilities o f the interactive system, the user’s internal and physical state resulting from prior experiences, attitudes, skills and personality, and the context o f use. Note 3 to entry: Usability, when interpreted from the perspective o f the users’ personal goals, can include the kind o f perceptual and emotional aspects typically associated with user experience. Usability criteria can be us ed to as s es s as p e c ts of us er exp erience .

[SOURCE: ISO 9241-210:2010, 2.15] 3.24 user need

prerequisite identified as necessary for an user, or a set o f users, to achieve an intended outcome, implied or stated within a specific context o f use E X AM PLE 1

A pres enter (us er) ne ed s to know how much ti me i s left (prerequ i s ite) in order to complete the

E X AM PLE 2

An accou nt manager (us er) nee d s to know the numb er of i nvoices re ceived and thei r amou nts

presentation in time (intended outcome) during a presentation with a fixed time limit (context o f use).

(prerequisite), in order to complete the daily accounting log (intended outcome) as part o f monitoring the cash flow (context o f use). Note 1 to entry: A user need is independent o f any proposed solution for that need. Note 2 to entry: User needs are identified based on various approaches including interviews with users, observations, surveys, evaluations, expert analysis, etc.

Note 3 to entry: User needs o ften represent gaps (or discrepancies) between what should be and what is. Note 4 to entry: User needs are trans formed into user requirements considering the context o f use, user priorities, trade-o ffs with other system requirements and constraints.

[SOURCE: ISO/IEC 25064:2013, 4.19] 3.25 user requirements usage requirements

requirements for use that provide the basis for design and evaluation o f interactive systems to meet identified user needs Note 1 to entry: User requirements are derived from user needs, characteristics and capabilities in order to make use o f the system in an e ffective, e fficient, sa fe and satis fying manner. Note 2 to entry: User requirements speci fy the extent to which user needs, characteristics and capabilities are to be met when using the system. They are not requirements on the users. © I SO /I E C 2 0 1 6 – All rights res erved

5

ISO/IEC 25066:2016(E)

Note 3 to entry: In so ftware-engineering terms, user requirements comprise both “ functional” and “nonfunc tional” requi rements b as e d on us er nee ds and cap abi lities .

[SOURCE: ISO/IEC TR 25060:2010, 2.21] 4 4.1

Purpose and types of usability evaluations Purpose of an evaluation

The content o f a usability evaluation report varies based on the purpose o f the evaluation. An evaluation could be per formed to test whether specified user requirements have been implemented or to test whether specified accessibility recommendations have been implemented. Or an evaluation could be p erformed as the b as is for a proc urement decis ion . T his I nternational Standard describ es the contents

o f usability evaluation reports produced for a broad range o f usability evaluation objectives.

The purpose o f ISO/IEC 25062 is to facilitate incorporation o f usability as part o f the procurement decision-making process for so ftware to assist in judging i f a product meets usability goals. Examples of decis ions include purchas ing, upgrading and automating. I SO/I EC 2 5 0 62 is an example of a user ob ser vation rep or t for p erformance meas urement in accordance with Annex A. I SO/I EC 2 5 0 62 provides

a common format for human factors engineers and usability pro fessionals in supplier companies to report the methods and results o f usability tests to customer organizations. Since the procurement environment is the intended audience, I S O/I EC 2 5 0 62 is more prescrip tive in the format and the required elements .

4.2

Types of usability evaluations

Usability evaluation is a systematic process using one o f the following types o f evaluation approaches. The content o f an evaluation report depends on the type o f evaluation approach used. The classification o f evaluation approaches described below is used in C lause 2 . a) Inspection to identi fy usability de fects and potential usability problems including: — deviations o f the object o f evaluation from specified criteria such as user requirements, principles, design guidelines or established conventions; — potential usability problems when attempting to complete one or more tasks with the object o f evaluation. b)

O b ser vation of users including:

— observing user behaviour to identi fy actual usability findings; —

meas uring user p erformance and res p onse (e. g. time taken to p erform a task, numb er of us e errors ,

skin conductance or eye pupil dilation). NOTE 1

The observation o f users can be carried out as an explicit usability test and/or conducted in a

“rea l l i fe” s etti ng.

NOTE 2 The usability problems are either identified during the observation or are identified from subsequent analysis.

c) User surveys including: — eliciting problems, opinions and impressions from users (qualitative user surveys); —

meas uring level of user s atis fac tion or p ercep tion, e. g. rating scale values for s atis fac tion or for

subjectively perceived e ffectiveness or e fficiency (quantitative user surveys);

— other user reported data (e.g. data collected from an individual in conjunction with observation data). 6

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

NOTE 3

Collection o f in formation about participants such as demographic data does not constitute a user survey.

A usability evaluation report contains in formation about one or more types of the evaluations listed above. EXAMPLE 1 A usability test report describes problems encountered by users when carrying out tasks (type o f in formation is “Observing user behaviour”). A quantitative usability test report based on ISO/IEC 25062 contains measures o f e ffectiveness, e fficiency and satis faction (types o f in formation are “Measuring user per formance” and “User survey”).

When reporting findings in usability evaluation reports, it is important to di fferentiate usability de fects from their consequences. While usability de fects are typically inappropriate attributes o f the interactive system, their consequences describe the negative e ffect on the user that is either likely to o ccur or has b een obser ved or rep or ted.

EXAMPLE 2 A usability de fect could be the fact, that within a web form, required entry fields are not marked as such. The consequences could be that users fail to fill in required entry fields and there fore make use errors repeatedly.

Usability evaluation content can be further categorized by the types o f evaluation involved. Usability evaluations can b e di fferentiated in terms of “ins p ec tion-b ased” vers us “user-b ased”. T he fol lowing

clauses introduce the general types o f usability evaluation reports.

4.3 Assessing con formity o f the object o f evaluation against specified criteria

Evaluation report data can be used for di fferent purposes. One purpose is, to show that the object o f evaluation meets specified requirements, also re ferred to as con formance criteria. A con formity assessment o f the object o f evaluation against specified criteria is defined in ISO/IEC 17000 as a “demonstration that specified requirements relating to a product, process, system, person or body are fulfilled”. Assessment o f con formity consists o f comparing the evaluation results with pre-defined con formance criteria. The con formance criteria can be defined within a project or by a third party (e.g. a regulatory body). A rigorous evaluation is required to produce data that can be used for a con formity assessment. When a con formity assessment is used, it shall be documented in con formance with the requirements of this I nternational Standard.

NOTE A formal con formity assessment requires a defined “con formity assessment scheme”. The formal scheme provides a) legal de fensibility, b) evidence o f contractual compliance, c) consistency o f application and comparability o f results across assessors and organizations. Con formity assessment schemes are implemented at an i nternationa l, regiona l, nationa l and s ub -nationa l level .

The con formity assessment can be included in a usability evaluation report or can be issued as a separate “con formity assessment report”. Table 2 shows the di fferent types o f con formance criteria that can be specified as the basis for a con formity assessment. There can be various sets o f specified con formance criteria for one con formity assessment, i f the underlying evaluation consisted o f more than one type o f evaluation (e.g. inspection plus user observation plus user survey).

© I SO /I E C 2 0 1 6 – All rights res erved

7

ISO/IEC 25066:2016(E)

Table 2 — Conformance criteria used for conformity assessment and corresponding types of usability evaluation reports Conformance criteria

Type of usability evaluation report

— Specified user requirements (e.g. “The user shall be able to sort flights by duration.” or “The user shall be able to select alternative modes o f input or output to carry out a task.”) — Specified principles (e.g. “error tolerance”) and guidelines (e.g. “Required entry fields shall be visually distinct from optional entry fields.”) — Specified design conventions (e.g. “The edit-button is always at

I ns p ec tion-b as e d eva luation rep or t

the top -right corner of the form .” )

— Specified user requirements (e.g. “The user shall be able to

Us er ob s er vation rep or t

detec t that one or more p atients nee d i m me diate attention .” )

— Specified user requirements for per formance (e.g. “The user sha l l b e able to complete the s a les order with in 6 0 s econd s ” )

— Specified scores for subjectively perceived e ffectiveness, e fficiency, satis faction and other measures perceived by users

User survey report

(e. g. 3 , 5 on a s c a le ranging from 1 (m i n) to 5 (ma x) )

— Specified attributes for reported experiences (e.g. “I f any o f the reported usability problems is judged as unacceptable then the object o f evaluation fails the con formity assessment.”) P rinciples and guidelines that can b e used as conformance criteria are publ ished in various sources

including the ISO 9241 series. These principles and guidelines o ften apply across operating systems and development environments, e.g. “Colour should not be used as the only means to code in formation.” or “Required entry fields should be visually distinct from optional entry fields.” User-inter face related recommendations can b e found in the I SO 92 41 series of s tandards:

— ISO 9241-12 — Presentation o f in formation; — ISO 9241-13 — User guidance; — ISO 9241-14 — Menu dialogues; — ISO 9241-15 — Command dialogues; — ISO 9241-16 — Direct manipulation dialogues; — ISO 9241-20 — Accessibility guidelines for in formation/communication technology (ICT) equipment and services; — ISO 9241-110 — Dialogue principles; — ISO 9241-129 — Guidance on so ftware individualization; — ISO 9241-143 — Forms; — ISO 9241-151 — Guidance on World Wide Web user inter faces; — ISO 9241-171 — Guidance on so ftware accessibility; — ISO 9241-303 — Requirements for electronic visual displays; — ISO 9241-400 — Principles and requirements for physical input devices; — ISO 9241-410 — Design criteria for physical input devices; —

8

I SO 92 41-92 0 — Guidance on tac ti le and hap tic interac tions .

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

Established conventions that can also be used as con formance criteria typically include rules published by suppliers o f operating systems (e.g. “Windows”, “Mac OS”, iOS”, “Android”) and development environments (e. g. “. N E T ” or “Java” ) .

EXAMPLE

An example for an established convention is “a dialog box always has an “OK” and “Cancel” button

at the b o ttom right corner of the dia lo gue b ox”.

5

Content elements of usability evaluation reports

5.1

Selecting content elements

The following clauses describe the content elements that can be included in a usability evaluation rep or t. T he content elements are describ ed in s ub clauses organi zed on the b as is of sec tions that can b e included in an evaluation rep or t.

Depending on the purpose o f the evaluation, a usability evaluation report can include the following sec tions:

— Executive Summary; — Description o f the object o f evaluation; — Purpose o f evaluation; — Method; — Procedure; — Results; —

I nterpretation of res u lts and recommendations (op tional) .

Within each subclause, the required, recommended and permitted content elements for each type o f evaluation are indicated within a table at the end o f each subclause. Each content element is specified as: mandatory, i.e. required (“shall”), recommended (“should”) or permitted (“may”) for each type o f evaluation (i.e. inspection, user observation and user survey). Requirements describe elements that are essential in all situations. Recommendations are also important but they might not apply in all s ituations .

The content elements for each section o f an evaluation report are determined by the type(s) o f evaluation to be conducted. Furthermore, there are elements, that are always required and conditional elements, that can be selected for the evaluation, i f used (e.g. statistical analysis or provided recommendations) and/or applicable (e.g. parts o f the object that were evaluated or measures used in evaluation). Evaluations o ften contain more than one type o f evaluation (e.g. user observation and subsequent user survey). As a result, the evaluation report would include the content elements for both types o f evaluations . T he order in which the sec tions and the elements within it are intro duced do es not prescrib e a required

order for a usability evaluation report. Furthermore, the grouping o f the content elements themselves can be defined by the author o f the report (e.g. combining in formation such as methods and procedures into one sec tion of the evaluation rep or t) .

The evaluation report should provide su fficient in formation to determine the appropriateness o f the evaluation and to assess the validity o f the results. NOTE

For user observation, the context o f use for evaluation needs to reproduce the key aspects o f a subset

of the context of u s e i n order for eva luation res u lts to b e va lid .

Annex A contains a table that gives an over view of al l required and recommended content elements for

each type o f evaluation.

© I SO /I E C 2 0 1 6 – All rights res erved

9

ISO/IEC 25066:2016(E)

The following subclauses enumerate the content elements for a usability evaluation report. The report sections described in the following clauses re fer to all three types o f evaluation (inspection-based evaluation, user observation and user survey). 5.2

Description of the content elements for each type of evaluation

5.2.1

Executive summary (if used)

This section o f the usability evaluation report provides a concise overview o f the evaluation. The intent o f this section is to provide in formation for those who might not read the technical body o f the report. An executive summary can include: a) Name and description o f the object o f evaluation; b) Summary o f method(s) and the procedure; c) Summary o f results including key findings, related conclusions and recommendations (i f applicable). Table 3

specifies the required and recommended items for each type o f evaluation. Table 3 — Executive Summary

Type of evaluation: Content elements to be included in report:

User observation Observing user behaviour

Measuring user performance and response

User survey

sha l l

sha l l

sha l l

shal l

sha l l

sha l l

sha l l

shal l

sha l l

sha l l

sha l l

shal l

Inspection

a) Name and description o f the object o f eva luation

b) Summary o f method(s) and the procedure c) Summary o f results including key findings, relate d conclus ion s a nd recom mendations (i f applic able)

5.2.2

Description of the object of evaluation

This section o f the usability evaluation report identifies the entity, which was actually evaluated. NOTE Examples o f objects o f evaluation include concepts, user inter face prototypes, functioning so ftware systems, hardware products, or components o f a product or a service.

In formation about the object o f evaluation can include: a) Formal name and release or version; b) Parts o f the object that were evaluated (i f applicable); c) User groups for which the object is intended; d) Brie f description o f the object and its purpose; e) Intended context o f use; f

) Prior usability evaluation report summaries (i f applicable);

g) Expected impact (e.g. on per formance, sa fety, finances) o f the object; h) Citations to market research for the object. The context o f use for the object o f evaluation needs to be described. Further guidance on the descrip tion of the context of use is given in I SO/I E C 2 5 0 63 . E ach of the four comp onents of the context

10

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

o f u s e (u s ers , tas ks , e qu ipment, envi ron ment) i s no t a lways appl ic able exa mple, tas ks are no t a lways u s e d

for

for

ever y typ e o f eva luation (for

i n s p e c tion-b a s e d eva luation s) .

Table 4 s p e ci fie s the re qu i re d and re com mende d and p erm itte d item s

for

e ach typ e o f eva luation .

Table 4 — Description of the object of evaluation User observation

Type of evaluation:

Inspection

Content elements to be included in report:

Observing user behaviour

Measuring user performance and response

User survey

a)

Forma l name and releas e or vers ion

sha l l

sha l l

sha l l

sha l l

b)

P a r ts o f the o b j e c t th at were e va lu ate d

shal l

sha l l

sha l l

sha l l

shou ld

shou ld

shou ld

shou ld

shou ld

shou ld

shou ld

shou ld

(i f appl icab le) c)

Us er group s

fo r

wh ich the ob j e c t i s

i ntende d d)

B r ie f de s c r ip tion o f the ob j e c t a nd its

purp o s e e)

I ntended context of us e

shou ld

shou ld

shou ld

shou ld

f)

P r io r u s ab i l ity e va lu ation rep or t

shou ld

shou ld

shou ld

shou ld

m ay

m ay

m ay

m ay

m ay

m ay

m ay

m ay

s um maries (i f appl icab le) g)

E xp e c te d i mp ac t o f the ob j e c t

h)

C itation s to m a rke t re s e a rch

5.2.3

for

the ob j e c t

Purpose of the evaluation

T h i s s e c tion o f the u s abi l ity eva luation rep or t identi fie s the re a s on s

for

wh ich the eva luation wa s

conduc te d a nd wh ich p ar ts o f the obj e c t were eva luate d and why.

a)

D escrip tion of the purp ose

NO TE 1

P ur p o s es for an eva luation can i nclude:

fe e db ack



i mprovi ng de s ign by provid i ng



identi fyi ng u s abi l ity de fe c ts and u s abi l ity problem s;



con fi rm i ng/el ic iti ng u s er re qu i rements;



con fi rm i ng a s s ump tion s;



te s ti ng concep ts;



me a s uri ng the level o f u s abi l ity (i . e . e ffe c tivene s s and/or e ffic ienc y and/or u s er s ati s fac tion) ;



e s tabl i s h i ng b ench marks;



i nto the de s ign pro ce s s;

a s s e s s i ng whe ther a pro duc t, s ys tem or s er vice me e ts s p e ci fic con formance criteria/ accep ta nce c riteria;



identi fyi ng s treng th s and we a kne s s e s o f a pro duc t, s ys tem or s er vice;



identi fyi ng the con s e quence s that cou ld ari s e



re s olvi ng d i s pute s b e twe en u s ers a nd/ or s ta keholders;



identi fyi ng whe ther a pro duc t, s ys tem or s er vice i s acce s s ible;

© I SO /I E C 2 0 1 6 – All rights res erved

from

p o or u s abi l ity;

11

ISO/IEC 25066:2016(E)



b)

acqu i ri ng a cer ti fic ation, e . g.



to p a s s a n i nterna l qua l ity gate;



to p a s s a cer ti fic ation o f a cer ti fic ation b o dy.

Func tions and comp onents evaluated (if appl icable)

NO TE 2

I t i s no t ne ce s s a r y to de s c r ib e the

fu nc tion s

a nd comp onents , i f a l l the

fu nc tio n s

a nd co mp onents

were eva luated .

c)

Re a s on s

NO TE 3

for

on ly eva luati ng a s ub s e t o f the obj e c t (i f appl ic able)

I t i s no t ne ce s s a r y to de s c r ib e the re a s on s why o n l y a p a r t o f the o b j e c t wa s e va lu ate d , i f a l l the

func tion s and comp onents were eva luated .

Table 5 s p e c i fie s the re qu i re d a nd re com mende d item s

for

e ach typ e o f eva luation .

Table 5 — Purpose of the evaluation User observation

Type of evaluation: Content elements to be included in report:

Observing user behaviour

Inspection

Measuring user performance and response

User survey

a) D es crip tion of the pu rp o s e

sha l l

sha l l

sha l l

sha l l

b) Func tion s and comp onents eva luated

shal l

sha l l

sha l l

sha l l

shal l

sha l l

sha l l

sha l l

(i f applic able) c)

Re a s o n s

fo r

o n l y e va lu ati ng a s ub s e t o f the

ob j e c t (i f ap p l ic ab le)

5.2.4 5.2.4.1

Method General

T his sec tion of the evaluation rep or t describ es how the evaluation was conduc ted. T he go al of the descrip tion is to provide enough information to determine the appropriatenes s of the metho d and to as s e s s the va l id ity o f the re s u lts , a s wel l a s enabl i ng repl ic ation .

a)

Typ e(s) o f eva luation u s e d

Us abi l ity eva luation rep or ts c an i nclude d ata b a s e d on one or more than one typ e o f eva luation (s e e 4.1) . T he u s abi l ity eva luation rep or t s tate s wh ich typ e(s) o f eva luation have b e en u s e d .

NO TE

Typ e s

o f e va lu ation s

a re

i n s p e c tio n-b a s e d

e va lu ation ,

ob s er vi ng u s er b eh aviou r,

me a s u r i n g u s er

p er for m a nce a nd u s er s u r vey.

b)

Su ffic ient i n formation to repl ic ate the pro ce du re u s e d du ri ng the eva luation

Table 6 s p e c i fie s the re qu i re d item s

12

for

e ach typ e o f eva luation .

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

Table 6 — General User observation

Type of evaluation:

Inspection

Content elements to be included in report:

Observing user behaviour

Measuring user performance and response

User survey

a)

Typ e(s) o f e va lu atio n u s e d

sha l l

sha l l

sha l l

sha l l

b)

S u ffic ient i n for m ation to rep l ic ate the

sha l l

sha l l

sha l l

sha l l

evaluation pro ce dure us ed duri ng the evaluation

5.2.4.2

Evaluators/participants

T h i s s e c tion o f the eva luation rep or t provide s i n formation ab out the p e ople ta ki ng p ar t i n the u s abi l ity eva luation .

E va luators

are

the

p e op le

who

r un

the

eva luation,

and

i nclude

p e ople

who

c arr y out

i n s p e c tion s . Pa r ticip ants are p e ople who are ac tua l or p o tenti a l u s ers o f the obj e c t o f eva luation, who

take p ar t in ob ser vational s tudies where their b ehaviour and/or task p erformance is monitored. T he p e ople who provide s u r vey data are a l s o p ar tic ip a nts . I n formation ab out the eva luators and p ar ticip ants enab le s re aders o f a rep or t to j udge whe ther the

information presented is appl icable to their own circums tances . a)

Total numb er of evaluators/p ar ticip ants

T his element rep or ts the total numb er of evaluators or p ar ticip ants . b)

S egmentation of tes t p ar ticip ants or evaluators/ins p ec tors (if more than one s egment)

When

obser ving

user

b ehaviour,

segmentation

of tes t

p ar ticip ants

into

group s

b ased

on

their

charac teris tics can b e used as an exp erimental variable (s ee 5 . 2 . 5 .1) . S egmentation of evaluators enable s the re s u lts pro duce d b y d i fferent c ategorie s o f eva luator to b e comp are d .

E X AM PLE 1 —

eva luator/i n s p e c tor with doma i n exp er ti s e;



eva luator/i n s p e c tor with u s abi l ity e xp er ti s e;



eva luator/i n s p e c tor repre s enti ng the u s ers;



s egmentation of tes t p ar ticip ants (if more than one) .

E X AM PLE 2 —

in frequent us ers vers us habitual users .

c)

Key C ha rac teri s tic s o f te s t p ar tic ip a nts or u s ers con s idere d

for

i n s p e c tion

Key ch arac teri s tics o f te s t p ar tic ip ants ch arac teri z e attribute s o f the i ntende d u s er p opu l ation that are relevant to the va l id ity o f the eva luation .

NO TE 1

Ke y ch a rac ter i s tic s c a n i nclude:



demo graph ic s that a re u s e d to identi fy i ntende d u s er group s o f s p e c i fic i ntere s t,



tas k-relate d ch arac teri s tics ,



phys ic a l and s en s or y ch arac teri s tics ,



p s ycholo gic a l and s o ci a l charac teri s tics ,



s o c ia l a nd orga n i z ationa l charac teri s tic s ,

for

for

exa mp le, age;

exa mple, trai n i ng , ski l l level a nd e s tabl i s he d b ehaviou rs;

for

exa mple b o dy d i men s ion s , s treng th, vi s ion a nd he ari ng;

for

e xample re ad i ng age, habits , language and c u ltu re;

for

e xample pro fe s s ion or j ob title, re s i s ta nce to change

a nd a ri s k ta ki ng c u lture;

© I SO /I E C 2 0 1 6 – All rights res erved

13

ISO/IEC 25066:2016(E)



user group memb ership (i. e. the group s that the tes t p ar ticip ant represents for this evaluation, e. g. s mar t-phone users , land-line phone users) .

d)

D ifferences b etween s ample and the user p opulation (i f applicable)

This element describes any di fferences between the participant sample and the actual user population. In particular di fferences in key characteristics are described. EXAMPLE 3

Actual users might attend a training course whereas test subjects were untrained.

e) Table o f participants by characteristics Tables help summarize and improve the readability o f the key characteristics o f participants for the re ader.

14

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

NO TE 2

A table can i nclude:

— participants (row) by characteristics (columns), o f key characteristics such as computing experience, age, gender, abil ities . Table 7

specifies the required and recommended items for each type o f evaluation. Table 7 — Evaluators/participants

Type of evaluation: Content elements to be included in report: a)

To ta l numb er of eva luators/

User observation Observing user behaviour

Measuring user performance and response

User survey

sha l l

sha l l

shal l

sha l l

sha l l

sha l l

shal l

sha l l

sha l l

sha l l

sha l l

sha l l

N/A

sha l l

sha l l

sha l l

N/A

sha l l

sha l l

sha l l

Inspection

p ar ticip ants b) S egmentation of tes t p ar ticip ants or eva luators/ in s p ec tors (i f more than one s egment)

c) Key characteristics o f test participants or u s ers con s idered for i n s p ec tion d)

D i fferences b etween s ample and the u s er

p opu lation (i f applic able)

e) Table o f participants by characteristics 5.2.4.3

Tasks (if used in the evaluation)

T his sec tion of the evaluation rep or t describ es the tasks used for evaluation . When obser ving user

behaviour, measuring user per formance or gathering survey data, typically tasks are specified.

I n s p ec tions can al so b e tasked-b ased, but not al l ins p ec tions are task-b ased. For ins p ec tions , the

usability evaluation report shall explicitly state whether or not tasks are specified. T he in formation if tasks are used can include: a)

Tasks us ed for evaluation

T he tasks used for evaluation are expres sed in terms of the title and intended outcomes that p eople are

expected to achieve without re ferencing any specific means o f achieving them. b)

Task scenarios for each task

A task scenario is the in formation provided to the participants including any materials handed out. c)

S elec tion criteria for the tasks

The selection criteria for the tasks explain why the selected tasks were deemed to be important for the evaluation. E X AM PLE 1

T he mo s t frequent tasks for e ach s elec ted u s er group .

E X AM PLE 2

Tasks that give ri s e to the gre ates t p o tentia l ri sk.

d)

S ource of selec ted tasks

T he s ource of the selec ted tasks explains what the tasks are b ased on.

EXAMPLE 3

Observation o f customers using similar products, product marketing specifications, discussion

with us ers or des ign te am .

e)

Task data given to p ar ticip ants and/or ins p ec tors (i f appl icable) . E X AM PLE 4

D ata to b e pro ces s ed .

© I SO /I E C 2 0 1 6 – All rights res erved

15

ISO/IEC 25066:2016(E)

E X AM PLE 5

f)

Recorded s i mu lation of the c u s tomer reques t.

C riteria for task completion or task ab andonment for each task.

These are the criteria for terminating the task, either finishing the task or quitting. E X AM PLE 6

After more than 3 0 m i n, the task i s term i nate d .

E X AM PLE 7

After th re e un s ucces s fu l attemp ts , the task i s term i nated .

E X AM PLE 8

As s o on a s the us er b elieves the task has b e en completed .

Table 8

specifies the required and recommended items for each type o f evaluation. Table 8 — Tasks (if used in the evaluation)

Type of evaluation: Content elements to be included in report:

User observation Inspection

Observing user behaviour

Measuring user performance and response

User survey

a) Tasks us ed for eva luation

sha l l

sha l l

sha l l

shal l

b) Task s cenario s for e ach task

sha l l

sha l l

sha l l

shal l

c) S elec tion criteria for the tasks

sha l l

sha l l

sha l l

shal l

d)

sha l l

sha l l

sha l l

shal l

sha l l

sha l l

sha l l

sha l l

N/A

may

sha l l

N/A

S ource of s ele c te d ta sks

e) Task data given to p ar ticip ants and/ or i ns p e c tors (i f applic able) f) C riteria for task completion and task ab andonment for each task

5.2.4.4

Evaluation environment

a) Physical environment and facilities This section o f the evaluation report describes the in formation related to the physical environment and facil ities .

NOTE 1

This can include a description o f the setting and type o f space in which the evaluation was conducted.

NOTE 2

This is especially important in situations in which the context o f evaluation is di fferent to the intended

context of us e:

EXAMPLE 1 Usability lab, cubicle o ffice, meeting room, home o ffice, home family room, manu facturing floor, remote usability testing using video and audio con ferencing and desktop sharing, etc.

— any relevant features o f the setting or circumstances that could a ffect the results. EXAMPLE 2

Video and audio recording equipment, one-way mirrors, or automatic data collection equipment

— a description o f the physical environment. E X AM PLE 3

D ark envi ronment ( l ighti ng) i n a rad iolo gic a l s c reen i ng ro om

EXAMPLE 4

The user’s choice o f location or environment (e.g. home, car, o ffice, etc.) is reported for

remo te tes ti ng

b)

Technical environment (i f appl icable)

T his element describ es b oth the software and computing environment. NO T E 3

16

T h i s c an i nclude:

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

— computer configuration, including model, operating system version, required libraries or settings, i f browser-based, browser name and version; relevant plug-in names and versions; — any other display devices, audio devices, and input devices. c)

E valuation adminis tration to ol s (if used)

This element describes any hardware or so ftware used to control the evaluation or to record data. NO TE 4

T h i s can i nclude:

— a description or specification o f a standard questionnaire; — any hardware or so ftware used to control the test or to record data. d)

E valuation adminis trators (i f applicable)

This element identifies the number o f evaluation facilitators/administrators and their roles and res p ons ibi lities . Table 9

specifies the required and recommended items for each type o f evaluation. Table 9 — Evaluation environment

Type of evaluation: Content elements to be included in report:

a) Physical environment and facilities b) Te ch nic a l envi ronment (i f applic able)

User observation User survey

Observing user behaviour

Measuring user performance and response

N/A

sha l l

sha l l

may

sha l l

sha l l

sha l l

shou ld

Inspection

c)

E va luation adm i n i s tration to ol s (i f u s ed)

shou ld

shou ld

shou ld

shou ld

d)

E va luation ad m in i s trators (i f appl icab le)

may

shou ld

shou ld

shou ld

5.2.5 5.2.5.1

Procedure Design of the evaluation

This section o f the evaluation report details each step in the execution o f the usability evaluation. This section summarizes what one did and how one did it. It identifies the type o f evaluation, the specific exp erimental manipu lations , as wel l as ins truc tions to the p ar ticip ants . a)

D escrip tion of the evaluation des ign

This element describes the type o f evaluation per formed and the experimental design o f the evaluation, the plan for assigning experimental conditions to participants including the specific experimental manipulations if applicable. NO TE 1

E xp erimental manipu lation s can i nclude:

— randomization; — counterbalancing; —

other control features .

b)

I ndep endent variables (i f appl icable)

This element describes those variables that are manipulated by the evaluator (independent variables). EXAMPLE 1

Di fferent levels o f training, age o f participants, noise levels, lighting, which prototype participants

exp erience .

© I SO /I E C 2 0 1 6 – All rights res erved

17

ISO/IEC 25066:2016(E)

c) Predefined criteria (for inspection or observation) (i f used) T his element describ es the criteria used for ins p ec tion or ob ser vation. NO T E 2



T h i s c an i nclude:

for ins p ec tion:

— principles; — guidelines; — established conventions; —

for ins p ec tion or us er ob ser vation: —

user requirements .

EXAMPLE 2

A specified principle to be inspected against is “error tolerance”.

EXAMPLE 3 A specified guideline to be inspected against is “required entry fields shall be visually distinct from optional entry fields.” EXAMPLE 4

A specified accessibility guideline to be inspected against is “Whenever moving, blinking,

s crol l i ng , or auto -up dating i nformation i s pres ente d, s oftware sha l l enable the us er to p au s e or s top the pres entation, excep t for s i mple pro gres s i nd icators .”

EXAMPLE 5 A specified user requirement to be inspected against is “The user shall be able to sort flights by duration”. EXAMPLE 6

A specified design convention to be inspected against is “The edit-button is always at the

top -right corner of the form”.

d)

Meas ures us ed in evaluation (if appl icable)

T his element describ es the meas ures for which data were recorded for each set of conditions . E X AM PLE 7

Numb er of u s e errors p ar ticip ants ma ke during a task.

e) Operational definitions o f criteria or measures (i f applicable) T his element describ es what cons titutes a criteria or meas ure. E X AM PLE 8

f)

What con s titutes a p ar ticip ant us e error: “An i ncorrec t navigational choice”.

I nterac tion b etween individuals taking p ar t in each evaluation s es s ion (i f appl icable)

T his element describ es the al lowed interac tions b etween individual s taking p ar t in the evaluation . NO T E 3

T h i s c an i nclude:

— number and roles o f testing sta ff and participants who will interact during the evaluation session; — number and roles o f participants and i f and how they will interact with each other during the evaluation ses s ion . g)

O ther individual s present in evaluation s (if appl icable)

This element identifies any other individuals expected to be present during the evaluation, i f applicable. h)

G eneral ins truc tions given to the p ar ticip ants

T his element describ es the general ins truc tion s given to the p ar ticip ants of the evaluation. NO T E 4

18

T h i s c an i nclude:

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

— the actual instructions given to the participants (here or in an Appendix); — instructions on how participants were to interact with any other persons present, including how they were to ask for assistance and interact with other participants, i f applicable. i)

Specific instructions on tasks (i f applicable)

This element describes the specific task instructions including time limits for completing the tasks given to p ar ticip ants or evaluator(s) . NO TE 5

T h i s can i nclude:

— task instruction summary; —

time limits p er task.

j)

Sequence o f activities for conducting the evaluation

T his element describ es the sequence of organi z ational ac tivities for conduc ting each ses s ion within the evaluation from greeting the p ar ticip ants to dis mis s ing them . NO TE 6

T h i s can i nclude:

— steps followed to execute the test sessions and record data; — details o f nondisclosure agreements, form completion, warm-ups, pre-task training, and debriefing; — whether participants were paid or otherwise compensated; — verification that the participants knew and understood their rights as human subjects. Table 10

specifies the required and recommended items for each type o f evaluation. Table 10 — Design of the evaluation

Type of evaluation: Content elements to be included in report: a)

D es crip tion of the eva luation des ign

User observation Inspection

Observing user behaviour

Measuring user performance and response

User survey

sha l l

sha l l

sha l l

sha l l

b) I ndep endent variables (i f applic able)

N/A

sha l l

sha l l

sha l l

c) Predefined criteria (for inspection or

sha l l

sha l l

N/A

N/A

N/A

may

sha l l

sha l l

sha l l

sha l l

sha l l

sha l l

N/A

sha l l

sha l l

shou ld

N/A

shou ld

shou ld

shou ld

N/A

sha l l

sha l l

sha l l

shou ld

sha l l

shal l

N/A

N/A

shou ld

shou ld

may

ob s er vation) (i f u s e d) d)

M eas ures us e d i n eva luation

(i f appl icab le)

e) Operational definitions o f criteria or mea s ures (i f applic able) f) I nterac tion b e tween i ndividua l s ta ki ng p ar t i n e ach eva luation s es s ion (i f appl icable) g) O ther i nd ividua l s pres ent in eva luations (i f appl icab le) h) G enera l i ns truc tion s given to the p ar ticip ants

i) Specific instructions on tasks (i f appl icab le)

j) Sequence o f activities for conducting the evaluation

© I SO /I E C 2 0 1 6 – All rights res erved

19

ISO/IEC 25066:2016(E)

5.2.5.2

Data to be collected

T his s ub clause describ es the elements of the pro cedure that describ e the data to b e col lec ted during the evaluation .

a) Usability de fects in terms o f deviations from predefined criteria (i f criteria are used) This element identifies that deviations from predefined criteria (see 5 . 2 . 5 .1)

wi l l b e col lec ted during

the evaluation.

NOTE 1 Deviations include all attributes o f the interaction with the object o f evaluation that deviate from criteria and are expected to cause usability problems. E X AM PLE 1

A d ia lo gue b ox that do es no t have a “C ancel” button .

EXAMPLE 2 In a context o f use where there is a user requirement, that critical controls are readily identifiable (criterion), no user has found the emergency shutdown button (deviation). b)

O b ser ved user b ehaviour

This element identifies all the types o f observed user behaviour that are to be collected. These data can be used in order to identi fy usability findings in the interaction with the object o f evaluation . NO T E 2

Us er ob s er vation data c an i nclude s ituations i n wh ich



users don’t know how to proceed with the task,



use errors occur,



users communicate frus tration,



users engage in p os itive b ehaviours ,



users exhibit discom for t,

— evidence that one or more specific user requirements are met (or not met). c)

O b ser ved p erformance data

This element identifies all the types o f per formance data relating to e ffectiveness and e fficiency that are to b e col lec ted.

Per formance data are a specific case o f observation data where numerical values are obtained with a fo cus on meas urement. NO T E 3

Performance data me as ures ca n i nclude:

— accuracy and completeness o f task results (e ffectiveness) (i f applicable); — task completion; — time taken on task; — use errors and frequency o f occurrence; — number o f mouse clicks, touch events or gestures; — number o f key strokes; — distance moved on screen with pointing device (e.g. mouse); — eye tracking paths; 20

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

— behavioural data (e.g. emotional, fidgeting, level o f attention); — physiological data (e.g. skin conductance, blood pressure). d)

User-rep or ted qualitative data

This element identifies the type(s) o f user-reported qualitative data to be collected and the data ins trument that wi l l b e used.

User-reported qualitative data are statements made by users on their experience with the object o f evaluation.

NOTE 4

A questionnaire is a typical instrument for collecting user-reported data. A questionnaire used to

col lec t qual itative data us es op en- ende d ques tions .

NO TE 5

Us er-rep or ted qua litative data c an i nclude:

— experienced problems; — positive experiences; — how the users use the object o f evaluation; — expectations; — concerns; —

s ugges tions .

NOTE 6 e)

User-reported problems can be accompanied by subjective severity ratings.

User-rep or ted quantitative data

This element identifies the type(s) o f user-reported quantitative data to be collected and the data ins trument that wi l l b e used to col lec t.

User-reported quantitative data are experiences rated on a predefined scale. NOTE 7 A questionnaire is a typical instrument for collecting user-reported data. A questionnaire used to collect quantitative data uses closed questions typically with an associated rating scale. NOTE 8

User-reported quantitative data can include subjective ratings o f the object o f evaluation in terms o f



s atis fac tion,



comfor t,



trus twor thines s ,



attitude,



app eal,



effor t,



p erceived effec tivenes s ,

— perceived e fficiency. Table 11

specifies the required and recommended items for each type o f evaluation.

© I SO /I E C 2 0 1 6 – All rights res erved

21

ISO/IEC 25066:2016(E)

Table 11 — Data to be collected Type of evaluation: Content elements to be included in report:

a) Usability de fects in terms o f deviations from predefined criteria (i f criteria are used) b) O b s er ve d us er b ehaviou r

User observation Observing user behaviour

Measuring user performance and response

User survey

sha l l

N/A

N/A

N/A

N/A

sha l l

shou ld

N/A

Inspection

c) O b s er ve d p erformance data

N/A

d)

N/A

may may

N/A

N/A

Us er-rep or ted qua litative data

e) Us er-rep or ted quantitative data

5.2.6 5.2.6.1

sha l l

N/A

may

sha l l

N/A

sha l l

Results Data analysis

This section o f the usability evaluation report describes the data collected and the statistical or data analytic treatment used. Su fficient detail is reported to justi fy the conclusions. This section presents the res ults , a discus s ion of the res u lts and the impl ication or inferences of the res ults .

a) Approach used for the analysis o f observed, measured or collected data This element describes in which way the observed, measured or collected data were analysed. b)

D ifferences in planned and col lec ted data (if applicable)

T his element des crib es the differences b etween the data that was planned to b e col lec ted and the data

that was actually collected i f applicable.

c) Portion o f data used in the analysis (i f applicable) This element describes the portion o f the gathered data that was actually used for the analysis. E X AM PLE 1

d)

H ow m i s s i ng data wa s treate d . H ow data was tre ated with res p ec t to exclus ion of outl iers .

D ata scoring (i f used)

T his element describ es the mapping b etween the data values that were col lec ted and the values used in

the subsequent analysis. E X AM PLE 2

H ow us e errors were categori zed . H ow ac tua l ages map to age ranges . H ow as s i s ted u s e errors

are mapp e d to a s e t of va lues .

e)

D ata reduc tion

This element identifies the method used to generate summaries o f the raw data. EXAMPLE 3

Which measure o f central tendency was used (e.g. mean or mode). How variation was measured

(e . g. s tandard deviation or range) .

EXAMPLE 4 f

Systematic characterization o f open ended responses.

) Statistical analyses (i f used)

This element identifies and describes the statistical analyses used to analyse the data including the s tatis tical procedure. E X AM PLE 5

H ow group s were comp are d (e . g. t-tes t. F-tes t) .

For data that are calculated as means calculate the s tandard deviation and the s tandard error of the mean.

22

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

Table 1 2

specifies the required and recommended items for each type o f evaluation. Table 12 — Data analysis

Type of evaluation: Content elements to be included in report:

a) Approach used for the analysis o f observed,

User observation Observing user behaviour

Measuring user performance and response

User survey

sha l l

sha l l

shal l

sha l l

sha l l

sha l l

shal l

sha l l

sha l l

sha l l

shal l

sha l l

Inspection

mea s ured or col le c ted data b) D i fferences i n planned and col le c ted data (i f appl icab le)

c) Portion o f data used in the analysis (i f appl icab le) d)

D ata s cori ng (i f us e d)

N/A

sha l l

shal l

sha l l

e)

D ata re duc tion

N/A

sha l l

shal l

sha l l

N/A

sha l l

shal l

sha l l

f

) Statistical analyses (i f used)

5.2.6.2

Presentation of the results

T his sec tion of the evaluation rep or t shows the presentation of the data col lec ted b ased on the data

analysis. Individual scores or raw data are not included. For data that are reported as means include the standard deviation and the standard error o f the mean (“confidence interval”) see 5 . 2 . 6 .1 .

Tables and graphs can often b e used to present complex res ults in a concise manner. B oth tables and

various graphical formats are e ffective in describing usability data at a glance. Bar graphs are use ful for describing subjective data such as that gleaned from Likert scales. A variety o f plots can be used e ffectively to show comparisons o f expert benchmark times for the object o f evaluation vs. the mean participant per formance time. I f necessary, the details o f the interpretation are described in “findings and recommendations ”. T he res u lts s ummari ze:

a) Usability de fects in terms o f the deviations o f attributes o f the object o f evaluation from established criteria

This element summarizes the usability de fects in terms o f the deviations o f attributes o f the object o f evaluation from principles, guidelines and established conventions or specified user requirements. NOTE 1

A mapping o f deviations o f attributes o f the object o f evaluation to specified criteria can be used to

pres ent the res u lts .

b) Potential usability problems that are likely to arise from identified usability de fects This element summarizes the potential usability problems (and the associated rationales for the prediction) that result from the identified usability de fects, i.e. deviations o f attributes o f the object o f evaluation from principles, guidelines and established conventions or specified user requirements. c) Usability findings identified during observations This element summarizes the usability findings identified during observations. d)

Performance data b ased on meas urements

T his element s ummarizes the col lec ted meas ures that charac teri ze the p erformance res u lts p er task or task group . NO TE 2

Performance data can b e accumu late d wh i le ob s er vi ng u s er b ehaviour. T hes e c an include:

— accuracy and completeness o f task results (e ffectiveness); © I SO /I E C 2 0 1 6 – All rights res erved

23

ISO/IEC 25066:2016(E)



ta s k comple tion rate;



ti me on tas k;



e fficienc y;



u s e errors and



nu mb er o f a s s i s ts;



nu mb er o f mou s e cl icks , touch events a nd ge s tu re s;



nu mb er o f key s troke s;



d i s ta nce move d on s cre en with p oi nti ng device (e . g. mou s e) ;



eye tracki ng p ath s;



p s ycholo gic a l d ata (e . g. emo tiona l , fidge ti ng , level o f attention) ;



phys iolo gic a l data (e . g. s ki n conduc tance, blo o d pre s s u re) .

e)

P roblem s , opi n ion s and i mpre s s ion s rep or te d b y u s ers

fre quenc y

o f o cc u rrence;

T h i s element s u m mari z e s the problem s , opi n ion s a nd i mpre s s ion s rep or te d by u s ers .

NO TE 3

f)

Us ers c a n rep or t p ro b lem s , op i n ion s a nd i mp re s s ion s s p o nta ne o u s l y wh i le b ei ng ob s er ve d .

Meas ured level of user s atis fac tion or p ercep tion .

T his element s ummarizes the meas ured level of user s atis fac tion or p ercep tion . Table 1 3 s p e ci fie s the re qu i re d and re com mende d item s

for

e ach typ e o f eva luation .

Table 13 — Presentation of the results User observation

Type of evaluation: Content elements to be included in report a)

Us ab i l ity de fe c ts i n ter m s o f the de vi ation s o f

attr ib ute s o f the ob j e c t o f e va lu atio n

Observing user behaviour

Measuring user performance and response

User survey

sha l l

m ay

m ay

m ay

sha l l

N/A

N/A

N/A

N/A

sha l l

shou ld

N/A

Inspection

fro m

pre de fi ne d c r iter i a (i f c r iter i a a re u s e d) b)

Po tenti a l u s abi l ity pro b lem s th at a re l i kel y to

arise c)

from

identi fie d u s ab i l ity de fe c ts

Us ab i l ity fi nd i ngs identi fie d du ri n g

ob s er vations d)

Performance data b as ed on mea s u rements

e) P roblem s , opi nions and i mpres s ion s

N/A

m ay

sha l l

N/A

N/A

shou ld

m ay

sha l l

N/A

N/A

shou ld

shou ld

rep o r te d b y u s ers

f) M eas u red level of u s er s ati s fac tion or p ercep tion

5.2.7

Interpretation of results and recommendations

T h i s s e c tion o f the u s abi l ity eva luation rep or t provide s i nter pre tation o f re s u lts and re com mendation s , wh ich help to identi fy the i s s ue s to b e e xam i ne d i n de tai l .

NO TE

a)

24

I n s o me c a s e s , on l y the ke y i nter pre tation s a nd re com mend atio n s a re i nclude d .

I nterpretation of res u lts

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

T his element provides conclus ions b ased on the interpretation of the res u lts . b)

Recommendations

T h i s element provide s a s e t o f re com mendation s

for

the i mprovement o f the obj e c t o f eva luation b a s e d

on the evaluation res u lts and their interpretation . Table 14 s p e ci fie s the re qui re d and re com mende d item s

for

e ach typ e o f eva luation .

Table 14 — Interpretation of results and recommendations Type of evaluation: Content elements to be included in report: a)

I nter pre tation of res u lts

b) Recom mendations

5.2.8

User observation Observing user behaviour

Measuring user performance and response

User survey

shou ld

shou ld

shou ld

shou ld

shou ld

shou ld

shou ld

shou ld

Inspection

Additional content for conformity assessment (as part of a usability evaluation report)

T h i s s e c tion o f the u s abi l ity eva luation rep or t provide s con form ity a s s e s s ment. Us abi l ity eva luation rep or ts

a)

for

con form ity a s s e s s ment i nclude add itiona l content elements .

C on form ity a s s e s s ment s cheme (i f u s e d)

T h i s element de s crib e s the con form ity as s e s s ment s cheme (title, vers ion) .

NO TE

b)

C on for m ity a s s e s s ment s cheme s c a n e xi s t at i nter n atio n a l , re gion a l , n atio n a l or s ub -n ation a l le vel .

C onformance criteria

T his element describ es the conformance criteria used. c)

Statement whether al l con formance criteria have b een met

T his element provides a s tatement and describ es whether al l con formance criteria have b een met. d)

E vidence showi ng why con forma nce criteri a were no t me t (identi fie d noncon form itie s)

T h i s element provide s the fi nd i ngs that s how why s p e ci fic con formance c riteria have no t b e en me t (identi fie d noncon form itie s) .

Table 1 5 s p e ci fie s the re qu i re d and re com mende d item s

for

e ach typ e o f eva luation .

Table 15 — Additional content for conformity assessment (if used) Type of evaluation:

User observation Observing user behaviour

Measuring user performance and response

User survey

sha l l

sha l l

sha l l

sha l l

b) C on formance criteria

shal l

sha l l

sha l l

sha l l

c)

shal l

sha l l

sha l l

sha l l

sha l l

sha l l

sha l l

sha l l

Content elements to be included in report: a)

C on fo r m ity a s s e s s ment s cheme

Inspection

(i f u s ed)

Statement whether a l l conformance criteria

have b e en me t d)

E vidence s howi ng why co n fo r m a nce c riter ia

were no t me t (identi fie d no ncon for m itie s)

© I SO /I E C 2 0 1 6 – All rights res erved

25

ISO/IEC 25066:2016(E)

Annex A (normative)

Overview on required and recommended content elements for each type of evaluation

Table A.1 — Overview on required and recommended content elements for each type of evaluation Type of evaluation: Content elements to be included in report:

User observation Observing user behaviour

Measuring user performance and response

User survey

sha l l

sha l l

sha l l

shal l

sha l l

sha l l

sha l l

shal l

sha l l

sha l l

sha l l

sha l l

sha l l

sha l l

sha l l

shal l

sha l l

sha l l

sha l l

sha l l

shou ld

shou ld

shou ld

shou ld

shou ld

shou ld

shou ld

shou ld

Inspection

5.2 .1 Executive summary a)

N a me a nd de s c r ip tio n o f the o b j e c t o f

eva luation b)

S u m m a r y o f me tho d(s) a nd the

pro cedu re c)

S u m m a r y o f re s u lts i nclud i ng ke y

fi nd i n gs , rel ate d conclu s io n s a nd

recom mendations (i f appl icab le)

5.2 .2 Description of the object of evaluation a) Forma l name and rele as e or vers ion b)

P a r ts o f the ob j e c t th at were eva lu ate d

(i f applic able) c)

Us er gro up s

for

wh ich the ob j e c t i s

i ntended d)

B r ie f de s c r ip tio n o f the o b j e c t a nd its

pu rp o s e e) I ntended context of u s e

shou ld

shou ld

shou ld

shou ld

f)

shou ld

shou ld

shou ld

shou ld

m ay

m ay

m ay

m ay

m ay

m ay

m ay

m ay

a) D es crip tion of the pu rp o s e

sha l l

sha l l

sha l l

shal l

b) Func tion s and comp onents eva luated

sha l l

sha l l

sha l l

shal l

sha l l

sha l l

sha l l

sha l l

P r ior u s ab i l ity e va lu atio n rep or t

s um maries (i f applic able) g)

E xp e c te d i mp ac t o f the ob j e c t

h) C itation s to market res e arch for the ob j e c t

5.2 .3 Purpose of the evaluation

(i f applic able) c)

Re a s o n s

fo r

o n l y e va lu ati ng a s ub s e t

o f the ob j e c t (i f ap p l ic ab le)

5.2 .4 Method 5.2 .4.1 General a)

Typ e (s) o f e va lu atio n u s e d

sha l l

sha l l

sha l l

shal l

b)

S u ffic ient i n fo rm ation to rep l ic ate the

sha l l

sha l l

sha l l

shal l

eva luation pro cedu re u s e d duri ng the eva luation

5.2 .4.2 Evaluators/ participants

26

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

Table A.1 (continued) User observation

Type of evaluation: Content elements to be included in report: a)

To ta l numb er of eva luators/

Observing user behaviour

Measuring user performance and response

User survey

sha l l

shal l

shal l

sha l l

sha l l

shal l

shal l

sha l l

sha l l

sha l l

sha l l

sha l l

N/A

shal l

shal l

sha l l

N/A

sha l l

sha l l

sha l l

Inspection

p ar ticip ants b) S egmentation of tes t p ar ticip ants or eva luators/i ns p e c tors (i f more than one s egment) c)

Ke y ch a rac ter i s tic s o f te s t p a r tic ip a nts or

u s ers con s idered for i n s p ec tion d)

D i fferences b etween s ample and the

u s er p opu lation (i f applic able) e)

Tab le o f p a r tic ip a nts b y ch a rac teri s tic s

5.2 .4.3 Tasks (if used in the evaluation) Tasks u s ed for eva luation

sha l l

shal l

shal l

sha l l

b) Task s cenario s for each task

a)

sha l l

shal l

shal l

sha l l

c)

sha l l

shal l

shal l

sha l l

S ele c tion criteria for the tasks

d)

S ource of s ele c ted tasks

sha l l

shal l

shal l

sha l l

e)

Task data given to p ar ticip ants and/

sha l l

sha l l

sha l l

sha l l

N/A

m ay

shal l

N/A

or i n s p e c tors (i f appl ic able) f) C riteria for task completion and task ab andonment for each task

5.2 .4.4 Evaluation environment fac i l itie s

N/A

sha l l

sha l l

m ay

sha l l

shal l

shal l

shou ld

shou ld

shou ld

shou ld

shou ld

m ay

shou ld

shou ld

shou ld

sha l l

shal l

shal l

sha l l

b) I ndep endent variables (i f applic able)

N/A

sha l l

sha l l

sha l l

P re de fi ne d c r iter ia (fo r i n s p e c tio n

sha l l

shal l

N/A

N/A

N/A

m ay

sha l l

sha l l

sha l l

shal l

shal l

sha l l

N/A

sha l l

sha l l

shou ld

N/A

shou ld

shou ld

shou ld

N/A

shal l

shal l

sha l l

shou ld

shal l

shal l

N/A

a)

Phys ic a l envi ron ment a nd

b) Te ch nic a l envi ronment (i f applic able) c)

E va luation ad m i ni s tration to ol s

(i f u s ed) d)

E va luation ad m in i s trators

(i f appl icab le)

5.2 .5 Procedure 5.2 .5.1 Design of the evaluation a)

c)

D es crip tion of the eva luation des ign

or ob s er vation) (i f us e d) d)

M eas ures us e d i n eva luation

(i f appl icab le) e)

O p eratio n a l de fi n ition s o f c riter i a/

mea s ures (i f applic able) f) I nterac tion b e tween i ndividua l s ta ki ng p ar t i n each eva luation s es s ion (i f appl icab le) g) O ther i nd ividua l s pres ent in evaluations (i f appl ic able) h) G enera l i ns truc tion s given to the p ar ticip ants i)

Sp e c i fic i n s tr uc tio n s o n ta s ks

(i f appl icab le)

© I SO /I E C 2 0 1 6 – All rights res erved

27

ISO/IEC 25066:2016(E)

Table A.1 (continued) Type of evaluation:

User observation User survey

Observing user behaviour

Measuring user performance and response

N/A

shou ld

shou ld

may

sha l l

N/A

N/A

N/A

b) O b s er ve d us er b ehaviou r

N/A

sha l l

shou ld

N/A

c) O b s er ve d p erformance data

N/A

d)

N/A

may may

N/A

Content elements to be included in report:

j) Sequence o f activities for conducting

Inspection

the eva luation

5.2 .5.2 Data to be collected

a) Usability de fects in terms o f deviations from predefined criteria (i f criteria are used)

sha l l

N/A

may

shal l

N/A

N/A

shal l

sha l l

sha l l

sha l l

sha l l

sha l l

sha l l

sha l l

shal l

sha l l

sha l l

sha l l

sha l l

N/A

sha l l

sha l l

shal l

N/A

sha l l

sha l l

shal l

N/A

sha l l

sha l l

shal l

sha l l

may

may

may

sha l l

N/A

N/A

N/A

N/A

sha l l

shou ld

N/A

N/A

may

sha l l

N/A

e) P roblem s , opi nions and i mpres s ion s

N/A

shou ld

may

sha l l

f) M eas u red level of u s er s ati s fac tion

N/A

N/A

shou ld

shou ld

Us er-rep or ted qua litative data

e) Us er-rep or ted quantitative data

5.2 .6 Results 5.2 .6.1 Data analysis

a) Approach used for the analysis o f observed, meas ured or col lec te d data b) D i fferences i n plan ne d and col lec te d data (i f applic able)

c) Portion o f data used in the analysis (i f applic able) d)

D ata s cori ng (i f u s e d)

e) D ata reduc tion

f

) Statistical analyses (i f used)

5.2 .6.2 Presentation of the results

a) Usability de fects in terms o f deviations o f attributes o f the object o f evaluation from predefined criteria (i f criteria are used) b) Potential usability problems likely to arise from identified usability de fects c) Usability findings identified during ob s er vations d)

Performance data b as ed on mea s u rements

reported by users or p ercep tion

5.2 .7 Interpretation of results and recommendations a) I nterpretation of res u lts

shou ld

shou ld

shou ld

shou ld

b) Recom mendation s

shou ld

shou ld

shou ld

shou ld

5.2 .8 Additional content for conformity assessment (if used)

a) Con formity assessment scheme, i f used

sha l l

sha l l

sha l l

shal l

b) C on formance criteria

sha l l

sha l l

sha l l

shal l

c) Statement, whether a l l conformance

sha l l

sha l l

sha l l

sha l l

sha l l

sha l l

sha l l

sha l l

criteria have b een met

d) Evidence showing why con formance criteria were no t met

(identified noncon formities)

28

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

Annex B (informative)

Usability test report example

B.1 General T his is an example of an evaluation rep or t. Since the order in which the sec tions and the elements within an evaluation rep or t are not prescrib ed,

this example demonstrates that it is not necessary to follow the order in which the content elements were presented in this I nternational Standard.

However, the rep or t do es need to include al l the

required elements for con formance. T his example evaluation rep or t al so i l lus trates that the grouping of

the content elements themselves can be defined by the author o f the report (e.g. combining in formation s uch as metho ds and pro cedures into one sec tion of the evaluation rep or t) . Al l headings as intro duced

in this International Standard have been included, although they do not contain in formation in detail or in some cases include repetition o f material. It is not necessary to use all the recommended headings in every evaluation report. Evaluations o ften contain more than one type o f evaluation as in this example evaluation (e.g. user observation and subsequent user survey). The content elements for each section o f an evaluation report are determined by the type(s) o f evaluation conducted. As a result this example evaluation report includes the content elements for both, observing user behaviour and user survey. The ‘user survey’ in this example uses the verbal comments made by participants as they completed each test as the basis for evaluation. The data, results and recommendations from both types o f evaluation are presented together in the relevant s ec tions of the rep or t.

Within this example, the footnotes identi fy the con formance tables within this International Standard that apply to that section and demonstrate that the example report does include all the required elements for conformance.

This Annex has been adapted from an evaluation report provided by UXQB — The International Usability and User Experience Qualification Board. The website names and other identi fying in formation (such as figures and screen shots) have been altered or eliminated for anonymity. NOTE

This example is not a complete report. It does not include all the findings or the figures which are

mentione d with i n the exa mple.

B.2 rentmytruck website usability test report B.2.1 Executive summary6) T he running vers ion of

www.rentmytruck.com was usability tested in March 2011 with five members

of the target group . T he evaluation was b ased on ob ser ving user b ehaviour and the comments made

by participants as they completed seven test tasks. Participants answered a set o f questions about themselves at the end o f the test. The test method was unmoderated remote usability testing, where typical users (test participants) carried out tasks while their actions and what they said were recorded. Screen and audio recordings o f the test sessions were analysed by usability pro fessionals a fter the test. The purpose was to determine usability strengths and weaknesses o f the rentmytruck website. This report describes findings and recommendations from the test. 6)

Conforms to Table 3 for observing user behaviour and user survey.

© I SO /I E C 2 0 1 6 – All rights res erved

29

ISO/IEC 25066:2016(E)

M ai n p o s itive u s abi l ity fi nd i ngs



for

w w w. rentmytruck. com :

Back button always works without any problems

Te s t p a r ticip ants a lways go t what they e xp e c te d when they pre s s e d the B ack button i n thei r brows er. O n s ome comp arab le web s ite s , the B ack button do e s no t a lways work as u s ers exp e c t.



The website preserves the contents of the shopping cart well

Tes t p ar ticip ants never los t an item in the shopping car t. O n some comp arable web s ites , there is a timeout that clears the shopping car t and asks p ar ticip ants to s tar t over. I f a timeout l imit exis ts on this web s ite, none of our tes t p ar ticip ants encountered it. M ain improvement areas for w w w. rentmytruck. com : —

Users can rent trucks without being asked about damage coverage

S evera l p a r ticip ants rente d a tr uck without ever s e ei ng the D a mage pro te c tion p age . T hei r renta l b y

defau lt included no damage coverage, and the webs ite did not ask them to make a choice. —

Taxes, fees and total price are not shown

Par tic ip ants wante d to know the the s tore exclud i ng ta xe s a nd

fu l l

fe e s .

price o f the renta l . T he web s ite on ly s hows wh at u s ers mu s t p ay i n

Par tic ip ants were d i s ple a s e d that they cou ld no t s e e the to ta l price

i n the shoppi ng c ar t. T hey were even more d i s ple a s e d when they

fou nd

out that they cou ld no t even s e e

the total price on the checkout p age.

B.3 Findings 7) B.3.1 General B.3.1.1 Positive usability findings — Te s t

Back button always works without any problems p a r ticip a nts

a lways

go t

what

they

e xp e c te d

when

they

pre s s e d

the

B ack

button .

On

s ome

comp a rable web s ite s the B ack button do e s no t a lways work a s u s ers exp e c t.



The website preserves the contents of the shopping cart well

Tes t p ar ticip ants never los t an item in the shopping car t. O n some comp arable web s ites , there is a timeout that clears the shopping car t and asks p ar ticip ants to s tar t over. I f a timeout l imit exis ts on this web s ite, none of our tes t p ar ticip ants encountered it.

B.3.1.2 —

Usability problem

FAQs use slang without explanation

FAQs use slang without explanation: E xamples of terms that con fused tes t p ar ticip ants: D amage Waiver, Me d ic a l-l i fe coverage, L i abi l ity coverage . Us er com ments i nclude d:

“It would be nice if there was a little bit more to help me out with that” “[The FAQs] didn’t explain it well”

Recommendation —

M ake terms that are hard to unders tand clickable. When users cl ick these term s , show a p op -up that e xpla i n s the me an i ng u s i ng com mon ly known term s a nd example s .

7)

30

C o n fo rms to Tab les 1 3 and 1 4

fo r

o b s erving us er b ehavio ur and us er s urvey.

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

B.3.2 Rent a Truck B.3.2.1 Positive usability finding —

Test participants easily realized how truck price depends on truck size

S ee the example in Figure 1 (not included in this Annex) .

B.3.2.2 —

Usability problem

Total price not shown on the checkout page

Test participants were displeased that they could not see the total price in the shopping cart. They were even more displeased when they found out that they could not even see the total price on the checkout p age. S ee Figure 3 (not included in this Annex) . User comments included:

“‘Due today’ is $30. 45. Where are they getting that from? It would be nice if they told me what this was like, what fees are being included here.”

“Where is my final price?”

Recommendations:

— In the shopping cart and on the checkout page, show both what is due today and what the total price of the order including truck rental and s torage rental . —

O ffer prices b oth with and without ta xes . I f there is no ro om for b oth, omit the prices without ta xes .

— Ask the user for the expected ending date o f any storage rental included in the o ffer. Use the ending date to compute the total price of the s torage rental .

— Make it completely clear how the website arrives at any total or subtotal. I f there is insu fficient ro om for a fu l l explanation, add the lin k explain next to the total or s ub total . C l icking this lin k mus t

display a detailed breakdown. B.3.3 Insurance

B.3.3.1 Positive usability finding —



B.3.3.2 —

Usability problem



B.4 Description of the object of evaluation 8) T he web s ite that was evaluated was

www.rentmytruck.com , which was the one avai lable to the public

in M arch 2 011 . T he home p age of the web s ite at the time of tes ting is shown in Figure 11 (not included in this Annex) . T he target group for the web s ite is individual s who are in p os ses s ion of a val id driver’s license and who want to rent a moving truck for a lo cal or long dis tance move. T he web s ite al so sel l s and rents moving s uppl ies and moving tool s . I t als o provides s torage s p ace.

Test participants were explicitly asked to re frain from submitting orders. Apart from this restriction, the website was fully available to them. 8)

Conforms to Table 4 for observing user behaviour and user survey.

© I SO /I E C 2 0 1 6 – All rights res erved

31

ISO/IEC 25066:2016(E)

The website is intended to be used in a typical o ffice or home desktop environment. B.5 Purpose of the evaluation 9)

The purpose was to determine usability strengths and weaknesses o f the rentmytruck website. B.6 Evaluation method B.6.1 Overview of the evaluation method 10) B.6.1.1

General

The usability evaluation focused on observing user behaviour. This usability test was conducted as an unmoderated remote usability test by the company . This company specializes in unmoderated usability tests with users recruited from their user base. In an unmoderated usability test, users are not observed live while they carry out tasks. Instead, their interactions with the website and their verbal comments are audio recorded for later analysis. Five users each carried out seven tasks on the web s ite in sep arate tes t ses s ions . At the end of each tes t

session, they answered a number o f pre-defined questions about themselves.

Subsequently, the screen and audio recordings were analysed by three usability testers working independently. They each identified usability problems and strengths as well as the underlying usability de fects. They then reviewed individual findings and agreed on a consolidated set o f usability findings. B.6.1.2

Methodological basis

This usability test used observation o f the users’ behaviour as evidenced by screen recordings. This enabled the identification o f success ful completion o f tasks, together with problems encountered in carrying out the tasks. In addition, the recognized “think-aloud” method was used as a basis for the “user survey”. This method is described for example in Dumas und Redish (1999): A Practical Guide to Usability Testing, and Hartson und Pyla (2012): The UX Book. B.6.1.3

Test sessions

The evaluation is based on an analysis o f screen recordings from five unmoderated test sessions o f www.rentmytruck.com carried out in late March 2011 by , together with analysis o f audio recordings o f users’ comments. E ach tes t s es s ion las ted b etween 1 3 min and 2 3 min. T he total time for each tes t ses s ion including ans wering ques tions was les s than 3 0 min.

B.6.1.4

Target group for the system

T he target group for the web s ite is individual s who are in p os ses s ion of a valid driver’s l icense and who want to rent a moving truck for a lo cal or long dis tance move. T he target group includes a large p ar t of the adu lt, E nglish-s p eaking US p opu lation . T he target group is exp ec ted to have some knowledge of

computers and the internet, but they do not have to be computer pro fessionals.

9) 10)

32

Con forms to Table 5 for observing user behaviour and user survey. Conforms to Tables 6 and 7 for observing user behaviour and user survey. © I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

Table B.1 — Usability test results for each test participant Participant

Sex

Age

Occupation

Web savvy

1

M

24

Missionary

Average

2

M

52

Sma l l bus i nes s manager

Average

3

F

62

Retired. Formerly a televi-

Average

s ion news pro ducer, then licen s ed p a ra legal .

B.6.1.5

4

F

36

H ous ewi fe

Average

5

M

31

S a les and marketi ng

Average

Recruitment of users

All test participants were recruited by the company , which specializes in unattended usability testing. Both sexes were represented. Test participants were required to have a val id driver’s l icense and to b e con s idering renting a truck. O ther wise, no res tric tions were imp osed on the recruiting.

B.6.2 Usability test script11) B

.

6

.

2

.

1



B

r

i

e

f i

n

g

does not publish the instructions given to test participants ahead o f a usability test session. B.6.2.2

General instructions given to the users

Each user was given instructions online for each o f the seven tasks. They were asked to carry out the task and were told to verbally describe any problems or particular issues that they noticed. B.6.2.3

Test tasks

The test tasks were defined to test the most frequently occurring tasks on the website based on website statistics. Tasks were reviewed by a user experience specialist working for rentmytruck and by several independent usability pro fessionals. The usability o f the test tasks and the instructions for the test participants in the unattended usability test were tested in two dry-runs. Test tasks and instructions were subsequently improved based on the feedback. The following task set was used for all sessions. The tasks were carried out in the same order by each p ar ticip ant:

Your friends Mike and Anna are about to move from Pittsburgh, PA to Denver, CO. They have an apartment in Pittsburgh consisting o f a living room, a bedroom, a kitchen, and a bathroom. They want to find the cheapest service for the move to Colorado. They expect to make the move themselves Scenario:

with some help from a few friends .

They are planning to move out on April 14th and they expect the trip to take three days. The couple plans to return to Pittsburgh a fter two years so they want to rent a sel f-storage unit in Pittsburgh for the stu ff they do not need in Denver. Task 1:

T he couple needs a truck that is s uitable for al l the furniture and b elongings in their three-

room apartment. Please find the total price the couple will have to pay for the truck. NOTE

11)

They are moving on April 14th from XXX1 Rd. in Pittsburgh, PA 15217 to XXX2 St. in Denver, CO 80218.

Conforms to Tables 8 and 10 for observing user behaviour and user survey.

© I SO /I E C 2 0 1 6 – All rights res erved

33

ISO/IEC 25066:2016(E)

Expected an swer: According to rentmytruck, a 14 ft truck is required. The price of the truck is $1 ,165 plus moving in surance $196 plus environmental fee $5. Taxes are not included. The tax rate does not seem to be available from the website.

Task 2: Be fore you go any further, you want to check i f Mike and Anna need a special driver’s license to drive the truck across country. Where would you find that in fo? Expected an swer: An ordinary driver’s licen se is OK according to the FAQ “Do I need a special driver’s licen se”.

Task 3:

They also need an indoor storage unit in Pittsburgh that can hold 10 moving boxes

(18 in × 18 in × 16 in) and a large fridge. Find the p er month cos t of the s torage.

Expected answer: The price o f storage at XXX Self Storage, 1 st floor, 5 ft × 5 ft × 8 ft for 24 months is $59 a month .

Task 4:

[…]

Expected an swer: […]

Task 5:

[…]

Expected an swer: […]

Task 6:

[…]

Expected an swer: […]

Task 7:

[…]

Expected an swer: […]

B.6.2.4

Task completion

E ach task was cons idered complete as so on as each tes t p ar ticip ant moved on to the next task. (Since no

moderator was present, it was not possible to help participants i f they got stuck or when they arrived at an incorrec t ans wer.)

B.6.2.5

Post-session questions

After each tes t ses s ion, the tes t p ar ticip ant ans wered several ques tions including the fol lowing:

— Are you male or female? — How old are you? — Have you ever rented a truck from this truck company be fore? — Where do you live (town, state)? — What is your occupation? — How would you rate your web knowledge? — Did you complete the tests at home or in your workplace, or other (i f other, please speci fy)? — Which type o f device did you use, desktop computer or notebook/laptop (manu facturer and model)? — Which browser did you use (including version)?

34

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

B.7 Findings for each test participant12)

For each test participant, the following table shows the analysis o f the test participant’s observed p erformance on each task. Tasks were as s es sed and scored as fol lows:

— The task was solved correctly without problems; scored as 1. — Problems occurred which delayed the test participant in carrying out the task; scored as 2. — The test participant encountered considerable problems but eventually succeeded in completing the task correctly; scored as 3. — The test participant was unable to complete the task or arrived at a result that deviated significantly from the correct result; scored as 4. Table B.2 — Analysis of the test participant’s observed performance Participant Task 1 — S ee no tes .

1

2

3

4

5

3

1

4

3

3

1

1

1

1

1

1

3

3

3

3

1

1

1

1

2

4

4

4

4

4

4

4

2

1

4

1

1

4

2

1

P rice for truck rental, P ittsburgh to D enver Task 2 C heck ne ed for s p ecia l d river ’s licens e Ta sk 3 — S e e no tes . C o s t of i ndo or s torage unit Task 4

Phone number o f nearest rentmytruck location Task 5 — S e e no tes . Rent truck and purchas e movi ng s uppl ies Task 6 — S ee no tes .

Liability for repair costs Task 7

Find nearest rentmytruck location Notes to these findings Problems in task 1: See the following findings in section 1.

— Insu fficient help for selecting right truck size. — Unclear i f included mileage is su fficient. —

Ta xes and fees are not shown .

— Total price not shown; website only shows what user must pay in store. Problems in task 3: See the following finding in section 1. —

No help offered for selec ting the right s ize of the s torage room .

Problems in task 5: See the following findings in section 1.

— Adequate liability coverage is not included for all types o f moves. —

12)

Users can rent trucks without b eing asked ab out coverage.

Conforms to Table 12 for observing user behaviour and user survey.

© I SO /I E C 2 0 1 6 – All rights res erved

35

ISO/IEC 25066:2016(E)

P

r

o

b

l

e

m

s



i

n



t

a

s

k

6

:



S

e

e



t

h

e



f

o

l

l

o

w

i

n

g



f i

n

d

i

n

g

s



i

n



s

e

c

t

i

o

n



Uncle a r i f vanda l i s m i s covere d b y i n s u rance coverage .



Unclear i f there is a deduc tible.

36



1

.

© I SO /I E C 2 0 1 6 – All rights res erved

ISO/IEC 25066:2016(E)

Bibliography [1]

ISO 9241 series, Ergonomics of human-system interaction

[2]

ISO 26800:2011, Ergonomics — General approach, principles and concepts

[3]

ISO/IEC/IEEE 15288:2015, System s and software engineering — System life cycle processes

[4]

ISO/IEC/IEEE 15289:2011 1 3 ) , System s and software engineering — Content oflife-cycle information item s (documentation)

[5 ]

ISO/IEC 17000:2004, Conformity assessment — Vocabulary and general principles

[6]

ISO/IEC/IEEE 24765:2010, System s and software engineering — Vocabulary

[7 ]

ISO/IEC 25000:2014, System s and software engineering — System s and software Quality Requirements and Evaluation (SQuaRE) — Guide to SQuaRE

[8]

ISO/IEC 25010:2011, Systems and software engineering — System s and software Quality Requirements and Evaluation (SQuaRE) — System and software quality models

[9]

ISO/IEC 25040:2011, System s and software engineering — System s and software Quality Requirements and Evaluation (SQuaRE) — Evaluation process

[10]

ISO/IEC 25063:2014, System s and software engineering — System s and software product Quality Requirements and Evaluation (SQuaRE) — Common Industry Format (CIF) for usability: Context of use description

[11]

ISO/IEC 25064:2013, System s and software engineering — Software product Quality Requirements and Evaluation (SQuaRE) — Common Industry Format (CIF) for usability: User needs report

[12]

ISO/IEC 25065 14) , System s and software engineering — System s and software Quality Requirements and Evaluation (SQuaRE) — Common Industry Format (CIF) for Usability — User Requirements

Specification [13]

ISO/IEC 33020:2015, Information technology — Process assessment — Process measurement framework for assessment of process capability

[14]

ISO TS 18152:2010, Ergonomics o f human-system interaction — Specification for the process assessment of human-system issues

[15]

ISO TR 18529:2000, Ergonomics — Ergonomics of human-system interaction — Human-centred lifecycle process description s

[16]

ISO/IEC TR 11580:2007, Information technology — Framework for describing user interface objects, action s and attributes

[17]

ISO/IEC TR 24774:2010 System s and software engineering — Life cycle management — Guidelines for process description

[18]

ISO/IEC TR 25060:2010, System s and software engineering — System s and software product Quality Requirements and Evaluation (SQuaRE) — Common Industry Format (CIF) for usability: General framework for usability-related information

[19]

IEC 62366 –1:2015, Medical devices — Part 1 : Application of usability engineering to medical devices

13)

Withdrawn. Replaced with I SO /I E C/I E E E 1 5 2 8 9 : 2 0 1 5 .

1 4)

Under preparation.

© I SO /I E C 2 0 1 6 – All rights res erved

37

ISO/IEC 25066:2016(E)

ICS  35.080 Price b as ed o n 3 7 pages

© I SO /I EC 2 0 1 6 – All rights reserved