Please enable JavaScript.
Coggle requires JavaScript to display documents.
CHAPTER 3: EVALUTION TECHNIQUE - Coggle Diagram
CHAPTER 3: EVALUTION TECHNIQUE
Main Goals of Evalution
assess effect of interface on user
assess extent of system functionality
identify specific problems
Evaluation by User Participation
Observational techniques
Protocol analysis
audio/video transcription difficult and requires skill.
Mixed use in practice
Automated analysis
Workplace project
Post task walkthrough
Cooperative evaluation
variation on think aloud
user collaborates in evaluation
both user and evaluator can ask each other questions throughout
Post-task walkthroughs
transcript played back to participant for comment
identify reasons for actions and alternatives considered
Think Aloud
user asked to describe what he is doing and why, what he thinks is happening
user observed performing task
Query techniques
Interviews
analyst questions user on one-to -one basisusually based on prepared questions
informal, subjective and relatively cheap
Questionnaires
Set of fixed questions given to users
Need careful design
what information is required?
Styles of question
open-ended
: ‘Can you suggest improvements to interface?’
scalar
It is easy to recover from mistakes.
multi-choice
ranked
general
establish background of user
Field studies
Conducted in the work environment or “in the field”.
Advantages
natural environment
longitudinal studies possible
context retained (though observation may alter it)
Disadvantages
noise
distractions
Appropriate
where context is crucial for longitudinal studies
Laboratory studies
Disadvantages
difficult to observe several users cooperating
lack of context
Appropriate
if system location is dangerous or impractical for constrained single user systems to allow controlled manipulation of use
Advantages
uninterrupted environment
specialist equipment available
tertutup
Empirical method : Experimental evaluation
Empirical Methods: Experimental Evaluation
To support a particular claim or hypothesis
It can be used to study a wide range of different issues at different levels of detail
Spreadsheet package
Technique
Representative tasks
Participants
Measurements
Outline plan
Factors Considered In Experimental Design
Variables
Experiments manipulate and measure variables under controlled conditions in order to test the hypothesis
types
Independent variables (manipulated/changed)
elements of the experiment that are manipulated to produce different conditions for comparison
Example : interface style, level of help,
Dependent variables (measured)
can be measured
Example : time taken (user performance time), number of errors (accuracy)
Hypotheses
Is a prediction of the outcome of an experiment
or expected relationship between at least two variable.
Participants
should be chosen to match the expected user population as closely as possible
If participants are not actual users, they should be chosen to be of similar age and level of education as the intended user group
Consider sample size
Statistical measures
Non-parametric
Do not assume normal distribution
Less powerful
More reliable
Contingency table
Classify data by discrete attributes
Count number of data items in each groups
Parametric
Robust
powerful
Assume normal distribution
Experimental design:
between subject
Each participant is assigned to a different condition
within subject
Known as repeated measure
Each user performs under each different condition.
Expert Analysis
Model-based evaluation
A third expert-based approach.
example, GOMS (Goals, Operators, Methods and Selection).
GOMS is the model that predicts user performance with a particular interface and can be used to filter particular design options.
Goals specify what the user wants and intends to achieve.
Operators are the building blocks for describing human-computer interaction at the concrete level.
Methods are programs built with operators that are designed to accomplish goals.
Selection rules predict which method will be used.
Cognitive Walkthrough
Negatives
1.Can be subject to personal ‘quirks’.
Not always easy to draw generalizable conclusions.
Danger of over-reaction.
Positives
Throws up surprises
Extremely practical and easy to implement.
Real insights gained into potential issues with new users;
identify potential problems using psychological principles
evaluates design on how well it supports user in learning task
Heuristic Evaluation
H2-1: Visibility of system status
keep users informed about what is going on
H2-2: Match between system & real world
speak the users’ language
H2-3: User control & freedom
don’t force down fixed paths
The freedom to undo any accidental actions
“exits” for mistaken choices, undo, redo
H2-4: Consistency & standards
- Consistency of effects
same words, commands, actions will always have the same effect in equivalent situations
-Consistency of language and graphics
same info/controls in same location on all screens/dialog boxes
-Consistency of input
consistent syntax across complete system
H2-5: Error prevention
try to make errors impossible
modern widgets: only “legal commands” selected, or “legal data” entered
provide reasonableness checks on input data
H2-6: Recognition rather than recall
Computers good at remembering things, people aren’t!
Promote recognition over recall
relies on visibility of objects to the user (but less is more!)
menus, icons, choice dialog boxes vs. command lines, field formats
H2-7: Flexibility and efficiency of use
Experienced users should be able to perform frequently used operations quickly
H2-8: Aesthetic and minimalist design
No irrelevant information in dialogues
Aesthetic and Minimalist design is about giving relevant data and removing all the unwanted things
H2-9: Help users recognize, diagnose, and recover from errors
error messages in plain language
precisely indicate the problem
constructively suggest a solution
H2-10: Help and documentation
Tutorial and/or getting started manuals
Reminders
Reference manuals