Skip to content

Instantly share code, notes, and snippets.

@patilswapnilv
Created September 21, 2023 15:53
Show Gist options
  • Save patilswapnilv/17ea92034429bc76c53f8dac85f73ff9 to your computer and use it in GitHub Desktop.
Save patilswapnilv/17ea92034429bc76c53f8dac85f73ff9 to your computer and use it in GitHub Desktop.
Goal: Get a little practice in creating a flow model for an enterprise.
Activities:
� Follow up on your flow model initial sketch that you did in Exercise 4-1.
� Again represent each work role or system entity as a node in the diagram.
� Use arcs between nodes to show all communication and coordination necessary to do the work of the enterprise.
� Use arcs to represent all information flow and flow of physical artifacts.
� Include all forms of communication, including direct conversations, email, phones, letters, memos, meetings, and so on.
� Include both flow internally within the enterprise and flow externally with the rest of the world.
Deliverables: One flow model diagram for your system, with as much detail as feasible.
Schedule: This could take a couple of hours.
Exercise 6-6: Hierarchical Task Inventory for Your System
Goal: Get some practice creating a hierarchical task inventory diagram.
Activities: Using your task-related contextual data notes, make a simple hierarchical task inventory diagram for your system.
Deliverables: Simple HTI diagram(s) for the system of your choice.
Schedule: An hour should be enough to get what you need from this exercise.
Exercise 6-7: Usage Scenarios for Your System
Goal: Get some practice in writing usage scenarios.
Activities:
� Select one or two good representative task threads for the most interesting user class, for example, the customer.
� Write a couple of detailed usage scenarios, referring to user roles, tasks, actions, objects, and work context.
� Work quickly; you can clean it up as you go.
Hints and cautions: Do not worry too much about the design yet; we will get to that.
Deliverables: A few usage scenarios to share and discuss.
Schedule: An hour should be enough time for this one.
Exercise 6-8: Design Scenarios for Your System
Goal: Get some practice in writing usage scenarios.
Activities:
� For the same usage scenarios you wrote in the previous exercise, write a couple of detailed design scenarios, again referring to user roles, tasks, actions, objects, and work context.
� Make up anything you need about the design on the fly.
� Do this quickly; you can clean it up as you go. Deliverables: A few design scenarios to share and discuss. Schedule: An hour should be enough time for this one.
Exercise 6-9: Identifying Information Objects for Your System
Goal: Get a little practice in identifying information objects for a system.
Activities:
� Review the ontology of your system.
� Identify the entities within your application that are operated on by users-searched and browsed for, accessed and displayed, modified and manipulated, and stored back again.
� Sketch an outline or list of these information objects, their attributes, and the relationships among them.
Deliverables: The list just described.
Schedule: A half hour should do it.
CHAPTER 7 EXERCISES
Exercise 7-1: Creating a User Persona for Your System
Goal: Get some experience at writing a persona.
Activities:
� Select an important work role within your system. At least one user class for this work role must be very broad, with the user population coming from a large and diverse group, such as the general public.
� Using your user-related contextual data, create a persona, give it a name, and get a photo to go with it.
� Write the text for the persona description.
Deliverables: One- or two-page persona write-up
Schedule: You should be able to do what you need to learn from this in about an hour.
Exercise 7-2: Practice in Ideation and Sketching
Goal: To get practice in ideation and sketching for design.
Activities:
� Doing this in a small group is strongly preferable, but you can do it with one other person.
� Get out blank paper, appropriate size marking pens, and any other supplies you might need for sketching.
� Pick a topic, a system, or device. Our recommendation is something familiar, like a dishwasher.
� Start with some free-flow ideation about ways to design a new and improved concept of a dishwasher. Do not limit yourself to conventional designs.
� Go with the flow and see what happens.
� Remember that this is an exercise about the process, so what you come up with for the product is not that crucial.
� Everyone should make sketches of the ideas that arise about a dishwasher design, as you go in the ideation.
� Start with design sketches in the ecological perspective. For a dishwasher, this might include your dining room, kitchen, and the flow of dishes in their daily cycle. You could include something unorthodox: sketch a conveyor belt from the dinner table through your appliance and out into the dish cabinets. Sketch how avoiding the use of paper plates can save resources and not fill the trash dumps.
� Make some sketches from an interaction perspective showing different ways you can operate the dishwasher: how you load and unload it and how you set wash cycle parameters and turn it on.
� Make sketches that project the emotional perspective of a user experience with your product. This might be more difficult, but it is worth taking some time to try.
� Ideate. Sketch, sketch, and sketch. Brainstorm and discuss.
Deliverables: A brief written description of the ideation process and its results, along with all your supporting sketches.
Schedule: Give yourself enough time to really get engaged in this activity.
Exercise 7-3: Ideation and Sketching for Your System
Goal: More practice in ideation and sketching for design. Do the same as you did in the previous exercise, only this time for your own system.
CHAPTER 8 EXERCISES
Exercise 8-1: Conceptual Design for Your System
Goal: Get a little practice in initial conceptual design.
Activities:
� Think about your system and contextual data and envision a conceptual design, including any metaphors, in the ecological perspective. Try to communicate the designer's mental model, or a design vision, of how the system works as a black box within its environment.
� Think about your system and contextual data and envision a conceptual design in the interaction perspective. Try to communicate the designer's mental model of how the user operates the system.
� Finally, think about your system and contextual data and envision a conceptual design in the emotional perspective. Try to communicate a vision of how the design elements will evoke emotional impact in users.
Deliverables: Brief written descriptions of your conceptual design in the three perspectives and/or a few presentation slides of the same to share with others.
Schedule: You decide how much time you can afford to give this. If you cannot do this exercise in all three perspectives, just pick one, perhaps the ecological perspective.
Exercise 8-2: Storyboard for Your System
Goal: Get a little practice in sketching storyboards.
Activities:
� Sketch storyboard frames illustrating narrative sequences of action in each of the three perspectives.
� Include things like these in your storyboards:
� Hand-sketched pictures annotated with a few words
� All the work practice that is part of the task, not just interaction with the system, for example, include telephone conversations with agents or roles outside the system
� Sketches of devices and screens
� Any connections with system internals, for example, flow to and from a database
� Physical user actions
� Cognitive user actions in "thought balloons"
� Extra-system activities, such as talking with a friend about what ticket to buy
� For the ecological perspective, illustrate high-level interplay among human users, the system as a whole, and the surrounding context.
� In the interaction perspective, show screens, user actions, transitions, and user reactions.
� Use storyboards in the emotional perspective to illustrate deeper user experience phenomena such as fun, joy, and aesthetics.
Schedule: You decide how much time you can afford to give this. If you cannot do this exercise in all three perspectives, just pick one, perhaps the ecological perspective.
CHAPTER 9 EXERCISES
Exercise 9-1: Intermediate and Detailed Design for Your System
Goal: Get some practice in developing a few parts of the intermediate and detailed design.
Activities:
� If you are working with a team, get together with your team.
� Choose just one principal work role for your system (e.g., the customer).
� Choose just one key task that work role is expected to perform.
� For that work role and task, make a few illustrated scenarios to show some of the associated interaction.
� Sketch some screen layouts to support your scenarios, along with some representation of the navigational structure.
� Go for a little depth, but not much breadth.
� Make a few annotated wireframes for the same scenarios.
Hints, cautions, and assumptions:
� Do not get too involved in design guidelines issues yet (e.g., icon appearance or menu placement).
� Control time spent arguing; learn the process!
� Base your screen designs on the contextual analysis and design you have done so far.
Deliverables: Just the work products that naturally result from these activities.
Schedule: Whatever you can afford. At least give it an honest try.
CHAPTER 10 EXERCISES
Exercise 10-1: Identifying User Experience Goals for Your System
Goal: A little experience in stating user experience goals.
Activities: Review the WAAD and user concerns in the social model for the system of your choice, noting user or customer concerns relating to user experience goals.
Deliverables: A short list of user experience goals for one user class of the system of your choice.
Schedule: A half hour or so (it should be easy by now).
Exercise 10-2: Creating Benchmark Tasks and UX Targets for Your System
Goal: To gain experience in writing effective benchmark tasks and measurable UX targets. Activities:
� We have shown you a rather complete set of examples of benchmark tasks and UX targets for the Ticket Kiosk System. Your job is to do something similar for the system of your choice.
� Begin by identifying which work roles and user classes you are targeting in evaluation (brief description is enough).
� Write three or more UX table entries (rows), including your choices for each column. Have at least two UX targets based on a benchmark task and at least one based on a questionnaire.
� Create and write up a set of about three benchmark tasks to go with the UX targets in the table.
� Do NOT make the tasks too easy.
� Make tasks increasingly complex.
� Include some navigation.
� Create tasks that you can later "implement" in your low-fidelity rapid prototype.
� The expected average performance time for each task should be no more than about 3 minutes, just to keep it short and simple for you during evaluation.
� Include the questionnaire question numbers in the measuring instrument column of the appropriate UX target.
Cautions and hints:
� Do not spend any time on design in this exercise; there will be time for detailed design in the next exercise.
� Do not plan to give users any training.
Deliverables:
� Two user benchmark tasks, each on a separate sheet of paper.
� Three or more UX targets entered into a blank UX target table on your laptop or on paper.
� If you are doing this exercise in a classroom environment, finish up by reading your benchmark tasks to the class for critique and discussion.
Schedule: Work efficiently and complete in about an hour and a half.
CHAPTER 11 EXERCISES
Exercise11-1: Building a Low-Fidelity Paper Prototype for Your System
Goal: To obtain experience with rapid construction of a low-fidelity prototype for early stages of user interaction design and to have a real paper prototype to generate lots of critical incidents later in your evaluation exercise.
Activities: This should be one of your most fun exercises, but it can also be a lot of work.
� Following the guidelines for paper prototype construction given in Section 11.6.5, build a paper prototype for your system or product design.
� Make sure that the prototype will support at least the benchmark tasks, descriptions for which you wrote in the previous exercise.
� Add in some other "decoy" interaction design "features," widgets, and objects so that the prototype does not look tailored to just your benchmark tasks.
Hints and cautions:
� It is normal for you to have to do more design work during this exercise, to complete details that were not fully designed in previous exercises.
� Remember: You are learning the process, not creating a perfect design or prototype.
� Assuming you are doing this as a team: Get everyone on your team involved in drawing, cutting, taping, and so on, not just one or two people.
� You will be done much faster if everyone pitches in.
� This is not art class so do not worry too much about straight lines, exact details, etc.
� Pilot test to be sure it will support your benchmark tasks for evaluation.
Deliverables: A right smart "executable" paper prototype that will support your benchmark tasks in user experience testing, and your pilot tests passed with flying colors (no monochromatic flying).
Schedule: Just git 'er done. It could take several hours, but it is essential for all the exercises that follow.
CHAPTER 13 EXERCISES
Exercise13-1: Formative UX Inspection of Your System
Goal: Get a little practice in doing a UX inspection.
Activities:
� Unless you have another prototype, use the paper prototype you built in the previous exercise. If your paper prototype is not suitable for an effective exercise in UX inspection, select an application or appropriate Website as the target of your inspection.
� Perform a UX inspection as described in Chapter 13.
� If you are working with a team, use the team approach described in Chapter 13.
Deliverables: A list of UX problems identified by your UX inspection.
Schedule: An hour and a half.
CHAPTER 14 EXERCISES
Exercise 14-1: Formative UX Evaluation Preparation for Your System
Goal: To get some practice in preparation for a simple empirical evaluation.
Activities:
� If you are working with a team, get together with your team.
� Decide roles for team members. Include at least a facilitator and a prototype executor, plus a quantitative data recorder and one or more critical incident recorders.
� In addition, if you are doing this exercise in a classroom with other teams, assign two team members as participants to trade to another team when you start data collection in the next exercise.
� The prototype executor should get out the paper prototype you made in Exercise 11-1 and make sure the prototype works without breaking.
� If you developed a programmed prototype, everything will be the same except that you will not need an interface executor. You will, instead, need someone to make sure the prototype hardware and software are set up, installed, and running properly for evaluation.
� This activity works well for a team of about four. If you have more or fewer members in your team, it is easy to make adjustments. If there are only two of you, for example, one person can be the executor and the other person can record critical incidents and time the benchmark tasks. If there are four or five of you, the extra people will be valuable in helping record critical incidents. If you have been working alone on all the previous exercises, you may want find a couple of other people to help you run the evaluation. In addition and in any case, you need to recruit two people to serve as participants to evaluate your prototype.
� Get out the UX target table you made in Exercise 10-2.
� Have at least two benchmark tasks that you created in Exercise 10-2, each written on a separate piece of paper.
� Assuming you used a questionnaire for subjective data in your evaluation session, get out copies of the questionnaire, one for each participant you will be using, and circle the questions you want participants to answer.
� Review your evaluation protocols.
Deliverables: Just have everything just mentioned ready for the next exercise, data collection.
Schedule: It should not take too long to get ready for evaluation.
CHAPTER 15 EXERCISES
Exercise 15-1: UX Evaluation Data Collection for Your System
Goal: To get a little practice in the data collection part for a very simple formative UX evaluation using a paper prototype.
Activities: This is perhaps the most fun and most rewarding of all the exercises when you finally get to see some users in action with your interaction design.
� New team formation:
� This is described in terms of multiple teams in a classroom setting. For other setups, make appropriate adjustments.
� After all the teams are gathered and sitting around a table, make the switch of participants with another team.
� You send the two people in the participant role from your team to another team. Curb the potential confusion here by doing the swap in an orderly circular fashion among the teams.
� You will now have new participants from a different team who are unfamiliar with your design. These new participants are now permanently on your team, for the rest of these exercises, including data collection, analysis, and reporting.
� As an alternative, if you do not have multiple teams, try recruiting a couple of co-workers or friends as participants.
� Sitting together in your newly formed teams, get out your UX target table form, your benchmark task descriptions, and your questionnaires.
� Dismiss your two participants (the new team members you just got) to the hallway or other waiting area.
� Data collection:
� Assemble and boot up your prototype, per the instructions in Section 15.3.6.
� Call in your first participant into the "lab," greet the participant, and explain the evaluation session.
� Have this first participant perform your first benchmark task for your objective UX targets.
� Have the participant read the first benchmark task aloud.
� Ask the participant to perform that task while thinking aloud.
� The executor moves prototype parts in response to participant actions.
� The facilitator directs the session and keeps it moving.
� Timer(s) writes down or enters timing and error count data as indicated in UX targets as the user performs the task (do not count participant's reading aloud of task in task timing).
� Everyone else available should be used to take notes on critical incidents and UX problems.
� Remember the rules about not coaching or anticipating user actions. And the computer may not speak!
� Have this first participant perform your second benchmark task for your objective UX targets.
� Have the participant read the second task aloud and perform it while thinking aloud.
� How much data to collect?
� You need to collect a dozen or more critical incidents in this overall exercise (i.e., from both participants doing both benchmark tasks).
� If you do not get at least a half dozen from each participant, continue with that participant doing exploratory use of your prototype until you get enough critical incidents.
� For example, have them browse through each screen, looking at each object (button, menu, etc.), commenting on and giving their opinion about the quality of the user experience relating to various features.
� Have this participant complete your questionnaire and then give them their "reward."
� Keep your first participant as a new member of the rest of the team to help with observations.
� Bring in the second participant and perform the same session again.
Deliverables: All your data.
Schedule: Complete by end of class (about an hour and a half, if you are efficient).
CHAPTER 16 EXERCISES
Exercise 16-1: UX Data Analysis for Your System
Goal: To get some practice with the analysis part of a very simple formative UX evaluation.
Activities:
� If you are working with a team, get together with your team, including any new participants you picked up along the way.
� Fill in the UX target table "Observed results" column.
� Together, your team compiles and compares the quantitative results to determine whether UX targets were met.
� Review your raw critical incident notes and write a UX problem list.
� Organize the UX problem list and perform cost-importance analysis.
� Using a paper cost-importance table or laptop spreadsheet, list a dozen or more UX problems from critical incidents.
� Assign an importance (to fix) rating to each observed problem.
� Propose solutions (without doing all the work of redesign).
� Group together any related problems and list as single problem.
� Assign cost values (in person-hours) to each solution.
� Compute priority ratios.
� Compile your results:
� Move your "Must fix" problems to the top of your cost-importance table.
� Sort the remaining problems by decreasing priority ratios to determine the priority rank of UX problems.
� Fill in the cumulative cost column.
� Assume a hypothetical value for available time resources (something to make this exercise work).
� Draw the cutoff, line of affordability.
� Finalize your "management" decisions (resolution) about which changes to make now and in the next version.
Deliverables:
� Summary of quantitative results, written in "Observed results" column in your UX target table form (for comparison with UX targets).
� List of raw critical incidents.
� Cost-importance table form containing three UX problems selected as interesting to present to class or your work group (complete across all three rows).
� Choose someone to give brief a report on your evaluation results.
Schedule: Given the simplicity of the domain, we expect this exercise to take about 30 to 60 minutes.
CHAPTER 17 EXERCISES
Exercise 17-1: Formative Evaluation Reporting for Your System
Goal: Write a report of the formative UX evaluation you did on the system of your choice.
Activities:
� Report on your informal summative evaluation results using a table showing UX targets, benchmark tasks, questionnaires, and so on used to gather data, along with target values and observed values.
� Add brief statements about whether or not each UX target was met.
� Write a full report on a selected subset (about half a dozen) of UX problems found in the qualitative part of your formative UX evaluation. Follow the guidelines in this chapter regarding content, tone, and format, being sure to include redesign proposals for each problem.
� Report on the results of your cost-importance analysis, including problem resolutions, for all the problems you reported previously and, if appropriate, some others for context.
Deliverables: Your formative evaluation report.
Schedule: We expect this exercise to take about an hour.
Index
Note: Page numbers followed by b indicate boxes, f indicate figures and t indicate tables.
A
Abridged methods
contextual analysis process, 157-159 contextual inquiry process, 120
cost-importance analysis, 589
designer-ability-driven models, 247
design-informing models, 246-248
hybrid of WAAD and other models, 247-248 on-the-fly modeling during interviews, 248
Active learning, 887
Affinity diagrams, 159-160 Affordances
cognitive, 650
characterization, 651-652
role of, 646
use of, 646, 647
concept, 643, 653-654
definition, 644
description, 643
emotional, 660-661 false cognitive
dial marks, power settings, 657, 657f door sign, 655, 655f
form, insurance company, 656, 656f misdirection, 657, 657f
radio switch, 656, 656f Web page links, 656, 656f
functional, 649
physical, 647
sensory, 647-649, 653
role of, 647-648
text legibility, 647-648
summary, 649
types, 644, 649, 652
user-created
adhesive label, 660, 660f
cobbled design modifications, 658 copier darkness settings, 659, 659f cup-holder artifact, 659, 659f
glass door, 658-659, 658f physical, 658
road sign, 660, 660f sidewalk patterns, 657-658
After Scenario Questionnaire (ASQ), 450 Agile SE methods
characteristics communication, 621
description, 620
goals, 621
practitioners, 621
principle, 621 planning
controlling scope, 623-624 customer stories, 622-623, 622f story-based, 623, 623f
sprints
acceptance test creation, 624 acceptance testing and deployment, 625 code testing, 625
description, 624
implementation coding, 625 unit code test creation, 624
Alpha and beta testing, 490-491 Amazon's Kindle(tm), 308, 325-328, 788
Ambient computing, 331
The American National Standards Institute (ANSI), 598 Apple's iPad(tm), 788
Artifact model, 72b
ASQ. See After Scenario Questionnaire AttrakDiff, questionnaires
administration, 454
alternatives, 457
description, 454
employment, 455
factors, 454
list, 454, 455, 455t
use, 454
variation, 454, 456t
versions, 454
word choices and terminology, 456-457 Automatic UX evaluation method, 492
B
Benchmark tasks content, construction
ambiguities, 370 ecological validity, 373b
midtask change, intention, 373 parameters, 372
rubrics for special instructions, 373 start and end points, timing, 371-372 task script, 373-374
words, usage, 371
work context and usage-centered wording, 371 degraded modes, 369
description, 366
designer questions, 366-367
design scenarios, 368
engineering judgment, 375
error recovery, 369
initial user performance, 368 navigation, 368
power users, 369
selection, 367
establishing targets, 370
Ticket Kiosk System example first impression, 374, 375t initial performance, 374
measuring instrument, 374, 374t objective, measure, 365t, 374
typing skills, 368
using combinations, 368-369 user tasks, spectrum, 367
Borland's 3-D Home Architect(tm), 705 Brainstorming
breakout groups, 280 contextual analysis process, 282 "deep dive" approach, 281-282 description, 280
Kiva, 281, 281f
physical mock-ups, 281, 281b rules of engagement, 282 sub-teams, 280-281
C
Calendar Management System, 739-740 Choosing process instance
instantiation, 60
mapping project parameters to process parameter, 63-64
process parameters, 63
project parameters, 61-62 Chunking
description, 697
grouping and recoding, 698
phone number and Miller estimate, 698 trick case, 698
CIF. See Common Industry Format Cognitive affordance
clarity, 719
complete information, 744-745
long labels, 744 consistency
continue/retry, 725
Find dialogue box, Microsoft Word, 729, 729f Nero Express, 727
presentation, 716-717, 717f
problems, 730
Select Pay Stub year, 726, 727 View Pay Stub Summary, 726, 726f
controlling complexity
airline departure board, 733, 734f decomposing, 731
instructions, bewilderment, 731, 731f layout and spatial grouping, 731 light and call switch, flight, 735
push-button controls, washing machine, 734, 735f Search button, 733
distinguishability, choices elimination process, 723 tragic airplane crash, 723
error recovery feedback, 749
undo actions, 749
error avoidance, 746-748 existence
feed-forward, 710
Microsoft PowerPoint, 710, 710f, 711f requirements, 708-709
legibility, 714 noticeability
log-in boxes, 714
status lines, 714
users' attention, 713-714
precise wording, 719-723
presentation complexity, 714-715
sensory needs, 712 timing, presentation
paper towel dispenser, 715, 715f pasting, 715
user choices and useful defaults current date, 736
planning events, calendar system, 737 tragic choice, defaults, 736
visibility invisible, 712
store user, deodorant, 712
Cognitive directness heater control, car, 743
knob arrangement, cook top, 741 Macromedia Dreamweaver(tm), 741
natural mapping, 740-741 rotation, graphical object, 741
Cognitive load theory, 699 Common Industry Format (CIF)
reporting formal summative UX evaluation results ANSI, 598
NIST, 597
requirements, 598
reporting qualitative formative results, 595 Comparative Usability Evaluation (CUE) series, 497 Complex interaction
complex work domain, 66-69 simple work domain, 70-71
Computer-printed paper prototypes, 408
Computer System Usability Questionnaire (CSUQ), 450 Conceptual design
description, 305
domain-complex systems, 305-306 ecological perspective
Amazon Kindle, 308
description, 308
iPods and iTunes, 308-311 emotional perspective
description, 312
designer work flow and connections, 312, 312f ideation and sketching, 305
interaction perspective description, 311
Microsoft Outlook, 311-312
use, animation, 311
metaphors (see Metaphors, conceptual design) screen designs and widgets, 305
Ticket Kiosk System example
communicating and social networking, 314, 316f communication connection, 314, 315f
early, 314, 314f
feature, 314, 315f
Conceptual design (Continued )
immersion, emotional perspective, 313, 313f interaction perspective, 316, 317f
Wheel lifecycle template, 299, 300f Constructing design-informing models
abridged methods
creation, on-the-fly modeling, 248 designer-ability-driven models, 247 hybrid of WAAD and other models,
247-248
selective, 246
barriers, work practice, 185-186, 242-244 contextual inquiry and analysis, 184-185 data, 184
exercises, 6-7, 8-9
extract inputs, 184
model consolidation, 244-245
MUTTS (see Middleburg University Ticket Transaction Service (MUTTS))
second span bridge (see Second span bridge) slideshow presentations, 186
software use cases, 248-249 sources, 246
Ticket Kiosk System (see Ticket Kiosk System) usage models
flow model, 209-215
hierarchical task inventory, 216-219 information object model, 232-235 task interaction models, 219-232 task models, 216
user models personas, 209
social models, 196-209
work roles, 187-190
Web accessability (see Web accessability) Wheel lifecycle template, 181, 182f work environment models
artifact model, 235-238
physical model, 238-242 Contextual analysis
abridged contextual analysis process, 157-159
affinity diagrams, 159-160
data interpretation, 130, 132
exercises, 3-5
flow model and work roles identification, 133-134
managing complexity, 133
sketching, 134-135 MUTTS
flow model, sketching, 135b work activity note synthesis, 141b work role, 134b
raw work activity data, 131
WAAD (see Work activity affinity diagram (WAAD))
Wheel lifecycle template, 129, 130f
work activity notes, creating and managing anticipated data bins, 143-144
interview and observation, 136 printing, 144
raw user work activity data, 136-137 synthesization, 137-143
work roles, 132 Contextual inquiry
abridged contextual inquiry process, 120 activity theory, 125
application MUTTS, 94-95
organizational context, 95 Ticket Kiosk System, 95-96
contextual user studies
design and iteration, 124-125 measures, 125
problem, 123-124
cross-cultural user-experience design (see Cross- cultural user-experience design)
data-driven vs. model-driven inquiry, 121-125
domain complex systems (see Domain complex systems)
emotional aspects, work practice, 120 ethnography, 126
exercises, 2-3
existing system/new system, 93-94 goals, 99
MUTTS, user data gathering, 116b observing and interviewing, 91-93, 92f participatory design, 127
people's work practice, 89-90 SnakeLight, 93
system concept statement, 96-98
task analysis/marketing survey, 90-91 Ticket Kiosk System, system concept
statement, 97b voting process, 88-89
Wheel lifecycle template, 87-88, 88f work, 91
work activities, 91
work practice, 91
Cooperative user-system task performance path variations
description, 671
interaction cycle task context instances, 673, 673f multiuser task, 671-672
secondary tasks and intention shifts, 672 stacking and restoring task context, 672
primary tasks description, 670
environment, system/users, 671
user-initiated, 671
Cost benefit and business case analysis, UX casting, net, 841-842
correcting, usability problem, 845 politics and business
champion, 852
credibility, 853-854
engineering, 850
inertial resistance, 853
investment, 850-852
marketing, 852
reward policies, 852
selling, process, 849-850
poor usability, 844-845
ROI, 845-847
savings, 847-848
strategic planning, 848
system development process, 843 techniques, 842
Cost-importance analysis abridged, 589
cost to fix (see Cost to fix) cumulative cost, 584 importance to fix, 577-578 line of affordability, 584-586
multiple problem solutions, 588
priorities, emotional impact problems, 589 priority rankings, 582-584
priority ratio, 581-582
problem groups straddling, 576, 589 solutions
photo album problem, 579 Ticket Kiosk System, 579
Cost to fix
actual vs. predicted costs, 581 cost values, 580
Critical incident
data collection mechanisms comments, 546-547 manual note taking, 547 markers, 546
raw data filtration, 546, 547f video recording, 546
information collection, 545-546 Critiquing vs. ideation
designers, cross-disciplinary team, 275 implementation constraints, 274-275
low-fidelity prototypes, 275
Cross-cultural user-experience design cautions, considerations and developments,
105-106
culture models and dimensions, 105 cultures, 104, 106
localization, 104, 105
CSUQ. See Computer System Usability Questionnaire Cultural conventions, 650-651
D
Data collection techniques critical incident identification
capture and document, 439 description, 436-437
evaluator's recognition, 439-440, 439f formative evaluation, objective, 436 form of, 437
notable indication, 437
observance, 437
optimum time, 440
origins, technique, 438
relevance, 437
self-reporting mechanism, 438-439
software tools, 438
variation, 438 emotional impact
AttrakDiff, questionnaires, 454-458
bio-metrics, 459-460
description, 452-453
indicators, 453
physiological responses, observation, 458-459 self-reported indicators, 453
phenomenological aspects, 460-464 questionnaires (see Questionnaires, UX evaluation) think-aloud (see Think-aloud technique)
Data-driven inquiry, 107b Data value formats, 721-723 Design
creative activity, 252
design-thinking, 256-258
engineering, 253-254
human information processing (HIP), 254-256 paradigms, 253-259
ideation (see Ideation) perspectives
ecological, 261
emotional, 261-264
interaction, 261 phenomenology
HCI, 294, 295
hermeneutics, 295
humanistic studies, 294
method, 294
personal engagement and attachment, 291 presence, 295-296
usage and interaction, 296-297 sketching, 284-291
thinking, 256-258
visual appeal, emotion and usability alarm management system, 262, 263f calm computing, 263
mission-critical system, 264f snap decisions, 262
Design of Everyday Things. See The Design of Everyday Things
Designer's mental models description, 300 ecological perspective
description, 301
thermostat example, 302
emotional perspective, 303 interaction perspective
thermostat example, 302, 303
description, 302
mapping, 300, 301f Design guidelines, UX
accommodating, user differences, 800-801 anthropomorphism
avoidance, 794-795
direction-finding tasks, 796
feedback, 795
user-computer dialogue, 796 assessment
feedback (see Feedback, interaction cycle) information displays, 786-789
system response, 773-774 consistency
absolute, 792-793
innovation, 793
structural, 791-792
gratuitous graphics, 799
GUIs, 693-694
help, 801
human-computer interaction (HCI), 693 human memory
chunking, 697-698
cognitive load, 699
long-term memory, 700
muscle memory, 701-702 recognition vs. recall, 700-701 sensory memory, 697
shortcuts, 701
short-term/working memory, 697
stacking, 699
humor, 793
interaction cycle, parts, 702, 702f internal and external review, 694 outcomes
automation issues, 770-773
system functionality, 769 system response time, 770
physical actions
help user, 762-768
sensing objects, 761-762 planning
clear system task model, 703-705 efficient task paths, 705-706 progress indicators, 706
transaction completion slips, avoidance, 706-708 psychology principles, 694
scope and universality, 689-693 simplicity
consumer appliances, 789 digital phone system, 790 functionality, 790
machines, more controls, 789 sound and color
blinking red, 797-798
blue, 798
bright colors, 797
chromostereopsis, 798, 799f
text legibility, 799-800
tone and psychological impact, 796 translation
cognitive affordance (see Cognitive affordance) sensory and cognitive actions, 708
task structure, 751-761 usability principles, VE, 691b user interfaces, handheld devices
conventional user interfaces, 690 description, 690
user preferences, 800 using and interpreting bewilderment, 695
consistency, 695-696
errors, 695
simplicity, 696
UAF, 696
Design production exercises, 12
interaction specifications defined, 350
multiple, overlapping representation techniques, 352
prototype usage, 351-352
resources, design and iterative refinement, 351 intermediate, 337-339
macro view, lifecycle iterations conceptual, 336
detailed, 336-337
ideation, 335
intermediate, 336
refinement, 337
maintaining, custom style guide defined, 348
rules, organizational signature elements, 349-350
Social Security Administration (SSA), 349-350 user interface objects, 349
uses, 348-349
participatory design, 352-356 UX lifecycle process, 333
Wheel lifecycle template, 333, 334f
Design walkthroughs and reviews description, 469
group, 469
materials, 469-470
practitioner, 470 Detailed design
annotated wireframes, 339
visual comps, 339
Dilbertian HFTAWR (high-frivolity-to-actual-work ratio) approach, 562-563
Discount UX engineering methods goals, 498-499
Nielsen and Molich's original heuristics, 492, 493t
risk management
evaluation errors, mitigation, 499 false negatives, 499-500
false positives, 500
studies, 501
Domain complex systems
complex and esoteric domains, 99-100 customer organization, visit, 99
data collection, 114
emotional impact, 116
goals, 108
observation and interview, task data, 109 phenomenological aspects, 116
process, 111-112
product perspective, 103-106
work roles, 115-116
E
Ecological validity definition, 375-376
using telephones for, 376
Ticket Kiosk System example, 376b Embedded computing, 331 Embodied interaction
advantage, 330
cognitive actions, 329-330
description, 328
embodiment, 329
physical mock-ups, 329
Scrabble, 330-331
shape and augment, 330
Emotional impact, data collection techniques aesthetics and affect
cognitive paradigm, 29
interaction design, 29
processing model, 30
subjective view, 29
symmetrical designs, 29
AttrakDiff, questionnaires (see AttrakDiff, questionnaires)
bio-metrics definition, 459
monitoring equipment, 459
polygraph/lie detector, 460
pupillary dilation, 459
centrality, context, 31
description, 452-453
fun, work, 32
indicators, 453
physiological responses, observation behavioral observations, 458
faceAPI, 459
facial and bodily expressions, 458 limitations, 459
monitoring, 458
reviewing process, 458
software-assisted recognition, 458-459 potential breadth
blood and adrenaline pumping, 28 cross-disciplinary approach, 25-28 social and cultural interactions, 25 software system, 24
standard, expectation/desire, 24-25 self-reported indicators
advantages, 453
dependance, 453
questionnaires, 453
reactions, 453
think-aloud technique, 453
written diaries/logs describing, 453 Entities, modeling
activity
groups and subgroups, roles, 197 system-related roles, 197
work domain, 197
workplace ambiance, 197
nodes, 210
slideshow presentation social model, 198f Ethnography
characteristics, 126
contextual inquiry, 126
ethnographic-based approach, 126 Evaluation lab, Bloomberg LP
observation room description, 517
mobile prototype, 517-518
stakeholders, 518 participant room
formative evaluation session, 517 mobile prototype, evaluation, 518 multi-monitor workstation, 517
Evaluation reporting exercises, 18 formative
content, 599-600
customer/client, 604
description, 601
format and vocabulary, 605-606
inform and/or influence, management, 603-604
problem report effectiveness, 608-609
project team, 602-603
qualitative data, 609-610
time, 607-608
tone, 606-607
UPA workshop report, 601 UX engineering, 601-602
informal summative results
formative evaluation, 595
product design, 595
participant anonymity, 594-595 qualitative formative results
CIF, 597-599
rapid methods, 597
UX practitioners, 597
quality communication, 593-594 Extracting interaction design requirements
abridged methods anticipating needs, 179
using the WAAD directly, 178 work activity notes, 179
contextual analysis, 161-162
exercises, 891-892
formal requirements extraction constraints, 175-176
customers and users, validation, 177 deductive reasons, 165-166
document structure, 169
emotional impact and user experience, 170-171
extrapolation, 171-172
generic structure, 168-169
"hinges", 166
marketing inputs, 173-175
missing data, 172
preparation, 166
prioritizing, 176-177
resolve organizational, social and personal issues, customer, 177-178
statements, 167-168 system support needs, 173
terminology consistency, 167
WAAD, 165, 170
gap, 162-163
needs and requirements, 163-165 Ticket Kiosk System
extraction, 173b statement, 169b
Wheel lifecycle template, 161, 162f
F
Feedback, interaction cycle clarity, 779-780
completeness, 780-781
consistency, 784-786 existence
database system, 775
Unix operating system, 774-775 precise wording, 780
presentation complexity, 777
consistency, 778
legibility, 777
medium, 778
noticeability, 776-777
timing, 777-778
visibility, 776
tone, expression, 782 usage centeredness
error-handling routine, 783 Gobbledygook email message, 782f system-centered "error" message, 783f
user control, 786 Fidelity, prototypes
high
description, 397
use, 397
level of, 395 low
aesthetic quality, 396
description, 396
experience, differences, 396
"kindergarten" activity, 396
paper, 396
medium, 397
Fitts' law, 764-765, 764b
Formal requirements extraction constraints
legacy system, 175 MUTTS, 176b
products, 175-176
customers and users, validation, 177 deductive reasoning
design, 165-166
work activity note, WAAD, 165
document structure, Ticket Kiosk System, 169b emotional impact and user experience, 170-171 extrapolation, 171-172
generic structure, 168-169
"hinges", 166
marketing inputs, 173-175
missing data, 172 MUTTS, constraints, 176b preparation, 166
prioritizing, 176-177
social and personal issues, customer, 177-178
statements, 167-168 system support needs, 173
terminology consistency, 167 Ticket Kiosk System, 169b, 173b work activity data, 165
Formative and informal summative methods advantages and disadvantages, 494-495 analytic vs. empirical
axe design, 434-435 critical incident, 434b description, 433-434
emotional impact factors, 435 expert usage, 434
intrinsic methods, 434
payoff and intrinsic approaches, 434 payoff methods, 434
think aloud technique, 434b UX inspection, 434
classification, dimensions, 432
CUE series, 497
"damaged merchandise", 496
description, 492-497
dimensions intersection, 435, 435f effectiveness, 493
evaluators and problems, 494
interactive software systems, 495 lab-based approach, 494
practical problems, 496 rigorous vs. rapid
description, 433 ecological validity, 433b expense, 433
quality vs. cost trade-offs, 433
usability inspection methods vs. lab-based testing, 495
usability metrics, 495-496 Formative (qualitative) data analysis
abridged approach, 575
clarification and amplification, emotional impact data, 563-564
clean up, raw data, 563
consolidating, merging and grouping, UX problem data, 562f
consolidation, raw critical incident notes
critical incidents vs. UX problem instances, 565 single UX problem instance, 566b
UX problem instance concept, 565 critical incident, 561b
description, 561
Dilbertian HFTAWR (high-frivolity-to-actual-work ratio) approach, 562-563
early UX problem data records, 563 exercises, 17-18
individual critical incident descriptions, 564 photo album example, 567
problem instance, 561b sources, 564
UX problem instances content, 567-568
data management, 574-575
group records, 571-573
merging into UX problem records, 569-571
project context, 569 Formative reporting
content
cost-importance data, 600 emotional impact problems, 600 individual problem, 599-600
video clips, 600
customer/client, 604 format and vocabulary
evaluation, 605
jargon, 605
precision and specificity, 606
inform and/or influence, management, 603-604
problem report effectiveness law, 608-609
redesign proposals, 609
usability problem, 609
project team, 602-603
qualitative data, 609-610
time, 607-608 tone
positive approaches, 607
respect feelings, 606-607 UX engineering
concepts, 601-602
persuasion and selling, concept, 602
rapport and empathy, 602 teaching, 602
Formative vs. summative evaluation description, 429, 430
design, 429
education and curriculum, 429 engineering, informal summative, 432 engineering vs. science
fundamental differences, 431-432
quantitative metrics, 431
validity, 432
informal summative, engineering design phase, 430
lab-based UX testing sessions, 430 qualitative data, 429b
quantitative data, 429b
G
Gods Must Be Crazy (see The Gods Must Be Crazy)
Green Machine User-Experience Design behavior-changing process, 327 business data bases, 328
energy use, comparisons, 327 funding, Smart Grid, 326 home-consumer context, 326
mental model and navigation, 328 product purchase, 327
prototypes, 326 Smart Grid data, 326 testing, 328
H
Handheld devices, 690b Haptics
BMW iDrive, 767
car radio, 768 defined, 763b microwave, 767
HCI. See Human-computer interaction HE. See Heuristic evaluation
Heuristic evaluation (HE) advantages, 473
default practitioner, 472
heuristics, 473
limitations, 478-479
procedure, 474-475
reporting, 475-477
rule-based method, 472 variations
participatory, 477-478
perspective-based usability inspection, 478 problem reporting, 478
UX inspections, 477
walkthroughs, 478 Hierarchical task inventory (HTI)
description, 214b
envisioned task structure model, 219 MUTTS, 217b, 218f
task inventories, 216-217
temporal implications, 217-219 Ticket Kiosk System example, 219b
Horizontal vs. vertical prototypes depth, 394
description, 393-394, 394f
functionality, 394
product overview, 394
workflow, 394
HTI. See Hierarchical task inventory Human-computer interaction (HCI)
activity theory, 253
automated cockpit warning systems, 255-256 community, 356
contextual design, 126-127 creativity and innovation, 259 description, 253
designers, 402 design-thinking
architects, 258
car design, 257-258
description, 256-257
iPad, 258
participatory design techniques, 257, 257b engineering, 253-254
ethnography, 92
frameworks, 258
human information processing (HIP), 254-256 identification, 262
iterative lifecycle, 350
methods, 254
participatory design, 353-354
PICTIVE approach, 356
prototyping tools, 410-411, 425-426
research community, 422-423
utilitarian engineering approach, 258-259 work activity theory, 125b
world-view, 255
Human information processing (HIP) paradigm, 254-256
Human-Machine Interaction Network on Emotions (HUMAINE) project, 553-554
Human memory limitations
Calendar Management System example, 739-740 chunking
description, 697
designed, phone number, 698 grouping and recoding, 698 trick case, 698
cognitive load defined, 699
task closure, 699
command vs. GUI selection interaction styles recognition vs. recall, 700-701
shortcuts, 701 long-term memory
capacity, 700
hypnosis, 700
learning, 700 muscle memory
"on" and "off", electrical switch, 701-702 rhythm, 701
sensory memory persistence, sensory, 697
visual persistence, 697 short-term/working memory
proactive interference, 697
throw-away data, 697 stacking
execution context, 699
large and complex tasks, 699 task performance, 699
Human spirit, UX connectedness, music, 26-27
disconnection, absorption, 27
serendipity, projects, 27-28
work, spirit, 28
I
Ideation brainstorming
breakout groups, 280 contextual analysis process, 282 "deep dive" approach, 281-282 description, 280
Kiva, 281, 281f
physical mock-ups, 281, 281b rules of engagement, 282 sub-teams, 280-281
vs. critiquing
designers, cross-disciplinary team, 275 implementation constraints, 274-275
low-fidelity prototypes, 275
user interface, 276
description, 274
emotional factors, 278
exercises, 9-10
exploration, 274
input bin, 278
Magitti activity-aware leisure guide, 278b team assembling, 277-278
Ticket Kiosk System community outreach, 284
emotional impact, 283 features and coverage, 283 ontological artifacts, 282 themes and motifs, 283 ubiquitous locations, 284
work space, set up
individual and group designer, 277, 277f Kiva, 276-277, 276f
Informal summative (quantitative) data analysis UX targets
convergence toward quality user experience, 560 descriptive statistics, 557
inferential statistical analyses, 556 iteration, 556
Observed Results column, 557
partial informal quantitative testing results, 557t Information displays
organization, presentation complexity control, 787
train passengers example, 787 visual bandwidth
limited horizontal, 788-789, 788f limited vertical, 788-789, 789f reading devices, 788
Inspection, UX description, 470-471
design, 470
inspectors, 471-472 practical approach
co-discovery/team approach, 480
design guidelines/heuristics, 480
emotional impact, 482
evolution, 479-480
feedback and credibility, 479 inspector, role, 481
note-taking and analysis, 483-484 problems, 481, 482
reporting, 484
time and effort, 481
usage-based approach, 480
user-surrogate role, 483
user tasks, 481
tool, 471 Interaction
complex and domain-complex systems, 66-69
phenomenological aspects, 70b simple and domain-complex systems,
69-70
Interaction cycle
assessment, 684-685, 773-789
concepts, HCI, 664
cooperative user-system task performance, 670-673
defined, core functionality, 683 effectiveness, 683-684 existence
functionality/feature, 684
unwanted automation, 684 gulfs, user and system
description, 665
evaluation, 667
execution, 666-667
hierarchical structure, 676 human user vs. machine, 663
knowledge base, design concept, 663
non-user-interface system functionality, 683 Norman's stages-of-action model, 664-665,
668-670
outcomes, 768-773
parts, 702
physical actions, 680-683, 761-768
planning, 676-677, 703-708 principles and guidelines, 663 quality, functionality, 684
translation, 678-679, 708-761
UAF, affordances, 685-686
usability problem, 664
user action, 683
UX design, concepts and issues, 664, 675
Interaction Design Association (IxDA), 834 Interactive prototypes
amount, 398
click-through, 398 fully programmed
project team, requirement, 398 proposals, 398-399
real programming language, 399 physical mock-ups
description, 400
fidelity, 400
handheld, 400
hardware, 400
paper-in-device, 401
physicality, 400
power, 401
use, 401
wood block, 400-401
scripted, 398 video animations
animated sketches, 402
description, 402 Wizard of Oz
description, 399
human evaluator, 399
use, 399, 400
users, unawareness, 399 Intermediate design
application ontology, information objects graphics-drawing example, 337
Ticket KioskSystem example, 338 communication, 338-339
goal, 337
screen layout and navigational structure, 339 strategies, realization, 337
IxDA. See Interaction Design Association
K
K-YAN project
emotional impact, form, 288, 293f flip-open mechanism, 288, 292f ideation sketches, 288, 290f
mid-fidelity exploration sketches, 288, 291f
L
Legacy system, 175b Lifecycle process
concept
calibration, 48-49
interaction design process, 50 repeatable formula, 49
rigid structure, 50 described, 47b influences on, 50-53 iterative process, 47b
misbegotten approach, 47-48
UX process template (see Lifecycle template, UX process)
Web user experience design, 51b Lifecycle template, UX process
activities (see Process activities) choosing process instance
instantiation, 60
mapping project parameters to process parameter, 63-64
process parameters, 63
project parameters, 61-62 commercial product perspective, 72 complex interaction
complex work domain, 66-69 simple work domain, 70-71
evaluation activity, 54
fundamental activities involved, 78-79 gradations, 72-73
implementation, 53-54
interface engineering, 75-76
iteration for interaction design refinement, 81-83
lifecycle diagram, ISO 13407 standard, 77f parallel streams, software and interaction process
activities, 79-81
phases, 76 prototype
horizontal, 56b local, 56b
T, 56b
vertical, 56b scope, 75
simple interaction
complex work domain, 69-70 simple work domain, 70-71
sub-activities, 55
system complexity space
low interaction complexity, 65 MUTTS example, 65b
PhotoShop, Lightroom, and Aperture, 65b work domain complexity, 65, 66
think-aloud technique, 55 universal abstract activity cycle, 53f usability engineering, 76
Usability Engineering for Bioinformatics, 67b usability testing, 77-78
user interface team, 73-75 Local prototypes
description, 395
design discussions, 395
Local prototypes (Continued ) dialogue box, 395
use, 395
Local UX evaluation method, 491
M
Mac Mail(tm) program, 707, 707f Macromedia Dreamweaver(tm), 741
Master Document feature, Microsoft Word(tm), 704 Measuring instruments
benchmark tasks (see Benchmark tasks) description, 365
initial user performance, 366 time-on-task, 365
user satisfaction questionnaires description, 376-377
first-impression UX measure, 377, 377t goals, measures, and measuring instruments,
377, 378t
performance, 377-378
Measuring the usability of multi-media systems (MUMMS), 450
Mental models description, 299
designer's (see Designer's mental models) exercises, 10-11
mapping, 301f, 304
role, conceptual design, 304 user's
cars, 304
comedy curve balls, 304 description, 303
knowledge, 303, 304
mapping, 301f, 303
Norman's binary switch explanation, 304
thermostat, 303-304 Metaphors, conceptual design
description, 300, 306
ecological perspective, 306
emotional perspective, 307 interaction perspective
components, 307
description, 306
'desktop', 307
Macintosh platform, 307
time machine feature, 306-307 typewriter, 306
use, 306
Metrics and targets abridged approach, 389 baseline level
description, 381
Ticket Kiosk System example, 383b, 384t
description, 361-362
engineering process, 388-389
exercises, 12, 13
measures (see UX measures)
measuring instruments (see Measuring instruments) metrics (see UX metrics)
observed results, 386 practical tips and cautions
class definitions, 386 measures and levels, 387-388 target level values, 387
trade-offs, 387
usefulness and emotional impact, 388 project context
completeness level, 359-360
creation, 360
evaluation, 359
interaction design process, 360 quantifiable end, 360-361
roots, 361 setting levels
description, 382
formative evaluation sessions, 383-386
problem-solving skills, 383
values, 382
tables, 362, 362t target level
description, 381
experience test, 382
performance, 382
quantification, goal, 381
Ticket Kiosk System example, 383b, 385t Ticket Kiosk System example, 362b
Wheel lifecycle template, 359, 360f work roles and user classes
measuring instrument, 363-364 Ticket Kiosk System example,
363b, 363t
Middleburg University Ticket Transaction Service (MUTTS)
bins as inputs, 185b business process, 95
customers, 117-119
description, 94 essential use case, 231b flow model, 211b
consolidation, 244b sketching, 135b
hierarchical task inventory (HTI), 217b information objects and attributes, 233b physical model, 241b
social model, 205b
step-by-step task interaction model, 192b
task interaction branching and looping, 228b usage scenario, 221b
user class, 195b
user data gathering, 116b
work activity note synthesis, 141b work role, 134b
work roles and sub-roles, 188b Model-driven inquiry, 107b Modes
bad mode, 750
email system, 749
good mode, 750
meaning change, user action, 749
MUMMS. See Measuring the usability of multi-media systems
MUTTS. See Middleburg University Ticket Transaction Service
N
National Institute of Standards and Technology (NIST), 597
NDAs. See Non-disclosure agreements
NIST. See National Institute of Standards and Technology
Non-disclosure agreements (NDAs), 523 Norman's stages-of-action model
adoption, 664, 665f
business report creation example print dialogue box, 670
print report, 670
steps, financial status, 669
sub-steps, task decomposition, 669-670 defined, cognitive walkthrough, 665 goals, 664-665
outcomes and system response, 668 partitioning, 668
significance and importance, translation, 668-670 transition, 669f
O
Onion-layers effect, 590 Organizational structure
development organization, 857 human factors engineer, 856 implementation, 858
practitioners, 856
strategic approaches, 857
team interaction, 857
P
Paper prototypes
coding blocker, role, 407-408 computer-printed
description, 408
graphical images, 408 hand-eye feedback loop, 409
OmniGraffle/Microsoft Visio, 408
software tool, 408, 409
stopgap measure, 409
time spend, 409
Paper prototypes (Continued ) construction, approaches
adhesive-backed colored circles, 417, 417f corners cutting, 415
creative techniques, use, 413 data entry, 416, 416f
decoy user interface objects, 416 drawing, 411
executor, task threads, 417
foam-core board easels, 411, 411f, 412f formative evaluation exercise, 415 foundation screen, underlying,
411-412, 412f
highlighting object, 413, 415f machine/scanner, use, 415
materials, set, 410-411
"not yet implemented" message, 415-416, 416f
paper cutouts, 412, 412f, 414, 415f
pilot test, 417
plastic interaction sheets, 413 preferences dialogue box, 412, 413f pull-down menu, 412, 414f screen/display, buildup, 414 scrolling, cutting slits, 412-413, 414f sketching and storyboarding, 415 time management, 410
work faster, 411
description, 407
design reviews and demos, 408 hand-drawn, 408
low fidelity, 407 program, low-fidelity
dead time, 410
execution, 409
run-time, 409
Web page production tool, 410 writing code, 408
Participants selection
demographic survey, 512
expert, 512-513
lab-based and non-lab-based methods, 511
need and budget establishment, 511-512
number, determination, 513
representative users, 512
sampling, 511
user class attributes, 512
"three to five users" rule (see "Three to five users" rule)
Participatory design
HCI history and literature, 352 interaction situations, 353 Joint Application Design, 355 PICTIVE2
objective, 354
paper prototyping, 354
UTOPIA, 353-354
project UTOPIA, 355
reciprocal learning, 353
rules, engagement, 355
Scandinavian approach, 356
user participation, 355
Personal information ecosystem (PIE) computational power, 308-309
definition, 309-310
designing and assessing usability, 310 email management, 310
email programs, 309
equilibrium, 309
information flow, 311
information practices, 310
multiple devices, 309
system architecture, 310
wicked problem, 309-310
workflow, 310
Phenomenological aspects, data collection techniques diaries
description, 462
digital voice recorder, 463
mobile phone, 462
verbal reports, 463
voice-mail method, 462-463
direct observation and interviews, 464 goals, 461-462
long-term studies audio, 461
constant attention, 461
description, 460
inquiry and ethnography, 461 iPod, 461
participants report, 461
studying and evaluating, 460-461 systems and product, 460 timeline, 460
periodical, 463-464
reporting, 463
Photo album problem, cost-importance analysis, 579 Physical actions, interaction cycle
affordance, 680
components, 680
defined, snap-dragging, 681
description, 680
Fitts' law, 680
haptics and physicality, 682 help user
awkwardness and physical disabilities, 763-764
haptics and physicality, 767-768
manual dexterity and Fitts' law, 764-765 overshoot errors, avoidance, 765-766
menu choices, 681-682
sensing objects, 761-762
software modification, 681 in UAF
existence, physical affordances, 682 manipulation, UI objects, 682-683
Physical model creation, 239
description, 238
envision, 242
MUTTS example, 241b, 241f slideshow presentations, 239b, 240f
PIE. See Personal information ecosystem Pilot test, 417
Planning, interaction cycle clear system task model
library information system, 705
Master Document feature, Microsoft WordT, 704 support users, 703
tab reorganization, 704, 704f concepts, 676
efficient task paths, 705-706 hierarchy, plan entities, 676 progress indicators
task sequencing, 706
Turbo-Tax(tm), 706
transaction completion slips, avoidance attachment, forgotten, 707
defined, 706
Gmail reminder, file attachment, 707f Mac reminder, attach file, 707f microwave, 707
Ticket Kiosk System example, 706-707 in UAF
goal decomposition, 677 task/step, work flow, 677 use and exploration, 677 user knowledge, 677
user model and high-level system, 677 user work context, environment, 677
Post-study system usability questionnaire (PSSUQ), 450
Preparation, rigorous empirical evaluation
lab-based and field-based evaluation, 503-504 method and data collection techniques
adaptation, 511
critical incident, think-aloud and co-discovery, 510 emotional impact, 510
goal driven, 510
questionnaires, 510, 511 number of participants, 529-536
Preparation, rigorous empirical evaluation (Continued ) participants
number, determination, 529-536
preparation, 516-528
recruitment, 513-516
selection, 511-513
pilot testing, 528-529 planning
cost-effective decisions and trade-offs, 504 description, 505
goals, 505-506 tasks
benchmark, 508 exploratory free use, 509 unmeasured, 508
user-defined, 509 team roles
evaluation activities, 506
facilitator, 506
practitioners and observers, 506 prototype executor, 506-507 qualitative data collectors, 507 quantitative data collectors, 507 supporting actors, 507
Wheel lifecycle template, 503, 504f Priority rankings, 582-584, 588
Priority ratio, 581-582 Process activities, UX
analysis, 55
design, 56
evaluation, 56 flow
iteration, 58-59
managing with transition criteria, 57-58 not always orderly, 56-57
lifecycle streams, 60 prototype
horizontal, 56b local, 56b
T, 56b
vertical, 56b
Prototyping advantages, 418
aspects, design, 403 breadth and depth, effects,
407, 407f
depth and breadth choices, approach, 393
horizontal vs. vertical, 393-394 local, 395
"T" prototypes, 394-395
description, 402 dilemma and solution
product version, time, 391-392 Scandinavian origins, 393
significance, 392
traditional development approaches, 392 universality, 392-393
ecological perspective conceptual design, 404 description, 404b
hallway methodology, 405
IBM's Olympic Message System, 404 system structure, level, 404
validity, 405b
emotional perspective, 405, 405b exercises, 14
fidelity (see Fidelity prototypes) fidelity level and interactivity amount
breadth and depth, effects, 407 description, 402
design perspective, 403-405
risk and cost management, 406-407 stage of progress, 402-403
horizontal, 56b interaction perspective
computer-printed paper/mock-up, 405
conceptual design, 405 description, 405b design iteration, 405
wireframes, use, 405
interactivity (see Interactive prototypes)
local, 56b
paper (see Paper prototypes) potential pitfalls
buy-in, 418
cooperation, 418
limitations, 419
overwork, 419
project management, 419
selling, 418
risk and cost management behavior and sequencing, 406 low- vs. high-fidelity, 406, 407t
user interaction design, parts, 406 software tools
autocompletion, 423-424
functional behaviors, 424 HCI research community,
422-423
programming, 423
UIMSs, 423
stage of progress
audience and explaining, 402-403
increase, progression, 403
iteration kinds, 403 T, 56b
transition, product formative evaluation, 420
interaction design, reuse, 421 investment, 420
keeping, 421
prototype code, 420
recoding, 420
tail, lifecycle, 420
UX and SE collaboration, 421-422
UX team, 422 vertical, 56b
Wheel lifecycle template, 391, 392f PSSUQ. See Post-study system usability
questionnaire
Q
Qualitative UX data, generation and collection critical incident
data collection mechanisms, 546-547
information collection, 545-546
lab-based testing, 545
think-aloud data collection, 548
Quantitative UX data, generation and collection benchmark tasks, 543
objective
online help, 544
"oops" errors, 544-545
timing measurements, 543 user errors, counting, 544
subjective, 545
Quasi-empirical UX evaluation method data analysis and reporting, 489 experienced practitioners, 487-488 formal protocols and procedures, 487 preparation, 488
session and data collection, 488-489 task driven, 488
Questionnaire for user interface satisfaction (QUIS) calculation, 446
categories, 445
description, 445
time, 446
Questionnaires, UX evaluation ASQ, 450
CSUQ, 450
description, 444
hedonic quality evaluation, 450 modification
data collection technique, 450-451
formative evaluation, 452
scale values, 451
semantic differential scales, 451, 451b SUS, 451
term substitution, 451-452
Questionnaires, UX evaluation (Continued ) "user-friendliness", 451
warning, 452
Websites, 452
MUMMS, 450
PSSUQ, 450
QUIS (see Questionnaire for user interface satisfaction (QUIS))
semantic differential scales assessment, agreement, 445
description, 445
discrete points, 445
SUMI, 450
SUS (see System usability scale (SUS)) traditional usability, 444-445
USE
applications, questions sets, 449, 449t bottom line, 449
description, 448-449
WAMMI, 450
QUIS. See Questionnaire for user interface satisfaction
R
Rapid iterative testing and evaluation (RITE) method
collaborative process, 485
data collection, 487
description, 484
problem fixing, 485 procedure
evaluation session, 485-486
follow-up evaluation, 486
practitioner selection and team preparation, 485
problem fixing, 486 Rapid UX evaluation methods
alpha and beta testing, 490-491 automatic, 492
characteristics, 467-468
design walkthroughs and reviews description, 469
group, 469
guidelines and style guides, 470 materials, 469-470
practitioner, 470
discount UX engineering methods (see Discount UX engineering methods)
exercises, 14-15
fast-track projects, 467
HE (see Heuristic evaluation (HE)) informal, 468
inspection, 469, 469b
interactive prototype, 468-469
local, 491
quasi-empirical, 487-489
questionnaires, 490
remote, 491
RITE (see Rapid iterative testing and evaluation (RITE) method)
think-aloud technique, 468-469, 468b UX inspection
description, 470-471
design, 470
inspectors, 471-472
practical approach, 479-484
tool, 471
Wheel lifecycle template, 467, 468f Remote UX evaluation method, 491 Requirements. See also Extracting interaction
design requirements domain-complex systems, 164 generic structure, 168f
legacy system, 175
software engineering traditions, 163-164 usability, 163
WAAD, 178
work role, 177
Return on investment (ROI) analysis, 845
incremental approach, 847
NYNEX project, 845
usability, 845
Rigorous empirical evaluation
cost and information extraction, 530 ecological validity
A330 Airbus, 525-526
description, 524, 524b
SSA- Model District Office, 526-527 third age suit, 525
usage/design scenarios, 525
exercises, 15-16 informed consent
data collection, 519
form, 519-521, 522f
permission application, 520 lab and equipment
Bloomberg LP, 516, 517-518
data collection, 516
DRUM, 516-519
novice UX practitioners, 530 paperwork
data collection forms, 523-524 instructions, 521-523
NDAs, 523
questionnaires and surveys, 523 planning room usage, 524 recruitment
co-discovery evaluation, 515
database, 514
incentives and remuneration, 514 management, 515-516
methods and screening, 513-514 subsequent iterations, 516
user participants, 515 rules of thumb, 530-531 sampling, 529
session parameters
full lifecycle iterations, 519 task and session lengths, 519
session work package benchmark tasks, 528
contents, 527-528
training materials, 524
Ripple model, SE
constraint subsystem, 826-827
environment, 824-825, 825f project definition subsystem, 826 repository subsystem, 827
software implementation, 825
UX role, 826
ROI. See Return on investment
S
SAM. See Self-Assessment Manikin SE. See Software engineering Second span bridge
barrier, 183
models, 182
Self-Assessment Manikin (SAM), 457-458 Sessions running, rigorous empirical evaluation
emotional impact data nonverbal techniques, 550
questionnaires, 550
think-aloud technique, 548-549
exercises, 16-17
HUMAINE project, 553-554 participants, preliminaries
benchmark tasks, 539
data collection, 539 design and process, 538 paperwork, 538-539
reception room, 537 setup and lab, 538
phenomenological evaluation data diary-based technique, 550
direct observation and interviews, 551 questionnaires, 551
self-reporting, 550
usage changes, 551-552
voice-mail and per-call payment, 550-551 post-session probing, 552-553
protocol issues and participants assistance, 541
comfortableness, 541
Sessions running, rigorous empirical evaluation (Continued )
interaction, 540-541
low-fidelity prototypes, 541-542
partnership cultivation, 540 UX problems, attitude, 539
qualitative UX data
critical incident information and data collection, 545-547
lab-based testing, 545
think-aloud data collection, 548 quantitative UX data
benchmark tasks, 543
objective, 543-545
subjective, 545 reset, next participant
paper prototype, 553
Web-based evaluation, 553 Wheel lifecycle template, 537, 538f
Shared cultural conventions, 650-651
Simple interaction
complex work domain, 69-70 simple work domain, 70-71
Situated awareness, 332 Sketching, design process
conversation, 285
description, 284
embodied cognition, 286
exercises, 9-10
ideation and design, 285
K-YAN project (see K-YAN project) language
characteristics, 287-288
designers, 288-290, 289f
Ticket Kiosk System, 287, 288f, 289f vocabulary, 287
mobile phone example, 285 physical mock-ups
description, 290
rough and finished, 290, 293f, 294f
vs. prototypes, 285-286
supplies, 286-287
Small UpFront Analysis (SUFA) aim, user stories, 636-637 goals, 636
user interviews and observation, 636 UX
lifecycle process, 634-635 role, planning, 635, 635f
SnakeLight, 93 Social models
commercial product perspective, 208 concerns and perspectives, 199-200 entities (see Entities)
envisioned, 208-209
influences, 200-208
MUTTS example, 205b, 205f slideshow presentation example
arcs representing influences, 203b concerns, 200b
entities, 198b
Ticket Kiosk System example, 208b Social Security Administration (SSA)
policy, 349-350
telephone interviews, 349 Software engineering (SE)
agile development, 819 connections, lifecycles, 821, 822f developing interactive systems, 830 differences, lifecycles
design usage, 806
UX iteration, 805
UX practitioners, 805
UX roles, 805
functional core, 804
HCI, 811
interaction design, 829
lifecycle, 819
mechanism, communication, 823 organization, locus of influence
business role, 806
description, 806
design role, 806-807
factors, 807
The Inmates Are Running the Asylum, 808 roles, 808
software/development role, 806
parallel connections, lifecycles, 822-823, 823f ripple model (see Ripple model, SE)
risk management
parallel connections, lifecycles, 823f, 824 UX and SE lifecycles, 820f, 824
role, interaction design, 827
serial connection, iterative version, 821, 822f similarities, lifecycles, 805
team members, 828-829
UI (see User Interface (UI))
UI changes, 821
user-interface, 803-804 UX
communication, 812
coordination, 813
dependency and constraint enforcement, 814-817 dependency type, 818
evaluation, 817-818
people, 821
roles, 811
and SE goals, 804-805 synchronization, 813-814
and UX lifecycles, series, 820, 820f
Software usability measurement inventory (SUMI), 450 SSA. See Social Security Administration
Step-by-step task interaction models barriers, 225, 243t
branching and looping structures, 229f creation, 225
description, 224 information and needs, 225 MUTTS example, 230f
task and step goals, 224 task interaction model, 224 task triggers, 224
Storyboards components, 317-318
description, 316
ecological perspective, 318
emotional perspective, 318-321 frame transitions
cognitive affordance, 321b description, 321
dynamics, interaction, 321 state changes, 322, 324f value, 321-322
ideation and sketches, 317 interaction perspective, 318 Ticket Kiosk System example
differences, ecological perspective, 319, 319f purchase, sample sketches, 321, 322f sequence of sketches, ecological perspective,
318, 318f
three-screen kiosk design, interaction perspective, 320-321
Subjective questionnaire data analysis, 561 SUFA. See Small UpFront Analysis
SUMI. See Software usability measurement inventory SUS. See System usability scale
System complexity space
low interaction complexity, 65 MUTTS, 65b
PhotoShop, Lightroom and Aperture, 65b work domain complexity, 65, 66
System usability scale (SUS) analogy, use, 448
analysis, 448
calculation, 448
description, 447
dimensionality, 447-448
evaluation grade, 448
numerical score, 448
possibilities, 447
questionnaires (see Questionnaires, UX evaluation) significances, 447
statements, 447
T
Task interaction model, 71b Task models
task interaction models design scenarios, 222-223
envisioned, 232
essential use case, 228-232 scenarios use, 219-222
step-by-step (see Step-by-step task interaction models)
task structure models envisioned, 219
HTI (see Hierarchical task inventory) inventory, 216-217
Task structure, interaction cycle
designing, flexibility and efficiency, 751-752
direct manipulation and natural interaction control adding appointment example, 760
commands, 759
GUIs, 759
physicality, 760, 760b grouping
hardware store organization, 753 Ticket Kiosk System example, 753
human working memory loads, 751 task thread continuity
description, 753-754
online shopping, 755
Outline view, 755
query screen, 754
"Save As" task, Microsoft Office, 755, 756f
select, item, 757
undoing user work, 757-758 users, control
EndNote(tm), 759
interaction dialogue and bossy attitude, 758 TCO. See Total cost of ownership
The Design of Everyday Things, 650 The Gods Must Be Crazy, 651 Think-aloud technique
co-discovery
interactive conversation, 443
natural conversation, 443
origin, 443
participant personalities, 443-444
planning, 444
quantitative task performance metrics, 444 significances, 443
time verbalizing statements, 443 description, 440
management, 442
natural, participants, 441-442
retrospective, 442
types, participants, 441 use
analyst and participant, 440 evaluation session, 440
rigorous and rapid empirical methods, 441 "Three to five users" rule
approach and practical outcome, 535-536 assumptions, 534-535
cumulative percentage, problems, 533 detection rates, 532-533
interaction design, 532
marginal added detection and cost-benefit, 533-534
probability function, 531 UX problem detection, 532
Throw-away data, 697 Ticket Kiosk System
constructing design-informing models conceptual design (see Conceptual design) design scenario, 222b
envisioned flow model, 215b, 215f envisioned hierarchical task inventory,
219b
envisioned social model, 208b envisioned work roles, 190b ontological elements, 233b storyboards (see Storyboards) system concept statement, 97b
cost-importance analysis, 579, 585t grouping related problems, 572b priority ratios, 582b
problem resolutions, 587t UX benchmark tasks
baseline level values, 383b benchmark tasks, 374b ecological validity, 376b questionnaire, 377b target level values, 383b UX goals, 362b
UX measures, 365b UX metrics, 380b
work role and user class, 363b Total cost of ownership (TCO), 841 "T" prototypes
description, 394-395
role, 395
Translation, interaction cycle concepts, 678
description, 678 in UAF
content, meaning, 679
existence, cognitive affordance, 679 presentation, 679
task structure, interaction control, preferences and efficiency, 679
use, UI objects, 679
U
UAF. See User action framework Ubiquitous computing, 331 Ubiquitous interaction
ambient intelligence, 5-6 computing
commercial application, 2-3
healthcare rehabilitation, 3 multimodal receptors and sensors, 3 Smart-ITs, 2
wearable computers, 2
desktops, graphical user interfaces and Web, 1
highway signage, 7
human-computer interaction (HCI), 6 implementation technology, 5
quality user experience, 3-5
radio-frequency identification technology, 6 UI. See User interface
UIMSs. See User Interface Management Systems UIST. See User Interface Software and Technology
Symposium
UPA. See Usability Professionals Association Usability
computer science growth, graphics, 46
hardware and software developments, 45 interaction technique, 45
linguistic structure, 45
programming language translation, 44 standardization, 46
User Interface Management Systems (UIMS), 45
computer usage, 7
disastrous system development, 9 disciplines
civil engineering, 37
ergonomics, hardware devices, 37 effects, 20
extensive training, 9
formal methods, 44
human factors and systems engineering cockpit control layouts, 39
products maintenance, 38
scientific management, 38
testing systems, 40
human work activity and ethnography, 44 intellectual gratification, 8
interaction design, 7
psychology and cognitive science developmental approach, 41-42
empiricism, 40-41
information processors, 41
interaction design, 41
Usability (Continued ) software engineering
architectural implications, 46
development lifecycles, 46
functional modules, 46
task analysis, 42
theory, 42-44 user experience
expanding concept, quality, 10-11 misconceptions, 10
traditional concept, 9-10
user satisfaction, 11
Usability Engineering for Bioinformatics, 67b Usability principles, VEs, 691b
Usability Professionals Association (UPA), 833 Usage models
flow model architecture, 209
creation, 210-213
envisioned, 214-215 MUTTS example, 213f
product perspective, 213-214
slideshow presentations, contextual inquiry, 211b, 212f
work roles, 190b, 190f information object model, 232-235 task models (see Task models)
Usefulness, satisfaction, and ease of use (USE) questionnaire, 448-449
User Action Framework (UAF). See also Interaction cycle advantages
organized and structured usability data, 687 richness, usability problem analysis schemes, 687 usability data reuse, 688
vocabulary and communicate design issue, 686 affordance
interaction cycle, user actions, 686, 686f
sensory and cognitive, 686
users connection, design, 685, 685f structured knowledge base
completeness, 675
design concept, 674-675
device independent, 674 interaction cycle, 674, 674f
User experience (UX) broad definition
initial awareness, product, 23 shared design vision, 23
business strategy
goals, increased productivity, 36 instructional bulletin example, 35-36 policy, law, 35-36
coders, 625-626 components
aesthetics, food presentation, 19 minimum errors and frustration, 15-16 nutritional value, 19
controlling scope, 632-633
customer and user representatives, 631 description, 620-625
design beyond technology, 15 domain-complex systems, 619
environment, 619 functionality
hedonic quality, 12
stellar interaction design, 12 usability testing, 12
fuss over usability field support, 33
software design, 33
sub-standard product, 34 high quality designing, 26b High-Tech/"Cool"
intrinsic benefactors, 13
loss, enthusiasm, 14
low-end model, 14-15
Microsoft software packaging design, 13, 13f Vista's gratuitous redesign, 13-14
hotcakes, 34-35
ideation, design, 626
interaction and usage context, 21, 21f
lifecycle aspects, 620-622, 622f marketing department, 34 paradigm shift
customer, 632
traditional UX process, 631 problems, anticipate, 633
productivity-enhancing tools, 11
prototype, 810
qualitative data, 20
role, branding and corporate culture emotional responses, 22
interaction design, 23
spectacular design, 22 SE
approaches, 620
characteristics, 620-622
description, 620-625
lifecycle aspects, 620-622
planning, 622-624
prototypes, 810
requirement, 811
roles, 809, 810
sprints, 624-625
synthesized approach, integrating UX communication, 642
counterpart activities, 637, 637f customers and users, 641 design partners, 638
feedback, value, 640-641
goal, 641
impact, 642
planning, iterations, 641-642
practitioners, 638
prototype integration, 640 prototyping and UX evaluation, 640 role, 637
and SE activities, 638 style guides, 642
SUFA, 634-637
user-centered design techniques, 634 UX and SE work flow, 638, 639f
usage context, 19
UX component, 630-631
UX lifecycle, 626
User experience (UX) work administrative preparation
commitment, 834-835
UX lab, 835
UX leadership establishment, 835 video clips, managers, 835-836
agile methods, 860
analytics rise, 861 cost-justifying
articles and books, 840
benefit and business case analysis, 841-848
cutting costs, 841
human factors, 854-855
legacy systems, 858-859
organizational structure, 855-858
transition, 859-861
description, 831
design project, 837
evaluation session, 838 formative evaluation
description, 837b prototype, 837
practitioner, 838-839
professionalism, 839-840 professional preparation
apprentice, 831
consulting help, 832
IxDA, 834
portfolio, 834
training, project team members, 831-832 UIST, 833-834
UPA, 833
UX activities, 833
UX design, 832
proliferation, platforms, 860 technical preparation
personalize and actualize, process, 836
User experience (UX) work (Continued )
practice, contextual inquiry and analysis, 836 UX activities, 836
UX lab, 836-837
techniques, 837
users observations, 838 User interaction
software, 818
software design and implementation, 818, 818f software requirements, 819
UX lifecycle, 819 User interface (UI)
graphical, 1
objects, 349
team, 73-75
User Interface Management Systems (UIMSs), 423
User Interface Software and Technology Symposium (UIST), 833-834
User models
social model (see Social models) user classes
experience-based characteristics, 194-195 knowledge and skills-based characteristics, 191 physiological characteristics, 192-194
user personas, 209 work roles
envisioned, 189-190
mediates, 187-189
relationship, 190
sub-roles, 187 User personas
candidate, identification, 268 characteristics
commercial products/systems, designing, 270 memorability, 270
relevance and believability, 270 richness, 270
work role, 271
Cooper's in-flight entertainment system, 272-274
creation mechanics, 269-270
description, 264-274
ecstatic customers, 266-267 edge cases and breadth, 267 entertainment events, 272b
functionality and flexibility, design, 266 goal-based consolidation, 268
goals for design, 271 selection, primary, 269
stories, 271
usage, design, 271-272
work role, 268 User's behavior
Amazon Kindle(tm) device, 325-328 bringing carts, 325
domain, airport baggage, 325 idea, design, 325
slanty design, 324-325 sloped reading desks, 325
User's mental models. See Mental models UX. See User experience
UX evaluation
architect designer, 618 data collection techniques
critical incident identification, 436-440 emotional impact, 452-460
phenomenological aspects, 460-464
questionnaires, 444-452
think-aloud, 440-444 data, types
description, 435-436 objective vs. subjective, 436
quantitative vs. qualitative, 436 description, 611
design concepts, 615
emotional impact and phenomenological aspects, 616
flexibility goals, 616
quantitative and qualitative data, 617 formative and informal summative methods
analytic vs. empirical, 433-435 classification, dimensions, 432
dimensions intersection, 435 rigorous vs. rapid, 433
formative results, variations detection rates, problems, 465 evaluator effect, 464
inspection methods, 465
lab-based testing, 465
limitations, 465
metal detector, 465
screen/Web page, 465 formative vs. summative
description, 429, 430
design, 429
education and curriculum, 429 engineering, informal summative, 432 engineering vs. science, 431-432 informal, 430
and informal summative, engineering, 430-431
qualitative data, 429b quantitative data, 429b
goals, 611
in situ vs. user reflections, 615-616 lifecycle, 618
measurability productivity/ease, 428
questionnaires, 428
teaching and learning, 427-428 methods
design representations stages, 613, 613t hybrid approaches, 614
inspection, 613-614
lab-based test, 612
prototypes, 613, 614
resources, 612
process, 612, 617
prototype, 427
testing, 428-429
Wheel lifecycle template, 427 UX measures
characteristics, 364
description, 364
long-term performance, 364
performance, initial, 364
quantitative metrics, 364
targets, 364
Ticket Kiosk System example, 365b, 365t user performance requirements, 365
UX metrics characteristics, 379
description, 378-379
frustration/satisfaction, 379
numeric average, 379
performance trade-offs, 379-380
project context, 359-361
roots, 361
Ticket Kiosk System example, 380b, 380t UX problem instances
analysis, 573
content, 567-568
as feedback to process improvement, 590-591
group records, 571-573
merging into UX problem records find and merge, 569-570 records creation, 570-571
project context, 569
V
Verbal instruments, 457
Verbal protocol, 440-444
W
WAAD. See Work Activity Affinity Diagram WAMMI. See Website analysis and measurement
inventory
WCAG. See Web Content Accessibility Guidelines Web accessability
government Web sites, 194 impairments, 193
people, disabilities, 192-193
policy changing, 193
WCAG 1.0 guidelines influence, 193
Web Content Accessibility Guidelines (WCAG), 193
Website analysis and measurement inventory (WAMMI), 450
Web User Experience Design conceptual model design, 52 detailed UX design, 53
information architecture design, 52 page design standards design, 52
Wheel lifecycle template, 503, 504f. See also Lifecycle template, UX process
Wireframes building
drawing/word processing software, 345 information architecture, 345
windows/container elements, 346
workflow, 345-346
defined, 340
drawing aspects, 340-341
elaboration, conceptual design and layout, 342, 342f
high-level conceptual design, 341-342, 341f layers, 346-347
path, 340
prototypes, 346 sketchy
conventional drawing tools, 347 description, 347
strong community, 347 uses
feedback, potential users and stakeholders, 344 hyperlinking capabilities, deck, 344
interaction design specifications, 344 Wizard of Oz
description, 399
human evaluator, 399
prototypes (see Interactive prototypes) use, 399, 400
users, unawareness, 399
Work Activity Affinity Diagram (WAAD) builds, 144
clusters, 148-149 colors of label, 152
consolidation and communication, 155-157 data ownership, 151
elimination, 179
grouping groups, 153
growing clusters, 147 hands of analysts, 151
hierarchical and nonhierarchical relationships, 154-155
hierarchical structure, 185
hybrid, 247-248
labeling groups, 152-153
mind-sets, 146-147
monitoring note groups, 151-152 MUTTS example, 171
number of levels, 153 process, 170
and requirements, 183
set rules, 145-146
source node ID, 168-169 speed, 150
team members, 145
use, 178
user statements, 171
work activity note groups, 150 work roles, 147-148
Work activity data, domain-complex system analyst and designer ideas, 113
complex and esoteric domains, 99-100 contextual data "bins", creation,
106-107
customer and user people, 100-101 customer organization, visit, 99 data collection, 114
description, 99b designers create, 112
design ideas, 113
emotional impact, 116
goals, 108
group interview, 103
"key" people, 101-102
note taking, 110
numbering system, use, 110-111 observation and interview, task data, 109 partnerships, users, 108
phenomenological aspects, 116
process, 111-112
product perspective, 103-106
real users, 102
right conditions, 102-103
team, 100
trust and rapport, 108 video recording, 109-110
visits, 107
work artifacts, collection, 114 work roles, 115-116
Work activity notes creating and managing
anticipated data bins, 143-144 interview and observation, 136 prints, 144
raw user work activity data, 136-137 synthesization, 137-143
groups, 150
mind-sets, work activity notes, 146-147 Work activity theory, 125b, 355
Work environment models artifact
construction, 237-238 restaurant, 236b, 236f slideshow presentations, 238b work products, 235-237
physical creation, 239
description, 238
envision, 242 MUTTS, 241b
slideshow presentations, 239b
Intentionally left as blank
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment