Skip to content

Instantly share code, notes, and snippets.

@patilswapnilv
Created September 21, 2023 15:54
Show Gist options
  • Save patilswapnilv/9d2520eebab010da90d7acc2dcd5a447 to your computer and use it in GitHub Desktop.
Save patilswapnilv/9d2520eebab010da90d7acc2dcd5a447 to your computer and use it in GitHub Desktop.
This book is destined to become a primary reference for just about anyone involved in the development of interactive products of almost any kind. It addresses both the design process and design principles and goes beyond traditional usability to address all aspects of the user experience. The authors have distilled two careers' worth of research, practice and teaching into a concise, practical and comprehensive guide for anyone involved in designing for the user experience of interactive products.-Deborah J. Mayhew, Deborah J. Mayhew & Associates
The UX Book covers the methods and guidelines for interaction design and evaluation that have been shown to be the most valuable to students and professionals. The students in my classes have been enthusiastic about the previous versions of this text that they used. This book will benefit anyone who wants to learn the right way to create high quality user experiences. Like good user interfaces, this text has been refined through multiple iterations and feedback with actual users (in this case, feedback from students and faculty who used earlier versions of the book in classes), and this is evident in the final result.- Brad A. Myers, Professor, Human-Computer Interaction Institute, School of Computer Science, Carnegie Mellon University
The UX Book takes on a big challenge: a comprehensive overview of what it takes to design great user experiences. Hartson and Pyla combine theory with practical techniques: you leave the book knowing not just what to do, but why it's important.-Whitney Quesenbery, WQusability, author, Global UX: Design and research in a connected world
The UX Book
Process and Guidelines for Ensuring a Quality User Experience
The UX Book
Process and Guidelines for Ensuring a Quality User Experience
REX HARTSON PARDHA S. PYLA
AMSTERDAM � BOSTON � HEIDELBERG � LONDON NEW YORK � OXFORD � PARIS � SAN DIEGO
SAN FRANCISCO � SINGAPORE � SYDNEY � TOKYO
Morgan Kaufmann is an imprint of Elsevier
Acquiring Editor: Rachel Roumeliotis Development Editor: David Bevans Project Manager: Andre� Cuello Designer: Joanne Blank
Cover Designer: Colin David Campbell of Bloomberg L.P.
Morgan Kaufmann is an imprint of Elsevier 225 Wyman Street, Waltham, MA 02451, USA
# 2012 Elsevier, Inc. All rights reserved.
No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or any information storage and retrieval system, without permission in writing from the publisher. Details on how to seek permission, further information about the Publisher's permissions policies and our arrangements with organizations such as the Copyright Clearance Center and the Copyright Licensing Agency,
can be found at our website: www.elsevier.com/permissions.
This book and the individual contributions contained in it are protected under copyright by the Publisher (other than as may be noted herein).
Notices
Knowledge and best practice in this field are constantly changing. As new research and experience broaden our understanding, changes in research methods or professional practices, may become necessary. Practitioners and researchers must always rely on their own experience and knowledge in evaluating and using any information or methods described herein. In using such information or methods they should be mindful of their own safety and the safety of others, including parties for whom they have a professional responsibility.
To the fullest extent of the law, neither the Publisher nor the authors, contributors, or editors, assume any liability for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions, or ideas contained in the material herein.
Library of Congress Cataloging-in-Publication Data
Application submitted
British Library Cataloguing-in-Publication Data
A catalogue record for this book is available from the British Library. ISBN: 978-0-12-385241-0
Printed in the United States of America
12 13 14 10 9 8 7 6 5 4 3 2 1
For information on all MK publications visit our website at www.mkp.com
"Don't panic!"1
1Douglas Adams, The Hitchhiker's Guide to the Galaxy
Intentionally left as blank
Preface
GOALS FOR THIS BOOK
Our main goal for this book is simple: to help readers learn how to create and refine interaction designs that ensure a quality user experience (UX). A good user interface is like an electric light: when it works, nobody notices it. (We used to be able to use the telephone as a similar example, but now multifunction cell phones with all kinds of modalities have thrown that example under the bus.) A good user interface seems obvious, but what is not obvious is how to design it so that it facilitates a good user experience. Thus, this book addresses both what constitutes a positive user experience and the process by which it can be ensured.
Books need to be designed too, which means establishing user (reader) experience goals, requirements, user role (audience) definitions, and the like. Our goals for the reader experience include ensuring that:
� the book is easy to read
� the material is easy to learn
� the material is easy to apply
� the material is useful to students and practitioners
� the reader experience is at least a little bit fun
Our goals for the book content include:
� expanding the concept of traditional usability to a broader notion of user experience
� providing a hands-on, practical guide to best practices and established principles in a UX lifecycle
� describing a pragmatic process built on an iterative evaluation-centered UX lifecycle template for managing the overall development effort
� expanding the traditional role of design in the iterative lifecycle to embrace design thinking and ideation to address the new characteristics embodied within user experience
� providing interaction design guidelines, including in-depth discussion of affordances and other foundational concepts
� facilitating an understanding of key interaction design creation and refinement
activities, such as:
� contextual inquiry to understand user work that the design is to support
� contextual analysis to make sense of the raw contextual inquiry data
� requirements extraction
� design-informing modeling
� conceptual and detailed design
� establishing user experience goals, metrics, and targets
� building rapid prototypes
� performing formative user experience evaluation
� iterative interaction design refinement.
� describing alternative agile UX development methods
� providing pointers on how to get started with these ideas in your own work environment
Our goals for scope of coverage include:
� depth of understanding-detailed information about different aspects of the UX process (like having an expert accompanying the reader)
� breadth of understanding-as comprehensive as space permits
� range of application-the process and the design infrastructure and vocabulary, including guidelines, are not just for GUIs and the Web but for all kinds of interaction styles and devices, including ATMs, refrigerators, road signs, ubiquitous computing, embedded computing, and everyday things.
As we were wrapping up this book, the following quote from Liam Bannon (2011) came to our attention:
Some years ago, HCI researcher Panu Korhonen of Nokia outlined to me how HCI is changing, as follows: In the early days the Nokia HCI people were told "Please evaluate our user interface, and make it easy to use." That gave way to "Please help us design this user interface so that it is easy to use." That, in turn, led to a request: "Please help us find what the users really need so that we know how to design this user interface." And now, the engineers are pleading with us: "Look at this area of life, and find us something interesting!" This, in a nutshell, tells a story of how HCI has moved from evaluation of interfaces through design of systems and into general sense-making of our world.
We were struck by this expressive statement of past, present, and future directions of the field of HCI. It was our goal in this book to embrace this scope of historical roots, the changing perspectives of thought, and future design directions.
USABILITY IS STILL IMPORTANT
The study of usability, a key component of ensuring a quality user experience, is still an essential part of the broad and multidisciplinary field of human- computer interaction. It is about getting our users past the technology and focusing on getting things done for work. In other words, it is about designing the technology as an extension of human capabilities to accomplish something and to be as transparent as possible in the process.
A simple example can help boost this oft-unexplained imperative, "make it transparent," into more than a nice platitude. Consider the simple task of writing with pencil and paper. The writer's focus is all about capturing expressions to convey content and meaning. Much mental energy can be directed toward organizing the thoughts and finding the right words to express them. No thought at all should be necessary toward the writing tools, the pencil and paper, or computer-based word processor. These tools are simply an extension of the writer. Until, that is, the occurrence of a breakdown, something that causes an attention shift from the task to the tools.
Perhaps the pencil lead breaks or a glitch occurs in the word processor software. The writer must turn attention away from the writing and think about how to get the software to work, making the tool that was transparent to the writer in the writing task become the focus of a breakdown recovery task (Heidegger, 1962; Weller & Hartson, 1992). Similarly, interaction designs that cause usability breakdowns for users turn attention away from the task to the computer and the user interface.
BUT USER EXPERIENCE IS MORE THAN USABILITY
As our discipline evolves and matures, more and more technology companies are embracing the principles of usability engineering, investing in sophisticated usability labs and personnel to "do usability." As these efforts are becoming effective at ensuring a certain level of usability in the products, leveling the field on that front, new factors have emerged to distinguish the different competing products.
While usability is essential to making technology transparent, in these days of extreme competition among different products and greater consumer awareness, that is not sufficient. Thus, while usability engineering is still a foundation for what we do in this book, it does not stop there. Because the focus is still on designing for the human rather than focusing on technology, "user- centered design" is still a good description. We now use a new term to express a concern beyond just usability: "user experience."
The concept of user experience conjures a broader image of what users come away with, inviting comparisons with theatre (Quesenbery, 2005), updating the old acronyms-for example, WYXIWYG, What You eXperience Is What You Get (Lee, Kim, & Billinghurst, 2005)-and spawning conferences-for example, DUX, Designing for User Experience. We will see that, in addition to traditional usability attributes, user experience entails social and cultural interaction, value-sensitive design, and emotional impact-how the interaction experience includes "joy of use," fun, and aesthetics.
A PRACTICAL APPROACH
This book takes a practical, applied, hands-on approach, based on the application of established and emerging practices, principles, and proven methods to ensure a quality user experience. The process is about practice, drawing on the creative concepts of design exploration and visioning to make designs that appeal to the emotions of users, while also drawing on engineering concepts of cost-effectiveness-making things as good as the resources permit, but not necessarily perfect.
The heart of the book is an iterative and evaluation-centered UX lifecycle template, called the Wheel, for interaction design in Part I: Process. Lifecycle activities are supported by specific methods and techniques spelled out in Chapters 3 through 19, illustrated with examples and exercises for you to apply yourself. The process is complemented by a framework of principles and guidelines in Part II: Design Infrastructure and Guidelines for getting the right content into the product. And, throughout, we try to keep our eye on the prize, the pragmatics of making it all work in your development environment.
ORDER OF THE MATERIAL
We faced the question of whether to present the process first or the design infrastructure material. We chose to start with the process because the process contains development activities that should precede design. We could just as
well have started with the design infrastructure chapters, especially the interaction design guidelines, and you can read it in that order, too.
One important reason for covering the process first is a practical consideration in the classroom. In our experience, we have found it effective to teach process first so that students can get going immediately on their semester-long team project. Perhaps their designs might be a little better if they had the guidelines first, but we find that it does not matter, as their projects are about learning the process, not making the best designs. Later, when we do get into the design guidelines, the students appreciate it more because they have a process structure for where it all goes.
Use the Index
Use the index! We have tried to keep the text free of inter-section references. So, if you see a term you do not understand, use the index to find out where it is defined and discussed.
OUR AUDIENCE
This book is not a survey of human-computer interaction, usability, or user experience. Nor is it about human-computer interaction research. It is a how-to-do-it handbook, field guide, and textbook for students aspiring to be
practitioners and practitioners aspiring to be better. The approach is practical, not formal or theoretical. Some references are made to the related science, but they are usually to provide context to the practice and are not necessarily elaborated.
Anyone involved in, or wishing to learn more about, creating interaction designs to ensure a quality user experience will benefit from this book. It is appropriate for a broad spectrum of readers, including all kinds of practitioners- interaction designers, graphic designers, usability analysts, software engineers, programmers, systems analysts, software quality-assurance specialists, human factors engineers, cognitive psychologists, cosmic psychics, trainers, technical writers, documentation specialists, marketing personnel, and project managers. Practitioners in any of these areas will find the hands-on approach of this book to be valuable and can focus mainly on the how-to-do-it parts.
Researchers in human-computer interaction will also find useful information about the current state of user interaction design and guidelines in the field. Software engineers will find this book easy to read and apply because it relates interaction design processes to those in software engineering.
Academic readers include teachers or instructors and students. The perspectives of student and practitioner are very similar; both have the goal of learning, only in slightly different settings and perhaps with different motivations and expectations.
We have made a special effort to support teachers and instructors for use in a college or university course at the undergraduate or graduate level. We are especially mindful that many of our teacher/instructor readers might be faced with teaching this material for the first time or without much background of their own. We have included, especially in the separate instructor's guide, much material to help them get started.
In addition to the material for course content, we have compiled a wide range of pedagogical and administrative support materials, for example, a comprehensive set of course notes, suggested course calendar, sample syllabi, project assignments, and even sample course Web pages. The exercises are adapted easily for classroom use in an ongoing, semester-long set of in-class activities to design, prototype, and evaluate an interaction design. As instructors gain the experience with the course, we expect they will tailor the materials, style, and content to the needs of their own particular setting.
We also speak to our audiences in terms of their backgrounds and needs. We want those working to develop large domain-complex systems in large-scale projects to have a sufficiently robust process for those jobs. We also want to address young "UXers" who might think the full process is overly heavy and engineering-like. We offer multiple avenues to lighter-weight processes. For many parts of the full process we offer abridged approaches.
In addition, we have added a chapter on rapid evaluation techniques and a chapter on agile UX methods, paralleling the agile software engineering processes in the literature. But we want these readers to understand that the abridged and agile processes they might use for product and small system development are grounded in full and robust processes used to develop systems with complex domains. Even if one always takes the abridged or agile path, it helps to appreciate the full process, to understand what is being abridged. Also, no matter what part of this book you need, you will find it valuable to see it set in a larger context.
Some readers will want to emphasize contextual inquiry, whereas others will want to focus on design. Although many of the process chapters have an engineering flavor, the design chapter takes on the more "designerly" essence of design thinking, sketching, and ideation. Others yet will want the heaviest coverage on evaluation of all kinds, as that is the "payoff" activity. We take the
approach that the broadest coverage will reach the needs of the broadest of audiences. Each reader can customize the way of reading the book, deciding which parts are of interest and ignoring and skipping over any parts that are not.
INCREASING MATURITY OF THE DISCIPLINE AND AUDIENCE
We are approaching two decades since the first usability engineering process books, such as Nielsen (1993), Hix and Hartson (1993), and Mayhew (1999), and human-computer interaction as a discipline has since evolved and matured considerably. We have seen the World Wide Web mature to become a stock medium of commerce. The mobile communications revolution keeps users connected to one another at all times. New interaction techniques emerge and become commonplace overnight to make the users' information literally a "touch" away.
Despite all these technological advances, the need for a quality user experience remains paramount. If anything, the importance of ensuring a positive user experience keeps increasing. Given the pervasive information overload, combined with the expectation that everyone is computer savvy, the onus on designing for a quality user experience is even more critical these days.
Among all these advances, many of the concepts of existing design and development paradigms are more or less unchanged, but emerging new paradigms are stretching our understanding and definition of our primary mandate-to create an interaction design that will lead to a quality user experience. Approaches to accomplish this mandate have evolved from engineering-oriented roots in the early 1990s to more design-driven techniques today.
Although much has been added to the literature about parts of the interaction development process, the process is still unknown to many and misunderstood by many and its value is unrecognized by many. For example, many still believe it is just about "usability testing."
Since our first book (Hix & Hartson, 1993), we have conducted many short courses and university courses on this material, working with literally hundreds of students and user experience practitioners at dozens of locations in business, industry, and government. We have learned quite a bit more about what works and what does not.
It is clear that, in this same period of time, the level of sophistication among our audiences has increased enormously. At the beginning we always had to
assume that most people in our classes had no user experience background, had never heard of user experience specialists, and, in fact, needed some motivation to believe in the value of user experience. As time went on, we had to adjust the short course to audiences that required no motivation and audiences increasingly knowledgeable about the need for quality user experience and what was required to achieve it. We started getting user experience specialists in the class-self-taught and graduates of other user experience courses.
WHAT WE DO NOT COVER
Although we have attempted a broad scope of topics, it is not possible to include everythinginonebook, norisitwisetoattemptit. Weapologizeifyourfavoritetopic is excluded, but we had to draw the line somewhere. Further, many of these additional topics are so broad in themselves that they cannot be covered adequately in a section or chapter here; each could (and most do) fill a book of their own.
Among the topics not included are:
� Accessibility and the American Disabilities Act (ADA)
� Internationalization and cultural differences
� Ergonomic health issues, such as repetitive stress injury
� Specific HCI application areas, such as societal challenges, healthcare systems, help systems, training, and designing for elders or other special user populations
� Special areas of interaction such as virtual environments or 3D interaction
Additionally, our extensive discussions of evaluation, such as usability testing, are focused on formative evaluation, evaluation used to iteratively improve interaction designs. Tutorials on performing summative evaluation (to assess a level of performance with statistically significant results) are beyond our scope.
ABOUT THE EXERCISES
The Exercises Are an Integral Part of the Course Structure
A Ticket Kiosk System is used as an ongoing user interaction development example for the application of material in examples throughout the book. It provides the "bones" upon which you, the reader or student, can build the flesh of your own design for quality user experience. In its use of hands-on exercises based on the Ticket Kiosk System, the book is somewhat like a workbook. After
each main topic, you get to apply the new material immediately, learning the practical techniques by active engagement in their application.
Take Them in Order
As explained earlier, we could have interchanged Part I and Part II; either part can be read first. Beyond this option, the book is designed mainly for sequential reading. Each process chapter and each design infrastructure chapter build on the previous ones and add a new piece to the overall puzzle. Because the material is cumulative, we want you to be comfortable with the material from one chapter before proceeding to the next. Similarly, each exercise builds on what you learned and accomplished in the previous stages-just as in a real-world project.
For some exercises, especially the one in which you build a rapid prototype, you may want to spread the work over a couple of days rather than the couple of hours indicated. Obviously, the more time you spend working on the exercises, the more you will understand and appreciate the techniques they are designed to teach.
Do the Exercises in a Group if You Can
Developing a good interaction design is almost always a collaborative effort, not performed in a vacuum by a single individual. Working through the exercises with at least one other interested person will enhance your understanding and learning of the materials greatly. In fact, the exercises are written for small teams because most of these activities involve multiple roles. You will get the most out of the exercises if you can work in a team of three to five people.
The teamwork will help you understand the kinds of communication, interaction, and negotiation that take place in creating and refining an interaction design. If you can season the experience by including a software developer with responsibility for software architecture and implementation, many new communication needs will become apparent.
Students
If you are a student in a course, the best way to do the exercises is to do them in teams, as in-class exercises. The instructor can observe and comment on your progress, and you can share your "lessons learned" with other teams.
Practitioners: Get buy-in to do the exercises at work
If you are a practitioner or aspiring practitioner trying to learn this material in the context of your regular work, the best way of all is an intensive short course with team exercises and projects. Alternatively, if you have a small interaction
design team in your work group, perhaps a team that expects to work together on a real project, and your work environment allows, set aside some time (say, two hours every Friday afternoon) for the team exercises. To justify the extra overhead to pull this off, you will probably have to convince your project manager of the value added. Depending on whether your manager is already UX literate, your justification may have to start with a selling job for the value of a quality user experience (see Chapter 23).
Individuals
Do not let the lack of a team stop you from doing the exercises. Try to find at least one other person with whom you can work or, if necessary, get what you can from the exercises on your own. Although it would be easy to let yourself skip the exercises, we urge you to do as much on each of them as your time permits.
PROJECTS
Students
Beyond the exercises, more involved team projects are essential in a course on development for a quality user experience. The course behind this book is, and always has been, a learn-by-doing course-both as a university course and in all of our short courses for business and industry.
In addition to the small-scale, ongoing example application used by teams as a series of in-class activities in conjunction with the book exercises, we cannot emphasize enough the importance of a substantial semester-long team project outside of class, using a real client from the community-a local company, store, or organization that needs some kind of interactive software application designed. The client stands to get some free consulting and even a system prototype in exchange for serving as the project client.
Instructors: See the instructor's guide for many details on how to organize and conduct these larger team projects. The possibilities for project applications are boundless; we have had students develop interaction designs for all kinds of applications: electronic mail, an interactive Monopoly game, a personnel records system, interactive Yellow Pages, a process control system, a circuit design package, a bar-tending aid, an interactive shopping cart, a fast-food ordering system, and so on.
Practitioners
As a way of getting started in transferring this material to your real work environment, you and your existing small team can select a low-risk project. You or your co-workers may already be familiar and even experienced with some of
those activities and may even already be doing some of them in your development environment. By making them part of a more complete and informed development lifecycle, you can integrate what you know with new concepts presented in the book.
For example, many development teams use rapid prototyping. Nonetheless, many teams do not know how to make a low-fidelity prototype (as opposed to one programmed on a computer) or do not know what to do with such a prototype once they have one. Many teams bring in users and have them try out the interaction design, but teams often do not know what data are most important to collect during user sessions and do not know the most effective analyses to perform once they have collected those data. Many do not know about the most effective ways to use evaluation data to get the best design improvements for the money. And very few developers know about measurable user experience targets-what they are, how to establish them, and how to use them to help improve the user experience of an interaction design and to manage the process. We hope this book will help you answer such questions.
ORIGINS OF THE BOOK
Real-World Experience
Although we have been researchers in human-computer interaction, we both have been also teachers and practitioners who have successfully used the techniques described in this book for real-world development projects, and we know of dozens, if not hundreds, of organizations that are applying this material successfully.
One of us (RH) has been teaching this material for 30 years in both a university setting and a short course delivered to hundreds of practitioners in business, industry, government, and military organizations. Obviously a much broader audience can be reached by a book than can be taught in person, which is why we have written this book. Because this book is rooted in those courses, the material has been evaluated iteratively and refined carefully through many presentations over a large number of years.
Research and Literature
In the Department of Computer Science at Virginia Tech, we (RH and colleagues) established one of the pioneering research programs in human-computer interaction back in 1979. Over the years, our work has had the following two important themes.
� Getting usability, and now UX, right in an interaction design requires an effective development process integrated within larger software and systems development processes.
� The whole point of work in this discipline, including research, is to serve effective
practical application in the field.
The first point implies that human-computer interaction and designing for user experience have strong connections to software and systems engineering. Difficulties arise if human-computer interaction is treated only as a psychology or human factors problem or if it is treated as only a computer science problem. Many people who enter the HCI area from computer science do not bring to the job an appreciation of human factors and the users. Many people who work in human factors or cognitive psychology do not bring an appreciation for problems and constraints of the software engineering world.
The development of high-quality user interaction designs depends on cooperation between the roles of design and implementation. The goals of much of our work in the past decade have been to help (1) bridge the gap between the interaction design world and the software implementation world and (2) forge the necessary connections between UX and software engineering lifecycles.
The second defining theme of our work over the past years has been technology exchange between academia and the real world-getting new concepts out into the real world and bringing fresh ideas from the field of praxis back to the drawing boards of academia. Ideas from the labs of academia are just curiosities until they are put into practice, tested and refined in the face of real needs, constraints, and limitations of a real-world working environment.
Because this book is primarily for practitioners, however, it is not formal and academic. As a result, it contains fewer references to the literature than would a research-oriented book. Nonetheless, essential references have been included; after all, practitioners like to read the literature, too. The work of others is acknowledged through the references and in the acknowledgments.
AROUSING THE DESIGN "STICKLER" IN YOU
We are passionate about user experience, and we hope this enthusiasm will take hold within you, too. As an analogy, Eats, Shoots, & Leaves: The Zero Tolerance Approach to Punctuation by Lynn Truss (2003) is a delightful book entirely about punctuation-imagine! If her book rings bells for you, it can arouse what she
calls your inner punctuation stickler. You will become particular and demanding about proper punctuation.
With this book, we hope to arouse your inner design stickler. We could think of no happier outcome in our readers than to have examples of poor interaction designs and correspondingly dreadful user experiences trigger in you a ghastly private emotional response and a passionate desire to do something about it.
This book is for those who design for users who interact with almost any kind of device. The book is especially dedicated to those in the field who get "hooked on UX," those who really care about the user experience, the user experience "sticklers" who cannot enter an elevator without analyzing the design of the controls.
FURTHER INFORMATION ON OUR WEBSITE
Despite the large size of this book, we had more material than we could fit into the chapters so we have posted a large number of blog entries about additional but related topics, organized by chapter. See this blog on our Website at TheUXBook.com. At this site you will also find additional readings for many of the topics covered in the book.
ABOUT THE AUTHORS
Rex Hartson is a pioneer researcher, teacher, and practitioner-consultant in HCI and UX. He is the founding faculty member of HCI (in 1979) in the Department of Computer Science at Virginia Tech. With Deborah Hix, he was co-author of one of the first books to emphasize the usability engineering process, Developing User Interfaces: Ensuring Usability Through Product & Process. Hartson has been principal investigator or co-PI at Virginia Tech on a large number of research grants and has published many journal articles, conference papers, and book chapters. He has presented many tutorials, invited lectures, workshops, seminars, and international talks. He was editor or coeditor for Advances in Human-Computer Interaction, Volumes 1-4, Ablex Publishing Co., Norwood, New Jersey. His HCI practice is grounded in over 30 years of consulting and user experience engineering training for dozens of clients in business, industry, government, and the military.
Pardha S. Pyla is a Senior User Experience Specialist and Lead Interaction Designer for Mobile Platforms at Bloomberg LP. Before that he was a researcher and a UX consultant. As an adjunct faculty member in the Department of
Computer Science at Virginia Tech he worked on user experience methodologies and taught graduate and undergraduate courses in HCI and software engineering. He is a pioneering researcher in the area of bridging the gaps between software engineering and UX engineering lifecycle processes.
Acknowledgments
I (RH) must begin with a note of gratitude to my wife, Rieky Keeris, who provided me with a happy environment and encouragement while writing this book. While not trained in user experience, she playfully engages a well-honed natural sense of design and usability with respect to such artifacts as elevators, kitchens, doors, airplanes, entertainment controls, and road signs that we encounter in our travels over the world. You might find me in a lot of different places but, if you want to find my heart, you have to look for wherever Rieky is.
I (PP) owe a debt of gratitude to my parents and my brother for all their love and encouragement. They put up with my long periods of absence from family events and visits as I worked on this book. I must also thank my brother, Hari, for being my best friend and a constant source of support as I worked on this book.
We are happy to express our appreciation to Debby Hix, for a careeer-long span of collegial interaction. We also acknowledge several other individuals with whom we've had a long-term professional association and friendship at Virginia Tech, including Roger Ehrich, Bob and Bev Williges, Tonya Smith-Jackson, and Woodrow Winchester. Similarly we are grateful for our collaboration and friendship with these other people who are or were associated with the Department of Computer Science: Ed Fox, John Kelso, Sean Arthur, Mary Beth Rosson, and Joe Gabbard. We are also grateful to Deborah Tatar and Steve Harrison of the Center for Human-Computer Interaction at Virginia Tech
for steering us to consider more seriously the design thinking paradigm of HCI.
We are indebted to Brad Myers of Carnegie Mellon University for the use of ideas, words, examples, and figures in the contextual inquiry and modeling chapters. Brad was instrumental in the evolution of the material in this book through his patient adoption of and detailed feedback from early and incomplete trial versions.
In addition, we wish to thank Janet Davis of Grinnell College for her adoption of an early draft of this book and for her detailed and insightful feedback.
Thanks also to Jon Meads of Usability Architects, Inc. for help with ideas for the chapter on agile UX methods and to John Zimmerman of CMU for suggesting alternative graphical representations of some of the models.
Additionally, one paragraph of Chapter 4 was approved by Fred Pelton.
Susan Wyche helped with discussions and introduced us to Akshay Sharma, in the Virginia Tech Department of Industrial Design. Very special thanks to Akshay for giving us personal access to the operations of the Department of Industrial Design and to his approach to teaching ideation and sketching.
Akshay also gave us access to photograph the ideation studio and working environment there, including students at work and the sketches and prototypes they produced. And finally our thanks for the many photographs and sketches provided by Akshay to include as figures in design chapters.
It is with pleasure we acknowledge the positive influence of Jim Foley, Dennis Wixon, and Ben Shneiderman, with whom friendship goes back decades and transcends professional relationships.
We thank Whitney Quesenbery for discussions of key ideas and encouragement to keep writing. Thanks also to George Casaday for many discussions over a long-term friendship. We would like to acknowledge Elizabeth Buie for a long and fruitful working relationship and for helpful discussions about various topics in the book. And we must mention Bill Buxton, a friend and colleague who was a major influence on the material about sketching and ideation.
We are grateful for the diligence and professionalism of the many, many reviewers over the writing lifecycle, for amazingly valuable suggestions that have helped make the book much better than what it started out to be. Especially to Teri O'Connell and Deborah J. Mayhew for going well beyond the call of duty in detailed manuscript reviews.
We wish to thank the Department of Computer Science at Virginia Tech for all the support and encouragement.
Among those former students especially appreciated for volunteering untold hours of fruitful discussions are Terence Andre, Steve Belz, and Faith McCreary. I (RH) enjoyed my time working with you three and I appreciate what you contributed to our discussions, studies, and insights.
Susan Keenan, one of my (RH) first Ph.D. students in HCI, was the one who started the User Action Framework (UAF) work. Jose (Charlie) Castillo and Linda van Rens are two special friends and former research collaborators.
We wish to thank all the HCI students, including Jon Howarth and Miranda Capra, we have had the pleasure of working with over the years. Our discussions
ACKNOWLEDGMENTS xxv
about research and practice with Jon and Miranda have contributed considerably to this book. We extend our appreciation to Tejinder Judge for her extensive help with studies exploring contextual inquiry and contextual analysis.
We also acknowledge all the students in classes where early drafts of this book were tested for their feedback and suggestions.
We also wish to acknowledge Mara Guimara�es da Silva for very dedicated, generous, and conscientious help in gathering and formatting the references in this book.
Special thanks to Colin David Campbell of Bloomberg L.P. for the design of the book cover and many diagrams in the book.
Thanks to Mathilde Bekker and Wolmet Barendregt for discussions during my (RH) visits to Technische Universiteit Eindhoven (TU/e) in the Netherlands.
Many thanks to Phil Gray and all the other nice people in the Department of Computing Science at the University of Glasgow for hosting my (RH) wonderful sabbatical in 1989. Special thanks to Steve Draper, Department of Psychology, University of Glasgow, for providing a comfortable and congenial place to live while I was there in 1989. And thanks to Dan Olson for good memories of doing contextual studies on the Isle of Mull.
And thanks to Jeri Baker, the director of the ONE Spirit organization (www.nativeprogress.org), who has put up with my (RH) absence from my post in helping her with that organization while working on this book.
It is not possible to name everyone who has contributed to or influenced our work, professionally or personally, and it is risky to try. We have interacted with a lot of people over the years whose inputs
have benefitted us in the writing. If you feel that we have missed an acknowledgement to you, we apologize; please know that we appreciate you nonetheless. Our thanks go out to you anonymous contributors.
Finally, we thank the students for the fun we have had with them at Usability Day parties and at dinners and picnics at Hartveld. In particular, we thank Terence Andre for creating the UAF hat, used at many meetings, and Miranda Capra for baking a UAF cake for one of our famous Fourth of July parties.
Finally, we are grateful for all the support from Andre� Cuello, Dave Bevans, Steve Elliot, and all the others at Morgan Kauffman. It has been a pleasure to work with this organization.
Guiding Principles for the UX Practitioner
Be goal-directed.
Don't be dogmatic; use your common sense. Context is everything.
The answer to most questions is "it depends." It's about the people.
Everything should be evaluated in its own way. Improvise, adapt, and overcome.
Intentionally left as blank
Contents
PREFACE ix
ACKNOWLEDGMENTS xxiii
GUIDING PRINCIPLES FOR THE UX PRACTITIONER xxvii
Chapter 1: Introduction 1
1.1 Ubiquitous interaction 1
1.2 Emerging desire for usability 7
1.3 From usability to user experience 9
1.4 Emotional impact as part of the user experience 24
1.5 User experience needs a business case 33
1.6 Roots of usability 36
Chapter 2: The Wheel: A Lifecycle Template 47
2.1 Introduction 47
2.2 A UX process lifecycle template 53
2.3 Choosing a process instance for your project 60
2.4 The system complexity space 64
2.5 Meet the user interface team 73
2.6 Scope of UX presence within the team 75
2.7 More about UX lifecycles 75
Chapter 3: Contextual Inquiry: Eliciting Work Activity Data 87
3.1 Introduction 87
3.2 The system concept statement 96
3.3 User work activity data gathering 98
3.4 Look for emotional aspects of work practice 120
3.5 Abridged contextual inquiry process 120
3.6 Data-driven vs. model-driven inquiry 121
3.7 History 125
Chapter 4: Contextual Analysis: Consolidating and Interpreting Work Activity Data 129
4.1 Introduction 129
4.2 Organizing concepts: work roles and flow model 132
4.3 Creating and managing work activity notes 136
4.4 Constructing your work activity affinity diagram (WAAD) 144
4.5 Abridged contextual analysis process 157
4.6 History of affinity diagrams 159
Chapter 5: Extracting Interaction Design Requirements 161
5.1 Introduction 161
5.2 Needs and requirements: first span of the bridge 163
5.3 Formal requirements extraction 165
5.4 Abridged methods for requirements extraction 178
Chapter 6: Constructing Design-Informing Models 181
6.1 Introduction 181
6.2 Design-informing models: second span of the bridge 181
6.3 Some general "how to" suggestions 184
6.4 A New example domain: slideshow presentations 186
6.5 User models 187
6.6 Usage models 209
6.7 Work environment models 235
6.8 Barrier summaries 242
6.9 Model consolidation 244
6.10 Protecting your sources 246
6.11 Abridged methods for design-informing models extraction 246
6.12 Roots of essential use cases in software use cases 248
Chapter 7: Design Thinking, Ideation, and Sketching 251
7.1 Introduction 251
7.2 Design paradigms 253
7.3 Design thinking 259
7.4 Design perspectives 261
7.5 User personas 264
7.6 Ideation 274
7.7 Sketching 284
7.8 More about phenomenology 291
Chapter 8: Mental Models and Conceptual Design 299
8.1 Introduction 299
8.2 Mental models 299
8.3 Conceptual design 305
8.4 Storyboards 316
8.5 Design influencing user behavior 324
8.6 Design for embodied interaction 328
8.7 Ubiquitous and situated interaction 331
Chapter 9: Design Production 333
9.1 Introduction 333
9.2 Macro view of lifecycle iterations for design 334
9.3 Intermediate design 337
9.4 Detailed design 339
9.5 Wireframes 340
9.6 Maintain a custom style guide 348
9.7 Interaction design specifications 350
9.8 More about participatory design 352
Chapter 10: UX Goals, Metrics, and Targets 359
10.1 Introduction 359
10.2 UX goals 361
10.3 UX target tables 362
10.4 Work roles, user classes, and UX goals 363
10.5 UX measures 364
10.6 Measuring instruments 365
10.7 UX metrics 378
10.8 Baseline level 381
10.9 Target level 381
10.10 Setting levels 382
10.11 Observed results 386
10.12 Practical tips and cautions for creating UX targets 386
10.13 How UX targets help manage the user experience
engineering process 388
10.14 An abridged approach to UX goals, metrics, and targets 389
Chapter 11: Prototyping 391
11.1 Introduction 391
11.2 Depth and breadth of a prototype 393
11.3 Fidelity of prototypes 395
11.4 Interactivity of prototypes 398
11.5 Choosing the right breadth, depth, level of fidelity, and amount
of interactivity 402
11.6 Paper prototypes 407
11.7 Advantages of and cautions about using prototypes 418
11.8 Prototypes in transition to the product 420
11.9 Software tools for prototyping 422
Chapter 12: UX Evaluation Introduction 427
12.1
Introduction
427
12.2
Formative vs. summative evaluation
429
12.3
Types of formative and informal summative evaluation methods
432
12.4
Types of evaluation data
435
12.5
Some data collection techniques
436
12.6
Variations in formative evaluation results
464
Chapter 13:
Rapid
Evaluation Methods
467
13.1
Introduction
467
13.2
Design walkthroughs and reviews
469
13.3
UX Inspection
470
13.4
Heuristic evaluation, a UX inspection method
472
13.5
Our practical approach to UX Inspection
479
13.6
Do UX Evaluation rite
484
13.7
Quasi-empirical UX evaluation
487
13.8
Questionnaires
490
13.9
Specialized rapid UX evaluation methods
490
13.10
More about "discount" UX engineering methods
492
Chapter 14: Rigorous Empirical Evaluation: Preparation 503
14.1 Introduction 503
14.2 Plan for rigorous empirical UX evaluation 504
14.3 Team roles for rigorous evaluation 506
14.4 Prepare an effective range of tasks 508
14.5 Select and adapt evaluation method and data collection techniques 509
14.6 Select participants 511
14.7 Recruit participants 513
14.8 Prepare for participants 516
14.9 Do final pilot testing: fix your wobbly wheels 528
14.10 More about determining the right number of participants 529
Chapter 15: Rigorous Empirical Evaluation: Running the Session 537
15.1 Introduction 537
15.2 Preliminaries with participants 537
15.3 Protocol issues 539
15.4 Generating and collecting quantitative UX data 543
15.5 Generating and collecting qualitative UX data 545
15.6 Generating and collecting emotional impact data 548
15.7 Generating and collecting phenomenological evaluation data 550
15.8 Wrapping up an evaluation session 552
15.9 The humaine project 553
Chapter 16: Rigorous Empirical Evaluation: Analysis 555
16.1 Introduction 555
16.2 Informal summative (quantitative) data analysis 556
16.3 Analysis of subjective questionnaire data 561
16.4 Formative (qualitative) data analysis 561
16.5 Cost-importance analysis: prioritizing problems to fix 576
16.6 Feedback to process 589
16.7 Lessons from the field 590
Chapter 17: Evaluation Reporting 593
17.1 Introduction 593
17.2 Reporting informal summative results 595
17.3 Reporting qualitative formative results 597
17.4 Formative reporting content 599
17.5 Formative reporting audience, needs, goals,
and context of use 601
Chapter 18: Wrapping up Evaluation UX 611
18.1 Goal-directed UX evaluation 611
18.2 Choose your UX evaluation methods 612
18.3 Focus on the essentials 615
18.4 Parting thoughts: be flexible and avoid dogma during
UX evaluation 616
18.5 Connecting back to the lifecycle 618
Chapter 19: UX Methods for Agile Development 619
19.1 Introduction 619
19.2 Basics of agile SE methods 620
19.3 Drawbacks of agile SE methods from the UX perspective 625
19.4 What is needed on the UX side 626
19.5 Problems to anticipate 633
19.6 A synthesized approach to integrating UX 634
Chapter 20: Affordances Demystified 643
20.1 What are affordances? 643
20.2 A little background 644
20.3 Four kinds of affordances in UX design 646
20.4 Affordances in interaction design 650
20.5 False cognitive affordances misinform and mislead 655
20.6 User-created affordances as a wake-up call to designers 657
20.7 Emotional affordances 660
Chapter 21: The Interaction Cycle and the User Action Framework 663
21.1 Introduction
663
21.2 The interaction cycle
664
21.3 The user action framework-adding a structured knowledge
base to the interaction cycle
674
21.4 Interaction cycle and user action framework content categories
675
21.5 Role of affordances within the UAF
685
21.6 Practical value of UAF
686
Chapter 22: UX Design Guidelines
689
22.1 Introduction
689
22.2 Using and interpreting design guidelines
695
22.3 Human memory limitations
696
22.4 Selected UX design guidelines and examples
702
22.5 Planning
703
22.6 Translation
708
22.7 Physical actions
761
22.8 Outcomes
768
22.9 Assessment
773
22.10 Overall
789
22.11 Conclusions
801
Chapter 23: Connections with Software Engineering
803
23.1 Introduction
803
23.2 Locus of influence in an organization
806
23.3 Which scenario is right for you?
811
23.4 Foundations for success in SE-UX development
812
23.5 The challenge of connecting SE and UX
818
23.6 The ripple model to connect SE and UX
824
23.7 Conclusions
827
Chapter 24: Making It Work in the Real World
831
24.1
Putting it to work as a new practitioner
831
24.2
Be a smart UX practitioner
838
24.3
UX professionalism
839
24.4
Cost-justifying UX
840
24.5
UX within your organization
848
24.6
Parting words
861
REFERENCES
863
EXERCISES
887
INDEX
905
Introduction 1
Fine art and pizza delivery, what we do falls neatly in between.
- David Letterman
1.1 UBIQUITOUS INTERACTION
1.1.1 Desktops, Graphical User Interfaces, and the Web Are Still Here and Growing
The "old-fashioned" desktop, laptop, and network-based computing systems are alive and well and seem to be everywhere, an expanding presence in our lives. And domain-complex systems are still the bread and butter of many business, industry, and government operations. Most businesses are, sometimes precariously, dependent on these well-established kinds of computing. Web addresses are commonplace in advertisements on television and in magazines. The foreseeable future is still full of tasks associated with "doing computing," for example, word processing, database management, storing and retrieving information, spreadsheet management. Although it is exciting to think about all the new computing systems and interaction styles, we will need to use processes for creating and refining basic computing applications and interaction styles for years to come.
1.1.2 The Changing Concept of Computing
That said, computing has now gone well beyond desktop and laptop computers, well beyond graphical user interfaces and the Web; computing has become far more ubiquitous (Weiser, 1991). Computer systems are being worn by people and embedded within appliances, homes, offices, stereos and entertainment systems, vehicles, and roads. Computation and interaction are also finding their way into walls, furniture, and objects we carry (briefcases, purses, wallets, wrist watches, PDAs, cellphones). In the 2Wear project (Lalis, Karypidis, & Savidis, 2005), mobile computing elements are combined in different ways by short- distance wireless communication so that system behavior and functionality adapt to different user devices and different usage locations. The eGadget project (Kameas & Mavrommati, 2005) similarly features self-reconfiguring artifacts, each with its own sensing, processing, and communication abilities.
Sometimes, when these devices can be strapped on one's wrist or in some way attached to a person's clothing, for example, embedded in a shoe, they are called wearable computers. In a project at MIT, volunteer soldiers were instrumented with sensors that could be worn as part of their clothing, to monitor heart rate, body temperature, and other parameters, to detect the onset of hypothermia (Zieniewicz et al., 2002).
"Smart-its" (Gellersen, 2005) are embedded devices containing microprocessors, sensors, actuators, and wireless communication to offer additional functionality to everyday physical world artifacts that we all "interact" with as we use them in familiar human activities. A simple example is a set of car keys that help us track them so we can find them if they are lost.
Another example of embedding computing artifacts involves uniquely tagging everyday objects such as milk and groceries using inexpensive machine- readable identifiers. It is then possible to detect changes in those artifacts automatically. For example, using this technology it is possible to remotely poll a refrigerator using a mobile phone to determine what items need to be picked up from the grocery store on the way home (Ye & Qiu, 2003). In a project at MIT that is exactly what happened, or at least was envisioned: shoes were instrumented so that, as the wearer gets the milk out for breakfast in the morning, sensors note that the milk is getting low. Approaching the grocery store on the way home, the system speaks via a tiny earphone, reminding of the need to pick up some milk (Schmandt, 1995).
Most of the user-computer interaction attendant to this ubiquitous computing in everyday contexts is taking place without keyboards, mice, or monitors. As Cooper (2004) says, you do not need a traditional user interface to have interaction.
Practical applications in business already reveal the almost unlimited potential for commercial application. Gershman and Fano (2005) cite an
INTRODUCTION 3
example of a smart railcar that can keep track of and report on its own location, state of repair, whether it is loaded or empty, and its routing, billing, and security status (including aspects affecting homeland security). Imagine the promise this shows for improved efficiency and cost savings over the mostly manual and error-prone methods currently used to keep track of railroad cars.
Proof-of-concept applications in research labs are making possible what was science fiction only a few years ago. Work at the MIT Media Lab (Paradiso, 2005), based on the earlier "Smart Matter" initiative at Xerox PARC, employs sensate media (Paradiso, Lifton, & Broxton, 2004) arranged as surfaces tiled with dense sensor networks, in the manner of biological skin, containing multimodal receptors and sensors. The goal is to use this kind of embedded and distributed computing to emulate living, sensitive tissue in applications such as robotics, telemedicine, and prosthetics. Their Tribble (Tactile Reactive Interface Built By Linked Elements) is an interesting testbed using a spherical structure of these nodes that can sense pressure, temperature, sound, illumination, and tactile stimulations and can respond with sound, vibration, and light.
More and more applications that were in research labs are now moving into commercial adoption. For example, robots in more specialized applications than just housecleaning or babysitting are gaining in numbers (Scholtz, 2005). There are robotic applications for healthcare rehabilitation, including systems to encourage severely disabled children to interact with their environment (Lathan, Brisben, & Safos, 2005), robotic products to assist the elderly (Forlizzi, 2005), robots as laboratory hosts and museum docents (Sidner & Lee, 2005), robot devices for urban search and rescue (Murphy, 2005), and, of course, robotic rover vehicles for unmanned space missions (Hamner et al., 2005).
1.1.3 The Changing Concept of Interaction
Sitting in front of a desktop or laptop usually conveys a feeling of "doing computing" to users. Users are aware of interacting with a computer and interaction is purposeful: for exchanging information, for getting work done, for learning, for play or entertainment, or just for exploring.
When we drive a car we are using the car's built-in computer and maybe even a GPS, but we do not think of ourselves as "doing computing." Tscheligi (2005) paraphrases Mark Weiser: "the world is not a desktop." Perhaps the most notable and most recognizable (by the public) example of interaction away from the desktop is seen in mobile communications. With an obviously enormous market potential, mobile communications are perhaps the fastest growing area of ubiquitous computing with personal devices and also represent one of the most intense areas of designing for a quality user experience (Clubb, 2007; Kangas & Kinnunen, 2005; Macdonald, 2004; Venkatesh, Ramesh, & Massey, 2003).
Designing for a Quality User Experience in 3D Applications
Doug A. Bowman, Department of Computer Science, Virginia Tech
Motion controls. Freehand gestures. "Natural" user interfaces. They go by many names, but interfaces involving physical interaction in 3D space are cropping up everywhere these days. Instead of pressing buttons or pushing on joysticks, gamers are swinging their arms, jumping up and down, or leaning their whole bodies to play in 3D virtual worlds. Instead of using a remote control, people are making mid-air gestures to control the components of their home theaters. Instead of looking for restaurants on a 2D map, mobile phone users look at augmented views of the real world through their phone's cameras. All this 3D interaction is certainly very cool, but does it necessarily make interfaces more "natural" or usable? How should we design 3D interaction to ensure a quality user experience?
Three-dimensional user interfaces (3D UIs) are very much an open field of research; there is much we do not yet know. What I am going to review here are a few of the major things we have learned over the last couple of decades of research in this area. For a comprehensive introduction to the field of 3D UIs, see the book 3D User Interfaces: Theory and Practice (Addison-Wesley, 2005).
As you might expect, 3D UIs that replicate an action that people do in the real world can be very successful. We call these "natural" or "high-fidelity" 3D UIs. For example, using physical turning and walking movements (measured by a position tracking system) to change your view of the virtual world is easy to comprehend and results in high levels of spatial understanding. Swinging your arms to make your character swing a virtual golf club is fun and engaging, requiring no special expertise. But natural 3D interaction has its limitations, as well. It can be difficult to reproduce exactly the action people use in the real world, resulting in misunderstanding. An experienced golfer might expect a slight twitch of the wrists at impact to cause the ball to draw from right to left, but it is unlikely that the interface designer included this in the technique. In fact, if an extremely realistic golf swing technique were developed, it probably would not be very fun for most players-I personally would only hit the ball 50 yards much of the time!
Another limitation of natural 3D interaction is that the user is constrained to things they can do in the real world. This leads to our second guideline, which is that "magic" 3D interaction can allow users to perform many tasks more quickly and effectively. It is a virtual world, after all, so why restrict ourselves to only real-world abilities? Magic techniques can be used to enhance our physical abilities (e.g., a person can pick up a 10-story building and place it somewhere else in the virtual city), our perceptual abilities (e.g., we can give the user "X-ray vision" like Superman so she can see what is on the other side of the wall), and even our cognitive abilities (e.g., the system can provide instructions to users to help them navigate through a complicated 3D world).
While we do not want to constrain the user's abilities in a 3D UI, we do want to provide constraints that help the user to interact more easily and effectively. For example, in an application for interior designers, even though we could allow users to place furniture anywhere in 3D space, it only makes sense to have furniture sitting upright on the floor. Therefore, 3D manipulation techniques in this case should only allow the user to control three parameters: 2D position on the floor and rotation around the vertical axis. Many 3D input devices are inherently
underconstrained because they allow the user to move them freely in 3D space and do not remain in place when the user lets go. Helpful constraints can be added to the system with the use of haptic feedback, which can be passive (e.g., using a physical piece of plastic to provide a surface for 2D input) or active (based on a force feedback display, such as the Sensable Phantom).
If appropriate constraints are not provided, users not only become less precise, they may also become fatigued (imagine how tired your arm would feel if you tried to sketch 3D shapes in mid-air for 15 minutes). So the last guideline I want to highlight is to design for user comfort. In many computer interfaces, physical comfort is not a major issue, but 3D interaction usually involves large-scale physical movements and the use of many parts of the body (not just the hand and fingers). What is more, 3D UIs for virtual reality often involve big, surrounding 3D displays that can make users feel dizzy or even nauseated. As a result, 3D UI designers have to take special care to design interfaces that keep users feeling as comfortable as possible. For example, manipulation techniques should allow users to interact with their arms propped against their bodies or a physical surface. 3D UIs should avoid rapid movements through the virtual world or unnatural rotations of the view that can make people feel sick. And if stereoscopic displays are used, keeping virtual objects at a comfortable distance can help avoid eye strain.
Well-designed 3D UIs can make for an engaging, enjoyable, and productive user experience. Knowing the foundational principles of human-computer interaction and UX design is a great start, but using 3D-specific results and guidelines such as these will help ensure that your 3D interaction is a success.
As an aside, it is interesting that even the way these devices are presented to the public reveals underlying attitudes and perspectives with respect to user-centeredness. For example, among the synonyms for the device, "cellphone" refers to their current implementation technology, while "mobile phone" refers to a user capability.
Interaction, however, is doing more than just reappearing in different devices such as we see in Web access via mobile phone. Weiser (1991) said ".. . the most profound technologies are those that disappear." Russell, Streitz, and Winograd (2005) also talk about the disappearing computer-not computers that are departing or ceasing to exist, but disappearing in the sense of becoming unobtrusive and unremarkable. They use the example of electric motors, which are part of many machines we use daily, yet we almost never think about electric motors per se. They talk about "making computers disappear into the walls and interstices of our living and working spaces."
When this happens, it is sometimes called "ambient intelligence," the goal of considerable research and development aimed at the home living environment. In the HomeLab of Philips Research in the Netherlands (Markopoulos et al., 2005), researchers believe "that ambient intelligence technology will mediate,
permeate, and become an inseparable common of our everyday social interactions at work or at leisure."
In these embedded systems, of course, the computer only seems to disappear. The computer is still there somewhere and in some form, and the challenge is to design the interaction so that the computer remains invisible or unobtrusive and interaction appears to be with the artifacts, such as the walls, directly. So, with embedded computing, certainly the need for a quality user experience does not disappear. Imagine embedded computing with a design that leads to poor usability; users will be clueless and will not have even the familiar menus and icons to find their way!
Even interaction via olfactory senses, that is, aromatic output is suggested for human-computer interaction (HCI)(Kaye, 2004), based on the claim that the sense of smell, well used in ordinary daily life, is a human sense underused
in HCI.
So far, our changing concepts of interaction have involved at least some kind of computation element, even if it is embedded electronic devices that do very specialized computation. Given the many different definitions of "interaction" in the HCI literature, we turned to the English definition of the word: mutual or reciprocal action, effect, or influence, as adapted from Dictionary.com. So, interaction involves an exchange, but is definitely not limited to computer systems.
In the realm of user experience, this concept of mutual effect implies that interaction must be considered within a context or environment shared between system and user. User input, if accepted by the system, causes a change in the internal system state and both user and system can cause changes in the external world, for example, move a mechanical part or adjust another system.
The user's part of interaction is often expressed through explicit user actions, used to direct the interaction toward a goal. A user-related input to a system in his or her environment can also be extracted or sensed by the environment, without a deliberate or conscious action by the user. For example, a "smart wall," a wall with ambient intelligence, can proactively extract inputs it needs from a user by sensing the user's presence and identifying the user with something like radio-frequency identification technology instead of just responding to a user's input actions. It is still user-system interaction, only the system is controlling the inputs. Here the dictionary definition given earlier, relating technology to an effect or influence, definitely makes sense, with "action" being only part of that definition.
The system can also extract other inputs, absent any users, by sensing them in the state of its own environment, for example, a high-temperature warning sensor. It
may then act to change its own internal state and, possibly, its external environment, for example, to adjust the temperature lower, without involving a user. This kind of automated system operation probably does not come under the aegis of human- machine interaction, although such a system would surely also involve human interaction for start-up, setting parameters, and other overall controls.
As another example of how our concept of interaction is intended to be very inclusive, consider road or highway signage. A road sign is like a computer message or user interface label in that it helps users (drivers) know what to do. In response, drivers take (driving) actions within the larger highway system. Most of the material in this book can be considered to be about interaction much more general than traditional HCI, including human-machine interaction, for example, with telephones, and ATMs, and even human-world interaction, such as interacting to navigate the structure of a museum.
1.2 EMERGING DESIRE FOR USABILITY
In the distant past, computer usage was esoteric, conducted mostly by a core of technically oriented users who were not only willing to accept the challenge of overcoming poor usability, but who sometimes welcomed it as a barrier to protect the craft from uninitiated "outsiders." Poor usability was good for the mystique, not to mention job security.
Sometimes, even more recently, we have what Cooper (2004, p. 26) calls "dancing bear" software. It is where a great idea triumphs over poor design. It is about having features just so good users cannot do without it, even if it has a terrible interaction design. Just having a bear that can dance leads one to overlook the fact that it cannot dance very well. Users are so grateful to have the functionality that they are willing to work around an interaction design that fell out of the ugly tree and hit every branch on the way down. Success despite poor interaction design can be used as a justification for resisting change and keeping the bad design ideas: "We have been doing it that way, our product is selling phenomenally, and our users love it." Think of how much better it could be with a good design.
As more people began to use computers, the general public and the press were generally slow to realize that we all can demand a better user experience. Statements of misplaced blame fail to inform or educate the public about the role of user experience in design. For example, the failure of voting machines in Florida was blamed by the press on improperly trained poll workers and confused voters. No one publicly asked the question why it takes so much training to operate a simple ballot machine or why citizens experienced with voting were confused with this system.
We are now seeing comments by people about usability of everyday situations. The very first three paragraphs of The Long Dark Tea-Time of the Soul (Adams, 1990, pp. 1-2) by one of our favorite authors, Douglas Adams (decidedly not a user experience specialist), open with this amazingly perspicacious observation on design of most airports:
It can hardly be a coincidence that no language on earth has ever produced the expression "As pretty as an airport."
Airports are ugly. Some are very ugly. Some attain a degree of ugliness that can only be the result of a special effort. This ugliness arises because airports are full of people who are tired, cross, and have just discovered that their luggage has landed in Murmansk (Murmansk airport is the only known exception to this otherwise infallible rule), and architects have on the whole tried to reflect this in their designs.
They have sought to highlight the tiredness and crossness motif with brutal shapes and nerve-jangling colors, to make effortless the business of separating the traveler forever from his or her luggage or loved ones, to confuse the traveler with arrows that appear to point at the windows, distant tie racks, or the current position of Ursa Minor in the night sky, and wherever possible to expose the plumbing on the grounds that it is functional, and conceal the location of the departure gates, presumably on the grounds that they are not.
Poor designs can indeed look so bad to users that they are forced to assume they could not be that bad unless it was deliberate, as this character in Douglas Adams' novel did. And that is only half the story when you consider designs that look beautiful but are totally unusable. In contrast, we want to use technology to learn things, to be entertained, to connect with others, and to do good in the world. In technology now, people look beyond sheer functionality or even usability to beauty, emotional satisfaction, meaning in what they do, and for intellectual gratification.
To many, one of the most significant motivations for the field of user experience is a concern about software product quality. Unfortunately, the software industry does little to dispel concerns about quality. For example, consider this "warranty," taken verbatim from a software product and typical of what we get with most software we buy:
This software is provided without warranty of any kind. The manufacturer does not warrant that the functions contained in the software will meet your requirements, or that the operation of the software will be uninterrupted or error-free, or that defects in the software will be corrected.
Does this not seem to say: "We do not do a good job. We do not care. And you cannot do anything about it."? Who would buy any other kind of consumer product, a TV or a car, with this kind of "warranty"? So why have we put up with this in software products?
Disastrous system development case studies give much more depth to motivating the need for usability and user experience. Marcus and Gasperini (2006) tell of an emergency-response system developed for the San Jose (CA) Police Department, a mobile, in-vehicle communication system for dispatchers and officers in cars. The police had a good working system that they had perfected and customized through years of use, but the underlying technology was too old. Unfortunately, the committee appointed to gather requirements did not include police officers and their focus was on functionality and cost, not usability. No user focus groups or contextual inquiry were considered and, not surprisingly, the mobile response functions and tasks were addressed minimally in requirements.
The resulting system had serious flaws; key information was missing while unneeded information was highlighted. Layouts were confusing and labeling was inconsistent-the typical list you would expect from an early user experience evaluation, only this was in the final system. Officer users were confused and performed poorly to the point of creating risks to their safety in the field.
The lack of feedback channels from initial users precluded fixing problems in subsequent versions. Extensive training was prescribed but could not be given due to cost. In the end, a very expensive new system had led to life-threatening perils for officer users, the situation became highly politicized, emotions ran high, and lawsuits were threatened. Much more money had to be spent in an attempt to fix major problems after the fact. This is a clear story of how a failure to take a user experience-oriented and user-centered approach to design led to truly extensive and awful consequences. A process to ensure a quality user experience that may seem to complicate things upfront can benefit everyone- customers, users, UX practitioners, designers, marketing people, and the public-in the long run.
1.3 FROM USABILITY TO USER EXPERIENCE
1.3.1 The Traditional Concept of Usability
Human-computer interaction is what happens when a human user and a computer system, in the broadest sense, get together to accomplish something. Usability is that aspect of HCI devoted to ensuring that human-computer interaction is, among other things, effective, efficient, and satisfying for the
user. So usability1 includes characteristics such as ease of use, productivity, efficiency, effectiveness, learnability, retainability, and user satisfaction (ISO 9241-11, 1997).
1.3.2 Misconceptions about Usability
While usability is becoming more and more an established part of the technology world, some misconceptions and mischaracterizations still linger. First, usability is not what some people used to call "dummy proofing." While it might have been mildly cute the first time it was used, this term is insulting and demeaning to users and designers alike. Similarly, usability is not equivalent to being "user-friendly." This is a misdirected term; to say that it is about friendliness trivializes the scope of the interaction design process and discounts the importance of user performance in terms of user productivity, etc. As users, we are not looking for amiability; we need an efficient, effective, safe, and maybe aesthetic and fun tool that helps us reach our goals.
To many not familiar with the field, "doing usability" is sometimes thought of as equivalent to usability testing. While usability evaluation plays a very important part, maybe even a starring role, in interaction design, it is by no means all there is in the interaction design creation and refinement process, as we will see in this book.
Finally, another popular misconception about usability has to do with visual appeal. We know of cases where upper management said something to the effect that "after the software is built, I want the usability people to make it look pretty." While visual design is an integral and important part of usability, it is not the only part of interaction design.
1.3.3 The Expanding Concept of Quality in Our Designs
The field of interaction design has grown slowly, and our concept of what constitutes quality in our designs has expanded from an engineering focus on user performance under the aegis of usability into what is now widely known as user experience. As with most new concepts, it takes a while for even those who embrace the concept to agree on its definition (Dagstuhl, 2010).
Within the evolution of a growing field it is natural to see aspirations for considerable breadth. For example, Thomas and McCredie (2002) call for "new usability" to account for "new design requirements such as ambience or attention." At a CHI 2007 Special Interest Group (SIG) meeting (Huh et al.,
1Also sometimes referred to as "pragmatic quality" or "ergonomic quality" ( Hassenzahl et al., 2000) and includes such attributes as simplicity and controllability.
2007), the discussion focused on "investigating a variety of approaches (beyond usability) such as user experience, aesthetic interaction, ambiguity, slow technology,2 and various ways to understand the social, cultural, and other contextual aspects of our world."
1.3.4 Is Not Emotional Impact What We Have Been Calling User Satisfaction?
Some say the emphasis on these emotional factors is nothing new-after all, user satisfaction, a traditional subjective measure of usability, has always been a part of the concept of traditional usability shared by most people, including the ISO 9241-11 standard definition. Also, user satisfaction questionnaires are about how users feel, or at least about their opinions. As Hazzenzahl et al. (2000) point out, at least in practice and as reflected in most usability questionnaires, this kind of user satisfaction has been thought of as a result of how users experience usability and usefulness.
As a result, these user satisfaction questionnaires have elicited responses that are more intellectual responses than emotional ones; they have not traditionally included much about what we call emotional impact.3 We as a profession did not focus on those aspects as much as we did on objective user performance measures such as efficiency and error counts. Technology and design have evolved from being just productivity-enhancing tools to more personal, social, and intimate facets of our lives. Accordingly, we need a much broader definition of what constitutes quality in our designs and quality in the user experience those designs beget.
1.3.5 Functionality Is Important, but a Quality User Experience Can Be Even More So
All other things being equal, a product that affords a better user experience often outsells ones with even more functionality. For example, take the Blackberry; once a market leader in smartphones but now outclassed by the iPhone, a later entrant into the market with less functional capabilities. There are many factors governing the relative market share of each product, but given comparably capable products, user experience is arguably the most important. The iPod, iPhone, and iPad are products that represent cool high technology
2From the abstract of this workshop summary paper: slow technology [is] a design agenda for technology aimed at reflection and moments of mental rest rather than efficiency in performance.
3Also sometimes referred to as hedonic quality (Schrepp, Held, & Laugwitz, 2006), perceived or experienced
hedonic quality (Hassenzahl, Beu, & Burmester, 2001), or emotional usability (Logan, 1994).
with excellent functionality but are also examples that show the market is now not just about the features-it is about careful design for a quality user experience as a gateway to that functionality.
Most users assume that they are getting correct and complete functional capability in their software, but the interface is their only way to experience the functionality. To users, the interaction experience is the system. And plain old usability still plays a role here. Users have an effort threshold beyond which they give up and are not able to access the desired functionality. Larry Marine (1994) puts it this way: "If the users can't use a feature, it effectively does not exist." He describes usability testing of a new version of a system and how users commented that they wished they had a certain feature on the current system and how frequently they would use it. But the current product already had that feature and designers wondered why users would ask for something they already had. The answer was clear: the users did not have it because it was not accessible to them.
Another instructive example once again comes from Apple. When Apple introduced the functionality for users to backup their data on the Macintosh platform, a seemingly mundane and somewhat boring task for most of us, they did so with a stellar interaction design. They introduced a cool fun metaphor, that of a time machine (also the name of this feature) that users can take to go "back in time" to retrieve files that were deleted or lost accidently. The backup procedure itself was automated for the most part and all the user needed to do was connect a backup medium to their Mac. The interesting thing here is that Microsoft, Apple's competitor, had backup capabilities in their operating systems at least since Windows 95! However, because of poor usability, most users did not know it existed and those of
us who did rarely used it. The effort software engineers spent to include the feature in the application functionality was wasted, another cost of poor usability.
Hassenzahl and Roto (2007) state the case for the difference between the functional view of usability and the phenomenological view of emotional impact. People have and use technical products because "they have things to do"; they need to make phone calls, write documents, shop on-line, or search for information. Hazzenzahl and Roto call these "do goals," appropriately evaluated by the usability and usefulness measures of their "pragmatic quality." Human users also have emotional and psychological needs, including needs involving self-identity, relatedness to others, and being satisfied with life. These are "be goals," appropriately evaluated by the emotional impact and phenomenological measures of their "hedonic quality."
1.3.6 A Good User Experience Does Not Necessarily Mean High-Tech or "Cool"
Often when a new cool and high-tech product is announced, technology enthusiasts and the public alike are impressed and many equate this product sizzle with amazing user experience. Much of the world culture, except the dispossessed, who are excluded from the mixed blessing of new technology, has come almost to worship high technology just because it is so cool. But for actual users the reaction can quickly shape-shift from amazement to annoyance to abomination when a failed interaction design in the cool new device becomes a barrier to its use. Clearly, while it is possible to harness new technology to serve real usability, "cool" and high technology are not intrinsic benefactors of a quality user experience.
As a case in point, in Figure 1-1 we show what was once a new Microsoft packaging design for Vista4 and some Office products, such as this one for Office Accounting Professional 2007.
As posted in a Windows Vista blog, the Microsoft designer proudly proclaims: "With Windows Vista and 2007 Office system, we didn't just redesign the software packages themselves, but are also introducing new packaging for the two products. The packaging has been completely revised and, we hope, foreshadows the great experience that awaits you once you open it." Later in the posting, it says, "Designed to be user-friendly, the new packaging is a small, hard, plastic container that's designed to protect the software inside for life-long use. It provides a convenient and attractive place for you to permanently store both discs and documentation. The new design will provide the strength, dimensional stability and impact resistance required when packaging software today. Our plan is to extend this packaging style to other Microsoft products after the launch of Windows Vista and 2007 Office system."
Other follow-up postings by readers of that blog declare, "It looks really nice and should really stand out on the shelves. Good job folks!" and "This looks awesome, really." And "Wow! I must say, I'm very, very impressed by this; excellent job guys." But these are reactions from people who have only seen a picture of the packaging. The reaction from actual users might eventually cause Microsoft to rethink their plan of switching to this as their "standard" packaging.
A glimpse of the same design from the user's, in this case the opener's, stance can be seen in Joel Spolsky's on-line column "Joel on Software" (Spolsky, 2007).
Figure 1-1
A new Microsoft software packaging design.
4Now we are delighted to see an updated version of Vista: Windows 7, otherwise known as Hasta la Vista (baby).
In an article entitled "Even the Office 2007 box has a learning curve," Spolsky says: "I simply could not figure out how to open the bizarre new packaging. It represents a complete failure of industrial design; an utter 'F' in the school of Donald Norman's Design of Everyday Things. To be technical about it, it has no true affordances and actually has some false affordances: visual clues as to how to open it that turn out to be wrong." And: "[This] is just the first of many ways that Office 2007 and Vista's gratuitous redesign of things that worked perfectly well shows utter disregard for all the time you spent learning the previous versions." Postings elsewhere by actual users contained similar sentiments.
Looking at these boxes displayed in stores, some of them actually have small instruction sheets on how to open the box taped on the outside. Upon closer inspection, this box design is a victim of a case of false affordances (Chapter 20). With what looked like hinges on one side, the box looked like a book, a shared design convention, but would not open like one-a violation of using shared conventions to explain an affordance. In our informal testing, several people with advanced degrees in computer science had significant trouble opening the box. Furthermore, the box was difficult to stack and wasteful of desk drawer space.
To give the benefit of doubt, we expect that Microsoft attempted to create an extraordinary user experience, starting from the time a user lays eyes on the software box in a store. However, the designer probably forgot that less box-savvy people will have to use this complicated design with curves and hinges. Clearly, even in just packaging, the best user experience requires a balance of functionality, usability, aesthetics, branding, identity, and so on.
In addition to user experience not just being cool, it also is not just about technology for technology's sake. Many years ago our university changed its phone system over to an all-digital exchange. At the time, the new phones seemed cool and powerful; users heard all about the different kinds of things they could do with call forwarding, paging, conference calls, and so on.
However, their initial enthusiasm for all this functionality faded quickly when they saw the 90-page "summary" user manual; no one read it, and by now almost everyone has lost it. No one ever saw or mentioned the presumably larger "full" manual. Loss of enthusiasm turned to rebellion when the university sent out word that they expected everyone to take a half-day training course on using this new phone system. One of the faculty expressed the feeling of many, "I've been using a telephone all my life and I certainly don't need a training course about a telephone now. All I want to do is make phone calls like I used to."
When many complained to the communications services department, they were actually told that they had a "low-end model" and that they might appreciate the new phones better if they had a model with even more
functionality! Surely this is another case where the thing that will likely make the least improvement in ease of use is adding new technology or functionality.
Years later, we still use these same phones almost exclusively for just making and answering ordinary phone calls, and mostly ignore the other blinking lights and arrays of buttons with intimidating labels. When they need to set up the occasional conference call, they follow the button presses and sequences on a label stuck on the bottom of the phone, and those steps were passed down by word of mouth from other co-workers.
1.3.7 Design beyond Just Technology
In this book we consider technology as just one design context, a platform for certain types of design. The design itself is the focus and the reader will feel as much at home in a discussion about principles and guidelines for the design of ATMs or highway signage as about design for high-tech user interfaces.
Design is about creating artifacts to satisfy a usage need in a language that can facilitate a dialog between the creator of the artifact and the user. That artifact can be anything from a computer system to an everyday object such as a
door knob.
So do not think of this book as being just about interaction design or design of user interfaces for software systems. The interaction design creation and refinement activities described herein apply more universally; they are about design to support human activities-work and play in a context. The context does not have to include software or even much technology. For example, what we say here applies equally well to designing a kitchen for two people to cook together, to the workflow of the DMV, or to the layout of an electronic voting machine.
1.3.8 Components of a User Experience
Let us start by declaring that the concept of usability has not been made obsolete by the new notions of user experience. All of the performance- and productivity- oriented usability factors, such as ease of use and learnability, are still very important in most software systems and even in many commercial products. Especially in the context of using systems associated with complex work domains, it is just as important as ever for users to get work done efficiently and effectively with minimum errors and frustration. The newer concept of user experience still embodies all these implications of usability. How much joy of use would one get from a cool and neat-looking iPad design that was very clumsy and awkward to use? Clearly there is an intertwining in that some of the joy of use can come from extremely good ease of use.
The most basic reason for considering joy of use is the humanistic view that enjoyment is fundamental to life.
- Hassenzahl, Beu, and Burmester5
As a result, we have expanded the scope of user experience to include:
� effects experienced due to usability factors
� effects experienced due to usefulness factors
� effects experienced due to emotional impact factors
5Hassenzahl, M., Beu, A., & Burmester, M. (2001). Engineering joy. IEEE Software, 18(1), pp. 70-76.
On Designing for the "Visitor Experience"*
Dr. Deborah J. Mayhew, Consultant, Deborah J. Mayhew & Associates1 CEO, The Online User eXperience Institute2
Here I will adopt the definition of "user experience" proposed in this book, that is, it is something entirely in the
head of the user. As product designers, we do everything we can to design something that will result in a good user experience for our target users. As moving from designing desktop software products to designing for Websites has clarified, the user experience may be impacted by more design qualities than usability alone. As a Web user interface designer, I use the term "visitor experience" and I recognize the need to address at least five different qualities of Websites that will impact the experience of the site's visitors:
� Utility
� Functional integrity
� Usability
� Persuasiveness
� Graphic design
These I define as follows.
Utility
It is easy to overlook utility as a quality of a Website design that will impact visitor experience, as it is perhaps the most fundamental. The utility of a Website refers to the usefulness, importance, or interest of the site content (i.e., of
1http://drdeb.vineyard.net 2http://www.ouxinstitute.com
the information, products, or services offered by the site) to the visitor. It is of course relative to any particular site visitor-what is interesting or useful to you may not be to me. It is also a continuous quality, that is, some Websites will feel more or less useful or interesting to me than others. For example, many Website visitors love to use social networking sites such as YouTube or Facebook, whereas others find these a total waste of time. I will have no need for a Website that sells carpenter's tools, whereas my neighbor might visit and use that site on a regular basis. This highlights an important fact for designers to keep in mind: that a single design will result in multiple visitor experiences depending on variations in the Website visitors themselves. This is why it is always so important to design for a target audience in particular, based on solid knowledge about that audience.
Functional Integrity
A Website's functional integrity is simply the extent to which it works as intended. Websites may have "dead" links that go nowhere, they may freeze or crash when certain operations are invoked, they may display incorrectly on some browsers or browser versions, they may download unintended files, etc. A lack of functional integrity is the symptom of buggy or incorrect-or even malicious-code. Functional integrity is also a continuous quality-some Websites may only have a few insignificant bugs, others may be almost nonfunctional, and anything in between is possible. In addition, a visitor using one browser or browser version may experience a Website's functional integrity differently as compared to a visitor using another browser.
Usability
Usability of course refers to how easy it is to learn (for first time and infrequent visitors) and/or use (for frequent visitors) a Website. A site can have high utility and high functional integrity and still be very difficult to learn or inefficient and tedious to use. For example, the Web you use to submit your tax returns may be implemented in flawless code and be relevant to almost every adult with great potential for convenience and cost savings, but be experienced by many visitors as unacceptably hard to learn or inefficient to use. Conversely, a site might feel very usable, but not very useful to a given visitor or have low functional integrity. It might be very easy and intuitive to figure out how to perform a task, but the site may consistently crash at a certain point in the task flow so that the task can never be accomplished.
Persuasiveness
Persuasiveness refers to the extent to which the experience visitors have on a Website encourages and promotes specific behaviors, which are referred to as "conversions." What constitutes a conversion varies from site to site, and even non-eCommerce sites may be promoting some type of conversion (e.g., newsletter signup, switching to online tax filing, looking up and using medical information). But persuasiveness is a particularly important design quality on an eCommerce Website, and the primary type of conversion in this case is a sale. So in the case of eCommerce sites, persuasiveness refers mainly to the extent to which the visitor's experience encourages and promotes sales.
Two examples of persuasiveness involve the presence, quality, and location of two types of information:
vendor information (e.g., company name, physical address and contact information, company history,
testimonials of past customers, and the like) and product information (things such as product color, material, care
instructions, and the like). Visitors look for evidence that they can trust an online vendor, especially if they have never heard of it before. Also, they are often unwilling to order a product if they cannot find all the information they need in order to judge whether it will meet their needs. This is why many people will often look for a product on Amazon.com first because it is a trusted vendor and usually provides comprehensive product information, including detailed reviews by other customers. Note that a Website may be experienced as fully functional and highly usable in terms of task completion and offer just what a visitor is looking for, but if it lacks key aspects of persuasiveness, such as adequate vendor and product information, potential sales may be lost. This is not just a loss for the Website owner, it wastes the time of the visitor and foils their goals as well, that is, it impacts their experience negatively.
Graphic Design
Finally, the "look and feel," that is, the graphic design, of a Website can have a significant impact on the visitor experience. The graphic design of a Website-primarily the ways colors, images, and other media are used-invoke emotional reactions in visitors that may or may not contribute to the site's goals. As with other aspects of design that impact the visitor, each visitor's reaction to a given graphic design may be different. You may be bored by soft pastel colors while I may feel reassured and calmed by them. You may find a straightforward and simple graphic design boring while to me it may feel professional and reassuring. I may be put off by sound and animation while you may find it exciting and appealing.
While utility and functional integrity are fairly independent design qualities, the lines among usability, persuasiveness, and graphic design are more blurred. Clearly usability and effective graphic design can contribute to the experience of persuasiveness, and graphic design can contribute significantly to the experience of usability.
Nevertheless, it is useful to consider these design qualities separately in order to understand their importance and apply them effectively during design.
Designing for a great visitor experience requires an interdisciplinary team of experts. The age-old profession of market research is the relevant discipline to employ to achieve the quality of utility. Competent Web development professionals are necessary to ensure functional integrity. Software and Web usability engineering is the expertise needed to achieve usability. There is currently a small but growing field of experts with experience applying marketing and persuasion psychology to eCommerce Web design. Finally, graphic design professionals specializing
in Website design provide the design skills and expertise in branding and target audience appeal that Websites need.
The real key here, beyond simply finding resources with the aforementioned skill sets, is to build an effective interdisciplinary design team. Often professionals with these different backgrounds and skill sets are unfamiliar with the other disciplines and how they can and must work together to design for an optimal visitor for a given target audience. At the very least, Website stakeholders need product development team members respectful of the expertise of others and with a willingness to learn to collaborate effectively to achieve the common goal of a design that results in an optimized experience for intended Website visitors. Together, specialists in these different disciplines can have the most positive impact on the success of Websites by applying their different bodies of knowledge to the site design in a way that will invoke a positive visitor experience in the target audience.
*This essay is a modified excerpt from a chapter called "The Web UX Design Process-A Case Study" that I have written for the forthcoming book Handbook of Human Factors in Web Design (2nd ed.) by Kim-Phuong L. Vu and Robert W. Proctor (Eds.), Taylor & Francis, 2011.
To illustrate the possible components of user experience, we borrow from the domain of fine dining. The usefulness of a meal can be evaluated by calculating the nutritional value, calories, and so on in comparison with the technical nutritional needs of the diner's body. The nutritional value of a meal can be viewed objectively, but can also be felt by the user insofar as the prospect of good nutrition can engender feelings of value added to the experience.
Usefulness can also be reckoned, to some extent, with respect to the diner's immediate epicurean "requirements." A bowl of chilled gefilte fish balls just will not cut it for a gourmand with a taste for a hot, juicy steak. And, when that steak is served, if it is tough and difficult to cut or chew, that will certainly impact the usability of the dining "task."
Of course, eating, especially for foodies, is a largely emotional experience.
Perhaps it starts with the pleasure of anticipation. The diners will also experience a perception of and emotional response to the dining ambiance, lighting, background music, and de�cor, as well as the quality of service and how good the food tasted. The menu design and information about ingredients and their sources contribute to the utility and the pleasure and value of the overall experience. Part of the emotional impact analogous to the out-of-the-box experience might include the aesthetics of food presentation, which sets the tone for the rest of the dining experience.
1.3.9 User Experience Is (Mostly) Felt Internally by the User Most in the field will agree that user experience, as the words imply, is the totality of the effect or effects felt (experienced) internally by a user as a result of interaction with, and the usage context of, a system, device, or product. Here, we give the terms "interaction" and "usage" very broad interpretations, as we will explain, including seeing, touching, and thinking about the system or product, including admiring it and its presentation before any physical interaction, the influence of usability, usefulness, and emotional impact during physical interaction, and savoring the memory after interaction. For our purposes, all of this is included in "interaction" and "usage context."
But is user experience entirely felt internally by the user? What about the performance-related parts of usability? Certainly the user experiences and feels internally effects of performance-related parts of usability, such as increased productivity. However, there are also externally observable manifestations of usability, such as time on task, that represent a component not necessarily felt internally by the user and not necessarily
related to emotion. The same holds for usefulness, too. If usability and usefulness are parts of the user experience, and we feel it is useful to consider them as such, then technically not all user experience is felt internally by the user. It is nonetheless convenient to gloss over this exception and, as a general rule, say that:
� usability and usefulness are components of user experience
� user experience is felt internally by the user
When we use the term "usability" by itself we usually are referring to the pragmatic and non-emotional aspects of what the user experiences in usage, including both objective performance measures and subjective opinion measures, as well as, of course, qualitative data about usability problems. In contrast, when we use the broader term "user experience" we usually are referring to what the user does feel internally, including the effects of usability, usefulness, and emotional impact.
1.3.10 User Experience Cannot Be Designed
A user experience cannot be designed, only experienced. You are not designing or engineering or developing good usability or designing or engineering or developing a good user experience. There is no usability or user experience inside the design; they are relative to the user. Usability occurs within, or is revealed within, the context of a particular usage by a particular user. The same design but used in a different context-different usage and/or a different user-could lead to a different user experience, including a different level of, or kind of, usability.
We illustrate this concept with a non-computer example, the experience of enjoying Belgian chocolates. Because the "designer" and producer of the chocolates may have put the finest ingredients and best traditional
processes into the making of this product, it is not surprising that they claim in their advertising a fine chocolate experience built into their confections. However, by the reasoning in the previous paragraph, the user experience resides within the consumer, not in the chocolates. That chocolate experience includes anticipating the pleasure, beholding the dark beauty, smelling the wonderful aromas, the deliberate and sensual consumption (the most important part), the lingering bouquet and after-taste, and, finally, pleasurable memories.
When this semantic detail is not observed and the chocolate is marketed with claims such as "We have created your heavenly chocolate experience," everyone still understands. Similarly, no one but the most ardent stickler protests when BMW claims "BMW has designed and built your joy!" In this book, however, we wish to be technically correct and consistent so we would have them say, "We have created sweet treats to ensure your heavenly chocolate experience" or "BMW has built an automobile designed to produce your ultimate driving experience."
To summarize our point in this section, in Figure 1-2 we illustrate how an instance of user experience occurs dynamically in time within an instance of interaction and the associated usage context between design and user. It is almost like a chemical reaction that gives off a by-product, such as caloric6 or an extra neutron.
Almost everything in this book depends on this simple, but enormously important, notion of the user experience being the result of a user's interaction with, and usage context of, a design. Although the meaning of this diagram may
not be clear at this point in your reading, we hope that these concepts will unfold
as you go through this book. Figure 1-2
User experience occurs
within interaction and usage context.
6Introduced as the very substance of heat by Lavosier in the 1770s to debunk the phlogiston theory, but you
knew that.
1.3.11 Role of Branding, Marketing, and Corporate Culture In some cases, the user experience goes even beyond the response to usability, usefulness, and joy of use. There are times when social, cultural, marketing, and political aspects, hardware choices, and the like can influence user experience. Users can get wrapped up in the whole milieu of what the manufacturer stands for, their political affiliations, how the product is marketed, and so on. What image does the brand of a product stand for? Is it a brand that uses environmentally sustainable manufacturing practices? Do they recycle? Consequently, what does the fact that someone is using a product of that particular brand say about them? These factors are more difficult to define in the abstract and more difficult to identify in the concrete.
Clearly these kinds of emotional responses are evoked by more than just product design. For some companies, many of the factors that contribute to this level of user experience may be part of the corporate DNA. For such companies, a quality user experience can be a call to action that aligns all roles toward a common mission, lived through their daily practice.
For example, consider the case of Apple. The culture of designing for user experience is so deeply engrained in their corporate culture that everything they produce has a stamp of tasteful elegance and spectacular design. This kind of fanatic emphasis on quality user experience at Apple extends beyond just the products they produce and even seeps into other areas of their company. When they make an employment offer to a new employee, for example, the package comes in a meticulously designed envelope that sets the stage for what the company stands for (Slivka, 2009b).
Similarly, when Apple sent call center technical support employees a T-shirt as a gift, it arrived in a carefully designed box with the T-shirt folded in a way that inspires a sense of design emphasis (Slivka, 2009a). From the time one walks into an Apple store to the sleek industrial design of the device, everything comes together in one harmonious whole to ensure that users love the device. (NB: We are agnostic in the PC vs. Mac religious wars, so please consider this objectively.) And, again, it is all about design for the user experience. A New York Times article (Hafner, 2007) extols the enchanting aura of Apple stores, "Not only has the company made many of its stores feel like gathering places, but the bright lights and equally bright acoustics create a buzz that makes customers feel more like they are at an event than a retail store." The goal of one new store in Manhattan was to make it "the most personal store ever created." This carefully designed user experience has been very successful in generating sales, return visits, and even tourist pilgrimages.
BMW embodies another corporate example of the importance of designing for emotional impact as part of a company's worldview. The makers of BMW cars have elevated the user experience to new heights in the industry. While this manufacturer could stake their reputation on the engineering aspects of these fine machines, instead their top claim to the world (BMW AG, 2010) is "Joy is BMW! More driving pleasure." And their follow-up statement really shows that it is all about user experience: "What you make people feel is just as important as what you make. And we make joy. Joy is why we built this company; joy is our inspiration. At BMW, we don't just make cars; we make joy."
We mention emotional response in the user experience as part of a corporate culture for completeness here, but it is beyond the scope of this book to say how to build this kind of emotional ambiance surrounding the company and the product. In this book we have to focus on the things we can do something about with the guidelines and processes-and that is design, mainly interaction design.
1.3.12 Why Have Such a Broad Definition?
Why do we want to include so much in our definitions of usage context and user experience? We believe that the user experience can begin well before actual usage. It can start as early as when the user beholds a system or product and its packaging or presentation. It does not necessarily end with actual usage.
After usage, the pleasure, or displeasure, can persist in the user's mind.
This perspective of what the user experiences about the product includes initial awareness of the product, to seeing its advertising, to visiting the store, to viewing it and buying it, to taking it out of the box, to using it, to talking with others who have used it-in other words, it is about a broad cultural and personal experience.
When we put forward this definition at conferences and workshops, sometimes we get criticism that such breadth makes it difficult to enforce, operationalize, and take ownership of user experience-related practices and responsibilities in an organization. But that is exactly the reason why the definition needs to be broad: it needs to implicitly recognize the need for multiple roles to work together, to collaborate and communicate, and to work synergistically to ensure a quality user experience. It frames the efforts toward designing for a user experience in an interdisciplinary context, where everyone from hardware engineers, to visual designers, to branding experts, to interaction designers need to collaborate and coordinate their efforts to define and execute a shared design vision.
1.4 EMOTIONAL IMPACT AS PART OF THE USER EXPERIENCE
The emotional aspects of user experience are just what the term implies. We are talking about pleasure, fun, aesthetics, novelty, originality, sensations, and experiential features-the affective parts of interaction. In particular, it is about the emotional impact of interaction on the user.
Users are no longer satisfied with efficiency and effectiveness; they are also looking for emotional satisfaction.
- Shih and Liu7
1.4.1 The Potential Breadth of Emotional Impact Sometimes a user's reaction to a system or product is extremely emotional, a user experience with a deep, intimate, and personal emotional impact. At other times a user might be mildly satisfied (or dissatisfied) or just a bit pleased. Not all user experiences evoke throes of ecstasy, nor should they. Often just being well satisfied without it rising to a personally emotional level is all a user can afford in terms of emotional involvement with a software system.
But, of course, we all live for the moments when the user experience hits the high end of emotional impact range when we experience amazingly cool products (software systems almost never reach these heights). We are talking about a product for which the user experience sets the product apart from the rest in the hearts and minds of discriminating users. Have you ever had something that you really loved to use? Something that had a beauty earned by its amazingly beautiful design?
While other similar products may have an equally usable and useful design, they just do not have that something extra that sparks a deep emotional chord of affinity. The others do not have that indefinable something that transcends form, function, usability, and usefulness, something that elevates the usage experience to pure joy and pleasure, something akin to the appreciation of well- crafted music or art.
Buxton (2007b, p. 127) relates an entertaining and enlightening story of his experiences with personal orange juice squeezers, where subtle design differences made enormous differences in his usage experience. He really likes one above all the rest and the difference is something that, as Buxton (2007b, p. 129) puts it, "sets a whole new standard of expectation or desire." The
7Shih, Y.-H., & Liu, M. (2007). The Importance of Emotional Usability. Journal of Educational Technology Usability, 36(2), pp. 203-218.
differences in the product are not necessarily something you can capture in a diagram, specifications, or even photographs of the product. It is something you have to experience; as Buxton again puts it, you "just can't use it without a smile." But you can be sure that the difference is the result of deliberate and skillful design.
There is an interesting story from General Motors about product passion. In October 2010, the board of directors quietly discontinued the Pontiac car from the GM line of brands. Of course, the direct cause was the transition through bankruptcy, but the beginning of the end for Pontiac started 26 years earlier.
Before that, Pontiac had its own separate facilities for design, production, and manufacturing with its own people. Owners and wannabe owners were passionate about Pontiac cars and Pontiac employees had been devoted to the brand. The brand had its own identity, personality, and cachet, not to mention the notoriety from custom muscle cars such as the GTO and the Firebird TransAm in Smokey and the Bandit.
In 1984, however, in its great corporate wisdom, GM lumped the Pontiac works in with its other GM facilities. The economically based decision to merge facilities meant no separate ideas for design and no special attention to production. After that, there was really nothing to be devoted to and the passion was lost. Many believe that decision led to the decline and eventual demise of the brand.
So what constitutes real emotional impact in usage? While most of the emotional impact factors are about pleasure, they can be about other kinds of feelings too, including affective qualities such as love, hate, fear, mourning, and reminiscing over shared memories. Applications where emotional impact is important include social interaction (Dubberly & Pangaro, 2009; Rhee & Lee, 2009; Winchester, 2009) and interaction for cultural problem solving (Ann, 2009; Costabile, Ardito, & Lanzilotti, 2010; Jones, Winegarden, & Rogers, 2009; Radoll, 2009; Savio, 2010).
Social and cultural interactions entail emotional aspects, such as trustworthiness (especially important in e-commerce) and credibility. Design for emotional impact can also be about supporting human compassion, for example, in sites such as CaringBridge.org and CarePages.com.
Although there were earlier academic papers about emotion in the user experience, Norman (2004) was one of the first to bring the topic to light on a broad scale, relating it to his theme of everyday things. There are conferences dedicated specifically to the topic, including the biennial Conference on Design & Emotion, the goal of which is to foster a cross-disciplinary approach to design and emotion. Also, the topic is definitely blossoming in the academic literature
(Hassenzahl, 2001; Shih & Liu, 2007). Boucher and Gaver (2006) introduce the notion of ludic values, aimlessly playful qualities such as joy, surprise, delight, excitement, fun, curiosity, play, and exploration.
Attractive things make people feel good
- Donald A. Norman8
8Norman, D. A. (2004). Emotional Design: Why We Love (Or Hate) Everyday Things. New York: Basic Books.
Connections That Make "Spirit" a Part of UX
Elizabeth Buie, Luminanze Consulting
UX work speaks to the human spirit. Now, before you think I have gone all woo-woo on you, let me explain: By "human spirit," I mean the part of us that seeks connection with something larger than ourselves. This "something larger" can be different things to different people or to the same people at different times and in different contexts. It can be as mundane as nature, a cause, or being a parent; it can be as mystical as God/dess, the Universe, or even, if we stretch it, the Force. It is whatever evokes in us a sense of deep connection, and the human spirit is the part of us that feels this connection.
Let me illustrate with three stories from my own experience.
THE CONNECTEDNESS OF MUSIC
I sing in a group that performs Medieval and Renaissance polyphony-Catholic a capella music from the 13th to the 17th centuries. Now, I am not by any means traditionally religious (and I have never been Catholic), but this music just speaks to me. The several independent voices in these songs weave in and out to create complex harmonies that are deep, ethereal, and glorious.
For someone raised in the 20th century, learning this stuff is just plain hard. A month in advance of the first rehearsal for each concert, our director sends out learning files in Musical Instrument Digital Interface (MIDI) format. I import these files into music notation software, make my part a French horn played loudly, and make the other parts different instruments played more softly. This allows me to pick out my part easily and in context. I save the results as MP3s, load them onto my iPod, and play them in the car.
One morning I was driving to a client meeting, listening to my learning MP3s. The date was close enough to the performance that I knew my melodic lines fairly well (if not the words) and was singing along. In the middle of the Washington, DC rush hour (one of the worst in the United States), my spirit soared. I have since realized that the
connection I felt that morning-that sense of oneness with everything around me-was part of my user experience of these technologies ... and so is the even deeper joy I feel when we perform this glorious music together for an audience. Creating this experience involves three pieces of equipment (four, if you count the car) and three software applications, and this soaring of spirit is part of my UX of all of them.
It is, in fact, for me their primary purpose.
THE DISCONNECTION OF ABSORPTION
The flip side is, of course, disconnection. These technologies can be absorbing and engrossing-to the point that if we are not careful, they can create distance and disconnection between us and those we care about. For example, I spend a lot of time in front of the computer, what with working mostly at home and not having a TV. I answer the phone that is by my desk, and it is exceedingly difficult for me to tear myself away from the screen to attend properly to a call. Most times I divide my attention somewhat, and I am sure my callers can tell.
My mother never seemed to take offense at this; she was proud of my work and always thought she was interrupting something important. One evening some years ago, she called. After a few short minutes she asked, "Are you on the computer?" I apologized and turned away from the screen; and we talked a brief while. I resolved to do better.
Three days later, however, she had an auto accident. Although she eventually regained consciousness, she had suffered a severe traumatic brain injury and was never her old self again. Seven months after the accident, she died.
So my last conversation with my mother was colored by this disconnection. I do not feel guilty about it-I did spend a lot of high-quality time with her in those months-but I do feel sad. And yet, I continue to find it inordinately difficult not to divide my attention between the phone and the screen.
Disconnection, too, can be part of the UX of technology.
THE SERENDIPITY OF NEW PROJECTS
In the winter of 2011 I started working on a project that provides information and exercises to support sexual health in cancer survivors. Two Websites-one for women and one for men-will supply the service. I conducted usability testing on the women's site which was still in beta and undergoing a clinical trial with cancer survivors, to see how well it helped improve their sexual health. I'm optimistic that my findings and my recommendations for design change will help both of these sites to improve their users' lives.
This project has special meaning for me. In fact, when the client told me what it was, I had to stop and catch my breath.
Ten years earlier, you see, my husband had died of prostate cancer. Antonio and I had lived with this disease for almost 10 years, and the hormone therapy that had worked so well against the cancer for several years had also destroyed his libido. You can imagine what kind of challenges that brings to a relationship.
So this project has a deep special meaning for me. I feel a profound connection with this user population, even though they are unaware of it. Most UX professionals can develop empathy with most user populations, but it is extra special when you have lived the problems that your users face. It is too late, of course, for this program to help Antonio and me, but I used my UX knowledge and skills to help make it easier for people in similar situations to address their problems.
UX IS WORK OF THE SPIRIT
Like many UX professionals, I got into this field because I want to help make people's lives better. Sure, I find the work challenging and fascinating; if I did not, I probably would have found some other work. But for me the key is knowing that what I do for a living matters. That it helps connect me with my users, my clients, and my best self. That it is larger than myself.
Life is about connection, and UX is no different. I submit that our work needs to nurture our own spirit and those of our users. Even when we are working on a product that has no obvious component of connection, we will serve our users best if we keep the possibility present in our minds.
Maybe the best illustration of the difference between utilitarian product usability and real user experience packed with emotional impact is demonstrated by Buxton's pictures of mountain bikes. He begins with a beautiful picture, his Figure 32, of a Trek mountain bike, just sitting there inviting you to take an exciting ride (Buxton, 2007b, pp. 98-99).
But the next picture, his Figure 33, is all about that exciting ride
(Buxton, 2007b, pp. 100-101). A spray of water conveys the fun and excitement and maybe a little danger to get the blood and adrenaline pumping. In fact, you can hardly see the bike itself in this picture, but you know it is how we got here. The bike just sitting there is not really what you are buying; it is the breathtaking thrill of screaming through rocks, mud, and water-that is the user experience!
1.4.2 A Convincing Anecdote
David Pogue makes a convincing case for the role of emotional impact in user experience using the example of the iPad. In his New York Times story he explains why the iPad turned the personal devices industry upside down and started a whole new class of devices. When the iPad came out, the critics dubbed it "underwhelming," "a disappointment," and "a failure." Why would anyone want or need it?
Pogue admits that the critics were right from a utilitarian or rational standpoint: "The iPad was superfluous. It filled no obvious need. If you already had a touch-screen phone and a laptop, why on earth would you need an iPad? It did seem like just a big iPod Touch" (Pogue). And yet, as he claims, the iPad is the most successful personal electronic device ever, selling 15 million in the first months. Why? It has little to do with rational, functional, and utility appeal and has everything to do with emotional allure. It is about the personal experience of holding it in your hand and manipulating finely crafted
objects on the screen.
1.4.3 Aesthetics and Affect
Zhang (2009) makes the case for aesthetics as part of an emotional or affective (about feeling or emotion) interaction. The movement from functionality and usability to aesthetics takes us from a utilitarian to an experiential orientation, from a cognitive paradigm to an affective-centric paradigm (Norman, 2002, 2004; Zhang & Li, 2004, 2005).
Interaction design can "touch humans in sensible and holistic ways" (Zhang, 2009). The term aesthetics is used to describe a sense of pleasure or beauty, including sensual perceptions (Wasserman, Rafaeli, & Kluger, 2000).
Zhang presents a theoretical linkage between aesthetics and affect.
Aesthetics, a branch of philosophy and often associated with art, is considered an elusive and confusing concept (Lindgaard et al., 2006). A key issue in studies regarding aesthetics is objectivity vs. subjectivity. The objective view is that aesthetic quality is innate in the object or the design and is known by certain features or characteristics regardless of how they are perceived. This means that objective aesthetic qualities can be evaluated analytically.
The subjective view of aesthetics is that it depends on how they are perceived. Aesthetics has different effects on different people and must be evaluated with respect to users/people. It is all about perceived aesthetic quality.
However, operationally, things are still a bit fuzzy. It is difficult to state goals for aesthetic design and there is no standard for measuring aesthetics: ".. .there is a lack of agreement and a lack of confidence on how to measure aesthetics related concepts" (Zhang, 2009). It is typical to think of one-dimensional metrics for aesthetics, such as subjective ratings of visual appeal.
Lavie and Tractinsky (2004) draw a distinction between classical aesthetics- defined by orderliness in clean, pleasant, and symmetrical designs-and expressive aesthetics-defined by creativity, innovation, originality, sophistication, and fascinating use of special effects.
In any case, it is agreed that the result of aesthetic design can be affect, in the form of a mood, emotion, or feeling. The assessment of affect is tricky, mainly relying on subjective assessment of an individual's perception of the ability of an object or design to change his or her affect.
Zhang is interested in the relationship between aesthetics and affect. In particular how are the objective view and the subjective view connected with respect to design? How can the aesthetics of a product or system evoke a change in the person's/user's affect? Norman (2004) proposes a three-level processing model for emotional design, making connection between aesthetics and emotion explicitly:
� Visceral processing requires visceral design-about appearance and attractiveness, appeals to "gut feeling"
� Behavioral processing requires behavioral design-about pleasure and effectiveness (usability and performance)
� Reflective processing requires reflective design-about self-image, identity, personal satisfaction, memories
Kim and Moon (1998) describe emotions, the immediate affective feelings about a system, in seven dimensions:
� attractiveness
� symmetry
� sophistication
� trustworthiness
� awkwardness
� elegance
� simplicity
As Zhang notes, these dimensions are "non-basic" as compared to basic emotions such as joy and anger and can be domain specific. They also seem a bit arbitrary and could allow for quite a few other alternatives. In the end, it is not clear if, or how, these criteria can relate aesthetics in the design to affect in the users.
Zhang's example convinces us that the relationship is, indeed, subjective and that perceived aesthetic quality does determine affective reaction. She describes a beautiful pop-up ad on the Internet, with pleasing images and music. And you experience a feeling beyond just pleasantness. It gets your attention and activates your mind. You have an affective reaction and perceived affective quality is positive.
Now consider exactly the same ad, still inherently beautiful and musical, but because of other factors-for example, you are focusing on something else, trying to solve a problem-the ad is irritating and annoying. You feel distracted; your attention stolen away from the task at hand, and you try to shut it out. You might even get a little angry if you cannot shut it out. The ad has the same objective aesthetic quality but it has a different effect on your affect. Your mind's alert level is still high but you are annoyed; you have a negative effect.
The point of Zhang's example is that the same aesthetics can lead to different user experiences depending on perceived, or subjective, aesthetic quality.
1.4.4 The Centrality of Context
Context has always been important in interpreting the meaning of usability in any situation. Now, context is even more important, essential and central to the meaning of emotional and phenomenological impact in situated usage.
As an example of how anticipated usage context influences how a product is viewed, consider the Garmin GPSMAP 62st handheld GPS device. In Field and Stream, a hunting magazine, an advertisement stresses an impressive list of features and functionality, including such esoteric technology as "3-axis tilt- compensated 100K topo mapping, Birds-Eye Satellite imagery, and quad helix antenna." The message for hunters is that it will get you to the right place at the right time in support of the goals of hunting.
In contrast, in Backpacker magazine, apparently catering to the idea that the typical backpacker is more interested in the enjoyment of the outdoors, while the hunter is more mission oriented, an ad for the same device appeals strongly to emotion. In a play on words that ties the human value of self-identity with orienteering, Garmin puts presence in life style first: "Find yourself, then get back." It highlights emotional qualities such as comfort, cozy familiarity, and companionship: "Like an old pair of boots and your favorite fleece, GPSMAP 62st is the ideal hiking companion."
Because the resulting user experience for a product depends on how users view the product and strongly on the usage context, designers have to work hard. So, in general, there is no formula for creating an interaction design that can be expected to lead to a specific kind of user experience. That is a factor that adds much difficultly to designing for what we hope will be a quality
user experience. However, the more designers know about users and usage context, the better they will be equipped to create a design that can lead to a desired user experience.
1.4.5 What about Fun at Work?
Emotional impact factors such as fun, aesthetics, and joy of use are obviously desirable in personal use of commercial products, but what about in task- oriented work situations? Here usability and usefulness aspects of user experience are obvious, but the need for emotional impact is not so clear.
It is easy to think that fun and enjoyment are just not a good match to computer usage for work. Some, including most Vulcans, say that emotions interfere with the efficiency and control needed for work.
But there is evidence that fun can help at work, too, to break monotony and to increase interest and attention span, especially for repetitive and possibly boring work, such as performed in call centers. Fun can enhance the appealingness of less inherently challenging work, for example, clerical work or data entry, which can increase performance and satisfaction (Hassenzahl, Beu, & Burmester, 2001). It is easy to see how fun can lead to job satisfaction and enjoyment of some kinds of work.
It is also obvious from the fact that emotional and rational behaviors play complementary roles in our own lives that emotional aspects of interaction are not necessarily detrimental to our reasoning processes for doing work. For example, software for learning, which can otherwise be dull and boring, can be spiced up with a dose of novelty, surprise, and spontaneity.
However, fun and usability can conflict in work situations; for example, less boring means less predictable and less predictable usually goes against traditional usability attributes, such as consistency and ease of learning (Carroll & Thomas, 1988). Too simple can mean loss of attention, and consistency can translate as boring. Fun requires a balance: not too simple or boring, but not too challenging or frustrating.
Some work roles and jobs are not amenable at all to fun as part of the work practice. Consider a job that is inherently challenging, that requires full attention to the task, for example, air traffic control. It is essential for air traffic controllers to have no-nonsense software tools that are efficient and effective. Any distraction due to novelty or even slight barriers to performance due to clever and "interesting" design features will be hated and could even be dangerous. For this kind of work, task users often want less mental effort, more predictable interaction paths, and more consistent behavior. They especially do not want a system or software tool adding to the complexity.
Certainly the addition of a game-like feature is welcome in an application designed primarily for fun or recreation, but imagine an air traffic controller having to solve a fun little puzzle before the system gives access to the controls so that the air traffic controller can help guide a plane heading for a mountain top in the fog.
1.5 USER EXPERIENCE NEEDS A BUSINESS CASE
Ingenious by design; hassle-free connectivity
- On a Toshiba satellite receiver box
1.5.1 Is the Fuss over Usability or User Experience Real?
As practitioners in this field, one of the frequent challenges we face is getting buy-in toward user experience processes from upper management and business stakeholders. So what is the business case for UX?
That computer software of all kinds is in need of better design, including better user interaction design, is indisputable. Mitch Kapor, the founder of Lotus, has said publicly and repeatedly that "The lack of usability of software and the poor design of programs are the secret shame of the industry" (Kapor, 1991, 1996). Those who know the industry agree. Poor user experience is an uncontrolled source of overhead for companies using software, overhead due to lost user productivity, the need for users to correct errors, data lost through uncorrected errors, learning and training costs, and the costs of help desks and field support.
Charlie Kreitzburg, founder of Cognetics Corporation, tells of chaos, waste, and failure, which he attributes this sorry state of software development primarily to software development practices that are "techno-centric rather than user-centric." He recommends the industry to "rethink current software design practice to incorporate user-centered design" principles.
These critical assessments of the software industry are not based on personal opinion alone but on large surveys conducted by groups with strong reputations in the software industry. The Standish Group (Cobb, 1995; The Standish Group, 1994, 2001) surveyed 365 IT executive managers from companies of small, medium, and large sizes and found that the lack of attention to user inputs is one of the most important reasons why many software projects were unsuccessful. This translated to costing corporations $80 billion a year.
Some estimate that the percentage of software projects that exceed their budgets is higher than 60% (Lederer & Prasad, 1992). According to May (1998), the average software development project is 187% over budget and 222% behind schedule and implements only 61% of the specified features.
A posting by Computer World (Thibodeau, 2005) declared: "Badly designed software is costing businesses millions of dollars annually because it's difficult to use, requires extensive training and support, and is so frustrating that many end
users underutilize applications, say IT officials at companies such as The Boeing Co. and Fidelity Investments." Keith Butler of Boeing said that usability issues can add as much as 50% to the total cost of software ownership.
Such startling reports on the dismal performance of the software development industry are not hard to find. Kwong, Healton, and Lancaster (1998) cite (among others) the Gartner Group's characterization that the state of software development is chaos: "25% of software development efforts fail outright. Another 60% produce a sub-standard product. In what other industry would we tolerate such inefficiency? As Kreitzburg has put it, imagine if
25% of all bridges fell down or 25% of all airplanes crashed."
1.5.2 No One Is Complaining and It Is Selling Like Hotcakes It is easy to mistake other positive signs as indicators that a product has no user experience problems. Managers often say, "This system has to be good; it's selling big time." "I'm not hearing any complaints about the user interface." This is a more difficult case to make to managers because their usual indicators of trouble with the product are not working. On closer inspection, it appears that a system might be selling well because it is the only one of its kind or the strength of its marketing department or advertising obscures the problems.
And, sometimes, project managers are the only ones who do not hear the user experience complaints. Also, despite demands for an improved user experience, some users simply will not complain.
If you wonder about the user experiences with your own product, but your users are not complaining, here are some indicators to watch for, characteristics of prime candidates for having problems with usability and user experience:
� Your users are accessing only a small portion of the overall functionality your system offers
� There are a significant number of technical support calls about how to use a particular feature in the product.
� There are requests for features that already exist in the product.
� Your competitor's products are selling better even though your product has more features.
This book can help you address these issues. It is designed for those who have been struck by the importance of a good user interface and who want to find out more about what a quality user experience means, how to ensure it, and how to know when you have it. This book is especially aimed toward practitioners- people who put theory into practice in a real-world development environment.
The methods and techniques described here can be used by anyone who is involved in any part of the development of a user interaction design for a user interface.
1.5.3 A Business Strategy: Training as a Substitute for Usability in Design
"It might not be easy to use right off, but with training and practice, it will be a very intuitive design." Sounds silly and perverse, but that is what many people are really saying when they suggest training as a way to fix usability problems.
Unfortunately, the following real-world example is representative of many.
A very large governmental organization serving the public sometimes attempts to solve user experience problems by "instructional bulletins" sent to all field users. These are real user experience problems that increase the time to do tasks, introduce significant opportunities for errors, and require users to remember these special-case instructions for each particular situation. Also, these bulletins are issued only once and then their complicated contents become the responsibility of the users, including those hired after they are issued and, therefore, have never received them.
In one such case, the relevant situation arises when an applicant, a client outside the organization, calls in on an 800 phone number. The call is answered by an agent working for the organization, the actual system user, acting as an information intermediary for the client/applicant. If the applicant requests certain information, to which access is not allowed, the request is denied and policy based on law requires that an explanatory written notice be sent via regular mail.
Screens referred to in the "instructional bulletin" about this kind of interaction are used to make a record of the request and the information denial decision, and to automatically generate and send out the notice. The opportunities for errors are abundant and the applicant will not receive the legally required notice if the user, the agent using the computer, fails to follow these instructions to the letter. We are told, without perceptible nodding or winking, that most agents should understand the jargon. The essence of the main part of the bulletin states:
The 800 Number LDNY System is a 2-screen process. It issues an electronic form #101A, annotates the LPFW worksheet with a record of the closeout action, and automatically purges the lead when the closeout expires based on data propagated to the LPFW. However, the LDNY screen must be completed properly in order to propagate the current date to the REC field and "INFORMAL
DENIAL" to the REMARKS field on the LPFW screen. If this data is not propagated to the LPFW, the applicant will not receive the notice. IMPORTANT: To get the REC date and the REMARKS to propagate to the LPFW screen, you must remember two things:
1. On page 2 of the LDNY, you must answer YES to PRINT NOTICE, otherwise the REC date and REMARKS will not propagate to the LPFW.
2. When you press ENTER on page 2 of the LDNY screen, you are returned to the LPFP screen, a screen you have already completed. You must ENTER through this screen. This will return you to the 800 Number screen. Do NOT use the normal procedure of using the PF3 key to return to the 800 Number screen because it will prevent the appropriate "INFORMAL DENIAL" from propagating to REMARKS on the LPFW screen.
Will a user remember all this, say, a few months after it was released? Multiply this situation by many other functions, forms, situations, and "instructional bulletins" and you have a formula for massive scale errors, frustration, lost productivity, and underserved clients. Training as a substitute for usability is an ongoing per-user cost that often fails to meet the goals of increased productivity and reduced risk, errors, and cost. The question that sticks in our minds is how could someone send out this memo with a straight face? How could the memo author not see the folly of the whole situation? Perhaps that person had been part of the bureaucracy and the system for so long that he or she truly believed it had to be that way because "this is how we have always done it."
1.6 ROOTS OF USABILITY
It is a matter of debate exactly when computer usability was born. It was clearly preceded by usability work for non-computer machines in industrial design and human factors. We know that computer usability was a topic of interest to some by the late 1970s and, by the early 1980s, conferences about the topic were being established. No field exists in isolation and ours is no exception. Human- computer interaction in general and usability in particular owe much of their origin and development to influences from many other related fields.
Human factors is about making things work better for people. For example, think about building a bridge: You use theory, good design practice, and engineering principles, but you
can't really know if it will work. So you build it and have someone walk over it. Of course, if the test fails, ... well, that's one of the reasons we have graduate students.
- Phyllis Reisner
From cognitive and behavioral psychology and psychometrics, concepts such as user modeling and user performance metrics were adopted into HCI. Much of the predesign analysis, such as business process modeling, has its roots in the field of systems engineering. Also, ideas such as software architectures that could abstract the user interface and functional core concerns, rapid prototyping tools, and software environments were borrowed from the discipline of computer science (Hartson, 1998).
Our caveat to the reader: In this and similar sections on history and related work at the end of most chapters, the coverage is by no means a survey of the vast contributions on any particular topic. The topics and references included are to be taken as examples. Please forgive any omission of your favorite references and see other books on this topic for surveys that do justice.
1.6.1 A Discipline Coming of Age
Compared to venerable disciplines such as architecture or civil engineering, computer science is an infant and human-computer interaction is still an embryo. The oldest computer science departments are in their 40s or 50s, and the PC has been around only about 30 years as of this writing. As is often the case, evolution accelerates; it is safe to say that more major changes have occurred within computer science in these 40 years than in civil engineering, for example, in the past hundred or more years (Mowshowitz & Turoff, 2005). As young as it is, HCI has experienced its own accelerated evolution.
Although work was being done on "human factors in computers" in the 1970s and earlier, HCI was born at Virginia Tech and several other universities in the late 1970s and 1980s and had been going on at IBM (Branscomb, 1981), the National Bureau of Standards (now the National Institute of Standards and Technology), and other scattered locations before that. This early work mainly focused on specific topics such as ergonomics of hardware devices (CRT terminals and keyboards), training, documentation (manuals), text editors, and programming, with little general or theoretical work yet evolved.
Many believe that HCI did not coalesce into a fledgling discipline until the CHI conferences began in Boston in 1983. But it probably began a couple of years before with the "unofficial first CHI conferences" (Borman & Janda, 1986) at the May 1981 ACM/SIGSOC conference, called the Conference on Easier and
More Productive Use of Computer Systems, in Ann Arbor, Michigan, and the March 1982 Conference on Human Factors in Computer Systems in Gaithersburg, Maryland.
Also, who does not like cake and candles? So CHI (the conference) celebrated its 25th birthday in 2007 (Marcus, 2007). Marcus says, "I can remember in the mid-1980s when an HP staff member announced with amazement that the amount of code for input-output treatment had finally surpassed the amount of code that was devoted to actual data manipulation. A watershed moment." Watershed, indeed!
1.6.2 Human Factors and Industrial and Systems Engineering Some people think that human factors got its start in "Taylorism," an early 20th-century effort to structure and manage the processes for producing a product efficiently. Through his principles of "scientific management," Frederick Winslow Taylor sought to define "best practices" of the time to reform our inefficient and wasteful, even lazy, ways of operating private and government enterprises and factories (Taylor, 1911). He is also known for helping formulate a national imperative for increased industrial efficiency.
Later, U.S. Air Force officials became concerned with airplane crashes experienced by World War II pilots. In an effort to reduce cockpit errors by pilots and to improve safety, engineers began to study critical incidents that may have led to airplane crashes. Work by Fitts and Jones (1947) is the best known in this regard. Then it grew into goals of improved production and safety in control systems for other kinds of machines, such as power plants. Eventually it has become part of the field of HCI, where it is concerned with critical incidents during interaction by computer users. This is where we got our early emphasis on simple user performance metrics (Tatar, Harrison, & Sengers, 2007).
According to Mark S. Sanders, as quoted by Christensen, Topmiller, and Gill (1988), "human factors is that branch of science and technology that includes what is known and theorized about human behavior and biological characteristics that can be validly applied to the specification, design, evaluation, operation, and maintenance of products and systems to enhance safe, effective, and satisfying use by individuals, groups, and organizations." Not far from our definition of usability, eh?
When human factors entered the computer age, it made a good fit with the emerging field of human-computer interaction. The focus on critical incidents persisted, but now the focus was on problems in HCI.
Human-computer interaction is clearly about human behavior and is used to drive system design, and human performance is the measurable outcome in using those systems (Bailey, 1996). As Bailey says, the human is the most complex part of almost any system and the most likely cause of accident or system failure, which is the reason why so much effort has gone into engineering for the performance of the human component.
We agree with all but the conclusion that the human is the most likely cause of errors or system failure; the whole point of human factors engineering is to design the system to take into account the susceptibility of the human for errors and to design the system to prevent them. So, our take on it is that the human user is what he or she is, namely human, and a design that does not take this into account is the most likely cause of errors and failures.
It is said that human factors got its start with aircraft cockpit design in World War II. The overarching assumption at that time was that humans could be trained to fit a design, the extent of the fit directly proportional to the amount of training. However, no matter how extensive the training and irrespective of the amount of flying experience, pilots were making dangerous mistakes while operating the controls in the cockpit. Researchers were brought in to investigate what were called "pilot errors."
Early investigators such as Fitts and Jones (1947) interviewed scores of pilots and started detecting design problems that ranged from lack of consistency among different cockpit control layouts to placement of unrelated controls together without visual or tactile differentiators to alert the pilots when wrong controls were being operated. The reports of Fitts and Jones are among the very earliest that recognized the causal connection between design flaws, rather than human errors, and mishaps in user performance.
In one such instance, as the folklore goes (not a finding of Fitts and Jones), pilots began bailing out at all the wrong times and for no apparent good reason. It seems that an update by designers included switching the locations of the ejection release and the throttle. When the finger of suspicion pointed at them, the engineers were indignant; "there were good reasons to change the design; it should have been designed that way in the first place. And pilots are very intelligent, highly trained, and already had shown that they could adapt to the changes." However, it turned out that, when under stress, the pilots sometimes involuntarily reverted to earlier learned behavior, and the result was an untimely, embarrassing, and dangerous alley-oop maneuver noted for its separation of pilot from plane.
In fact, the connection of human factors to HCI and usability is close; much of the early HCI work was referred to as "human factors in software engineering" (Gannon, 1979). In 1987 (Potosnak, 1987), for example, the place where human factors fit into the software engineering process was stated as providing a research base, models to predict human behavior, standards, principles, methods for learning about users, techniques for creating and testing systems, and tools for designing and evaluating designs.
Furthermore, many ideas and concepts from human factors laid the basis for HCI techniques later on. For example, the idea of task analysis was first used by human factors specialists in analyzing factory workers' actions on an assembly line. For many working in human factors engineering, the move to focus on HCI was a natural and easy transition.
1.6.3 Psychology and Cognitive Science
In addition to the major influence of human factors and engineering, HCI experienced a second wave of formative influence (Tatar, Harrison, & Sengers, 2007) from a special brand of cognitive science, beginning with Card, Moran, and Newell (1983), offering the first theory within HCI.
Like human factors engineering, cognitive psychology has many connections to the design for, and evaluation of, human performance, including cognition, memory, perception, attention, sense and decision making, and human behavioral characteristics and limitations, elements that clearly have a lot to do with user experience. One difference is that psychology is more about the human per se, whereas human factors engineering looks at the human as a component in a larger system for which performance is to be optimized.
However, because of the influence of psychology on human factors and the fact that most human factors practitioners then were trained in psychology, the field was known at least for a while as occupational psychology.
Because the field of human factors is based on a foundation in psychology, so are HCI and user experience. Perhaps the most fundamental contribution of psychology to human-computer interaction is the standard bearer, Card, Moran, and Newell (1983), which is still today an important foundational reference.
The empiricism involved in statistical testing in human factors and HCI has especially apparent common roots in psychology; see, for example, Reisner (1977). Hammond, Gardiner, and Christie (1987) describe the role of cognitive psychology in HCI to include observing human behavior, building models of human information processing, inferring understanding of the same, and scientific, or empirical, study of human acquisition, storage, and use of
knowledge/information. Cognitive psychology shares with human factors engineering the goal of system operability and, when connected to HCI, computer-based system operability.
Perhaps the most important application of psychology to HCI has been in the area of modeling users as human information processors (Moran, 1981b; Williges, 1982). Most human performance prediction models stem from Card, Moran, and Newell's Model Human Processor (1983), including the keystroke level model (Card, Moran, & Newell, 1980), the command language grammar (Moran, 1981a), the Goals, Operators, Methods, and Selections (GOMS) family of models (Card, Moran, & Newell, 1983), cognitive complexity theory of Kieras and Polson (1985), and programmable user models (Young, Green, & Simon, 1989). In the earliest books, before "usability" was a common term, "software psychology" was used to connect human factors and computers (Shneiderman, 1980).
Carroll (1990) contributed significantly to the application of psychology to HCI in fruitful ways. Carroll says, ".. . applied psychology in HCI has characteristically been defined in terms of the methods and concepts basic psychology can provide. This has not worked well." He goes on to explain that too much of the focus was on psychology and not enough on what it was being applied to. He provides a framework for understanding the application of psychology in the HCI domain.
As an interesting aside to the role of cognitive psychology in HCI, Digital Equipment Corporation researchers (Whiteside et al., 1985; Whiteside & Wixon, 1985) made the case for developmental psychology as a more appropriate model for interaction design than behavioral psychology and as a framework for studying human-computer interaction. The behavioral model, which stresses behavior modification by learning from stimulus-response feedback, leads to a view in which the user adapts to the user interface. Training is invoked as intervention to shape the user's behavior. The user with "wrong" behavior is importuned with error messages. Simply put, user behavior is driven by the interaction design.
In contrast, developmental psychology stresses that surprisingly complex user behavior springs from the person, not the design. The developmental view studies "wrong" user behavior with an eye to adapting the design to prevent errors.
Differences between system operation and user expectations are opportunities to improve the system. "User behavior is not wrong; rather it is a source of information about the system's deficiencies (Whiteside & Wixon, 1985, p. 38)."
Finally, as even more of an aside, Killam (1991) proffers the idea that humanistic psychology, especially the work of Carl Rogers, Rogerian psychology
as it is called, is an area of psychology that has been applied unknowingly, if not directly, to HCI. A client-centered approach to therapy, Rogerian psychology, as in the developmental approach, avoided the normative, directive style of prescribing "fixes" for the patient to adopt, instead listening to the patient's needs that must be met to affect healing.
The tenets of Rogerian psychology translate to some of our most well-known guidelines for interaction design, including positive feedback to encourage, especially at times when the user might be hesitant or unsure, and keeping the locus of control with the user, for example, not having the system try to second- guess the user's intentions. In sum, the Rogerian approach leads to an interaction design that provides an environment for users to find their own way through the interaction rather than having to remember the "right way."
As in the case of human factors engineering, many people moved into HCI from psychology, especially cognitive psychology, as a natural extension of their own field.
1.6.4 Task Analysis
Task analysis was being performed in human factors contexts long before HCI came along (Meister, 1985; Miller, 1953). In order to design any system to meet the needs of its users, designers must understand what tasks users will use the system for and how those tasks will be performed (Diaper, 1989b). Because tasks using machines involve manipulation of system/device objects such as icons, menus, buttons, and dialogue boxes in the case of user interfaces, tasks and objects must be considered together in design (Carroll, Kellogg, & Rosson, 1991).
The process of describing tasks (how users do things) and their relationships is called task analysis and is used to drive design and to build predictive models of user task performance. Much work was done in the 1980s and 1990s in the United Kingdom on developing task analysis to make it connect to interaction design to support users, including task analysis for knowledge description (Diaper, 1989a), the task action grammar (Payne & Green,
1986, 1989).
1.6.5 Theory
Much of the foundation for HCI has been closely related to theory in psychology, as much of it derived from adaptations of psychological theory to the human information processor known to HCI. Cognitive psychology (Barnard, 1993; Hammond, Gardiner, & Christie, 1987) and cognitive theory are the bases for much of what we do-claims analysis (Carroll & Rosson, 1992), for example.
The theory of work activity (Bodker, 1989, 1991) is embodied in techniques such as contextual inquiry.
Norman's (1986) theory of action expresses, from a cognitive engineering perspective, human task performance-the path from goals to intentions to actions (inputs to the computer) back to perception and interpretation of feedback to evaluation of whether the intentions and goals were approached or met. The study of learning in HCI (Carroll, 1984; Draper & Barton, 1993) also has obvious roots in cognitive theory. Fitts law (relating cursor travel time to distance and size of target)(MacKenzie, 1992) is clearly connected to kinesthetics and human performance.
As a prerequisite for task analysis and a significant legacy from cognitive psychology, models of humans as cognitive information processors are used to model and understand the full gamut of user cognition and physical actions needed to interact with computers (Card, Moran, & Newell, 1983). The command language grammar (Moran, 1981a) and the keystroke model (Card, Moran, & Newell, 1980), which attempt to explain the nature and structure of human-computer interaction, led directly to the Goals, Operators, Methods, and Selection (GOMS) (Card, Moran, & Newell, 1983) model. GOMS-related models, quantitative models combining task analysis and the human user as an information processor, are concerned with predicting various measures of user performance-most commonly task completion time based on physical actions in error-free expert task performance.
Direct derivatives of GOMS include Natural GOMS Language (Kieras, 1988) and cognitive complexity theory (Kieras & Polson, 1985; Lewis et al., 1990), the latter of which is intended to represent the complexity of user interaction from the user's perspective. This technique represents an interface as the mapping between the user's job-task environment and the interaction device behavior.
GOMS-related techniques have been shown to be useful in discovering certain kinds of usability problems early in the lifecycle, even before a prototype has been constructed. Studies, for example, by Gray, et al. (1990), have demonstrated a payoff in some kinds of applications where the savings of a number of user actions, for example, keystrokes or mouse movements, can improve user performance enough to have an economic impact, often due to the repetitiveness of a task.
Carroll and Rosson's task-artifact cycle (1992) elicits cognitive theories implicit in design, treating them as claims by the designer. They propose an iterative design cycle in which a scenario-based design representation depicts artifacts in different situations of use. These artifacts are then analyzed to
capture design rationale via the extraction of claims (design tradeoffs), which inform the design.
1.6.6 Formal Methods
While not theory per se, formal methods have been the object of some interest and attention for supporting both theory and practice in HCI (Harrison & Thimbleby, 1990). The objectives of formal methods-precise, well-defined notations and mathematical models-in HCI are similar to those in software engineering. Formal design specifications can be reasoned about and analyzed for various properties, such as correctness and consistency. Formal specifications also have the potential to be translated automatically into prototypes or software implementation.
1.6.7 Human Work Activity and Ethnography
Work activity theory (Bodker, 1991; Ehn, 1990) has had a continuing and deep impact on HCI theory and practice. Originating in Russia and Germany and now flourishing in Scandinavia, where it has been, interestingly, related to the labor movement, this view of design based on work practice situated in a worker's own complete environment has been synthesized into several related mainstream HCI topics.
A natural progression from work activity theory to a practical tool for gathering design requirements driven by work practice in context has led to the eclectic inclusion in some HCI practices of ethnography, an investigative field rooted in anthropology (LeCompte & Preissle, 1993). Indeed, the conflux of work activity theory and ethnographic techniques was refined by many pioneers of this new direction of requirements inquiry and emerged as contextual design in the style of Beyer and Holtzblatt (1998).
1.6.8 Computer Science: Interactive Graphics, Devices, and Interaction Techniques
In parallel to, but quite different from, the human factors, psychology, and ethnography we have been describing, several related threads were appearing in the literature and practice on the computer science side of the HCI equation. This work on graphics, interaction styles, software tools, dialogue management systems, programming language translation, and interface "widgets" was essential in opening the way to practical programming techniques for bringing interaction designs to life on computers.
The origin of computer graphics is frequently attributed to pioneers such as Ivan Sutherland (1963, 1964) and solidified by masters such as Foley and
colleagues (Foley & Van Dam, 1982; Foley et al., 1990; Foley & Wallace, 1974) and Newman (1968). For an insightful account of the relationship of graphics to HCI, see Grudin (2006).
The 1980s and 1990s saw a burgeoning of hardware and software developments to support the now familiar point-and-click style of interaction, including the Xerox Star (Smith et al., 1989) and the Lisa and Macintosh by Apple. This work was a rich amalgam of interaction techniques, interaction styles, user interface software tools, "dialogue management systems," and user interface programming environments.
"An interaction technique is a way of using a physical input/output device to perform a generic task in a human-computer dialogue" (Foley et al., 1990). A very similar term, interaction style, has evolved to denote the behavior of a user and an interaction object, for example, a push button or pull-down menu, within the context of task performance (Hartson, 1998). In practice, the notion of an interaction technique includes the concept of interaction style plus full consideration of internal machine behavior and software aspects.
In the context of an interaction technique, an interaction object and its supporting software is often referred to as a "widget." Libraries of widgets, software that supports programming of graphical user interfaces, are an outgrowth of operating system device handler routines used to process user input-output in the now ancient and impoverished interaction style of
line-oriented, character-cell, text-only, "glass teletype" terminal interaction. Early graphics packages took interaction beyond text to direct manipulation of graphical objects, eventually leading to new concepts in displays and cursor tracking. No longer tied to just a keyboard or even just a keyboard and mouse, many unusual (then, and some still now) interaction techniques arose (Buxton, 1986; Hinckley et al., 1994; Jacob, 1993).
Myers led the field in user interface software tools of all kinds (Myers, 1989a, 1989b, 1992, 1993, 1995; Myers, Hudson, & Pausch, 2000), and Olsen is known for his work in treating the linguistic structure of human-computer dialogue from a formal computing language perspective as a means for translating the language of interaction into executable program code (Olsen, 1983).
So many people contributed to the work on User Interface Management Systems (UIMS) that it is impossible to even begin to recognize them all. Buxton and colleagues (1983) were among the earliest thinkers in this area. Others we remember are Brad Myers, Dan Olsen, Mark Green, and our researchers at Virginia Tech. Much of this kind of work was reported in the ACM Symposium on User Interface Software and Technology (UIST), a conference specifically for the software-user-interface connection.
The commercial world followed suit and we worked through quite a number of proposed "standard" interaction styles, such as OSF Motif (The Open Group). Developers had to choose from those available mainly because the styles were tied closely to software tools for generating the programming code for interaction designs using the devices and interaction styles of these approaches. Standardization, to some extent, of these interactive graphical interaction techniques led to the widgets of today's GUI platforms and corresponding style guides intended for ensuring compliance to a style, but sometimes thought of mistakenly as usability guides.
This growth of graphics and devices made possible one of the major breakthroughs in interaction styles-direct manipulation (Shneiderman, 1983; Hutchins, Hollan, & Norman, 1986)-changing the basic paradigm of interaction with computers. Direct manipulation allows opportunistic and incremental task planning. Users can try something and see what happens, exploring many avenues for interactive problem solving.
1.6.9 Software Engineering
Perhaps the closest kin of usability engineering, or interaction development, on the computer science side is the somewhat older discipline of software engineering. The development lifecycles of both these disciplines have similar and complementary structure in a development project with similar kinds of activities, such as requirements engineering, design, and evaluation. However, for the most part, these terms have different philosophical underpinnings and meanings in the two disciplines.
In an ideal world, one would expect close connections between these two lifecycles as they operate in parallel during development of a unified interactive system. For example, when usability engineers see the need for a new task, it is important to communicate that need to the software engineers in a timely manner so that they can create necessary functional modules to support
that task.
However, in reality, these two roles typically do not communicate with one another until the very end when actual implementation starts. This is often too late, as many interaction design concerns have serious software architectural implications. One of the reasons for this apparent lack of connections between the two lifecycles is because of how these two disciplines grew: without either one strongly influencing the other. In fact, barring a few exceptions, the software engineering and usability engineering researchers and practitioners have mostly ignored one another over the years. We discuss this important topic of connecting with the software engineering lifecycle (Chapter 23).
The Wheel: A Lifecycle Template
He believed in love; he was married many times.
- Fred, on iteration
2.1 INTRODUCTION
The iterative, evaluation-centered UX lifecycle template described in this chapter sets the stage for the whole interaction design process part of this book. It is a map of all the choices for activities to create and refine a design that will lead to a quality user experience. These activities are where all the work products are created, including versions of the product or system being developed.
2.1.1 Flying without a Process
To set the stage, consider this all too common scenario for a misbegotten approach to interaction lifecycle activities within an interactive software development project (with many thanks to our good friend Lucy Nowell of Battelle Pacific Northwest Laboratories):
About 25% into the project schedule, the user experience specialist is contacted and brought in to do some screen designs. "Where is the task analysis?" "The What?" "Ok, you have done contextual inquiry and analysis and you have requirements, right?" "Oh, yes, lots of requirements lists-we pride ourselves in gathering and documenting all the necessary functionality beforehand." "Ummm .. ., Ok, do you have any usage scenarios?" "Uh, well, we've got a bunch of O-O use cases."
At this point the user experience specialist has the privilege of working overtime to get up to speed by poring through functional requirements documents and trying to create some usage scenarios. When the scenarios are sent out to prospective users, it is revealed that this is the first time anyone has asked them anything about the new system. The result is a deluge of feedback ("That's not how we do it!") and tons of changes suggested for the requirements ("Hey, what about this?"), including lots of brand new requirements. A very different view of the target system is emerging!
This is a critical point for management. If they are lucky or smart and there is time (a small percentage of projects), they decide to go back and do the upfront work necessary to understand the work activities and needs of users and customers. They dig into the context of real work, and users get involved in the process, helping to write usage scenarios. The requirements soon reflect real usage needs closely enough to drive a useful and fairly major redesign.
If they are not lucky or smart or do not have the time (a large percentage of product development projects), they will ignore all the commotion from users and plow ahead, confidence unshaken. The group continues on its chosen "clueless but never in doubt" path to produce yet another piece of shelfware. This project cannot be saved by any amount of testing, iteration, field support, or maintenance effort.
It is easy to fall into this kind of scenario in your projects. None of us are fond of the ending of this scenario. This kind of scenario is not necessarily anyone's fault; it is just about awareness of a guiding UX process that might help avoid this ending.
2.1.2 The Concept of Process
Calibration: What process means to us and others
To most people, including us:
� the term "process" connotes a set of activities and techniques
� the term "lifecycle" suggests a skeleton structure on which you can hang specific process activities, imbuing them with temporal relationships
Fine distinctions are unnecessary here, so we use the terms "process," "lifecycle," and "lifecycle process" more or less interchangeably. Here we introduce a lifecycle template, a skeleton representation of a basic lifecycle that you get to tailor to your needs by instantiating it for each project.
In your instantiation you get to determine your own process, choosing which activities to do and which techniques to use in doing them, as well as how much and how often to do each activity, and (perhaps most importantly) when to stop. Here, and in the other process chapters (Chapters 3 through 19), we offer guidelines on how to make these decisions.
What is a process?
A process is a guiding structure that helps both novices and experts deal with the
complex details of a project. Process acts as scaffolding, especially for novice practitioners, to ensure that they are on track to a quality product and on the path to becoming experts. Process acts as a checklist for experts to make sure they do not miss any important aspects of the problem in the heat of productivity. A process helps designers answer questions such as "Where are we now?" and "What can/should we do next?"
A process brings to the table organizational memory from similar previous efforts by incorporating lessons learned in the past. In other words, process
provides a repeatable formula to create a quality product. Process also alleviates risk by externalizing the state of development for observation, measurement, analysis, and control-otherwise, communication among the project roles about what they are doing is difficult because they do not have a shared concept of what they should be doing.
Why do we need a process? Following a process is the solution recognized by software engineering folks long ago and something in which they invest enormous resources (Paulk et al., 1993) in defining, verifying, and following. On the UX side, Wixon and Whiteside were way ahead of their time while at Digital Equipment Corp in the 1980s and put it this way (Wixon & Whiteside, 1985), as quoted in Macleod et al. (1997):
Building usability into a system requires more than knowledge of what is good. It requires more than an empirical method for discovering problems and solutions. It requires more than support from upper management and an openness on the part of all system developers. It even requires more than money and time. Building
usability into a product requires an explicit engineering process. That engineering process is not logically different than any other engineering process. It involves empirical definition, specification of levels to be achieved, appropriate methods,
early delivery of a functional system, and the willingness to change that system. Together these principles convert usability from a "last minute add on" to an integral part of product development. Only when usability engineering is as much part of software development as scheduling can we expect to regularly produce products in which usability is more than an advertising claim.
Without guidance from an interaction design process, practitioners are forced to make it up as they go along. If this sounds familiar to you, you are not alone. An approach without a process will be idiosyncratic; practitioners will
emphasize their own favorite process activities while other important process
activities fall through the cracks. What they do is dictated and limited by their own experience. They will try to apply the activities and techniques they know as much as possible; they have hammers and everything looks like nails.
As Holtzblatt (1999) puts it, following a process for product development can work against "the relentless drive of an organization to ship 'something' by a given date." Other excuses for not following a proven approach included "we do not have time to do the whole method, so we do not do any of it," "it does not fit well with our existing methods, that we are used to," "can our designers really be trained to do this?," and "do these methods transfer well to real-world project groups?" In this and the coming chapters, we hope that we can shed some light on answers.
A process is not necessarily rigid
Remember that a process does not necessarily imply a rigid structure or even a linear one. A process can be as lightweight or heavyweight as appropriate. In other words, even an incremental and iterative lifecycle approach in the software engineering world (such as an Agile methodology) is still a process.
Lest it still sounds inflexible, we should add that experts with lots of experience can interpret a process and take appropriate shortcuts and other creative liberties with it-and we encourage that throughout the book.
2.1.3 Influences on Our Lifecycle Process
The lifecycle process described in this book is based on insight that grew out of the adaptation and extension of several existing UX and software methodologies over many years. The methods that most significantly guided our creation of our own lifecycle template are:
� the Waterfall (Royce, 1970) software engineering lifecycle
� the Spiral Model (Boehm, 1988) of software engineering
� Mayhew's usability engineering lifecycle (Mayhew, 1999b)
� the Star lifecycle of usability engineering (Hartson & Hix, 1989)
� the Wheel (Helms et al., 2006) lifecycle concept
� the LUCID framework of interaction design (Kreitzberg, 2008)
Web User Experience Design within the Usability Engineering Lifecycle1
Dr. Deborah J. Mayhew, Consultant, Deborah J. Mayhew & Associates 2
CEO, The Online User eXperience Institute3
Within the software usability lifecycle I describe in my book The Usability Engineering Lifecycle (Morgan Kaufmann Publishers, 1999) is a phase consisting of a structured top-down iterative approach to software user interface design. Design is driven by requirements data from a requirements analysis phase. The overall design phase is divided into three levels of design as follows, with slight wording changes made here to reflect Website user experience (UX) design in particular. Each level includes an iterative process of design, mock-up, and evaluation, which is not addressed here.
Level 1
� Information architecture
� Conceptual model design Level 2
� Page design standards Level 3
� Detailed UX design
The rationale behind a top-down approach to UX design is that it is more efficient and effective to address distinct sets of design issues independently of one another, and in a specific order, that is, from the highest level to the most detailed level. Because the design tasks address issues that are fairly independent of one another, focusing on one level of design at a time forces designers to address all UX design issues explicitly and consciously. It ensures efficiency in that
1This essay is a modified excerpt from a chapter called "The Web UX Design Process-A Case Study" that I have written for the forthcoming book Handbook of Human Factors in Web Design (2nd ed.) by Kim-Phuong L. Vu and Robert W. Proctor (Eds.), Taylor & Francis, 2011. That chapter includes a rich case study of the top-down design process within the usability engineering lifecycle, which in turn is fully documented in The Usability Engineering Lifecycle by D. Mayhew, Morgan Kaufmann Publishers, 1999.
3 (http://drdeb.vineyard.net)
3(http://www.ouxinstitute.com)
lower level details are not revisited and reworked constantly as higher level design issues are addressed and reworked randomly. Each level of design builds on the design decisions at higher levels, which may have already been validated through iterative evaluation.
The top level in the top-down process for Web UX design includes two design foci, the first of which is information architecture design. The information architecture is a specification of the navigational structure of the Website. It does not involve any visual design.
Designers must design information architectures in a way that streamlines site visitor navigation across and within tasks and exploits the capabilities of automation (to enhance ease of use), while at the same time preserving familiar structures that tap into visitors' current mental models of their tasks.
While it may seem difficult at first to separate navigational/structural issues from visual design issues, it is productive to learn to do so for at least three reasons. First, the two really are independent. For example, you can have valid and supportive information architecture and then fail to present it clearly through an effective visual design.
Second, different skill sets are relevant to information architecture design as opposed to visual design. In particular, usability and persuasion skills are paramount to achieving optimal information architecture design, while in addition, graphic design skills are necessary to achieve effective appeal, atmosphere, tone, and branding, as well as help realize and support many usability and persuasion goals.
Third, the navigational structure (i.e., information architecture) is platform independent, whereas visual and behavioral design options will depend very much on the chosen platform. For example, a given information architecture may specify a hierarchical menu structure of categories and subcategories of products. Current Web platforms (i.e., browsers, browser versions, plug-ins) allow drop-down menus much like a traditional "GUI" menu bar structure as an option for presenting second (and even third) level navigational choices, whereas earlier browsers did not, requiring instead that different levels in a menu hierarchy be presented as sequences of pages with embedded links.
In conceptual model design, still within Level 1, the focus is still on navigation, but high-level design standards for presenting the information architecture visually are generated. Neither page content nor page design standards (i.e., visual presentation of page content) are addressed during this design task.
A good conceptual model design eliminates the need for the commonly seen "Site Map" page on Websites, that is, the user interface itself reveals the overall site structure at all times and makes it clear where you currently are in it, how you got there, and where you can go from there. A familiar example of how to achieve this is to provide a left- hand nav bar that displays an expandable/contractible set of hierarchical page links. Within this structure, the link to the current page can be cued by some sort of highlight and be inactive.
Visibility and clarity of the information architecture are large parts of what we want to achieve in Website conceptual model design. However, another key goal in a conceptual model design for a Website is persuasion. Also, we want the graphic design to be aesthetically appealing as well as appropriate to the business, to create a particular atmosphere designed to attract the target audience, and to provide strong branding.
In Level 2, page design standards, a second set of standards for the Website is generated for visually presenting and interacting with page content. This new set of standards is designed in the context of both the information architecture and the conceptual model design standards that have already been generated and (in some cases) validated.
Page design standards for a Website would typically include standards that would cover the consistent use and presentation of such things as content headers and subheaders, forms design, use of color cues, and the like. They might include a set of templates illustrating content layout standards for different categories of pages (e.g., fill-in forms, information-only pages, product description pages, pop-up dialog boxes). Consistency in the way all these things are applied in the design will again-just as it does in the case of conceptual model design standards- facilitate greatly the process of learning and remembering how to use the site. This is particularly important on Websites that will be used primarily by casual and discretionary users, as is the case with many eCommerce and other types of sites.
The standards documented during the conceptual model design and page design standards tasks, as well as the information architecture design, will dictate the detailed UX design of a large percentage of a site's functionality. Level 3, detailed UX design, is thus largely a matter of correctly and consistently applying standards already defined and validated to the actual detailed design of all pages and pop-up windows across the site.
However, there will always be unique details here and there across pages to which no particular established standard applies. These must still be designed, and designed well. Also, these design decisions should be driven by requirements data and evaluated.
In my 30 years of software user interface design I have found a top-down approach to user interface design to be most effective and efficient as a design process within the overall usability engineering lifecycle.
2.2 A UX PROCESS LIFECYCLE TEMPLATE
In Figure 2-1 we depict a basic abstract picture of activities for almost any kind of design, a cycle of the four elemental UX activities-Analyze, Design, Implement, and Evaluate-that we refer to generically as analysis, design, implementation, and evaluation. These four activities apply whether you are working with an architectural design, a hardware design, or a new car concept.
In the context of interaction design and UX, this abstract cycle translates to our UX lifecycle template of Figure 2-2, which we call the Wheel.
In our lifecycle concept, specific to a UX process, analysis translates to understanding user work and needs. Design
translates to creating conceptual design and determining interaction behavior and look and feel. Implementation translates to prototyping, and evaluation translates to ways to see if our design is on track to meet user needs and requirements.
In a larger system view, implementation includes a final production of hardware and software, including the user interface.
However, in our UX lifecycle template,
Figure 2-1
Universal abstract activity cycle of Analyze, Design, Implement, and Evaluate.
Figure 2-2
The Wheel: A lifecycle
template illustrating the
process part of this book.
implementation is limited to the interaction design component and prototyping is the design manifestation we use for evaluation before it is finalized for production.
The evaluation activity shown in Figure 2-2 includes both rigorous and rapid evaluation methods for refining interaction designs. Beyond that evaluation activity, the entire lifecycle is evaluation centered in the sense that the results of potentially every activity in the lifecycle are evaluated in some way, by testing, inspecting, analyzing, and taking it back to the customers and users.
The entire lifecycle, especially the prototyping and evaluation activities, is supplemented and guided by UX goals, metrics, and targets, as described in Chapter 10.
As you will see, this is not a lifecycle that must be followed arbitrarily, nor must any particular activity, sub-activity, or iteration be performed-this is just a template showing all the possibilities. Each of these activities and many of the more specific sub-activities correspond to one or more process-oriented chapters, among Chapters 3 through 19, of this book.
2.2.1 Lifecycle Terminology
Each of the four UX process activities in Figure 2-2 can have sub-activities, the major ways to do the basic activities. As an example, for the analysis activity, possible sub-activities include contextual inquiry (Chapter 3), contextual analysis (Chapter 4), requirements extraction (Chapter 5), and contextual data modeling (Chapter 6).
A method is a general approach to carrying out an activity or sub-activity.
For example, lab-based evaluation (Chapters 12 and 14 through 17) is a method for the evaluation activity. A technique is a specific practice applied within a
method. For example, the "think-aloud" technique is a data collection technique that can be used within the lab-based evaluation method for the evaluation activity.
2.2.2 UX Process Activities
Analyze: Understanding the business domain, user work, and user needs
The left-most of the four basic activity boxes in Figure 2-2 represents the analysis process activity. Among the many possible sub-activities to support analysis are contextual inquiry (Chapter 3) and contextual analysis (Chapter 4) for studying customer and user work practice in situ, from which we can infer user needs for a new system design.
Extracting requirements (Chapter 5) from contextual data is another analysis sub-activity. The requirements, if you choose to use them, are interaction design requirements, inputs driving the design process and helping to determine its features and the look, feel, and behavior of the interaction design. These requirements are used as a checklist to ensure that they are covered in the design, even before any UX evaluation.
Finally, synthesizing design-informing models is yet another possible analysis sub-activity. Design-informing models (Chapter 6) are abstractions of different dimensions of the work activity and design space. If you choose to use them, these include models describing how work gets done, how different roles in the work domain interact, the artifacts that are created, and so on.
Design: Creating conceptual design, interaction behavior, and look and feel
The upper-most box in Figure 2-2 represents the process activity for design, including redesign for the next version. Among the possible sub-activities to support design are design ideation and sketching (Chapter 7), where the team does creative design thinking, brainstorming, and sketching of new design ideas. Design ideation leads to the representation of mental models, conceptual design, and design storyboards. During the exploration of large numbers of design candidates, it can include physical mockups of product design ideas.
is a design sub-activity involving the details of applying requirements, design-informing models, and envisioned design-informing models to drive and inform the emerging interaction design. Design production entails prototyping and iteration of the conceptual design, intermediate
designs, and detailed designs.
Prototype: Realizing design alternatives
The right-most of the four basic activity boxes in Figure 2-2 represents the prototyping process activity. Prototype building is often done in parallel with, and in conjunction with, design. As designs evolve in designers' minds, they produce various kinds of prototypes as external design representations. Because prototypes are made for many different purposes, there are many kinds of prototypes, including horizontal, vertical, T, and local. Prototypes are made
at many different levels of fidelity, including low fidelity (especially paper prototypes), medium fidelity, and high fidelity (programmed functional prototypes), and "visual comps" for pixel-perfect look and feel.
Evaluate: Verifying and refining the interaction design
The process activity box at the bottom of Figure 2-2 represents the UX evaluation to refine an interaction design. For evaluation to refine, you can employ rapid evaluation methods (Chapter 13) or fully rigorous methods (Chapters 12 and 14 through 17). This evaluation is where wesee if we achieved the UXtargets andmetrics to ensure that the design "meets usability and business goals" (ISO 13407, 1999).
2.2.3 Flow among UX Process Activities
Flow not always orderly
The depiction of UX process activities in distinct boxes, as in Figure 2-2, is a convenient way to highlight each activity for discussion and for mapping to chapters in this book. These process activities, however, do not in practice have such clear-cut boundaries; there can be significant overlap. For example, most of the boxes have
their own kind of evaluation, if only to evaluate the transition criterion at the exit point of each activity in the decision whether to iterate or move on.
Similarly, prototyping appears in many forms in other boxes, too. For example, the design activity entails lots of different kinds of prototypes, including sketches, which can be thought of as a kind of quick and dirty prototype to support rapid and frequent design idea exploration. In this same vein there can be a little design occurring within the analysis activity, and so on.
Managing the process with activity transition criteria
The primary objective of the overall lifecycle process is to keep moving forward and eventually to complete the design process and make the transition to production. However, for the work in a project to flow among the UX process activities, the team must be able to decide:
� when to leave an activity
� where to go after any given activity
� when to revisit a previous process activity
� when to stop making transitions and proceed to production
The answers depend on the transition criterion at the end of each process activity. There is no formula for determining transition criteria; they are generally based on whether designers have met the goals and objectives for the current iteration of that activity. Therefore, it is the job of the team, especially the project manager, to articulate those goals as transition criterion for each process activity and to decide when they are met.
For example, in the analysis activity, designers must ask themselves if they have acquired enough understanding of the work domain and user needs, usage context, workflow, and so on. Another component of any transition criterion is based on whether you have adequate resources remaining to continue.
Resources limits, especially time and budget, can trump any other criteria for stopping a process activity or terminating the whole process, regardless of meeting goals and objectives.
Note in Figure 2-2 that the transition criterion coming out of each UX process activity box is a multipath exit point with three options: move forward to the next process activity, iterate some more within the current activity, or move back to a previous process activity.
The decision of where to go next after a given process activity depends on the assessed quality of the product and/or work products of the current activity and a determination of what next activity is most appropriate. For example, after an initial prototyping activity, a usability inspection might indicate that the design is
ready for prototyping at a higher fidelity or that it is necessary to go back to design to fix discovered problems.
Knowing when you need inter-activity iteration depends on whether you need to pick up more information to drive or inform the design. When some of your inputs are missing or not quite right, you must revisit the corresponding process activity. However, this kind of inter-activity iteration does not mean you have to redo the whole activity; you just need to do a little additional work to get what you need.
Knowing when to stop iteration and proceed to production lies in a key process management mechanism. When UX targets (Chapter 10), often based on evaluation of user performance or satisfaction, have been employed in your process, project managers can compare evaluation results with target values and decide when to stop iterating (Genov, 2005).
Why do we even need iteration?
Iteration is a little like the doctrine of original sin in interaction design: Most interaction designs are born bad and the design teams spent the rest of their lifecycles in an iterative struggle for redemption.
- Ford Perfect
Figure 2-3
Iteration: Ready, fire, aim.
Some people may question the need for iteration. Is not that just for novice designers who cannot get it right the first time? What about expert designers carefully applying complete knowledge of design guidelines and style standards? For any nontrivial interaction design, the UX process must be, and always will need to be, iterative. The design domain is so vast and complex that there are essentially infinite design choices along many dimensions, affected by large numbers of contextual variables.
To be sure, expert designers can create a good starting point, but because it is fundamentally impossible to get it all just right the first time, we need to use the artillery approach (Figure 2-3): Ready, Fire, Aim. We need to fire off our best shot, see how it missed the mark, and make corrections to home in on the target.
Iteration is not enough
The road to wisdom? Well, it's plain and simple to express: Err and err and err again but less and less and less.
- Piet Hein, Danish poet
So, if we must always iterate, is there any motivation for trying hard to get the first design right? Why not avoid the effort upfront and let this marvel of iteration evolve it into perfection? Again, the
answer is easy. You cannot just test your way to a quality user experience, you have to design for it. Iterative testing and redesign alone will not necessarily get you to a good design at the end of the day.
As Wyatt Earp once said, "Take an extra second to aim." Large interactive systems take a lot of time and money to develop; you might as well put
a little more into it up front to make it right. Without an honest and earnest upfront analysis and design effort, the process tilts too heavily toward just evaluation and becomes a unidimensional diagnostic-oriented process.
To use a truly geeky example, consider a program traversing an
n-dimensional surface, seeking a solution to a numerical analysis problem. If the search starts with the wrong "seed" or initial point (i.e., an initial solution that is too far from the actual solution), the algorithm might stop at a local optimum that is in a part of the search space, such as a saddle point, so remote from the optimal solution, if there is one, that you can never migrate out by any amount of iteration to get at a much better globally optimal solution. Similarly, in iterative interaction design, you can home in on the best details of a less-than-best design-honing a paring knife when you really need a butcher knife. Fixing the details of the bad design may never reveal the path to a completely new and better overall design.
So, the answer is about balance of all four process activities of Figure 2-1- analyze, design, implement, and evaluate-for a given amount of resources.
Start iteration early
The earlier the interaction design iteration begins, the better; there is no time to steer the ship when it is close to the dock. But the software implementation does not have to keep up with this iteration; instead we use interaction design prototypes, and there is no reason any production code should be committed to the interaction design until late in its lifecycle. Nevertheless, because the two roles cannot work in isolation, the software engineering people should be aware of the progression of the interaction design to ensure that their software architecture and design can support the interaction features on the user interface when it does come time to implement.
Typically, early cycles of iteration are devoted to establishing the basic underlying essentials of the design, including look and feel, and behavior, before getting into design details and their refinement. Project managers need to allow time for this explicitly in the schedule. It is an investment that pays generous dividends on everything that happens after that.
The rest of the process-related part of this book is mainly about iterating the activities in the diagram of Figure 2-2, plus a few other things in important supporting roles.
2.2.4 Lifecycle Streams
We mostly talk about complete lifecycles, where there is a clear-cut start and end to the project and where the design ideas are hatched creatively out of the imaginations of the designers. In reality that is often not the case. Often the "lifecycle" for a product never really starts or stops; it just goes on forever (or at least seems to) through multiple versions. Operating systems, such as Mac OS X and Microsoft Windows, are good examples.
The lifecycle is more of a continuous stream of reusing and, hopefully, improving ideas, designs, and deliverables or work products. In such cases the project can be heavily constrained by previously existing versions, code, and documentation. The need for stability and an orderly progression across versions makes it almost impossible to avoid the kind of inertia that works against new designs and radical rethinking. It is important for UX practitioners to make the case for at least the most important changes, changes that contribute to an eventual design evolution toward user experience improvement.
2.3 CHOOSING A PROCESS INSTANCE FOR YOUR PROJECT
Increasingly, the need to rush products to market to beat the competition is shortening development schedules and increasing the number of product versions and updates. Web applications must be developed in "Internet time." Ponderous processes and methods are abandoned in favor of lightweight, agile, and flexible approaches intended to be more responsive to the market-driven need for short product versioning cycles. Abridged methods notwithstanding, however, knowledge of the rigorous UX process is an essential foundation for all UX practitioners and it is important for understanding what is being abridged or made agile in choices for the shorter methods.
The lifecycle process diagram in Figure 2-2 is responsive to the need for many different kinds of UX processes. Because it is a template, you must instantiate it for each project by choosing the parts that best suit your project parameters. To support each of these activities, the team can pick from a variety of sub-activities, methods, techniques, and the level of rigor and completeness with which these activities are carried out. The resulting instantiation can be a heavyweight, rigorous, and complete process or a lightweight, rapid, and "just enough" process.
That choice of process can always be a point of contention-between academics and practitioners, between sponsor/customer and project team, and
among team members within a project. Some say "we always do contextual inquiry" (substitute any UX process activity); they value a thorough process, even if it can sometimes be costly and impractical. Others say "we never do contextual inquiry (or whatever process activity); we just do not have the time"; they value doing it all as fast as possible, even if it can sometimes result in a lower quality product, with the idea of improving the quality in later production releases.
Much has been written about powerful and thorough processes and much has been written about their lightweight and agile counterparts. So how do we talk about UX design processes and make any sense?
2.3.1 Project Parameters: Inputs to Process Choices
In reality there are as many variations of processes as there are projects. How do you decide how much process is right for you? How do you decide the kinds of process to choose to match your project conditions? What guidance is there to help you decide? There are no set rules for making these choices. Each factor is an influence and they all come together to contribute to the choice. The lifecycle template in this chapter and the guidelines for its instantiation are a framework within which to choose the process best for you.
Among the many possible factors you could consider in choosing a process to instantiate the lifecycle template are:
� risk tolerance
� project goals
� project resources
� type of system being designed
� development organizational culture
� stage of progress within project
One of biggest goal-related factors is risk and the level of aversion to risk in a given project. The less tolerance for risks-of things going wrong, of features or requirements being missing, or not meeting the needs of users-the more need for rigor and completeness in the process.
Budget and schedule are obvious examples of the kinds of resource limitations that could hinder your process choices. Another important kind of resource is person power. How many people do you have, what project team roles can they fill, and what skills do they bring to the project? Are the types of people you have and are their strengths a good match for this type of project?
Practitioners with extensive experience and maturity are likely to need less of some formal aspects of the rigorous process, such as thorough contextual inquiry or detailed UX goals and targets. For these experienced practitioners, following the process in detail does not add much to what they can accomplish using their already internalized knowledge and honed intuition.
For example, an expert chef has much of his process internalized in his head and does not need to follow a recipe (a kind of process). But even an expert chef needs a recipe for an unfamiliar dish. The recipe helps off-load cognitive complexity so that the chef can focus on the cooking task, one step at a time.
Another project parameter has to do with the demands due to the type of system being designed. Clearly you would not use anything like the same lifecycle to design a personal mp3 music player as you would for a new air traffic control system for the FAA.
Sometimes the organization self-selects the kind of processes it will use based on its own tradition and culture, including how they have operated in the past. For example, the organization's market position and the urgency to rush a product to market can dictate the kind of process they must use.
Also, certain kinds of organizations have their culture so deeply built in that it pre-determines the kinds of projects they can take on. For example, if your organization is an innovation consulting firm such as IDEO, your natural process tools will be predisposed toward ideation and sketching. If your organization is a government contractor, such as Northrup-Grumman, your natural process tools will lean more toward a rigorous lifecycle.
Somewhat orthogonal to and overlaid upon the other project parameters is the current stage of progress within the project for which you must choose activities, methods, and techniques. All projects will go through different stages over time. Regardless of process choices based on other project parameters, the appropriateness of a level of rigor and various choices of UX methods and techniques for process activities will change as a project evolves through various stages.
For example, early stages might demand a strong focus on contextual inquiry and analysis but very little on evaluation. Later stages will have an emphasis on evaluation for design refinement. As the stage of progress keeps changing over time, it means that the need to choose a level of rigor and the methods and techniques based on the stage of product evolution is ongoing. As an example, to evaluate an early conceptual design you might choose a quick design review using a walkthrough and later you might choose UX inspection of a low-fidelity prototype or lab-based testing to evaluate a high-fidelity prototype.
2.3.2 Process Parameters: Outputs of Process Choices
Process parameters or process choices include a spectrum from fully rigorous UX processes (Chapters 3 through 17) through rapid and so-called discount methods. Choices also can be made from among a large variety of data collection techniques. Finally, an agile UX process is available as an alternative choice for the entire lifecycle process, a process in which you do a little of each activity at a time in a kind of spiral approach.
2.3.3 Mapping Project Parameters to Process Choices
To summarize, in Figure 2-4 we show the mapping from project parameters to process parameter choices. While there are some general guidelines for making these mapping choices, fine-tuning is the job of project teams, especially the project manager. Much of it is intuitive and straightforward.
In the process chapters of this book, we present a set of rather rigorous process activities, but we want the reader to understand that we know about
Figure 2-4 Mapping project
parameters to process
parameter choices.
real-world constraints within tight development schedules. So, everywhere in this book, it should be understood that we encourage you to tailor your own process to each new project, picking and choosing process activities and techniques for doing them, fitting the process to your needs and constraints.
2.3.4 Choose Wisely
A real-world Web-based B2B software product company in San Francisco had a well-established customer base for their large complex suite of tools. At some point they made major revisions to the product design as part of normal growth of functionality and market focus. Operating under at least what they perceived as extreme pressure to get it to the market in "Internet time," they released the new version too fast.
The concept was sound, but the design was not well thought through and the resulting poor usability led to a very bad user experience. Because their customers had invested heavily in their original product, they had a somewhat captive market. By and large, users were resilient and grumbled but adapted. However, their reputation for user experience with the product was changing for the worse and new customer business was lagging, finally forcing the company to go back and completely change the design for improved user experience. The immediate reaction from established customers and users was one of betrayal. They had invested the time and energy in adapting to the bad design and now the company changed it on them-again.
Although the new design was better, existing users were mostly concerned at this point about having a new learning curve blocking their productivity once again. This was definitely a defining case of taking longer to do it right vs. taking less time to do it wrong and then taking even longer to fix it. By not using an effective UX process, the company had quickly managed to alienate both their existing and future customer bases. The lesson: If you live by Internet time, you can also crash and burn in Internet time!
2.4 THE SYSTEM COMPLEXITY SPACE
One of the things that makes it difficult to define a process for system design is that there is a spectrum of types of systems or products to be developed, distinguished mainly by complexity, each needing a somewhat different process and approach. In the next few sections we look at what is entailed in understanding this spectrum of system types.
Some systems are a combination of types and some are borderline cases. System or product types overlap and have fuzzy boundaries within the system complexity space. While there undoubtedly are other different ways to partition the space, this approach serves our purpose.
In Figure 2-5 we show such a "system complexity space" defined by the dimensions of interaction complexity and domain complexity. Interaction complexity, represented on the vertical axis, is about the intricacy or elaborateness of user actions, including cognitive density, necessary to accomplish tasks with the system.
Low interaction complexity usually corresponds to smaller tasks that are generally easy to do on the system, such as ordering flowers from a Website. High interaction complexity is usually associated with larger and more difficult tasks, often requiring special skills or training, such as manipulating a color image with Adobe Photoshop.
On the horizontal axis in Figure 2-5 we show work domain complexity, which is about the degree of intricacy and the technical nature of the corresponding field of work. Convoluted and elaborate mechanisms for how parts of the system work and communicate within the ecology of the system contribute to domain complexity.
Figure 2-5
Example systems within the system complexity space (interaction complexity vs. domain complexity).
The work in domain-complex systems is often mediated and collaborative, with numerous "hand-offs" in a complicated workflow containing multiple dependencies and communication channels, along with compliance rules, regulations, and exceptions in the way work cases are handled. Examples of complex work domains include management of arcane financial instruments such as credit default swaps, geological fault analysis for earthquake prediction, and healthcare systems.
Low work domain complexity means that the way the system works within its ecology is relatively simple. Examples of work domains with low complexity include that same Website for buying flowers and a simple calendar management application.
2.4.1 The Influence of System Type on Process Choice
The location of the system or product you are designing within the system complexity space can have a major influence on process choices about the right level of rigor and the right techniques to apply. To describe the criteria UX designers can use to make the call, we look at characteristics of the four quadrants of the system complexity space in Figure 2-5.
As we move along the diagonal through this space from lower left to upper right, going from simple systems to complex systems, there is (as a generalization) a gradation of required rigor and fidelity in the corresponding processes. The quadrants are discussed in the following subsections.
Complex interaction, complex work domain
In the upper right-hand quadrant of Figure 2-5 we show the interaction-complex and domain-complex systems, which are usually large and complicated. An example of a complex interaction is an air traffic controller deciding the landing orders for an incoming airliner. An air traffic control system also has enormous domain complexity, with workflow and collaboration among a large number of work roles and user types. Another defining example for this quadrant is a large system for the Social Security Administration.
Systems appearing in this quadrant are often associated with the greatest need to manage risk. Such projects will usually entail doing all the process activity boxes in detail, along with lots of iteration. These are the development projects with the greatest compliance requirements, the most weight given to traceability, and the highest importance of error avoidance.
For example, in mission-critical systems, such as for air traffic control or for military weapons control, there is great user pressure for error avoidance. When you cannot get these things wrong and the cost of failure is unacceptable, you need the most rigorous process, the full process spelled out in great detail in Chapters 3 through 18.
Because of their sheer size and need for rigorous processes, domain-complex and interaction-complex systems are typically among the most difficult and expensive to design and develop. A decidedly engineering approach to formal requirements can be very important to help designers touch all the bases and fill in all the blanks so that no functions or features are forgotten.
This is the kind of system for which design is most likely to need full lab-based user experience evaluation and iteration to produce a well-considered and coherent overall design. This is about the design of serious systems; this sector within the system complexity space has little, if anything, to do with emotional impact factors such as aesthetics, fun, or joy of use.
For large domain-complex systems, such as military weapons systems, you are most likely to encounter resistance to innovation. Radical designs are not always welcome; conformity can be thought more important. User and operators, in some cases, commit operations to habit and perform tasks with learned behavior even if there are better ways. This might be an opportunity for you to champion change and fight against the "this is not how we do it" syndrome, but you must approach this campaign with caution.
Gaming applications can be in this quadrant but they also can span both axes throughout the space.
Usability Engineering for Bioinformatics: Decoding Biologists' and Bioinformaticians' Work Sequences
Deborah Hix and Joe Gabbard, Virginia Bioinformatics Institute and Department of Computer Science, Virginia Tech
Over a collective four decades in usability engineering (UE), we have worked in a broad variety of application
domains including military (e.g., decision support systems, situational awareness applications), government (e.g., Social Security Administration), and commercial (e.g., software and hardware companies). The realm of bioinformatics
is as complicated as any other single domain we have encountered. This is at least in part because of its fast-changing nature, the current explosion of genomic and related data, the complexity of the field itself, and the technology backgrounds and attitudes of biologists and bioinformaticians.
When we began working in the Virginia Bioinformatics Institute (VBI) at Virginia Tech, approximately 8 years ago, there was almost no knowledge of the existence of usability engineering, never mind any structured use of it in developing complex bioinformatics applications. During this time, we have seen a slight increase in UE efforts in this field, but many (with the exception of large government-funded) Web-based interfaces still look like they were created by graduate students!-a nonoptimal situation in a world of increasingly interactive and sophisticated Web interfaces and apps.
Designing and evaluating user interfaces for biologists and bioinformaticians are challenging in part due to the increasing availability of inexpensive genome sequencing technology, resulting in an explosion of data-in volume, complexity, and heterogeneity. Today at the lab workbench, biologists have access to a staggering flow of data of unprecedented breadth, depth, and quantity.
Further, biologists rarely use a single tool to accomplish a given task; they frequently move data across applications and tools using, for example, desktop-based applications (such as Excel) as well as Web-based resources (such as NCBI's BLAST). So, by necessity, a single technology source or tool or app cannot support their workflow, as their workflow is typically accomplished across multiple applications, Websites, and/or tools. This situation emphasizes the importance of good contextual/domain analysis and design in the UE process.
We have also seen that applications and Websites for biologists and bioinformaticians often need to support a broad variety of multiple work threads for an extensive variety of user classes. That is, the bioinformatics field intersects many specialized disciplines, and as a result, there are numerous types of user classes, each performing varied and mutually exclusive tasks. Moreover, users in this field often solve the same problem using different approaches, increasing the number of possible workflows (including specific data and tools needed) for each task. A single huge online data repository could have more than half a dozen (or even many more) very different user classes, all with different use cases and specific work flows. This situation emphasizes the importance of good user profiles in the UE process.
Finally, biologists are not necessarily early adopters of information technology. They are well versed in cutting- edge biology, but not cutting-edge computer technology. Many have, of necessity, done their own specialized application or Website development, becoming "expert enough" in tools such as scripting and Perl. This is also changing; biologists are relying less on programming- or scripting-savvy approaches. The more advanced their tools and analysis needs get, the more biologists rely on someone else's bioinformatics or software development skills to meet their needs. In today's Web 2.0 application space, most biologists want Web-based applications that support performance of very complicated user tasks without having to do (or oversee) scripting or programming themselves.
When we began in this field all those years ago, we had several approaches to introducing and promoting acceptance of UE activities into a VBI world devoid-and unaware-of them. These included immersion, "starting small," and education.
We made sure our offices were colocated with the group (of biologists and software developers) with which we were working so that we could immerse ourselves and be ever present with them. Otherwise, we might have been
viewed as "a priest with a parachute," flying in to "bless" what they had done, but having little or no substantive input to either process or product. We carefully chose a small part of the UE process to perform on a small part of our product, a Web repository named PAThosystems Resource Integration Center (PATRIC), funded by the National Institutes of Health (patric.vbi.vt.edu). Choosing what part of the product with which to begin, UE should be based on a feature or function that is very important, of high visibility, and/or of high utility to users; preferably something with a "wow" factor that will make a splash. Choosing what small part of the process with which to begin should also be based on factors such as availability of appropriate users with whom to work (these may be very difficult to
come by early on in an environment that has little or no UE in place, such as VBI) and current state of development of the product.
Our first substantive small UE activity was an expert evaluation (or expert inspection) of an existing in-house product that was being used to inform development of PATRIC. We chose this knowing we did not have a readily available pool of users for either domain analysis activities or a lab-based formative evaluation and that an expert evaluation did not need them. We were extremely careful in how we wrote our expert evaluation report so as not to alienate software engineers, who, to date, had designed all VBI user interfaces, with little or no interaction with users. During this time, we began to cultivate a PATRIC user group of appropriate biologists and bioinformaticians, and moved on to structured interviews and focus group-like sessions that would lead to domain analysis and user profiles. In addition to getting us much-needed information for UE, these sessions also helped expose users and developers to the UE process in a nonthreatening way. After several months, we were able to develop wireframe mockups and present them to some of our initial users, plus other stakeholders who had not been involved in domain analysis. For these earliest formative evaluations, we engaged both in-house users and remote users; for remote users, we used desktop-sharing software to present wireframes and semiworking prototypes to elicit feedback. In addition to this carefully chosen progression of UE activities, we had cooperative management who agreed to provide education; every member of the PATRIC team was required to take a 3-day intensive short course on UE.
Finally, we found that patience and persistence were nontechnical but key ingredients in this progression! It took many months to slowly and carefully insert UE practices into the PATRIC software development environment. When we encountered roadblocks, both passive aggressive and outright aggressive, we would regroup, figure out a different way to proceed, and continue moving forward. We promoted our "success stories" among the group and tried to make everyone feel continually and substantively involved in the process. We had a major breakthrough when, one day, our meeting discussion turned to some topic specifically related to user interface design, and the lead software engineer looked directly at us and announced, "That is Debby and Joe's problem!" They finally got it!
Simple interaction, complex work domain
In the lower right-hand quadrant of Figure 2-5 we show interaction-simple and domain-complex systems. In this quadrant, user tasks are relatively simple and easy to understand. The key effort for users in this quadrant is understanding the domain and its often esoteric work practice. Once that is understood, the interaction is relatively straightforward for users. Tax preparation software for
average households is a good example because the underlying domain is complex but the data entry into forms can be simplified to a step-by-step process.
In the UX process, interaction simplicity means that less attention to tasks descriptions is needed, but the domain complexity calls for more attention to contextual inquiry and analysis, modeling, and requirements for insight into internal system complexity and workflow among multiple work roles. Physical modeling and the social model of Chapter 6 become more important to gain access to the essentials of how people and information interact within the system.
Simple interaction, simple work domain
The simplest quadrant is in the lower left-hand corner of Figure 2-5, where both interaction and work domain are simplest. This quadrant contains smaller Websites, certain interactive applications, and commercial products. Just because this is the simple-simple quadrant, however, does not mean that the products are simple; the products of this quadrant can be very sophisticated.
Although emotional impact factors do not apply to every system or product in this quadrant, this sector within the system complexity space has the most to do with emotional impact factors such as aesthetics or fun or joy of use. This quadrant also represents projects that are design driven, where the UX process is all about design rather than user research or user models.
There is an abundance of relatively simple systems in the world. Some, but not all, commercial software products are domain-simple and interaction- simple, at least relative to large systems of other types. An example, shown in Figure 2-5, is a Website for ordering flowers. Interaction with this Website is very simple; just one main task involving a few choices and the job is done.
Work domain complexity of a Website for buying flowers is also relatively simple because it involves only one user at a time and the workflow is almost trivial.
Because of the simplicity in the work domain and interaction in this quadrant, good choices for a UX process lean toward agile approaches with a focus on design and special rapid methods for evaluation. That translates to a low level of rigor; leaving out some process activities altogether and using lightweight or specialized techniques for others.
The best designers for expert users in this case might be "dual experts," experts in HCI/UX and in the work domain. An example is a designer of Adobe Lightroom who is also deeply involved in photography as a personal hobby.
This quadrant is also where you will see innovative commercial product development, such as for an iPhone or a personal mp3 music player, and corresponding emotional impact issues and, where appropriate (e.g., for an
mp3 personal music player but not for a florist's Website), phenomenological aspects of interaction.
These products represent the least need for a complete rigorous lifecycle process. Designers of systems in this quadrant need not expend resources on upfront user research and analysis or requirements gathering. They can forego most of the modeling of Chapter 6 except, perhaps, specific inquiry about users and their activities, with a special interest in user personas.
Although commercial product design certainly can benefit from thorough contextual inquiry, for example, some successful products were essentially "invented" first and then marketed. The Apple iPad is a good example; the designers did not begin within a field study of existing usage patterns. They dreamed up a product that was so good that people who thought they would never be interested in such a product ended up fervently coveting one.
Projects in this quadrant are far less engineering oriented; design will be based almost entirely on a design-thinking approach. Designers are free to focus on imaginative design thinking, ideation, and sketching to make the user experience the best it can be. Processes for this type of system are usually faced with low risks, which means designers can put innovation over conformity-for example, the iPod over previous music players-and are free to envision radically new design ideas.
Early prototyping will center on multiple and disposable sketches for exploring design ideas. Later, low-fidelity prototypes will include paper prototypes and physical mockups. Subsequent evaluation will be about using rapid methods to get the conceptual design right and not being very concerned with user performance or usability problems.
Complex interaction, simple work domain
In the upper left-hand quadrant of Figure 2-5 we show interaction-complex and domain-simple systems. It is typical of an interaction-complex system to have a large volume of functionality resulting in a large number and broad scope of complex user tasks. A digital watch is an example. Its interaction complexity stems from a large variety of modal settings using overloaded and unlabeled push buttons. The domain, however, is still simple, being about "what time is it?" Workflow is trivial; there is one work role and a simple system ecology.
Attention in this quadrant is needed for interaction design-myriad tasks, screen layouts, user actions, even metaphors. Rigorous formative evaluation is needed for conceptual design and detailed interaction. The focus of modeling will be on tasks-task structure and task interaction models-and perhaps the
artifact model, but not much attention will be given to work roles, workflow, or most of the other models of Chapter 6.
For simple work domains, regardless of interaction complexity, contextual inquiry and contextual analysis rarely result in learning something totally new that can make a difference in informing design. Rather, even more than for a simple interaction case, complex interaction requires a focus on ideation and sketching, as well as attention to emotional impact factors.
The commercial product perspective within the system complexity space
"Commercial products" is a good label for the area that spans the left-hand side of the system complexity space diagram in Figure 2-5, where you find relatively low domain complexity but variable interaction complexity. The more interaction complexity, the more sophisticated users can be.
Gradations within the system complexity space
Many systems and design projects fall across quadrants within the system complexity space. Websites, for example, can belong to multiple quadrants, depending on whether they are for an intranet system for a large organization, a very large e-commerce site, or just a small site for sharing photographs. Products such as a printer or a camera are low in domain complexity but can have medium interaction complexity.
One good illustration of complexity vs. process rigor is seen in systems for managing libraries, shown in the middle of the work domain complexity scale of Figure 2-5, near the bottom. Typical library systems have low interactioncomplexity because the scope of tasks and activities for any one user is fairly circumscribed and straightforward and the complexity of any one user task is low. Therefore, for a library system, for example, you do not need to model tasks too much.
However, a full library system has considerable domain complexity. The work practice of library systems can be esoteric and most UX designers will not be knowledgeable in this work domain. For example, special training is needed to handle the surprisingly important small details in cataloging procedures. Therefore, a rigorous approach to contextual inquiry and analysis may be warranted.
Because of the high work domain complexity, there is a need for thorough contextual data modeling to explain how things work in that domain. As an example, the overall workflow entails book records connected in a network, including cataloguing, circulation tracking, searching, and physical shelf location. A full flow model may be necessary to understand the flow of information among the subsystems.
Healthcare systems are another example of projects that cross system complexity space quadrants. Large healthcare systems that integrate medical instrumentation, health record databases, and patient accounting are another example of systems with somewhat complex work domains.
The healthcare domain is also saddled with more than its share of regulation, paperwork, and compliance issues, plus legal and ethical requirements-all of which lead to high work domain complexity, but not as high as air traffic control, for example. Machines in a patient's room have a fairly broad scope of tasks and activities, giving them relatively high interaction complexity.
We refer to the system complexity space throughout the rest of the process chapters in discussions about how much process is needed. For simplicity we will often state it as a tradeoff between systems with complex work domains, which need the full rigorous UX process and systems with relatively simple work domains, which need less rigor but perhaps more attention to design thinking and emotional impact.
Since simple work domains correspond roughly to the left-hand side of the system complexity space of Figure 2-5, where most commercial products are found, we will often use the term "commercial products" as a contrast to the complex domain systems, even though it is sometimes possible for a commercial product to have some complexity in the work or play domain.
2.5 MEET THE USER INTERFACE TEAM
Whatever you are, be a good one.
- Abraham Lincoln
One early stage activity in all interactive software projects is building the UX team. Someone, usually the project manager, must identify the necessary roles and match them up with available individuals. Especially in small projects, the different roles are not necessarily filled with different people; you just need to maintain the distinction and remember which role is involved in which context and discussion.
In addition to the software engineering roles, here we are mainly concerned with roles on the UX team. Roles we can envision include the following:
� User researcher: involved with contextual inquiry and other work domain analysis activities. You may also need other roles even more specialized,
such as a social anthropologist to perform in-depth ethnographic field studies.
� Users, user representatives, customers, and subject matter experts: used as information sources in contextual inquiry and throughout the lifecycle.
� User interaction designer: involved with ideation and sketching, conceptual and detailed design, and low-fidelity prototyping activities.
� UX analyst or evaluator: involved in planning and performing UX evaluations, analyzing UX problems, and suggesting redesign solutions.
� Visual/graphic designer: involved in designing look and feel and branding and helping interaction designers with visual aspects of designs.
� Technical writer: involved in documentation, help system design, and language aspects of interaction designs.
� Interactive prototype programmer: involved in programming interactive high- fidelity UX design prototypes.
� UX manager: someone with overall responsibility for the UX process.
Figure 2-6
Example UX team roles in the context of the Wheel lifecycle template.
Some of these roles are shown with respect to the lifecycle activities in Figure 2-6.
Often terms for team roles are used loosely and with overlap. For example, "UX engineer" or "UX practitioner" are catch-all terms for someone who does contextual analysis, design, and evaluation on the UX side.
As a further consideration, in many projects, team composition is not static over the whole project. For example, people may come and go when their special talents are required, and it is not unusual for the team to get smaller near the end of the lifecycle. Often near the end of the version or release cycle, much of project team gets reassigned and disappears and you get a possibly new and much smaller one, with a much shorter remaining lifecycle.
2.6 SCOPE OF UX PRESENCE WITHIN THE TEAM
In the early days of usability it was often assumed that a usability practitioner was needed only in small doses and only at certain crossroads within the project schedule, resulting in a rough and frustrating life for the usability person in the trenches. In project environments, they were treated as temp workers with narrow purviews and meager responsibilities, getting no real authority or respect.
Software developers grudgingly let the usability practitioner, who was probably a human factors engineer, look at their designs more or less after they were done. Because they were not a bona fide part of the project, they played a secondary role, something like a "priest in a parachute": The human factors engineer dropped down into the middle of a project and stayed just long enough to give it a blessing. Anything more than a few minor changes and a blessing was, of course, unacceptable at this point because the design had progressed too far for significant changes.
2.7 MORE ABOUT UX LIFECYCLES
Just as a lifecycle concept did not always exist in the software development world, the need for a separate development lifecycle for the interaction design has not always been recognized. Moreover, once a lifecycle concept was introduced, it took time for the idea to be accepted, as it had done for software in prior decades.
The Hix and Hartson book (1993) was one of the first to emphasize a separate lifecycle concept for interaction design. Among early calls to arms in this evolutionary struggle to establish acceptance of a disciplined usability process were pleas by Curtis and Hefley (1992). They argued that "interface engineering," as they called it, required an engineering discipline just like any
other: "All engineering disciplines, including interface engineering, require the definition of a rigorous development process."
Hefley and friends followed this up with a CHI '96 workshop asking the question, User-centered design principles: How far have they been industrialized? (McClelland, Taylor, & Hefley, 1996). They concluded that the field was, indeed, evolving toward acceptance, but that there was still a lack of understanding of the interaction design process and a shortage of skills to carry it out. Raising awareness within management and marketing roles in the software world was a priority. Mayhew (1999b) helped solidify the concept with practitioners through a pioneering tour de force handbook-style codification of lifecycle activities and deliverables.
Usability engineering as a term and as a concept was coming into existence in the early 1990s. In his celebratory 1996 retrospective, Butler (1996) attributed the actual coining of the term "usability engineering" to John Bennett in the 1980s. Here, Butler provided a review of the discipline's state of the art as it began to mature after the first 10 years and argued for a need to integrate usability engineering using a "comprehensive integrated approach to application development."
Nielsen (1992b) had already been talking about the increasing importance of computer-user interfaces and the need to make them usable by using "a systematic usability effort using established methods." He proposed a usability engineering model that included fundamental usability tenets such as "know thy user" and advocated an iterative refinement of the interaction design.
This model proposed different phases of the UX lifecycle: pre-design, design, and post-design with corresponding activities such as understanding overall work context, understanding intended users, setting usability goals, and undertaking iterative testing. Nielsen (1993) later elaborated these ideas into one of the first usability engineering textbooks.
Whitney Quesenbery (2005) describes how the ISO 13407 standard (1999) reflected the "general industry approach to UCD" at the time. It describes four principles of user-centered design, including "active involvement of customers (or those who speak for them)," but apparently did not speak for the users directly.
This standard also made a strong point in favor of not just the principle of using an iterative cycle, but of the need to plan to allow time for iteration in practice. In its central focus on process, the standard prescribes five process activities, starting with planning for UCD, followed by an iterative cycle of specifying context of use, specifying requirements, producing design solutions, and evaluating designs, as seen in Figure 2-7.
Despite the name user-centered design, this cycle does not give much focus to design as a separate activity, but rolls it in with implementation in the "produce design solutions" box. Nonetheless, the ISO standards were timely and gave a real boost to the credentials of UCD processes to follow.
2.7.1 Much More Than Usability Testing: The Need for a Broad Lifecycle Process
As usability slowly emerged as a goal, thinking about methods to achieve it was at first slow to follow. Everyone vaguely knew you had to involve users somehow, that it helped to follow a style guide, and that you definitely had to do usability testing. Armed with just enough of this knowledge to be dangerous, the budding new usability specialists plunged in, not knowing enough to know what they did not know. But to be effective, especially cost-effective, our heroes needed help in using the right technique at the right time and place.
Without an established lifecycle concept to follow, those concerned with user experience were coming up with their own, often narrow, views of user experience methods:"silverbullet" theoriesthatdeclareallyouhavetodoiscontextualinquiry, just test a lot, do everything to "empower users," be object oriented, and so on.
Figure 2-7
Lifecycle diagram from the ISO 13407
standard, adapted with permission.
The most broadly fashionable of these uni-dimensional schemes was to equate the entire process with testing, setting usability in a purely diagnostic frame of reference. In response, participants in the CHI '96 workshop mentioned in the previous section felt it important to make the point: "Usability testing and evaluation make contributions to product quality, but testing alone does not guarantee quality." They contended that approaches using only post hoc testing should be expanded to incorporate other UCD activities
into earlier parts of the UX process.
Outside the usability world of the time, acceptance was even more sluggish.
It took time for interaction design to be recognized by others as a full discipline with its own rigorous lifecycle process. And it was often the software engineering people who were most resistant; oh, how soon we forget our own history! In the days when "structured programming" was just becoming the fashion (Stevens, Myers, & Constantine, 1974), development groups (often one or two programmers) without a process were often suspicious about the value added by a "process" that deflected some of the time and effort from pure programming to design and testing, etc.
And so it is with interaction design, and this time it is often the software engineers and project managers who are resisting more structure (and, therefore, more perceived overhead and cost) in parts of the overall interactive system development process not thought to contribute directly to the output of program code.
2.7.2 Fundamental Activities Involved in Building Anything In the simplest sense, the two fundamental activities involved in (i.e., a process for) creating and building something, be it a house or a software product, are almost always the same: design and implementation. If the complexity of the task at hand is simple, say building a sand castle at the beach, it is possible to undertake design and implementation simultaneously, with minimal process, on the fly and in the head.
However, as complexity increases, each of these activities needs explicit attention and thought, leading to a more defined process. For example, in remodeling one's kitchen, some "design" activities, such as sketches for the new layout and configurations of appliances and countertops, are required before "implementing" the new kitchen.
While you have to do requirements and needs analyses for your own kitchen remodeling so that you do not end up with bells and whistles that you do not really need or use, it is even more important if you are remodeling a kitchen for
someone else. You need this added process step to make sure what is being built matches the requirements.
As complexity of the target system or product increases, so does the need for additional steps in your process to manage that complexity. If we are, say, building a house instead of a kitchen, more steps are needed in the process, including consideration of "platform constraints" such as municipal regulations, geographical constraints such as the location of water lines, and, perhaps more importantly, a defined process to manage the complexity of multiple roles involved in the whole undertaking.
2.7.3 Parallel Streams of Software and Interaction Process Activities
To begin on common ground, we start with a reference point within the discipline of software engineering. Just as we discussed in the previous section, perhaps one of the most fundamental software engineering principles is the distinction between software design and software implementation, as shown in Figure 2-8.
Instead of having programmers "designing" on the fly during implementation, proper software engineering methods require software design first (after capturing requirements, of course), and the resulting design specifications are given to the programmers for implementation. Then programmers, possibly a different person or group, follow the design as documented in the specifications to implement the software.
The programmer who creates the software code to implement the design is in the best position to spot incorrect or missing parts of the specification.
For example, while coding a "case statement," the programmer may notice if the specification for one of the cases is missing. At this point, the programmer has two choices: (1) save time by filling in missing parts or correcting erroneous parts of the specifications by using best judgment and experience or (2) take the extra time to send the specifications back to designers for amendments.
The first choice is tempting, especially if the schedule is tight, but the implementer has not necessarily been privy to all the
prior meetings of designers about rationale, goals, design principles, and so on and may not get it right. In addition, design additions or changes made by the implementer are usually undocumented. The code written to correct the design becomes a software time bomb, later leading to a bug that can be almost impossible to find. As a result, conventional software
Figure 2-8
Distinction between software design and implementation.
Figure 2-9
Software development workflow diagram.
engineering wisdom requires feeding back the faulty specifications to the designers for correction and iteration back to the implementers.
Adding inquiry, requirements, and modeling plus functionality design at the beginning and testing at the end to the boxes of Figure 2-8 gives the picture of software development workflow shown in Figure 2-9.
Systems analysis involves a high-level study of the intended system, including concerns from all disciplines associated with the product. For example, if the project is to design software to manage a nuclear power plant, the systems analysis activity will include study of all component subsystems ranging from safety to software to physical plant to environmental impact.
At this stage, the key subsystems are identified and their high-level interactions specified. In the remainder of this chapter we focus on interactive software systems only and limit the discussion to creation and refinement of interaction design and the development of its software.
Design in the work domain, or application domain, in the second box from the left (Figure 2-9), is the place where the real contents of the system are crafted. If the program is a software support tool for bridge building, for example, this is where all the specialized subject matter knowledge about civil engineering, over-constrained pin joints, strength of materials, and so on is brought to bear. The software design is where algorithms,
data structures, calling structures, and so on are created to represent the work design in software.
The analogous activities for user interface (this time, including the user interface software) development are shown in Figure 2-10.
Connecting the processes together and adding rapid prototyping, to get the big picture, we get the overall development workflow diagram of Figure 2-11.
Immediately noticeable is the lack of vertical connections, which points out the need for improved communication between the lifecycles for functional software and for the user interface component of the overall system. There
is an absolute lack of formal methods to integrate these two lifecycles. This is a big hole in the practice of both sides of the picture. In practice, this communication is important to project success and all parties do their best to carry it out, relying mainly on informal channels.
The means for achieving this communication vary widely, depending on project management abilities, the size of the project, and so on. For small projects, a capable manager with a hands-on management style can function effectively as a conduit of communication between the two work domains.
Larger projects, where it is impossible for one person to keep it all in his or her head, need a more structured inter-domain communication mechanism (Chapter 23).
2.7.4 Iteration for Interaction Design Refinement Can Be Very Lightweight
Figure 2-11 offers a good backdrop to the discussion of iteration within the UX lifecycle for interaction design. Management and software people often strongly resist the general idea of iteration, repetitively going back over process activities. Some team members worry that they can barely afford the time and resources to produce a system once, let alone iterate the process multiple times. This fear is due to a misconception about the nature of iteration in the overall diagram of Figure 2-11, probably because the concept has not been well explained.
In fact, if everything in the diagram of Figure 2-11 were iterated, it would be prohibitively burdensome and laborious. The key to understanding this kind of
Figure 2-10
Analogous user interface development workflow.
Figure 2-11
Overall interactive system
development workflow
diagram.
iteration needed for design refinement is in realizing that it does not mean iterating the whole process, doing everything all over again. Instead it is about only a selective part (see Figure 2-12) of the overall process, just enough to identify and fix the major UX problems.
Iterating this small sub-process is far from ponderous and costly; in fact, it:
� is only a very small and very lightweight iteration
� does not have to be expensive because it involves only a very small part of the overall process
� can occur early in the overall lifecycle when design changes cost little
� can have minimal impact on schedule because it can be done in parallel with many other parts (especially the software engineer- ing parts) of the overall project lifecycle
These are strong reasons why iteration to refine interaction designs can be cost-effective and can lead to a high-quality user experience without being a burden to the overall software and system development budget and schedule.
The perceptive reader will see that we have come full circle; the process in Figure 2-12 is a variation of the Wheel lifecycle template of Figure 2-2.
You will know more about what goes on in each part of this diagram as you go through the rest of the process part of this book (Chapters 3 through 19).
Figure 2-12
The small lightweight sub- process to be iterated for the interaction design.
Intentionally left as blank
The Pre-Design Part of the UX Lifecycle
Here is an overview of how contextual inquiry, contextual analysis, needs and requirements extraction, and modeling lead up to design:
� Contextual inquiry (Chapter 3), is an empirical process to elicit and gather user work activity data.
� Contextual analysis (Chapter 4) is an inductive (bottom-up) process to organize, consolidate, and interpret the user work activity data in the next chapter.
� Chapter 5 is about a deductive analytic process for extracting needs and requirements.
� Chapter 6 is about a synthesis of various design-informing models, such as task descriptions, scenarios, and work models.
� Chapters 7, 8, and 9 are about design, an integrative process aided by the contextual data and their offspring, needs, requirements, and models.
The parts of the figure are not completely separable like this but, for the book, we break it up a bit to "chunk" it for easier digestion.
Contextual Inquiry: Eliciting Work Activity Data
I don't build a house without predicting the end of the present social order. Every building is a missionary1... It's their duty to understand, to appreciate, and conform insofar as possible to the idea of the house. (Lubow, 2009)
- Frank Lloyd Wright, 1938
3.1 INTRODUCTION
3.1.1 You Are Here
We begin each process chapter with a "you are here" picture of the chapter topic in the context of our overall Wheel lifecycle template; see Figure 3-1. The process begins with understanding user work and needs by "getting your nose in the customer's tent." To understand the users' activities in the context of
1The term "missionary" referred to his commitment to educate his customers about their own needs. While
he aimed to serve his clients' needs, he felt he was the only authority on determining those needs.
Figure 3-1
You are here; in the contextual inquiry chapter, within understanding user work and needs in the context of the overall Wheel lifecycle template.
their current work practice (or play practice), using any currently existing system or product, we do contextual inquiry (this chapter) and contextual analysis (Chapter 4). Sometimes contextual inquiry and contextual analysis are
collectively called contextual studies or "user research."
3.1.2 A True Story
In southwest Virginia, somewhat remote from urban centers, when the first-time computer-based touchscreen voting machines were used, we heard that quite a few voters had difficulty in using them. Although an official gave instructions as people entered one particular voting area, a school gymnasium, he did it in a confusing way.
One of the voters in line was an elderly woman with poor eyesight, obvious from her thick eyeglasses. As she entered the voting booth, one could just imagine her leaning her head down very close to the screen, struggling to read the words, even though the font was reasonably large.
Her voice was heard floating above her voting booth, as she gave some unsolicited user feedback. She was saying that she had trouble distinguishing the colors (the screen was in primary colors: red, green, and blue). A member of another major gender nearby said aloud to himself, as if to answer the woman,
that he thought there was an option to set the screen to black and white. But oddly, no one actually told this, if it was true, to the woman.
In time, the woman emerged with a huge smile, proclaiming victory over the evil machine. She then immediately wanted to tell everyone how the design should be improved. Remember, this is an elderly woman who probably knew nothing about technology or user experience but who is quite naturally willing to offer valuable user feedback.
It was easy to imagine a scenario in which the supervisors of the voting process quickly flocked about the voter and duly took notes, pledging to pass this important information on to the higher-ups who could influence the next design. But as you might guess, she was roundly humored and ignored. Call us superstitious, but that is just bad UX ju ju.
There are a few things to note in this story. First, the feedback was rich, detailed, and informative about design. This level of feedback was possible only because it occurred in real usage and real context.
Second, this woman represented a particular type of user belonging to a specific age group and had some associated visual limitations. She was also naturally articulate in describing her usage experience, which is somewhat uncommon in public situations.
So what does this have to do with contextual inquiry? If you do contextual inquiry in a real environment like this, you might get lucky and find rich user data. It is certain however that, if you do not do contextual inquiry, you will never get this kind of information about situated usage.
3.1.3 Understanding Other People's Work Practice
This chapter is where you collect data about the work domain and user's work activities. This is not about "requirements" in the traditional sense but is about the difficult task of understanding user's work in context and understanding what it would take in a system design to support and improve the user's work practice and work effectiveness.
Why should a team whose goal is to design a new system for a customer be all that interested in work practice? The answer is that you want to be able to create a design that is a fit for the work process, which may not be the same as what the designers think will fit.
So, if you must understand something about what the users do, why not just ask them? Who knows their work better than the users themselves? Many customers, including those in large and complex organizations, may wonder why you want to look at their work. "Just ask us anything about it; we have been doing it for years and know it like the back of our hands."
The answer is that what they "know" about their work practice is often biased with their own assumptions about existing tools and systems and is mostly shaped by the limitations and idiosyncrasies of these tools and practices. It is not easy for users consciously to describe what they do, especially in work that has been internalized. Humans are notoriously unreliable about this.
Also, each user has a different perspective of how the broader work domain functions. Knowledge of the full picture is distributed over numerous people. What they know about their work is like what the seven blind men "know" about an elephant.
Why not just gather requirements from multiple users and build a design solution to fit them all? You want an integrated design that fits into the "fabric" of your customer's operations, not just "point solutions" to specific problems of individual users. This can only be achieved by a design driven by contextual data, not just opinions or negotiation of a list of features.
That is why contextual inquiry has taken on importance in the UX process. It takes real effort to learn about other people's work, which is usually unfamiliar, especially the details. It can be difficult to untangle the web of clues revealed by observation of work.
Even surface observables can be complex, and the most important details that drive the work are usually hidden beneath the surface: the intentions, strategies, motivations, and policies. People creatively solve and work around their problems, making their barriers and problems less visible to them and to outsiders studying the work.
Because it is so difficult to understand user needs, much upfront time is wasted in typical projects in arguments, discussions, and conjectures about what the user wants or needs based on anecdotes, opinions, experience, etc. The processes of contextual inquiry and analysis remove the necessity for these discussions because the team ends up knowing exactly what users do, need, and think.
3.1.4 Not the Same as Task Analysis or a Marketing Survey Oftentimes people might say, "We already do that. We do task analysis and marketing surveys." While task analysis does identify tasks, it does not give enough insight into situations where tasks were interwoven or where users needed to move seamlessly from one task to another within the work context.
Task analyses also do not work well in discovering or describing opportunistic or situated task performance. Paying attention to context in task analysis is what led us to contextual inquiry and analysis.
Similarly, you cannot substitute market research for contextual inquiry. They are just two different kinds of analysis and you may need both. Marketing data
are about sales and can identify the kinds of products and even features
customers want, but do not lead to any understanding about how people work or
how to design for them. Customer/user data about work in context are what lead to design.
3.1.5 The Concepts of Work, Work Practice, and Work Domain
We use the term "work" to refer to the usage activities (including play) to achieve
goals within a given domain. It is what people do to accomplish these goals. In most cases, use of the term "work" will be obvious, for example, using a CAD/ CAM application to design an automobile.
"Work practice" is how people do their work. Work practice includes all activities, procedures, traditions, customs, and protocols associated with doing the work, usually as a result of the organizational goals, user skills, knowledge, and social interaction on the job. The context of this kind of work often includes some manual activities in association with some interactive activities.
If we are talking about the context of using a product, such as a consumer software product, then the "work" and "work activities" include all activities users are involved in while using that product. If the product is, say, a word processor, it is easy to see its usage to compose a document as work.
If the product is something like a game or a portable music player, we still refer to all activities a user undertakes while playing games or being entertained with music as "work" and "work activities." Even though the usage activities are play rather than work, we have to design for them in essentially the same way, albeit with different objectives.
Similarly we call the complete context of the work practice, including the
usage context of an associated system or product, the work activity domain or
simply the work domain. The work domain contains the all-important context, without which you cannot understand the work.
3.1.6 Observing and Interviewing in Situ: What They Say vs. What They Do
Okay, so we agree that we have to learn about customer/user work, but why not stay in our own offices, where we have a good working environment and lots of logistical support, such as secretaries for note-taking and transcription, and spacious comfortable conference rooms? The answer is that you cannot get all the information you need by talking with users outside their work context, which only accesses domain knowledge "in the head." Observing users and asking users to talk about their work activities as they are doing them in their
Figure 3-2
Observation and
interviewing for contextual
data collection.
own work context get them to speak from what they are doing, accessing domain knowledge situated "in the world" (see Figure 3-2).
Even when occurring in situ, in the user's own work environment, asking or interviewing alone is not enough. When gathering data in contextual inquiry, be sure
to look beyond the descriptions of how things work, what is commonly believed, and what is
told about the same. Observe the "ground truth"-the actual work practice, problems, issues, and work context. It is especially important to notice workarounds created by users when the intended system support or work practice does not support real needs.
Contextual inquiry in human-computer interaction (HCI) derives from ethnography, a branch of anthropology that focuses on the study and systematic description of various human cultures. In an article describing the transition from ethnography to contextual design, Simonsen and Kensing (1997) explain why interviews as an exclusive data-gathering technique are insufficient:
"A major point in ethnographically-inspired approaches is that work is a socially organized activity where the actual behavior differs from how it is described by those who do it." You need to observe and determine for yourself how the work in question is actually done.
Just as interviewing users is not enough to uncover their unmet needs, observation without interviewing also has its potential downsides. First, if you use observation as an exclusive data-gathering technique, you could miss some important points. For example, an important problem or issue simply might not come up during any given period of observation (Dearden & Wright, 1997).
Second, observation itself can affect user behavior. This is the famous "measurement effect"2 adapted to observation of people. The very act of observation can cause people to change the behavior being observed.
For example, when a person is subjected to new or increased attention, for example, being observed during task performance, the "Hawthorne effect"
2Study of the problem of measurement in quantum mechanics has shown that measurement of any object involves interactions between the measuring apparatus and that object that inevitably affect it in some way. -Wikipedia.com
(Dickson & Roethlisberger, 1966) can produce a temporary increase of performance due to their awareness of being observed and perceived expectations of high performance. Diaper (1989) points this out as a natural human reaction. Simply put, when users are being observed, they tend to act like they think you want them to. When we are observed at work, we all want to do our best and be appreciated.
3.1.7 The SnakeLight, an Example of How Understanding Work Practice Paid Off
Here is an anecdotal example about why it helps to understand how your users do their activities and how they use products and systems. This example of the effectiveness of in situ contextual inquiry comes to us from the seemingly mundane arena of consumer flashlights. In the mid-1990s, Black & Decker was considering getting into handheld lighting devices, but did not want to join the crowded field of ordinary consumer flashlights.
So, to get new ideas, some designers followed real flashlight users around. They observed people using flashlights in real usage situations and discovered the need for a feature that was never mentioned during the usual brainstorming among engineers and designers or in focus groups of consumers. Over half of the people they observed during actual usage under car hoods, under kitchen sinks, and in closets and attics said that some kind of hands-free usage would be desirable.
They made a flashlight that could be bent and formed and that can stand up by itself. Overnight the "SnakeLight" was the product with the largest production volume in Black & Decker history, despite being larger, heavier, and more expensive than other flashlights on the market (Giesecke et al., 2011).
3.1.8 Are We Gathering Data on an Existing System or a New System?
When gathering data and thinking about designs for a new system, analysts and designers can be strongly biased toward thinking about only the new system.
Students sometimes ask, "Should we be modeling the existing way they do it or as it would be done with the new system?" This is asking whether to do modeling in the problem domain or the solution domain, the work domain or the design domain. At the end of the day, the answer might well be "both," but the point of this particular discussion is that it must start with the existing way. Everything we do
in contextual inquiry and contextual analysis in this chapter and the next is about the existing way, the existing system, and the existing work practice. Often team members get to thinking about design too early and the whole thing
becomes about the new system before they have learned what they need to about work practice using the existing system.
In order for all this to work, then, there must be an existing system (automated, manual, or in-between), and the proposed new system would then somehow be an improvement. But what about brand new ideas, you ask, innovations so new that no such system yet exists? Our answer may be
surprising: that situation happens so rarely that we are going to go out on a limb and say that there is always some existing system in place. Maybe it is just a manual system, but there must be an existing system or there cannot be existing
work practice.
For example, many people consider the iPod to be a really innovative invention, but (thinking about its usage context) it is (mainly) a system for playing music (and/or videos). Looking at work activities and not devices, we see that people have been playing music for a long time. The iPod is another in a series of progressively sophisticated devices for doing that "work" activity, starting with the phonograph invented by Thomas Edison, or even possibly earlier ways to reproduce "recorded" sound.
If no one had ever recorded sound in any way prior to the first phonograph, then there could not have been an "existing system" on which to conduct contextual inquiry. But this kind of invention is extremely rare, a pure innovative moment. In any case, anything that happens in sound reproduction after that can be considered follow-on development and its use can be studied in contextual inquiry.
3.1.9 Introducing an Application for Examples
As a running example to illustrate the ideas in the text, we use a public ticket sales system for selling tickets for entertainment and other events. Occasionally, when necessary, we will provide other specific examples.
The existing system: The Middleburg University Ticket Transaction Service
Middleburg, a small town in middle America, is home to Middleburg University, a state university that operates a service called the Middleburg University Ticket Transaction Service (MUTTS). MUTTS has operated successfully for several years as a central campus ticket office where people buy tickets from a ticket seller for entertainment events, including concerts, plays, and special presentations by public speakers. Through this office MUTTS makes arrangements with event sponsors and sells tickets to various customers.
The current business process suffers from numerous drawbacks:
� All customers have to go to one location to buy tickets in person.
� MUTTS has partnered with Tickets4ever.com as a national online tickets distribution platform. However, Tickets4ever.com suffers from low reliability and has a reputation for poor user experience.
� Current operation of MUTTS involves multiple systems that do not work together very well.
� The rapid hiring of ticket sellers to meet periodic high demand is hampered by university and state hiring policies.
Organizational context of the existing system
The desire to expand the business coincides with a number of other dynamics currently affecting MUTTS and Middleburg University.
� The supervisor of MUTTS wishes to expand revenue-generating activities.
� To leverage their increasing national academic and athletic prominence, the university is seeking a comprehensive customized solution that includes integration of tickets for athletic events (currently tickets to athletic events are managed by an entirely different department).
� By including tickets for athletic events that generate significant revenue, MUTTS will have access to resources to support their expansion.
� The university is undergoing a strategic initiative for unified branding across all its departments and activities. The university administration is receptive to creative design solutions for MUTTS to support this branding effort.
The proposed new system: The Ticket Kiosk System
The Middleburg University Ticket Transaction Service (MUTTS) wants to expand its scope and expand to more locations, but it is expensive to rent space in business buildings around town and the kind of very small space it needs is rarely available. Therefore, the administrators of MUTTS and the Middleburg University administration have decided to switch the business from the ticket window to kiosks, which can be placed in many more locations across campus and around town.
Middleburg is home to a large public university and has reliable and well-used public transportation provided by its bus system operated by Middleburg Bus, Inc. There are several bus stops, including the library and the shopping mall, where there is space to add a kiosk for a reasonable leasing fee to the bus company.
A number of these bus stops seem good locations for kiosks; buses come and go every few minutes. Some of the major stops are almost like small bus stations with good-sized crowds getting on and off buses.
In addition to an expected increase in sales, there will be cost savings in that a kiosk requires no personnel at the sales outlets. The working title for the new system is Ticket Kiosk System, pending recommendations from our design team. The Ticket Kiosk System will have a completely new business model for the retail ticket operation.
3.2 THE SYSTEM CONCEPT STATEMENT
A system concept statement is a concise descriptive summary of the envisioned system or product stating an initial system vision or mandate; in short, it is a mission statement for the project. A system (or product) concept statement
is where it all starts, even before contextual inquiry. We include it in this chapter because it describes an initial system vision or mandate that will drive and guide contextual inquiry. Before a UX team can conduct contextual inquiry, which will lead to requirements and design for the envisioned system, there has to be a system concept.
Rarely does a project team conceptualize a new system, except possibly in a "skunk-works" kind of project or within a small invention-oriented organization. The system concept is usually well established before it gets to the user experience people or the software engineering people, usually by upper management and/or marketing people. A clear statement of this concept is important because it acts as a baseline for reality checks and product scope and as something to point to in the middle of later heated design discussions.
� A system concept statement is typically 100 to 150 words in length.
� It is a mission statement for a system to explain the system to outsiders and to help set focus and scope for system development internally.
� Writing a good system concept statement is not easy.
� The amount of attention given per word is high. A system concept statement is not just written; it is iterated and refined to make it as clear and specific as possible.
An effective system concept statement answers at least the following questions:
� What is the system name?
� Who are the system users?
� What will the system do?
� What problem(s) will the system solve? (You need to be broad here to include business objectives.)
� What is the design vision and what are the emotional impact goals? In other words, what experience will the system provide to the user? This factor is especially important if the system is a commercial product.
The audience for the system concept statement is broader than that of most other deliverables in our process and includes high-level management, marketing, the board of directors, stockholders, and even the general public.
Example: System Concept Statement for the
Ticket Kiosk System
Here is an example of a system concept statement that we wrote for the Ticket Kiosk System.
The Ticket Kiosk System will replace the old ticket retail system, the Middleburg University Ticket Transaction Service, by providing 24-hour-a-day distributed kiosk service to the general public. This service includes access to comprehensive event information and the capability to rapidly purchase tickets for local events such as concerts, movies, and the performing arts.
The new system includes a significant expansion of scope to include ticket distribution for the entire MU athletic program. Transportation tickets will also be available, along with directions and parking information for specific venues. Compared to conventional ticket outlets, the Ticket Kiosk System will reduce waiting time and offer far more extensive information about events. A focus on innovative design will enhance the MU public profile while Fostering the spirit of being part of the MU community and offering the customer a Beaming interaction experience. (139 words)
This statement can surely be tightened up and will evolve as we proceed with the project. For example, "far more extensive information about events" can be made more specific by saying "extensive information including images, movie clips, and reviews about events." Also, at this time we did not mention security and privacy, important concerns that are later pointed out by potential users. Similarly, the point about "focus on innovative design" can be made more specific by saying "the goal of innovative design is to reinvent the experience of interacting with a kiosk by providing an engaging and enjoyable transaction experience."
Usually a system concept statement will be accompanied by a broader system vision statement from marketing to help get a project started in the right direction. None of this yet has the benefit of information from customers or potential users. However, we do envision the customer being able to find event information, select events to buy tickets for, select seats, purchase tickets, print tickets, and get information and tickets for transportation while enjoying the overall experience interacting with the kiosk.
in this system concept statement will be adjusted and assumptions
corrected.
NB: All exercises are in Appendix E, near the end of the book.
3.3 USER WORK ACTIVITY DATA GATHERING
Much of the material in this chapter comes from the contextual design material existing in the literature. We do not try to reproduce these entire processes in this book, as those topics already appear in books of their own, with credit to their respective authors. What we do here is draw on these processes, adapting them to establish our own frame of reference and integrating them into the context of other requirements-related activities.
We gratefully acknowledge the sources from which we have adapted this material, mainly Contextual Design (Beyer & Holtzblatt, 1998) and Rapid Contextual Design (Holtzblatt, Wendell, & Wood, 2005). Other work we have drawn upon and which we acknowledge include Constantine and Lockwood (1999). A CHI Conference paper by Hewlett-Packard people (Curtis et al., 1999) contributed to our understanding by spelling out an excellent large-scale example of the application of contextual design.
To do your user work activity data gathering you will:
� prepare and conduct field visits to the customer/user work environment, where the system being designed will be used
� observe and interview users while they work
� inquire into the structure of the users' own work practice
� learn about how people do the work your system is to be designed to support
� take copious, detailed notes, raw user work activity data, on the observations and interviews
In these early chapters we are generally taking the perspective of domain- complex systems because it is the more "general" case. We will describe several
methods and techniques that have proven successful, but you should be creative and open to including whatever techniques suit the needs of the moment.
This means that you might want to use focus groups, for example, if you think they will be useful in eliciting a revealing conversation about more complex issues.
The goals of contextual inquiry are the same in both perspectives (domain-complex systems vs. interaction-complex consumer products), and
most of the steps we describe apply to, or can easily be adapted for, the product perspective. Where appropriate, we will offer descriptions of how the process might differ for the product user perspective.
3.3.1 Before the Visit: Preparation for the Domain-Complex System Perspective
Learn about your customer organization before the visit Preparation for the visit means doing upfront planning, including addressing issues such as these about the customer:
� For work activities situated in the context of a system with a complex work domain, get a feel for the customer's organizational policies and ethos by looking at their online presence-for example, Website, participation in social networks.
� Know and understand the vocabulary and technical terms of the work domain and the users.
� Learn about the competition.
� Learn about the culture of the work domain in general-for example, conservative financial domain vs. laid-back art domain.
� Be prepared to realize that there will be differences in perspectives between managers and users.
� Investigate the current system (or practices) and its history by looking at the company's existing and previous products. If they are software products, it is often possible to download trial versions of the software from the company's Website to get familiar with design history and themes.
Learn about the domain
While designing for complex and esoteric domains, working first with subject matter experts helps shorten the actual contextual inquiry process by giving you a deeper understanding of the domain, albeit from a non-user perspective. Your contextual inquiry process can now include validating this understanding. In cases where time and resources are at a premium (not an
insignificant portion of projects in the real world), you may just have to make do with just interviewing a few subject matter experts instead of observing real users in context.
Issues about your team
In addition, there are issues to address about your team:
� Decide how many people to send on the visits.
� Decide who should go on each visit, for example, user experience people, other team members, documentation folks.
� Set your own limits on the number of visits and number of team members involved, depending on your budget and schedule.
� Plan the interview and observation strategy (who in the team does what).
Your visit-group size can depend on how many are on your initial project team, the number of different user roles you can identify, the size of the project overall, the budget, and even your project management style. Practitioners report taking as many as two to eight or more people on visits, but three to four seems to be typical.
A multidisciplinary team is more likely to capture all necessary data and more likely to make the best sense of data during subsequent analysis. We have found using two people per interview appealing; one to talk and one to take notes.
Lining up the right customer and user people
Among the things to do to prepare for a site visit for contextual inquiry, you should:
� Select and contact appropriate users or customer management and administrative people to:
� explain the need for a visit
� explain the purpose of the visit (to learn about their work activities)
� explain your approach (for them actually to do the work while you are there to observe)
� obtain permission to observe and/or interview users at work
� build rapport and trust, for example, promise personal and corporate confidentiality
� discuss timing-which kinds of users are doing what and when?
� set scope: explain that you want to see the broadest representation of users and work activities, focusing on the most important and most representative tasks they do
� establish or negotiate various parameters, such as how long you will/can be there (it can be up to several intense weeks for data gathering), how often to visit (it can be up to every other day), how long for the average interview (a couple of hours maximum), and the maximum number of interviews per visit (as an example, four to six)
� Select and contact appropriate support people (determined by the management people you talk with) within the customer organization to arrange logistics for the visits.
� Select and contact appropriate people to meet, observe, and interview: customers, users (who do the work in question), especially frequent users, managers; aim for the broadest variety, cover as many usage roles as possible, plan visits to multiple sites if they exist.
This latter item, selecting the people to meet, observe, and interview, is especially important. Your fieldwork should include all work roles, selected other stakeholders who impact work directly or indirectly, and (depending on the project) possibly grand-customers (customers of the customer) outside the user's organization. You want the broadest possible sources to build a holistic and multi-perspective picture of the contextual data.
Get access to "key" people
For projects in a domain-complex system context, you might also be told by your customer that users of the system in question are scarce and generally unavailable. For example, management might resist giving access to key people because they are busy and "bothering" them would cost the organization time and money.
If you sense reluctance to give access to users, you need to step up and make the case; establish the necessity for gathering requirements that will work and the necessity for firmly basing requirements on an understanding of existing work activities. Then explain how this extra work upfront will reduce long-term costs of reworking everything if analysts do not get the right requirements.
Ask for just a couple of hours with key users. Persevere.
At the other end of the spectrum, for consumer software, such as shrink-wrap word processors, users abound and you can recruit users to interview via a "help wanted" ad posted in the local grocery store.
Do not interview only direct users. Find out about the needs and frustrations of indirect users served by agents or intermediaries. And do not forget managers. Here is a quote from a team that we worked with on a project, "It was eye-opening to talk with the managers. Managers are really demanding and they have different kinds of requirements from those of the users, and they see things from a totally different viewpoint than the other users."
Sometimes you may have access to the users for only a small period of time and therefore cannot observe them doing work. In such cases, you can ask them
to walk you through their typical day. You must work extra hard to ask about exceptions, special cases, and so on. This approach suffers from many of the problems we described earlier regarding not observing users in context but at least provides some insights into user's work.
What if you cannot find real users?
In the worst case, that is, when you have no access to real users (this has happened in our consulting and work experience), the last resort is to talk to user proxies. User proxies can be business experts or consultants who are familiar with the user's work.
This approach suffers from many disadvantages and often results in hearing about high-level functional needs and crude approximations of what a broad class of users need in the system. The accounts of such proxies are often tainted by their own opinions and views of the work domain. They also suffer from serious omissions and simplifications of often nuanced and complex user work activities.
Setting up the right conditions
The environment, the people, and the context of the interview should be as close a match to the usual working location and working conditions as possible. We once found ourselves being ushered into a conference room for an interview because, as the employer put it, "it is much quieter and less distracting here."
The employer had even arranged for time off from work for the worker so that he could focus his complete attention on the interview. But, of course, the conference room was not anything like the real work context and could not possibly have served as a useful source of information about the work context. We had to convince them to move the whole thing back into the active workplace.
Make sure that the observations and interviews are conducted without undue political and managerial influences. You want to create the right conditions for observation and interviews, conditions in which users feel comfortable in telling the "real" story of the everyday work practice. We once had to deal with the supervisor of a person we wanted to interview because the supervisor insisted on being present during the interview. His reason was that it was a rare opportunity to learn more about what his workers did and how.
However, we also suspected that the supervisor did not want the employee to be complaining to strangers about working conditions or the organization.
However, from the worker's view, having a supervisor present looked a lot like an
opportunity for the supervisor to evaluate the user's job performance. It meant not being able to be open and candid about much of anything. Instead, the employee would have to pay very close attention to doing the job and not saying anything that could be interpreted in a way that could be used against him.
It would be anything but a sample of everyday work practice.
How many interviewees at a time?
It might work out that, via a group interview, multiple users can work together and produce data not accessible through a single user. However, group interviews can also mask individual thoughts. Each user may have a very different view of how things work and what the problems are, but these differences can be sublimated in an unconscious effort to reach "consensus."
Additionally, group dynamics may be influenced by hidden agendas and turf battles.
Preparing your initial questions
Script your initial interview questions to get you off to a good start. There is no real secret to the initial questions; you ask them to tell you and to show you how they do their work. What actions do they take, with whom do they interact, and with what do they interact? Ask them to demonstrate what they do and to narrate it with stories of what works, what does not work, how things can go wrong, and so on.
We found that instead of asking them generally "What do you do here?" it is sometimes more helpful to ask them to walk us though what their work specifically entailed the day before and if that was typical. This kind of a specific probing gives them an easy point of reference to make their descriptions concrete.
Before the visit: Preparation for the product perspective
While the aforementioned guidelines for preparing a visit in a domain-complex system context generally also apply to a product perspective, there are a
few differences. For one, the context of work in a product design perspective is usually much narrower and simpler than that in an entire organization.
This is primarily because organizations contain numerous and often widely different roles, each contributing to a part of the overall work that gets accomplished.
In contrast, the work activities within a product design context are usually centered on a single user in a single role. To observe the full range of usage patterns by a single user of a product, you usually have to observe their usage
over a long time. In other words, to do this kind of contextual inquiry, instead of observing several users in a work role for a relatively short time, you have to "shadow" single users over a longer time.
For example, the work, or play, activities associated with a portable music player system sometimes include searching for and listening to music. At other times the same user is trying to manage music collections. Even in cases where the design needs to support multiple users, say the user's family, the complexity of the interaction among different roles is usually much lower in the product perspective, and often more homogeneous than in a domain-complex system perspective.
Where do we start with our contextual inquiry process for such products? The best place to start is by understanding the complete usage context of this kind of product, including desirable features and limitations.
We also have to ask about things such as branding, reputation, and competition in this product segment. To find unbiased information about these issues, instead of looking online for the customer's organizational policies and culture, we need to look for user groups or blogs about using this kind of product and check out reviews for similar products.
Do some initial brainstorming to see what kinds of user communities are primary targets for this product segment. College students? Soccer moms? Amateur photographers? Then think of good places to meet people in these user classes. If necessary, use marketing firms that specialize in recruiting specific target populations.
Mr. Aaron Marcus, President, and Principal Designer/Analyst, Aaron Marcus and Associates, Inc. (AM�A)
Modern technology and commerce permit global distribution of products and services to increasingly diverse
users who exist within different cultures. Culture affects every aspect of tool and sign making. Culture-centered design of user experiences seems "inevitable." Designers/analysts are aware of culture, but may not be informed of specific dimensions by which cultures can be described and measured.
Websites are one set of examples; they are immediately accessible by people worldwide and offer design challenges of "localization" that go beyond translation (Marcus and Gould, 2000). Some years ago, Jordanian Website Arabia.On.Line used English for North American and European visitors, but the layout read right to left as in Arabic because the local designers were too influenced by their own culture.
Localization goes beyond languages and translation. If one were to examine the home page of Yahoo.com in English and Maktoob.com, one of the Arabic world's most popular portals in Arabic, one would find not only language differences, but differences in color, imagery, organization, and topics of primary interest. There may be geographic, historical, political, aesthetic, and language differences.
Small-scale communities with preferred jargon, signs, and rituals can constitute a "cultural group." This definition is different from traditional definitions of culture that more typically refer to longstanding historical differences established over many generations and centuries. The modern cultural group may be considered more a social group or "lifestyle group," including affinity groups, social groups, and geographically dispersed groups communicating through the Internet. Today, "digital natives" vs "digital immigrants" may constitute significant differences in "culture."
The challenge for business is how to account for different user experiences that are culturally differentiated in a cost-effective manner. Developers may need to rely on market and user data to achieve short-term success and to avoid wasting money and time on too many variations. Paying attention to culture models and culture dimensions can assist.
CULTURE MODELS AND CULTURE DIMENSIONS
Analysts have written about culture for many decades. Geert Hofstede's (1997) cultural dimensions are well known and well established, although controversial for some anthropologists and ethnographers. Hofstede
examined IBM employees in more than 50 countries during 1978-1983 and was able to gather data and analyze it in a statistically valid method. His definition of culture (in his model, each country has one dominant culture) concerns patterns of thinking, feeling, and acting that are "programmed" by a particular group in their children by the time they reach pubescence. The differences of culture manifest themselves in specific rituals, symbols, heroes/heroines, and values. Hofstede's five dimensions of culture are the following:
Power-distance: High vs low-differences between powerful people in the society and others Collectivism vs individualism: being raised in a group and owing allegiance, or not
Femininity vs masculinity: roles that different sexes play within the society
Uncertainty avoidance: High vs low-the degree of discomfort about things not known
Long-term orientation vs short-term orientation: Confucian values of perseverance, virtue, etc., or other values.
For each culture dimension, Hofstede noted differences of attitudes toward work, family, and education.
CAUTIONS, CONSIDERATIONS, AND FUTURE DEVELOPMENTS
Although Hofstede's model is well established, and many studies have been based on it, there are also criticisms of the model:
� Old data, pre-postmodern (no emphasis on media, sociology of culture, politics of culture)
� Corporate subjects only, not farmers or other laborers
� Assumes one culture per country
� Assumes fixed, unchanging relationships
� Gender roles, definitions debatable
� Seems too general, stereotypical
Studies have shown that even the concept of usability may be biased. A study published in CHI 2009 Proceedings (Frandsen-Thorlacius et al., 2009) showed that Chinese users found fun and visual appeal to be related more closely to usability than for Danish users.
At the very least, awareness of culture models and culture dimensions enlarges the scope of issues. For
example, these models challenge the professions of UI development to think about appropriate metaphors for different cultures, user profiles that are culture sensitive, differing mental models, and their influence on performance, not only preference, alternate navigation strategies, evaluation techniques, attitude toward emotions, etc. An additional challenge is introducing culture considerations into corporate and organization frameworks for product/service development and into centers of user-centered design. There are additional sources of insight into UX and culture, each of which has formulated models and seven plus or minus two dimensions. Each of these gives rise to further issues and interactions with culture: persuasion, trust, intelligence, personality, emotion, and cognition.
With the rise of India and China as sources of software and hardware production, innovation, and consumption, it becomes more obvious that computer-mediated communication and interaction occur in a context of culture. It is inevitable that user-experience development must account for cultural differences and similarities. Models, methods, and tools do exist, but many research issues lie ahead. Future development of tools, templates, and treasure chests of patterns will provide a body of knowledge in the future for more humane, cultured design of computer-based artifacts.
References
Hofstede, G. (1997). Cultures and Organizations: Software of the Mind. New York: McGraw-Hill.
Frandsen-Thorlacius, O., Hornb�k, K., Hertzum, M., & Clemmensen, T. (2009). Non-Universal Usability? A Survey of How Usability Is Understood by Chinese and Danish Users. In Proc., CHI 2009 (pp. 41-50). 6 April 2009, Boston, MA. Marcus, A., & Gould, E. W. (2000). Crosscurrents: Cultural Dimensions and Global Web User-Interface Design. Interac-
tions, ACM Publisher, 7(4), 32-46. www.acm.org.
Anticipating modeling needs in contextual inquiry: Create contextual data "bins"
There is a spectrum of approaches to contextual data collection from data driven to model driven. We draw on the best of both but lean toward the model-driven approach. A data-driven approach operates without any
presuppositions about what data will be observed. There are no predefined data categories to give hints about what kind of data to expect. The data- driven approach simply relies on data encountered to guide the process of
data gathering and subsequent analysis. Whatever arises in contextual inquiry observations and interviews will define the whole process.
Alternatively, a model-driven contextual inquiry process means that
instead of just gathering relevant data as you encounter it in observations and interviews, you use your experience to help guide your data collection. In particular, you use the known general categories of design-informing models (Chapter 6) as a guide for kinds of data to watch for, looking forward to the data needs of modeling so that at least some of your data collection in contextual inquiry can be guided by these future needs.
From your knowledge you will have a good idea of which models will be needed for your project and what kind of data will be needed for your models. Using this knowledge, you create some initial "bins" for data categories and models into which you can start putting your contextual data, in whatever form it has at this point. A bin is a temporary place to hold data in a given category. As you collect data, you will think of other categories to add more bins.
For example, we will cover construction of what we call a physical model (Chapter 6), which includes a diagram of the physical layout of the working environment. So, if a physical model is relevant to your project, then you will need to make a sketch and/or take photos of the physical layout, where equipment is located, and so on while you are still on-site doing contextual inquiry. In order to meet those modeling needs later, you will also need to take notes about the physical layout and any problems or barriers it imposes on the work flow and work practice.
In the next chapter we will extend and complete the creation of bins for sorting and organizing your data in contextual analysis.
3.3.2 During the Visit: Collecting User Work Activity Data in the Domain-Complex System Perspective
When you first arrive
Begin your first field visit by meeting the manager, administrator, or supervisor through whom you arranged the visit. Continue with the building of trust and rapport that you started previously. Make it clear that you are doing this for the purpose of helping make a better design. It is a big advantage if, at the beginning, you can briefly meet all customer personnel who will be involved so that you can get everyone off to the same start by giving the overview of your goals and approach, explaining what will happen during your visits and why.
Remember the goal
Often in field visits for "talking with our users," people ask users what they want or need. In a contextual inquiry, we do not ask users what they want or need. We observe and interview users in their own work context about how they do their work as they are doing the work that later will be supported by the system you hope to design.
And, by all means, do not forget that we are on a quest for a quality user experience. The techniques of contextual inquiry and contextual analysis will not necessarily take care of searching out the special aspects of user experience; you have to be tuned to it. So be especially aware of user stories about particularly good or bad usage experiences and ferret out the reasons, both in design and in usage context, that contribute to those experiences.
Establish trust and rapport
The interviews with users should also start with trust building and establishing rapport. Help them understand that you have to ask many questions and "get in their face" about the work. Interviewing skills are learned; observe users doing their work and ask many questions about why they do something, how they do certain things, how they handle certain cases, and get them to tell specific stories about their work and how they feel about the work and the way it is done in the existing environment. Follow them around; do not miss a part of the work activity by staying behind if they have to move as part of their work.
Form partnerships with users
In most consulting situations the person who comes in from outside the customer organization is considered the "expert," but it is quite the opposite in contextual inquiry. The user is the expert in the domain of work practice and you are the observer trying to understand what is happening.
The process works best if you can form a partnership with each user in which you are co-investigators. Invite the user into the inquiry process where you can work together in close cooperation. You need to establish an equitable relationship with the user to support information exchange through a dialog.
As the observations and interviews proceed you can feed the partnership by sharing control of the process, using open-ended questions that invite users to talk, for example, what are you doing? Is that what you expect? Why are you doing that? Be a good listener and let the user lead the conversation. Pay attention to nonverbal communication.
Task data from observation and interview
One of the most important kinds of contextual data to collect is task data. You will need this to build your task structure models and task interaction models in Chapter 6. This is where a combination of observation and interview can work especially well. Get task-related data by observing actual sessions of users doing their own work in their own work context.
At the same time, you will interview the users, but asking only about the task they are doing, not about possible tasks or tasks that other users do.
The interview component is used to interpret and assign meaning to what is observed. To have necessary data for task models later, ask questions to clarify anything not obvious in what you observe. Ask questions about the purposes and rationale for each task and each important step; why do they do certain actions?
On the observation side of things, be sure to notice the triggers for tasks and steps; what happens to cause them to initiate each task or step? For example, an incoming phone call leads to filling out an order form.
Learn about your users' task barriers by observing tasks being performed and by think-aloud verbal explanation of underlying information about the tasks, such as task goals. Notice hesitations, problems, and errors that get in the way of successful, easy, and satisfying task or step completion. Probe for what was expected and reasons why it did not turn out well. You will need these answers to model barriers in the upcoming analysis and modeling.
It takes a certain skill to key in on occurrences of critical information in the flow of observation and interviews. With practice, you will acquire an awareness and ability to detect, discern, and discriminate the wheat from the chaff of the flow.
The output of this process is raw user work activity data in the form of lengthy observation and interview notes or transcripts of recorded sessions.
Recording video
Video recording is an effective way of comprehensively capturing raw contextual data where conditions and resources permit. Video recording can help you capture important nonverbal communication cues in contextual data.
However, factors such as the time and effort to produce and review the recordings can weigh against the use of video recording in contextual inquiry data collection. Confidentiality, privacy, and other such concerns can also preclude the use of video.
In addition, video recording can interfere with a close user relationship. The feeling that what they say is permanently captured may prevent them from being forthcoming. They may not be too willing to say something negative about existing work practice or complain about policies at work. Informal note taking, however, can provide a more intimate conversational experience that may encourage honest expression. Despite all these possible barriers, video clips can provide convincing evidence, when linked to the contextual notes, that an issue is real.
Note taking
Regardless of whether you use video or audio recordings of observation and interview sessions, you should consider note taking your primary source of usable raw data. Manual paper note writing may be the most commonly used contextual inquiry data collection technique used in the real world. It is unintrusive, not intimidating to the user, and fits most naturally into what should be a more or less low-key interaction with the user. Alternatively, a laptop is acceptable for note taking, if you can do it inconspicuously.
When taking notes, you must incorporate some scheme that preserves information about the source of each note. We recommend that you use:
� quotations marks to denote what a user says
� plain text to describe observations of what users do
� parentheses to delimit your own interpretations
A small handheld digital audio recorder used inconspicuously, but not trying to be covert, might be beneficial to augment note taking, especially when things are moving fast. One way to use audio recording is as the audio equivalent of written notes.
In this mode, after hearing user comments or observing user behavior, you dictate a short summary into the recorder, much as a medical doctor dictates summaries for patient charts during daily rounds. This mode of operation has the additional benefit that if the user can hear you paraphrase and summarize the situation, it is a chance to correct any misconceptions.
Use a numbering system to identify each point in data
It is important to use a numbering system to identify uniquely each note, each point in the raw data, or each sequence in a video or audio recording or transcript. This last item is necessary to provide a way to reference each note. Later, in analysis, each conclusion must be linked to the associated raw data note or else it cannot be treated as authentic. Some of the ways to tag your raw data for reference in analysis include the following:
� If you record sessions, you can use video frame numbers or time codes on the recording as identifiers of sequences and points in raw data.
� If you record sessions, you definitely should assign line numbers to the transcripts, just as it is done for legal documents.
� If you take manual notes, each note should be tagged with a note identification number tied to the person or persons who are the data source.
How to proceed
Record raw data by expressing it in the user's voice. Because these data are raw, it makes sense to express points in the interview transcripts generally as they occurred during the interview, which would usually be by using the words of the user. For example, the statement: "I like to add an extra egg when I make a cake" naturally reflects the fact that a user is speaking. If you record your interviews, the transcripts will mostly appear as the exact words of the user anyway.
Switching to an expression such as "the user likes to add an extra egg when baking a cake" unnecessarily introduces an analyst flavor that is not useful this early in the process. Moreover, the user's voice describes much closely the user's experience, and subtle use of adjectives and expressions can provide clues on designing for enhancing that experience.
It is your job to capture data about the user's work. Do not expect users necessarily to tell you what they want or need; this is just about how they work and how they feel about it. Your team will deduce needs later, after they understand the work. Also, do not expect users to do design, although they might occasionally suggest something they would like to see in the system.
� Be a listener; in most cases you should not offer your opinions about what users might need.
� Do not lead the user or introduce your own perspectives.
� Do not expect every user to have the same view of the work domain and the work; ask questions about the differences and find ways to combine to get the "truth."
� Capture the details as they occur; do not wait and try to remember it later.
� Be an effective data ferret or detective. Follow leads and discover, extract, "tease out" and collect "clues." Be ready to adapt, modify, explore, and branch out.
Part of being a good detective, the latter point above, is being willing to deviate from a script when appropriate. Be prepared to follow leads and clues and take the interview and observations where you need to go, tailoring questions to meet the goal of learning all you can about their work practice, work environment, and work issues and concerns.
As an example of following leads, this is a real story told by a team doing a project for one of our classes. The client was in retail sales and the conversation of the interview had centered on that concept, including serving their customers, making the sale transaction, and recording it.
However, during this conversation the word "inventory" was mentioned once, in the context of point-of-sale data capture. No one had asked about inventory, so no one had mentioned it until now.
Our good ethnographic detectives, recognizing an entree to another area of work activities, pounced on that word and pursued a new train of thought. What about inventory? What role does it play in your point-of-sale data capture?
Where does it go from there? How is it used and for what? How do you use inventory data to keep from running out of stock on items in demand? Who orders new stock and how? Once an order is sent, how do you keep track of it so it does not fall through the cracks? What happens when the new stock is delivered? How do you know when it arrives? Who works in receiving and what do they do? How do you handle partial shipments?
As an example of dialogue that violates the point above about not introducing your own perspectives, consider this user comment: "I want privacy when
I am buying tickets." You might be tempted to say: "You mean, when you are looking for events and buying tickets, you do not want other people in line to know what you are doing?" To which the user might respond: "Yes, that is what I mean." A better way to handle the user's comment here would have been with a follow-up question such as "Can you elaborate what you mean by wanting privacy?"
Pay attention to information needs of users
As you talk with users in the work roles, try to identify their information needs in the context of the work activities and tasks, as they do their jobs in the work domain. Do the current work practices and the current software systems provide information needed by users to do their jobs? Is the needed information provided at the time it is needed and in the form it is needed? And beware of "information-flooding screens."
When designers do not know what users need, they often fall back on the unjustifiable excuse that the users themselves will know what they need. These designers then create designs that display all information available or all the information users might need, in an "information flooding screen," and let the users sort it out. The designer's assumption is that all the information needed is presented-the "it is all there" syndrome-and the users are in the best position to know which parts are needed for which functions/tasks and what format is best for the job. This is a thinly veiled copout for not doing the necessary upfront analysis to inform the design.
What about design ideas that crop up? Contextual inquiry is not about design, but you do not want to lose any good ideas, so you should make note of design ideas from users as they come up and then get back to asking about work practice. It is normal for users to suggest design ideas, often very specific and sometimes not very practical. It is the interviewer's responsibility to take note of these suggestions, but to ask more questions to connect them back to work practice. Ask "why?" How does that suggestion fit into your workflow? What part of your work leads to a need for this?
What about analyst and designer ideas that crop up?
Similarly, make note of design ideas from your own team and tag them as such. Just as with users, it is normal for analysts to get design ideas during interviews or during subsequent analysis activities.
Because such suggestions can introduce analyst bias into what is supposed to be all about user data, "righteous" analysts may want to ignore them. But even analyst ideas generated in the action of contextual inquiry are real data and it would be a shame to lose them. So to include analyst and designer data in contextual inquiry, we suggest getting user confirmation by asking about these ideas and keeping clear the source; be sure to label or tag such notes as analyst ideas.
Questions not to ask
Do not expect you can ask customers and users for direct answers to the questions that will lead you straight to design. Remember that contextual inquiry is often called the process for discovering what users cannot tell you. In his "column" on the User Interface Engineering Website, Jared Spool (2010) advises us about three specific questions not to ask customers or users during field visits. We summarize the three no-no questions here:
� Do not ask about the future; do not ask users what they would do in a given circumstance. The answer will probably not reflect the reality of what they might do if in the same situation but all alone at work or at home.
� Do not ask for design advice, how they would design a given feature. Users are
not designers and do not usually have a design-thinking mind-set. You are likely to get off-the-wall answers that will not fit in with the rest of your design; although their idea might work in the present situation, it might not fit other usage conditions.
� Do not ask a question by trying to state what you think is their rationale. You just put ideas in their heads and they might give answers they think you want. Users often do not think about their usage in terms of a logical rationale for each action.
Collect work artifacts
During site visits collect as many samples of work artifacts, such as paper forms, templates, work orders, and other paperwork, as you can. Work artifacts include not just paperwork, but all items used in the work practice and photos of the same.
For example, consider the keys to a customer's car in an auto repair facility. First, they may be put in an envelope with the work order, so the mechanic has the keys when needed. After repairs, the keys are hung on a peg board, separate from the repair order until the invoice is presented to the customer and the bill is paid. Artifacts include physical or electronic entities that users create, retrieve, use or reference within a task, and/or pass on to another person in the work domain. This passing of artifacts should also show up in the flow model.
Example: Work Artifacts from a Local Restaurant
One of the project teams in our user experience class designed a system to support a more efficient workflow for taking and filling food orders in a local restaurant, part of a regional chain. As part of their contextual inquiry, they gathered a set of paper work artifacts, including manually created order forms and "guest checks," shown in Figure 3-3.
These artifacts are great conversational props as we interview the different roles that use them. They provide avenues for discussion given the fact that almost every restaurant uses these artifacts over and over again. What are things that work with this kind of artifact for order taking? What are some breakdowns? How does a person's handwriting impact this part of the work activity? What is the interaction like between the wait staff and the restaurant's guests?
Other forms of data collection
Other kinds of contextual data are also essential in representing work context, including:
� Copious digital pictures of the physical environment, devices, people at work, and anything else to convey work activities and context visually. Respect the privacy of the people and ask for permission when appropriate.
� On-the-fly diagrams of workflow, roles, and relationships; have people there check them for agreement.
� On-the-fly sketches of the physical layout, floor plans (not necessary to be to scale), locations of people, furniture, equipment, communications connections, etc.
� Quantitative data-for example, how many people do this job, how long do they typically work before getting a break, or how many widgets per hour do they assemble on the average?
Wrap it up
Do not overstay your welcome. Be efficient, get what you need, and get out of their way. Limit interviews to no more than two hours each; everyone is tired after that much concentrated work. At the end, you may wish to give interviewees something as a thank you. Although cash is always welcome, sometimes employers will not like you to pay their employees since in principle they are already being paid for being there. In these cases a "premium gift" is appropriate, such as a T-shirt or coffee mug inscribed with something catchy about the situation.
3.3.3 During the Visit: Collecting User Work Activity Data in the Product Perspective
Roles of users will be different with commercial products. In most cases, work in a domain-complex system context is performed by people in roles that make up the organization, which we will be calling "work roles." In the setting of a system
Figure 3-3
Examples of work artifacts gathered from a local restaurant.
with a complex work domain, a work role is defined and distinguished by a corresponding job title or work assignment representing an area of work responsibility. For a commercial product, a work role may just be the user.
Usage location will also be different for commercial products. The work or play by individual users of commercial products is not usually connected to an organization. This kind of work or play happens wherever the product is used. For example, if the product is a camera, the work happens pretty much anywhere.
The challenge therefore is being able to collect work activity data as it happens, in the context and location in which it happens, without influencing the user's behavior. What are the things users do when taking a photograph? With whom do they interact? What do they think about? What concerns and challenges do they have while taking pictures? What are the barriers to, or inconveniences in, doing it the way they want to?
Emotional impact and phenomenological aspects are more prominent with commercial products. A product such as a digital camera is much more likely to generate a strong emotional component within the user experience and even an emotional attachment to the device. What does it mean to the user emotionally to have a compact camera handy at all times?
A product like a digital camera also has more of a chance to be the object of long-term phenomenological acceptance into one's life and lifestyle. The more people carry the camera with them everywhere they go, the stronger the phenomenological aspects of their usage.
What does the camera's brand mean to people who carry it? How about the style and form of the device and how it intersects with the user's personality and attire? What emotions do the scratches and wearing of edges in an old camera invoke? What memories do they bring to mind? Does the user associate the camera with good times and vacations, being out taking photos with all his or her worries left behind? What does it mean to get one as a gift from someone? What about reuse and sustainability? How can we design the camera to facilitate sharing among friends and social networks?
You may have to observe longer-term usage. It usually takes longer to address these emotional and phenomenological factors in contextual inquiry because you cannot just visit once and ask some questions. You must look at long-term usage patterns, where people learn new ways of usage over time.
Example: User Data Gathering for MUTTS
We performed contextual inquiry sessions, interviewing MUTTS employees and customers. We had three analysts separately interviewing several groups of one or two users at a time and came up with a fairly rich set of raw data transcripts.
At the end, we also expanded the inquiry by asking customers about experience with other kiosks they might have used.
In most examples throughout this book, we cannot include all the details and you would not want us to. We therefore call on the reader for a kind of dramatic suspension of disbelief. The point of these examples is that it is not about content, especially completeness, which we deliberately abstracted to reduce the clutter of details. It is about simple illustrations of the process.
For simplicity, in most of our examples we will focus on MUTTS customers, whom we interviewed in the context of using the ticket office. Here are paraphrased excerpts from a typical session with a MUTTS customer:
Q: We want to begin with some questions about your usage of the ticket service, MUTTS. What do you do for a living? Tell us about your typical day.
A: I have a 9 to 5 job as a lab technician in Smyth Hall. However, I often have to work later than 5PM to get the job done.
Q: So do you use MUTTS to buy tickets for entertainment?
A: I work long hours and, at the end of the day, I usually do not have the energy to go to MUTTS for entertainment tickets. Because this is the only MUTTS location, I cannot buy tickets during normal working hours, but the MUTTS window is not open
after 7PM.
Q: How often and for what have you used the MUTTS service?
A: I use MUTTS about once a month for tickets, usually for events on the same weekend.
Q: What kinds of events do you buy tickets for?
A: Mostly concerts and movies.
Q: Describe the ticket buying experience you just had here at the MUTTS ticket office.
A: It went well except that I was a little bit frustrated because I could not do the search myself for the events I might like.
Q: Can you please elaborate about that?
A: My search for something for this weekend was slow and awkward because every step had to be a series of questions and answers through the ticket seller. If I could have used her computer to browse and search, I could have found what I wanted much sooner. Also, it works better if I can see the screens myself and read the event descriptions. And I also felt I need to answer quickly because I was holding up the line.
Q: Did you know you could search for some of these events on Tickets4ever.com?
A: No, I did not know they had local events.
Q: While you were looking at the seating chart, you seemed unsure about what the ticket seller was expecting you to do with it. Can you please walk us through what you were thinking and how that fit in with the way the seating chart was laid out.
A: Yeah, that was a problem. I could see it was a seating chart but I did not understand what seats were still available and could not quite put the layout of the seats in perspective. I had to ask her what the colors meant on the chart, and what the price difference was for each of those colored regions.
Q: Walk us through a couple of other experiences you have had at the ticket office and do not skip any details.
A: Last week I bought two movie tickets and that was very smooth because I knew what movie I wanted to see and they are usually the same price. Generally, buying movie tickets is very easy and quick. It is only with concerts and special events that things get somewhat complicated. For example, a couple of months ago, I wanted to get tickets to a concert and I could not get to this office for a couple of days because I was working late.
When I eventually got here, the tickets were sold out. I had to fill a form over there to get added to a waitlist. I do not know how the waitlist works, and that form was very confusing. Here, let me show you.. .
Q: What do you like most about MUTTS?
A: Because I am an MU employee, I get a discount on tickets. I also like that they feature the most popular and most current local events.
Q: What do you like least about MUTTS and what concerns do you have about using MUTTS to buy tickets?
A: MUTTS seems to have a limited repertoire of tickets. Beyond the most popular events they do not seem to handle the smaller events outside the mainstream.
Q: What improvements, if any, would you like to see in MUTTS?
A: It would help me if they were open later at night. It would be great if I could get football tickets here, too!
Q: Do you buy football tickets regularly?
A: Yes, I go to about four to five games every season.
Q: Do you buy tickets to any other athletic events? Can you describe a typical transaction?
A: Yes, I also get MU basketball tickets for at least a few games every season. For athletic tickets I have to be on the lookout for the dates when the lottery opens for the games I care about. I sign up for the lottery during the three days they are open and if
I win, I have to go all the way to the other side of campus to the MU Athletics Tickets Office. When I am looking to buy tickets to MU basketball, I like to look at different seating options versus prices; I sometimes look for an option allowing several friends to sit together. But that process is very complicated because I have to coordinate lottery signup with some friends. We get to buy only two guest tickets if we win the lottery.
Q: What difficulties do you experience in using MUTTS as the main source of tickets for events?
A: The main problem is that it is too far away from where I live and work.
Because the envisioned kiosk-based ticket system is so different from the existing MUTTS ticket window, we also wanted to get their thoughts on the proposed kiosk system.
Q: Now we want you to imagine a new service where you can buy tickets at public kiosks located across campus and the town. In particular we are planning to have ticket kiosks conveniently located at major bus stops in Middleburg. Have you had any experience with ticket kiosks in other places, other towns?
A: That is interesting! I never bought tickets at a kiosk before.
Q: Have you had any experience with other kinds of ticket kiosks at places like bus stops or in Metro-type commuter train stations in any big city?
A: Yes, I lived in New York for a couple of years and I used the MTA kiosks to buy metro cards all the time.
Q: If we were to put kiosks at places such as university parking lots, the university mall, and other public locations across campus to sell tickets that you get at this office, would you use them?
A: I would be willing to at least try a ticket kiosk located at the Burruss Hall bus stop because I take the bus there every day. I would also try one near the University Mall because I live near there.
Most of my free time is outside normal business hours, after many businesses are closed, so a kiosk might be convenient.
Q: What type of information would you like to see in such a kiosk?
A: When I look for entertainment options, I want to see the most current events (top picks for today and tomorrow) on the first screen so I can avoid searching and browsing for those.
Q: In your transaction here at the MUTTS office today, you asked if Unspoken Verses is like the Middleburg Poet Boys band. How do you envision getting information like that at a kiosk?
A: That is a good question! I am not sure. I guess the kiosk should have some sort of related items and good description of events. Perhaps even recommendations of some sort.
Q: Can you envision yourself using a kiosk to do what you did today at this office?
A: Yes, definitely. I guess I would expect some form of detailed description of the events.
I should be able to look for different types of events. If there are pictures, that would help. I should be able to see a seating chart.
3.4 LOOK FOR EMOTIONAL ASPECTS OF WORK PRACTICE
Look for the impact of aesthetics and fun in work practice, and look for opportunities for more of the same. When you are visiting workplaces, observing work practice, and interviewing people in work roles, you may find that customers and users are less likely to mention emotional aspects of their work practice because they think that is about personal feelings, which they might think is inappropriate in the context of technology and functional requirements.
As a result, you must try harder to uncover an understanding about emotional and social aspects of work practice. For each different work or other role studied in contextual inquiry, try to get at how emotion might play a part. You have to be diligent and observant of little details in this regard.
Look for ways to fight job boredom. Does anyone intimate, even obliquely, that they would like their job to be less boring? What about the work is boring? Where and when are people having fun? What are they doing when they have fun? Where do people need to have fun when they are not?
Where is there stress and pressure? Where can job stress be relived with aesthetics and fun? Where would it be distracting, dangerous, or otherwise inappropriate to try to inject fun or surprise?
What are the long-term phenomenological aspects of usage? What parts of usage are learned over longer times? Where is it appropriate for users to give the system or product "presence" in their lives?
3.5 ABRIDGED CONTEXTUAL INQUIRY PROCESS
The full rigorous process for contextual inquiry and analysis is appropriate for domain-complex systems. But the fully rigorous contextual process is not always necessary. Contextual inquiry calls for using good sense and not slavishly following a predefined process. Minimize overlap in raw data collection across interviews. Use your experience to focus on just the essentials.
Another way to abridge your contextual inquiry is to limit your scope and rigor. As an example, we were part of one small project where less than a day's worth of talking to users about their work practice made a big difference in our understanding of the work domain to inform the design.
One of the most obvious and direct ways to abridge the full contextual inquiry process to save resources is to not make audio or video recordings of the user interview sessions. This also saves resources later in contextual analysis because you do not have to transcribe the recordings.
3.6 DATA-DRIVEN VS. MODEL-DRIVEN INQUIRY
Beyer and Holtzblatt (1998) take an approach to contextual inquiry and analysis for HCI based on pure ethnographic field research. That is, their process is led entirely by work activity data. Simply stated, letting data do the driving means that if you encounter any information that seems relevant to the work practice and its milieu, collect it. This approach means forestalling any influence from your own knowledge, experience, or expectations and just gathering data as they present themselves.
Data-driven contextual inquiry results in voluminous raw data describing a
wide variety of topics. To digest this mass of disparate data points, make sense of them, and put these data to work in informing design, practitioners must apply contextual analysis to extract the concise and meaningful points and issues and then sort and organize them into piles or affinity diagrams. Then the sorted categories must be converted into design- informing models such as flow models, user models, and task models. In the purely data-driven approach, these categories and models are dictated by the data content.
In effect, Beyer and Holtzblatt (1998) recommend not thinking of data categories in advance, but letting data suggest the categories and
subsequent models. This will help avoid biasing the process by replacing data from users with analysts' hunches and opinions. Their "contextual design" approach to contextual inquiry and contextual analysis has proven itself effective.
However, Constantine and Lockwood (1999) show that there is more than one effective approach to gathering contextual data to inform design. They
promote a method they call model driven, which is in important ways the reverse of the Beyer and Holtzblatt data-driven approach. In their "use what you know" approach, Constantine and Lockwood advocate using knowledge and expectations from experience, intelligent conjecture, knowledge of similar systems and situations, marketing analysis, mission statements, and preliminary requirements to focus your contextual inquiry data gathering to anticipate preconceived data categories and target the most useful data and to get a head start on its organization and analysis.
From this experience, most practitioners know what kinds of models they will be making and what kinds of data feed each of these models.
This knowledge helps in two ways: it guides data collection to help ensure that
you get the kinds of contextual data you need, but at the risk of analyst bias in
those data. It also helps with analysis by giving you a head start on data
categories and models.
Certainly not all of this anticipatory information will be correct for a given work practice or system, but it can provide an advantageous starting point.
Experienced professional practitioners, having gone through the contextual inquiry process and having done similar analyses in other work contexts, will learn to get through the chaff efficiently and directly to the wheat.
Although their process might seem that it is about modeling and then finding just the data to support the predefined models, it really is about starting with some initial "exploratory" models to guide data collection and then focused data collection to find answers to questions and outstanding issues, to refine, redirect, and complete the models. This "model-driven inquiry" approach also has a solid real-world track record of effectiveness.
The Beyer and Holtzblatt contextual design approach works because, in the
end, data will determine the truth about work practice in any specific real-world
customer or user organization. However, the Constantine and Lockwood
approach works because it encourages you to use your experience and what you
know to anticipate data needs in contextual inquiry and contextual analysis. While data-driven inquiry assumes a "blank slate" and a completely open mind, model-driven inquiry acknowledges the reality that there is no such thing as a blank slate (Constantine & Lockwood, 1999).
The Beyer and Holtzblatt approach is rooted in real data untainted by guesswork or analyst biases. But the Constantine and Lockwood approach claims advantages in lower cost and higher efficiency; they claim the search for data is easier if you know something about what you are looking for. To them, it is about pragmatically reducing the ratio of data volume to insight yielded.
In order to inform the design and represent the user, the usability engineer needs to understand not only the user requirements, but also the business requirements and process. Also, it is very possible that neither the business requirements nor the current process is as well known as it should be. This case study depicts such a situation.
THE PROBLEM
I was working with an insurance claims processing company as a consultant. Their problem was that there was a high turnover of adjudicators. The adjudicators had the responsibility of reviewing all claims that the automatic processing had rejected and making a final determination on rejection or payment. The adjudicators were skilled workers and required about 6 weeks of training followed by several months of experience to get to the level of performance that the company required of them.
However, the work was both tedious and demanding, and the turnover of adjudicators was relatively high, as was the case for most of their clerical staff. The company asked me if I could redesign the user interface to make the process easier to learn so that new adjudicators could be brought on in less time.
UNDERSTANDING THE PROBLEM
As is usually the case for consultants, you work for a variety of clients in a variety of business sectors. The insurance business was completely new to me and there was a lot to learn. Management was able to explain to me what the responsibilities of the adjudicators were and what the management problem was. The company's business analysts provided me with an overview of the adjudication issues and pointed me to the policy manual, an online reference document with much more information than I could possibly absorb in a reasonable amount of time. The policy manual was the adjudicator's "bible" and becoming familiar with it was essential for them to do their work.
Management thought that something might be done to make finding the desired information in the bible easier, which would help the adjudicators. But discussions with the adjudicators did not reveal any significant problems in finding the information they needed and did not mention it as being a problem for either doing their work or in becoming proficient at their work. At this point I had no idea of what could be designed that would reduce the amount of training required.
There were two things still to do. One was to talk with the trainers to find out what they perceived to be the reason it took new adjudicators 6 months to become proficient. The other was to spend some time observing the adjudicators doing their work. Discussions with the trainers provided the first indication of what the problem really was and where the solution might lie and the observations confirmed it.
The trainers stated that the actual task of adjudicating the claims was something that was learned in a couple of weeks. The remainder of training time was spent learning where to get the information relative to the claim that would support a decision on whether to allow the claim. Watching the adjudicators doing their work showed that they were constantly pulling up new data screens, switching back and forth among the screens, and making notes on scraps of paper.
After spending some time observing the adjudicators at work, the real problem became evident. The current user interface was based on the structure of the underlying database of claims, referrals, subscribers, and providers.
To resolve an issue, adjudicators needed to immerse themselves in the database, searching for information about past claims ("encounters"), referrals, provider/clinic associations, and other pieces of data that would allow them to determine if the claim was covered or not. To do this, they were constantly navigating a complex database, pulling up screens full of data to find one or two items of interest.
It was not the training that was a problem or difficulty with the policy manual. The root problem was that they were doing the work of the computer system, sifting through large amounts of data to find those items that were pertinent to resolving the claim. Contextual research showed that the information needed to resolve a claim could be diagrammed as an object model. This model showed the needed information as well-defined objects and what the data relationships were from the perspective of the adjudicators.
I was also able to determine that the process of adjudicating a claim had three basic activities:
i. determining if a referral is needed
ii. matching a referral to a claim
iii. paying, denying, or forwarding the claim
DESIGN AND ITERATION
Although the process was usually fairly linear, the adjudicator would sometimes need to switch from one activity to another to check up on an item or resolve a minor issue. However, the recognition of these activities as constituting the process allowed for development of a simple conceptual model:
Encounter
Referrals
Provider
Encounter History
Resolve
where the information the users were previously writing down as notes was consolidated and kept visible as "Encounter Data." Selecting the tabs in the upper right would bring up tools and data needed for the specific activity the adjudicator was currently engaged in.
The conceptual model, above, was validated ("tested") by several adjudicators and adjusted to make access to data being sought during the referral matching activity easier and more straightforward.
It was at this point that we entered the agile phase of development. We developed an initial working prototype that fleshed out what data should be presented along with where and how it was presented and then went through several iterations of programming and designing of the prototype, changing data that were presented, and adjusting the placement of data and the mechanisms used to present it. These intermediate prototypes were reviewed with a select group of adjudicators until we had a final version that most everyone was satisfied with. At this point, we let the graphic designer clean it up and make it more attractive. Being an in-house application, our graphic design goals were aesthetic: to provide a display that was clean in appearance and comfortable to view and work with.
SUCCESS MEASURES
The final check was to validate the design with measures on time to train and productivity. We checked expected training time by simply allowing novice adjudicators to use the new design to adjudicate a number of claims with only a simple introduction to it. We first measured their performance using the current system and then measured their performance with the new system. During the first 30 minutes of using the new system, claim resolution time was approximately 20% longer than their performance with the old system. During the second 30 minutes with the new system, they were averaging 20% less time than with the old system. By the end of 90 minutes use of the new system, adjudicators were resolving claims in about one-third of the time that they did with the old system.
Since it was the task of finding the information needed to resolve a claim that required 6 months of experience to become proficient, we were comfortable that the new system would not only improve productivity but reduce the time it took to train adjudicators and bring them to an acceptable level of proficiency.
3.7 HISTORY
3.7.1 Roots in Activity Theory
First of all, we owe a great acknowledgment to those who pioneered, developed, and promoted the concepts and techniques of contextual design. Early foundations go back to Scandinavian work activity theory (Bjerknes, Ehn, & Kyng, 1987; B0dker, 1991; Ehn, 1988). The activity theory work was conducted for quite some time in Scandinavia, in parallel with the task analysis work in Europe and the United Kingdom. More recent conferences and special issues have been devoted to the topic (Lantz & Gulliksen, 2003). Much of the initial work in this "school" was directed at the impact of computer-based systems on human labor and democracy within the organizations of the affected workers. This singular focus on human work activities shines through into contextual inquiry and analysis.
126 THE U X BOOK: PROCESS AND GUIDELINES FO R E NSURING A QUALITY U SER E XPERIENC E
3.7.2 Roots in Ethnography
A second essential foundation for contextual inquiry is ethnography, an investigative field rooted in anthropology (LeCompte & Preissle, 1993). Anthropologists spend significant amounts of time living with and studying a particular group of humans or other possibly more intelligent animals, usually in social settings of primitive cultures. The goal is to study and document details of their daily lives and existence.
In a trend toward design driven by work practice in context, quick and dirty varieties of ethnography, along with other hermeneutic approaches (concerned with ways to explain, translate, and interpret perceived reality)(Carroll, Mack, & Kellogg, 1988), have been adapted into HCI practice as qualitative tools for understanding design requirements. Contextual inquiry and analysis are examples of an adaptation of this kind of approach as part of the evolution of requirements elicitation techniques.
The characteristics that define ethnography in anthropology are what make it just right for adaptation in HCI, where it takes place in the natural setting of the people being studied; it involves observation of user activities, listening to what users say, asking questions, and discussing the work with the people who do it; and it is based on the holistic view of understanding behavior in its context.
In contrast to long-term field studies of "pure" ethnography, with its cultural, anthropological, and social perspectives, the "quick and dirty" version of ethnography has been adapted for HCI. Although involving significantly shorter time with subjects and correspondingly less depth of analysis, this version still requires observation of subjects in their own environment and still requires attending to the sociality of the subjects in their work context (Hughes et al., 1995). For example, Hughes et al. (1994) describe application of ethnography in the area of computer-supported cooperative work (CSCW), a sub-area of HCI.
Lewis et al. (1996) describe an ethnographic-based approach to system requirements and design that parallels much of the contextual inquiry process described here. Rogers and Belloti (1997) tell how they harnessed ethnography as a research tool to serve as a practical requirements and design process.
Blythin, Rouncefield, and Hughes (1997) address the adaptation of ethnography from research to commercial system development.
3.7.3 Getting Contextual Studies into HCI
The foundations for contextual design in HCI were laid by researchers at Digital Equipment Corporation (Whiteside & Wixon, 1987; Wixon, 1995; Wixon, Holtzblatt, & Knox, 1990). By 1988, several groups in academia and industry
CONTEXTUAL INQUIRY: ELICITING W ORK A CTIVITY D ATA 127
were already reporting on early contextual field studies (Good, 1989) in the United States and the United Kingdom (notably the work of Andrew Monk). Similar trends were also beginning in the software world (Suchman, 1987). Almost a decade later, Wixon and Ramey (1996) produced an edited collection of much more in-depth reports on case studies of real application of contextual studies in the field. Whiteside, Bennett, and Holtzblatt (1988) helped integrate the concept of contextual studies into the UX process.
3.7.4 Connections to Participatory Design
Contextual inquiry and analysis are part of a collection of collaborative and participatory methods that evolved in parallel courses over the past couple of decades. These methods share the characteristic that they directly involve users not trained in specialized methods, such as task analysis. Among these are participatory design and collaborative analysis of requirements and design developed by Muller and associates (1993a, 1993b) and collaborative users' task analysis (Lafrenie`re 1996).
Intentionally left as blank
Contextual Analysis: Consolidating and Interpreting Work Activity Data
4.1 INTRODUCTION
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment