Designing Computers with People in Mind
Designing Computers with People in Mind
Anne Garrison, S. Joy Mountford, Greg Thomas
Advanced Technology Group
Research Note #29
Abstract: Well aware that good interface means more than good code, Apple's Design Group manager Joy Mountford initiated the project to encourage university students from different disciplines--primarily cognitive psychology, visual design, computer science, and industrial design --to collaborate on interface designs. They required the students to submit not only their finished prototype project designs, but also their account of the design process and their iterations, including user studies and user testing. The sponsors wanted them to learn "user-centered design," designing products with real people and real tasks in mind.
Well aware that good interface means more than good code, Apple's Design Group manager Joy Mountford initiated the project to encourage university students from different disciplines--primarily cognitive psychology, visual design, computer science, and industrial design --to collaborate on interface designs. The Interface Design Project was itself an interdisciplinary effort involving Joy Mountford's Human Interface Group in Advanced Technology, Bob Brunner's Industrial Design Department, and the local geographies of the Higher Education Marketing Departments. They required the students to submit not only their finished prototype project designs, but also their account of the design process and their iterations, including user studies and user testing. The sponsors wanted them to learn "user-centered design," designing products with real people and real tasks in mind.
The project brief also required the students to design interfaces that could adapt to environments beyond their desktops and to individuals with unique skills, goals, and work habits. "The computer I use today knows me no more than it did yesterday or last year," says Mountford. "We asked the students to show us how a computer might adapt to me as I work with it, or how I might adapt it myself."
After a semester's work on Apple-donated equipment, and intermittent visits from Apple liaisons, Apple's own interface designers, students from nine universities submitted twenty-five projects. These were evaluated by Apple reviewers as well as Austin Henderson (XeroxPARC), Mike Nutall (IDEO), and Kristee Rosedahl (independent designer).
Six months later, the most outstanding student teams presented their designs. Their prototype products included: a personal organizer, shopping aids, educational toys, new desktop computer interfaces, a personal digital assistant for exploring and studying nature, and a computer aid for the handicapped. Among the best projects, all for different reasons, were GloBall, HandiBoard, and Rosetta.
GloBall, a toy that adapts as children grow.
At the University of Toronto, an interdisciplinary team began by considering a young user, Iggy, whose pushy Uncle Byron had presented him with a computer for his first birthday. Iggy tried it for a while, but when all was said and done, he really preferred his other toys. Was there no computer interface more suitable for Iggy or other small children? The students decided that no one needed an adaptable interface more than small children like Iggy, who explore more of the physical world and learn new words and numbers every day. They designed a prototype project called GloBall, a ball that rolls on its own power, chasing or running away from a child, and plays arithmetic and language skill games.
Iggy could play games by pressing colors, shapes, and icons on Globall's touch-sensitive LCD screen, which would actually be a collection of LCD rectangles. The LCD screen would surround a foamy section protecting the ball's internal components: a CPU, a rechargeable battery, and a motion mechanism which would change the ball's center of gravity, allowing it to move on its own. When not rolling about, GloBall would sit in a base with a battery charger and a disk drive or PCMCIA slot. Iggy's parents would insert diskettes into the drive when Iggy wanted to play games with GloBall.
What sort of game could GloBall play with Iggy while he was still a crawling infant? The very simplest. Rolling toward and then away from him, it would adapt to the his position and his increasing speed and coordination. As Iggy grew bigger and brighter, he could press on GloBall's surface, matching colors and shapes and pictures of animals in response to both visual and audio cues.
When Iggy's parents pressed on GloBall's surface to input his name and age, GloBall would create a user profile, which would soon include his ability to differentiate colors, his response time to different stimuli, and the number of images he could handle at once. In the color matching game, red, blue, and yellow squares would glow on GloBall's surface, and GloBall would ask him to press on each color in turn. Once he was able to identify all three, GloBall would automatically display a more challenging array of colors. In the shape matching game, GloBall would challenge him to press rectangles and triangles only after he had learned circles and squares.
Once they'd created the GloBall design, the students considered the challenge of user testing. Since producing a real GloBall prototype wasn't economically feasible, they built two different balls for testing on little users. One, made of paper mache, contained a remotely controlled car that made the ball seem to move on its own. The other contained an intercom system that made it seem to speak.
Though the students had worried that children might be frightened or confused when GloBall talked to them, most children liked it and many even seemed eager to engage GloBall in conversation. A young user named Jacob spelled his name "Jacba" on the ball, and then promptly chastised the ball for mispronouncing it.
It also became clear that children might not always interpret language as adults might. When a GloBall prototype challenged one child to press on blue and he pressed on yellow, GloBall said "try again" and he proceeded to press yellow again and again, each time GloBall said "try again." A stronger case for knowing your users and user-testing your designs could hardly be made.
HandiBoard, a computer aid for the physically handicapped.
Five computer science and engineering students from Stockholm, Sweden's Royal Institute of Technology joined a visual artist from Sweden's Royal College of Arts, Crafts, and Design to design the HandiBoard prototype after interviewing handicapped people in hospitals, workplaces, and homes. Their user interviews had led them to two conclusions: first, that there are lots of types of mechanical aids for handicapped people but that most aren't easy to use; and second, that most tools for the handicapped assist with work rather than home life and communication. They decided that handicapped people needed a simple computer to help them in their daily lives and living environments. They also identified four tasks handicapped users most often need help with:
--calling for emergency assistance,
--communication with others
--mobility to move themselves, as well as objects
--and remembering to keep appointments, take medications, and perform tasks.
After studying the varied mobility and motor control of handicapped people, the students decided that HandiBoard should be a thin computer screen attached to the arm of a wheelchair, picked up and moved from place to place by individuals with control of their hands and arms, or suspended above the head or before the eyes of a totally paralyzed person. With current technology, those paralyzed from the neck down could wear a headset with a transmitter sending signals to a receiver connected to the HandiBoard. A headset transmitting sucking and blowing signals is also possible, and technology now being developed holds the promise of voice recognition for those who can speak and eye tracking for those able to move only their eyes.
What would handicapped people do if violently ill and unable to cry out vocally for assistance? Click with a mouse, move their head, or focus their eyes to select an icon on the HandiBoard screen, triggering a digitized cry for help. What would they do to turn on the lights, draw the curtains, or open the door for a friend? Select more HandiBoard icons. To communicate without speaking, call a cab or a friend? Select again. What would they do to remember to take their medication on time or keep a five o'clock appointment? Select icons and buttons to program the HandiBoard to give them audio or visual reminders.
When designing the HandiBoard's graphic interface, the students sought to offer users customizable, readily understandable everyday metaphors--icons of rooms, lights, doors, telephones, people, and other recognizable images which they could select to control their environments. To make automatic telephone calls, users could select from images including digitized photos of friends and icons representing services such as taxi-cabs or hospitals. They could dial other numbers by selecting from a numeric keypad laid out like a touch-tone phone.
Once they'd designed it, the students also realized that non-handicapped computer users could also use the Handiboard to control their home environments. This broader home market could make the Handiboard a more commercially viable product, lowering the price of a HandiBoard for the handicapped. Their conclusion demonstrated that designs for the handicapped, or other small groups, can have an impact in everyday situations and that designers should always consider a wider market, even when designing for a small one.
At the Art Center College of Design in Pasadena, Los Angeles students set out to design a portable, adaptable nutritional manager that would also help people predict and avoid the often unpleasant possible interactions of food and medications. They began by interviewing elderly people, who commonly use one or more medications, and soon learned that they tend to dislike change, especially changing technology. The students concluded that their nutritional manager would need to be not only very useful but also un-intimidating. After user testing a series of prototypes, they also came to the following conclusions:
-- A nutrition manager appealed to people 50 years of age and older and families.
-- Elderly people and people taking medications often have shaky hands, so small motor skills like using a pen should not be required.
-- The nutrition manager needed to fit the shape of the user's hand.
-- Users were confused and intimidated by a collection of icon buttons on a prototype's screen. They were not confused and intimidated when the data they could have accessed by clicking on icons was presented as a list that they could scroll through in outline form.
The final prototype, Rosetta, was a pair of handheld devices, the Living Kitchen, for use in the home, and The Shopper, a smaller device for use while shopping for groceries. Though the Living Kitchen was larger, the two devices shared the same simple interface: a screen, a "rocker switch" for navigating up and down lists, a volume control, and tangible "keys" inserted into the side of the device.
The keys were a clever hardware solution that replaced the software--the icon buttons and fast keys--which had confused users. Test subjects readily understood the key metaphor, especially because the keys were real, tangible objects shaped just like the keys they used every day. They were quick to comprehend that the keys unlocked information, just as they unlocked doors or mailboxes. Though a button that launched an application had been hard to understand, a key that launched an application seemed much like a key that started a car. In the students' design, users would be able to customize Rosetta by purchasing different keys to unlock the types of information most helpful to them.