Wednesday, February 18, 2009

Mockups & Prototypes

Feedback on Scenarios
•The “Goldilocks” Issue
–Too Detailed:
“Susie uses the mouse to move her cursor into the address bar. Once she sees the blinking bar in the address bar, she uses keyboard to type the URL and hits ‘Enter’ to bring up the site.”
–Too High-Level
“Susie opens the website to search for lesson plans.”
–Juuuuustright:
“Susie decides to go find lesson plans using uen.org. She opens a web browser and types the URL (www.uen.org) into the address bar.”

•How to decide on the right level of detail?
–Could someone who has never seen your system imagine your persona using it?
–You can assume that you don’t need to describe very common knowledge
•E.g., what it means to “click” or “type” or change text
–Does your scenario inform your design?

Mockups
•Definitions
–Pictures of screens as they would appear at various stages of user interaction
–Very early prototypes made of low-fidelity materials
–Prototypes with limited interactions and content
–Typically Paper (or Cardboard)
•Drawn or Created with software tools
(use something simple that you are most comfortable using)
–PowerPoint, Photoshop, Pencil, SmartDraw
OmniGraffle is popular for Macs
•Use different pages for new windows
Mockups = Low-Fidelity Prototypes = Paper Prototypes

Example Mockups
PDA travel application
http://www.youtube.com/watch?v=Bq1rkVTZLtU&NR=1
Website design (Not in English …)
http://www.youtube.com/watch?v=4WbxisFDYk4&feature=related
http://www.youtube.com/watch?v=AtfWM2jRS2w
Google Reader (Demonstration Prototype)
http://www.youtube.com/watch?v=VSPZ2Uu_X3Y

•Purpose
–Facilitate design evaluation BEFORE spending lots of time and money on a high-fidelity design
•Reduces programming time
•Can save money by identifying changes early
–Concrete design improves feedback/evaluation
•Prototyping: Same quality with 40% less code & 45% less effort (Boehm, Gray, Seewaldt, 1984)

•Use whatever is fast and easy for *you*
–Hand drawn?
–PowerPoint?
–Photoshop?
–Pencil (add-on to Firefox)
–Supports rapid paper prototyping for web/software interfaces
https://addons.mozilla.org/en-US/firefox/addon/8487

Local vs. Global Mockups
•Local
–Target selected aspects of the system
–Identify KEY issues. What is most tricky?!
•What terms to use in menus
•How to provide feedback, etc.
•Global
–Overall interaction with your system
–More holistic in nature
–Covers numerous tasks with the same prototype

Vertical vs. Horizontal
•Vertical <-> Local
–Implement one section of your system in detail
–Leave the rest as low-fidelity
•Horizontal <-> Global
–Implement all of your system in detail at high levels
–Make all high-level features interactive
–Leave in-depth content unspecified
•E.g., actual description of grants, actual help files

High-fidelity Prototypes
•Also known as
–Prototypes
–Version 0 systems
•Use after you have clarified your design requirements
•A working release of your system
•Developed with same tools as final system
•May lack some functionality/content

Wednesday, February 11, 2009

Design & Evaluation Plans

What is a Design Brief?
•Written plan
–Focuses on the design you want to achieve
•Captures goals you want to meet
•Identifies key components of success
•Explores challenges you’ll need to meet/resolve
–Ensures communication about priorities, milestones, and criteria for success
–Synthesizes important data & input
–Acts as a point of reference during the design process
•Does the solution fit the problem?
•How to prioritize issues in the design?

Components: Design Briefs
•Team Members
•Statement of Problem/Challenge
•Activities to Date
•Summary of User Interviews & Participant Profiles
•Design Priorities and Issues
•Quick Review of Related Products/Systems
•Persona(s)
•Task Statements
•Evaluation Plan
•Exposures
•Schedule of Design & Evaluation Activities
(Examples in blue are from the very first design brief I ever wrote.)

Team Members
•Specify the names and email addresses of all team members
•(If working alone, just name yourself!)

Statement of Problem/Challenge
•Should be one paragraph or less (short & sweet)
–What is the design challenge?
–What are you trying to accomplish?
–What will be the end result of your efforts?

The Center for Spoken Language Research (CSLR) has developed a series of interactive books and tutors that are designed to support the development of reading processes in children. Although there are a large number of existing, pre-defined tutors that ask the child to perform various tasks, it is not clear how to represent student performance in a clear and functional way. We will design the data display screens for the CSLR software so that it is useful and intuitive for literacy teachers working with these students.

Activities to Date
•Meetings with Project Sponsor/Stakeholders?
–Summary
•User Interviews
–Summarize the number of interviews and the general user profile (e.g., To date, we have conducted 5 interviews with technology teachers who have varied levels of experience in schools)
–Interview Questions: OK to provide the most current set of questions.

Summary of User Interviews
•Provide a summary of your user’s activities, needs, and concerns.
•These draw upon your “key themes”, but are described in detail.

All the teachers we spoke to tend to work with small groups of children at a time (between 1 and 8) and keep written records on their students. By law, the teachers are required to set yearly goals for the students, which are formalized in either an IEP (Individual Education Plan: developed for all Special Education students, and reading is one component of this plan) and/or in an ILP (Individual Literacy Plan: developed for students who are specifically behind in reading; these students may or may not be in special ed). All teachers expressed a definite need for representing growth over time. The times in the year at which teachers tested their students’ progress differed, but all were concerned with understanding a child’s growth, and being able to communicate that to others (e.g., parents or the child’s regular teacher)… (ONE OF FOUR PARAGRAPHS)

Summary of Profiles
•OK to summarize in a table
•Always is anonymized
–no names or identifying information
•Focus on important characteristics for your user group (these likely will vary by project)
–Age?
–Gender?
–Experience?
•Career
•Computer

Design Priorities and Issues
•What will a successful design need to achieve?
–What is necessary to meet the needs of your users?
–What will be key components of your design?
•What are likely design problems or challenges that you’ll face?
–What do you foresee as barriers to success?
–Where is your design likely to break down?

Design Priorities and Issues
Priorities
•Allow teachers to compare growth over time.
•Breakdown data into useful categories: skills, types of errors, etc.
•Support comparison of performance to specific goals for the student.
•Support materials that can be used in conferences with parents.
•Grade level assessment broken down into component skills.
•Selected information from teacher report summarized for parents.
•Support comparison to benchmarks, rubrics, grade-level standards, etc.Issues
•The CSLR wants a basic design template for data design, but diverse data will be produced from the interactive books and tutors. Can a single format accommodate diverse types of data? These include:
•Comprehension data from interactive books
•Skill data from interactive tutors
•Performance data (such as fluency speed)
•Teachers not only want summary data on student performance, but also strongly desire a categorical breakdown of errors and performance. This categorization may be variable based on student performance and may require “smarter” technology than currently exists.
•One teacher has mentioned that the use of “Excent” –a database program to track special education students’ progress –is being introduced in the school district. If the use of this program becomes standardized, reports from the interactive books and tutors should easily export to this program.

Persona(s)
•You may include multiple personas, but you should identify your primary user(s).

Task Statements
•Task Statements
–Not a task analysis
–Choose 3-5 tasks that your users will need to accomplish with the system. Use your persona(s) to describe those tasks in a rich, narrative manner. (In your final report, these will become full scenarios).

Dorothy is a reading specialist who works primarily with small groups of reading disabled children outside of their regular classroom. She is concerned about the progress of Brett, a grade 2 student. Brett seemed to be making steady progress when she started working with him, she feels as though he may not be showing the same level of growth in his phonological awareness this past month. Dorothy wonders if there is one particular skill, like the pronunciation of long vowels, that is causing his slower progress or if the problem is more general. Dorothy would like to compare Brett's recent progress with his long-term growth and to find out if Brett is performing at the same level for all the component reading skills in the suite of interactive books and tutors he has been using.

Review of Related Products/Systems
•Identify 2+ systems that can inform aspects of your design
•Review these systems based upon your design priorities & issues
•Hint: Screen shots often will be helpful in your analysis

Although many existing software programs advertise that they provide “progress reports” on children’s performance (e.g., Reader Rabbit, Reading Blaster, Accelerated Reader), the form of these progress reports does not meet the needs expressed by literacy teachers. For example, in “Read, Write, and Type,” if the child selects to complete the auxiliary skills section (which is optional) they receive a score for typing accuracy (a score of the number of letters the child typed correctly/ the total number of letters typed) and a score for speed. However, there are no means of specifically evaluating areas where the child may be experiencing difficulties. The only way to discover where the child might be having trouble is to sit with the child and watch his or her progress as he or she plays with the program. This type of assessment clearly does not meet the needs of reading teachers, who express a desire for specific skill-based assessment as well as time saving features such as summarizing and listing errors.

Evaluation Plan
•What will you do to test your design?
–High or low-fidelity prototypes?
–E.g., Cognitive Walkthrough or Heuristic Analysis?
–Testing with users?
•Who?
•Where?
•When?
–OK if this changes. Create a general plan for now and revise as needed.

Exposures
•What problems will you need to solve before/during system testing?
–Provide an anticipated solution, fall-back plan

•Seeding the prototype with realistic data (and data that can be interpreted independently of teacher observations of a student).
•One teacher has volunteered to share a “running record” with us. This may give us a good base for realistic and appropriate student data. However, it may be difficult to translate this qualitative record into quantitative data for testing.
•Meeting teachers at their schools for prototype evaluation may cause problems with platforms, software, & computer access. Teachers may not be able to come to campus for testing.
•One group member has a laptop available for use. We may use the laptop to provide a consistent computer environment even if we need to travel to the teacher’s location.

Schedule of Design Activities
•What are your key HCI milestones?
–User Interviews
–Prototype (low or high fidelity)
–Internal Evaluation (e.g., Cognitive Walkthrough)
–User Testing
–Final Report
•What are the tasks for each milestone?
•If in a team, who are the responsible members for each task?

Schedule of Design Activities Example
Deliverables
Completion Date
Responsible Member(s)
UserInterviews
3/07/2009
Create List of Interviewees
2/12/2009
Butcher,Baker
FinalizeList of Interview Questions
2/16/2009
Candlestickmaker
Schedule Interviews
2/20/2009
Candlestickmaker
Conduct AllInterviews
2/27/2009
Butcher, Baker
DevelopKey Themes, Summarize for Final Report
3/07/2009
Butcher, Baker, Candlestickmaker

Work on your design brief!
–Design briefs are due February 25th.

Wednesday, February 4, 2009

Interviewing Follow-Up, Scenario-Based Evaluation

Persona Exercise
Interviews:
Hardest Parts:
  • designing interview questions
  • non-leading questions
Suggestions:
  • open-ended questions to just get them talking
  • ex. How do you usually perform this task?
  • If there is an existing product like the one you are creating, get them to use it and watch them. How are they using it? What problems do they encounter?
  • If the interview goes in an unexpected direction, go with it for a while and find out why.
  • It's okay to ask more specific questions that are related to your project, just avoid leading them and making them feel like they are supposed to give a specific answer.
  • It really helps to record your interviews, but also take notes so you can have a quick record if you don't have time to listen to the entire interview again.
  • If working as a team, one person can interview and one can record. It allows the interviewer to be more focused on good follow-up questions and the person rather than writing.
  • Interview 6 to 10 people for your capstone project.
Deriving the Persona:
  • If you interview enough people, the key themes tend to come together.
  • You definitely may find that there are a couple personas, but you are going to need to decide on one persona for your design at least to begin with.
How did you decide who to interview?
  • Develop a profile: who are the general users? What user characteristics are relevant to design and testing? What characteristics should all users have in common? (ex. Do they all have to be able to use e-mail?) What characteristics will vary? How will you define them?
  • Try to interview between 6 and 10 people. Less makes it hard to identify patterns, more makes the info. not as useful.
  • Having trouble finding people? Ask the first few users to suggest others.
  • Do contextual interviews. Go to the users' relevant environment. (home, office, classroom)
How did you decide what to ask?
  • Interview as a team. One interviewer, one scribe
  • Explain why you are talking to them and they can end at any time for any reason
  • Ask open-ended questions that allow elaboration
  • Look for opportunities to probe interesting responses
  • Be specific in getting how people already do tasks: verbal - tell me about the last time you... observational - walk me through how you'd....
  • You can change/add questions as needed
  • Ask about contrasts: best experience? worst? likes? dislikes?
  • Last question: Can I contact you for a follow up interview? (It's okay if they say no)
  • Don't use yes/no questions
  • Be wary of leading questions -- Yes, Prime Minister youtube clip
  • Avoid speculative questions: (ex. Would you use an electronic calendar?) If you must ask, do so at the very end of the interview. Don't put too much stake in this answer.
  • Avoid specific design questions. Look for data to inform your design. Don't expect your interviewee to be able to solve your design problems.
What lessons learned for Capstone project?
  • It will be beneficial to interview 6-10 people.
  • Develop a user profile to help you decide who to interview and what questions to ask.
  • You may have to narrow your project to one persona to be your primary persona.
  • Are you going to focus on novice, intermediate, or advanced users? If the system is optional, you can focus on whichever group you want. But, if your system is required of a larger group of varied users, you may need to focus on the novice and just put in some short cuts to help the expert move through more quickly. (For me - links to get to what they want quickly, but detailed instructions for those who need it.
  • Encourage and reassure your interviewees and later your testing and usability people so they don't get nervous and feel like they're not doing a good job.
  • Adjust your questions for future interviews to avoid problems encountered in earlier interviews.
Analyzing your interview data:
  • Flesh out notes from interview IMMEDIATELY.
  • In teams, have each member start by analyzing separately. Look for patterns that concern design priorities, design issues, and deal-breaker (show-stoppers). Reconcile key themes identified by each team member.
  • Use data to identify priorities
  • Then derive tasks
Now you have personas & tasks … now what?!

Scenario-Based Evaluation
What is a scenario?
•Task + Persona = Scenario
•Idealized but detailed
•Describes a specific instance of interaction
•Tries to balance generality & accuracy
–Use persona
–Use common, general tasks
–Situate use in your design
•Scenario-Based Development
–Scenario-Based Evaluation

How to write a scenario
•Describe the situation of use that people (e.g., your persona) would experience
•Write it for what your persona would want (or need) to do
–Several scenarios for common tasks
•Include specific, realistic details from your data collection
•Scenarios tell a short story
–Represent the conditions, situations, and people that you’ll be designing for

*It's okay, initially, to have your scenario be idealistic because you will analyze problems with the system and where reality deviates from the scenario later during the analysis phase.

Scenarios for Design
•Usually, a collection of scenarios are used
–Should represent key priorities of your design
•Scenarios help you perform evaluations without the users
–Cognitive Walkthroughs
•Scenarios help justify & communicate decisions

HCI Exercise #3: Scenarios
•Due next week
•May work in teams
•Steps
–Use persona you created for HCI Ex #2
–Develop two common tasks
•Use an educational technology of your choice
•If possible, use your Capstone project
–Write two scenarios of use (one for each task), describing how your persona would engage with the technology to perform the task(s).

My Group's Practice Scenario:
Striving Stuart is currently enrolled in the Earth, Space and Geophysical Science course. His teacher, Ms. Rawlings, has just finished presenting the material on the forces that cause erosion to the class and assigned groups of students to work on a related project activity. Stuart needs to work with two other students, Bitter Betsy and Enigmatic Emily, to create a slide presentation reviewing the forces learned in class. Stuart is responsible for creating slides about wind, Betsy will create the slides about water, and Emily is in charge of slides about temperature. Stuart secretly has a crush on Emily and plans on sneaking in comments about how hot she is on her temperature slides. Betsy is very jealous of Emily. She plans on throwing a bucket of water on Emily. Since all three students have different schedules and they have existing gmail accounts, the group decides to create their presentation on Google Docs. Stuart creates a new presentation on Google Docs. He invites his group members to join the presentation via email. Using the information from his class notes, Stuart creates his slides about wind while the girls create their own. After every member has submitted their information, they add photos, proof read each other’s slides, and edit. Stuart watches to see when Emily signs on so he can daydream about her typing. He knows she is there because her name comes up on the screen. He waits until she is finished with her slides before trying to edit them since two people can’t edit the same slide at the same time. In class, the group simply pulls up their presentation on the web and each member takes turns speaking about their slides. The class enjoys the slide show, particularly because Stuart started the presentation with an “Earth, Wind and Fire” music video. After the presentation, Stuart asks Emily if she wants to ditch 8th period to go surfing with him.

Focus on key points of the system. Not to the point of task analysis, but do mention the general things he will do to accomplish the task.