Wednesday, April 29, 2009

Final Presentations

Rossi - Psychology project instructional website

Leilani - Citizenship class instructional website

Shannon & Camille - Science With Tom

Me - 30-Minute Computer Lab Lessons
  • perhaps use "Screenflow" to capture movies more professionally

JoAnna - "Team Explorers" student make-up work page

Matt - Big Brothers & Big Sisters mentor support and training site

Jenny - Granite District Teacher Tech Help

Kevin, Scott, June - Utah Center for Reading and Literacy, The University of Utah

Ross & Randy - Center for the Advancement of Technology in Education (CATE), The University of Utah

Wednesday, April 22, 2009

Final Reports and Presentations

You’ve Done Design Briefs
•Team Members
•Statement of Problem/Challenge
•Activities to Date
•Summary of User Interviews & Participant Profiles
•Design Priorities and Issues
•Quick Review of Related Products/Systems
•Persona(s)
•Task Statements
•Evaluation Plan
•Exposures
•Schedule of Design & Evaluation Activities

Final Reports Goals
•Present a problem/challenge
•Show how you used HCI methods to arrive at a solution
–Discuss design alternatives
–Motivate design decisions
•Choices were not arbitrary
•Choices move design closer to “optimal”

Final Report
Required Sections: Part 1
•Team Members*
•Statement of Problem/Challenge*
•Review of Related Products/Systems*
•Summary of Interviews and Participant Profiles*
•Key Design Issues and Priorities*
•Personas* (may be primary and secondary)
•Task Statements* (OK if revised)
*Included in Design Brief. Revise as necessary for Final Report

Final Report
Required Sections: Part 2
•Selection Among Design Alternatives - How did you decide on design elements early on?
–Mockups - Show examples. eg. Scan in your paper mock-ups
–Cognitive Walkthrough
–Heuristic Analysis
What were the key findings for each method?
How did the findings change your design?
Screen shots are helpful, if available!
•User Testing
–Design for User Testing
What was the design you used for user testing?
–Experimental Protocol
What was the procedure you used during testing? This includes instructions, practice tasks, etc.
-Tasks
What tasks did participants complete?
-Participants
Summarize the general characteristics of participants.
-Summary of Results/Overview of Findings
Can use a table and chart issues/results for each user
Can organize by aspects of the system and problems with each (rather than organizing by user).
Give data to support your claims. Give quotes, comments, screen shots, etc. of problematic elements that back up your summary.
If you don't have original formatting screen shots, use the current ones and talk about how you changed them from how they used to be.

Final Report
Required Sections: Part 3
•Recommendations/Revisions from User Testing
–High Priority - Deal breakers - what had to be changed or else the system would not work and goals would not be achieved
–Moderate Priority - things that don't make or break your system, but may help the users and meet the requests/comments
Explain how your prioritization of design issues and the design changes you made as a result.
•Final Design
•Future Issues
What was your final design? How does it resolve issues identified during user testing. How does your design meet the needs of your users? Finally, how well does the final design resolve the challenge/problem you identified at the beginning of the report.

Final Report Questions
•Does your report clearly show the evolution of your design from beginning to end?
•Do you show how HCI methods informed your initial design and subsequent changes?
•Have you made it clear how your user testing informed your final design?
•Is it clear how your final design will meet your users’ needs and resolve the challenge/problem you identified?

Due by 5:00 PM on May 6th

Presentations
•20 minutes per project
–We have 9 projects total
–Teams: Each person should present a portion of your project
•15-17 minutes for presentation
•3-5 minutes for answering questions
–Audience members: Be active & ask questions!

Presentation Goals
•Communicate the problem/challenge you were trying to solve
–Convince your audience that it’s important!
•User Needs
–What were the user needs that you identified?
–What was the initial design?
•HCI Methods --> Final Design
–How did HCI methods inform decision decisions? - Choose a few things that really helped you. Choose the most important design decisions and changes you made and how they happened.
–What were the major design changes and why did you make them (e.g., what HCI results convinced you?)
–How well does your final design meet users’ needs?

Presentation Questions
•Did you clearly state the problem/challenge and its importance?
•Is it clear what your users’ needs are?
•Did you clearly connect the outcomes of HCI methods (e.g., Cognitive Walkthrough, Heuristic Analysis) to major design changes/decisions?
•Did you make a case for your final design meeting users’ needs?

Present on April 29 - Bring Napkins

Wednesday, March 25, 2009

Think-Aloud Protocol II

Class Objectives
By the end of this class you should be able to:
•Articulate the difference between think-aloud and self-explanation protocols
•Describe basic categories of analysis for educational technology protocols
•Classify participant utterances into prescribed categories

Think-Aloud Protocol vs. Self Explanation
•Think Aloud
–Purported not to change user’s cognition
–Description (not explanation) is the target
–User’s process is the target
•Self Explanation
–Improves learning and understanding
–Explanation and reflection is the target
–User’s understanding is the target

Think Aloud vs. Self Explanation
•Purely HCI issues? Think-aloud
•Educational effectiveness? Self-Explanation
•Use a combination to evaluate your system’s educational content as well as the design
–Why students learn differently
–How students learn with your system (redesign)

Protocol Analysis for Experimental Studies
•Theory guides initial ideas for categories
•Comparison will be across groups
–What utterances will show learning differences?
•Errors
•Paraphrases vs. inferences
–What utterances will show processing differences?
•Shallow approaches (what kinds of statements?)
•Deep approaches (what kinds of statements?)

HCI Studies: Ed Tech
Why do we need to go beyond the think-aloud to self-explanations?
•Where do learners go wrong and why?
•Do learners understand the content?
•How do they work with the system and the content?
•Goals:
–Improve depth of processing
–Support metacognition
–Minimize errors

Useful Basic Categories
Metacognitive Monitoring (thinking about your own thinking)
–Occurs when the learner/user is monitoring the accuracy of their thinking
–Positive: Identifies thinking as correct
•“Yeah, that’s what I thought.”
–Negative: Identified thinking as incorrect or confused
•“Um, I totally don’t get this.”
•“I thought I should click on Assignments, but looks like not.”
Paraphrasing
–Occurs when learner reads, paraphrases, or summarizes system text
–Shows engagement, but not deep processing
–Repeated rereading or paraphrasing may indicate confusing instructions/text, especially if intermixed with errors
Goal-Directed Inferences
–Content: Learner uses goal to make an inference about the educational content.
•“I’m trying to find information on reading strategies, so I’ll watch this video to see what it tells me.”
•“So … if matter is neither lost nor gained … it must mean that the water just changed states here.”
–System Behaviors: Learner uses goal to make an inference about what the system will do.
•“I think this link will open the video for me.”
–Navigation: Learner uses goal to make an inference about where to go.
•“I’m looking for homework, so I think I need the calendar.”
Errors
–Inferences can be correct or incorrect
–Often occur in context
•Not an error in system design (e.g., a broken or misdirected link)
•Error or problem is tied to learner’s intended goal, or their interpretation of system behavior or educational content.
–“I can’t find the homework link.”
–“The video didn’t start. It must be broken.”
–“So, matter usuallyis conserved but sometimes it can be lost.”

Other Useful Categories
Negative vs. Positive Affect
“Ugh. I hate that picture!”
“Wow. This is such a cool activity”
•Help use
–Do users seek help? Is it available when they do?
•Strategies for use
–What will they try to do with the system?
–Is it what you intended?

Segmenting
•Look for fairly broad statements
Idea units. An utterance that expresses one thought, idea, or confusion.
•May consist of a partial sentence
•May consist of several sentences
•Fits into a single category
–Look for pauses–often indicate shift
–Turn-taking often results in new idea units

Example Transcript
Participant:
“When assembling the door, Nichols accidentally nails the transversal plank at a 36º angle. The measure of angle ARNequals 36º. What is the measure of angle BAR?”
Reading

ARNis 36º because that’s given. And BARis going to be 36º also, because of alternate interior angles. That’s because of measure angle ARN.
Content Inferences

“In the diagram below line ABis parallel to DC. If the measure of angle ACDis equal to 66.1º and the measure of angle ACBequals 44º, find the measure of angle ABC.”
Reading

Experimenter:
You seem a little unsure. What were you thinking?
Prompt

Participant:
I couldn’t find the question mark for where ACD was.
Error?

Tips for Coding
•Video/Audio
–Listen at LEAST 2 times before coding.
•Text transcripts
–Listen to audio while reading the transcript - this will allow you to hear the inflection of phrases that could mean a variety of things
•Context and intonation is important
–Then read at LEAST 2 times before coding

Videos to Critique
Usability Test: Would benefit from more verbalization
http://www.youtube.com/watch?v=SFwU_rvMBaE
Usability Test: Watch out for creating a conversation!
http://www.youtube.com/watch?v=Y_rKE0O7tek
Good information (but don’t get user to design!!)
http://www.youtube.com/watch?v=Jpt3qz1gtXI

Wednesday, March 11, 2009

Think-Aloud Protocol

Think Aloud Protocols (Methods)

What is a Think-Aloud Protocol?
•Participants, report (verbally) all task-relevant thoughts as they complete the task
–I prefer concurrent report
–Interviews: long-term version of retrospective protocols
–If you must use retrospective, use small time intervals! - You don't want them to have to remember what they thought about things from the very beginning of the task.
•Experimenters prompt users to produce (continuous) verbalizations, in a neutral manner
–“Don’t forget to keep talking”
–“What are you thinking?”
- But don't try to get them to say what you want them to say. Be neutral.

Getting Users Talking
•Tell them what they are going to be asked to do, and why
–We’re going to be asking you to “think-aloud,” which just means that we want you to say anything that’s in your head as you’re doing the task. This includes ideas, questions, frustrations, confusions, or comments as you work. Basically, you’ll be giving us your “stream of consciousness” thoughts as you work on each task.
Some people find this easy, but a lot of people find it weird –especially at first. But it’s the best way for us to get really good data on what is helping you and what is causing you problems as you work with the system. We’ll practice a little to help you get started.
•Always model the behavior FIRST.
–Users feel ridiculous
–Users need to hear what good thinking-aloud sounds like
–Need to model the level of detail required
•Practice! Need 2 practices (MINIMUM).
–Can use 1 practice, and provide feedback in the first few “throw-away” screens if you have them. Do not model or practice on your actual task. Use something else.
Model at the most detailed level possible. People will gravitate to less detail on their own, so model with a lot of detail. Give them feedback when they practice encouraging them to give you plenty of detail.

Practice Tasks: Option 1
•Mental Arithmetic
•Describe your thinking as you mentally solve an addition (or subtraction*) problem.
*Some users will find you unusually cruel and heartless, especially if the problems are too difficult.
49+56=

Practice Tasks: Option 2
•The Windows Walk
•Prompt: Imagine walking through your house or apartment. Go through each room, describing and counting the number of windows that you find.

Setting Up for Think-Aloud
•Position yourself behind and to the side of the user
–Peripherally visible, but not “in the action”
–Users are supposed to generate a verbal stream of data, not communicate with you
–If users ask you questions, praise them and encourage them to keep voicing those questions (even though you can’t answer them)
•Have them read their task scenario first
–E.g., “You are a 4thgrade teacher who is trying to set up the initial gradebook for your class. You want to …”
–Make sure the task is available on paper, for constant reference
•Remind the user of the instructions:
–Now we’ll start working with the system. Remember, just say whatever comes into your head, no matter how silly it seems to you. All that data is really useful in helping to improve the system.

Prompting the User
•Try to stay neutral
–Don’t ask ‘why’
–Don’t react to errors or successes
–Try to get them to forget about you!
•Prompt as needed (but keep it easy and breezy)
–“Don’t forget to keep talking”
–“What are you thinking?”
–“Can you say more about that?”
•Especially when they start to say something interesting but stop! This happens a lot…
•You can also just repeat the last bit of what they said as a question. Participant “So…. [trails off]” Experimenter: “So?”

Kids vs. Adults
•Both
–Concerned about looking “stupid”
–Vary widely in how naturally they keep talking
•Kids
–Generally more reluctant to talk aloud (prompt more)
–Mumble
–May need more reassurance

*If a user really can't complete the task, set a time limit, then just have them stop.
You can stop the entire thing and be done, but if the parts of your system are separate enough, get them to the next task and start there.

Praise, Praise, Praise
•Uncomfortable users = QUIET users
•Use praise for process liberally during practice and early in the task/study
•Sneak it in while system loads, or there are natural pauses
–“You’re doing a great job of thinking-aloud! Keep up the good work!”
–“You’re a natural at this” or “That’s great! Just what we need. I’ll keep prompting you to help.”

Example Task Scenario
Jack is a 10th grade science teacher who has volunteered to fill in for Jan, a 7th grade science teacher, while she’s out sick. Jan was supposed to teach her class about changes in the Earth’s surface this week. She suggests Jack come up with a classroom activity based on changes in the Earth’s surface.
One of the topics that Jack teaches in his 10th grade class relates to earthquakes. He wants to teach the 7th graders something related to this topic. Jack often uses DLESE in order to find activities and detailed text on material he teaches in his 10th grade class. He decides to check out what DLESE has to offer. He wants to find out which concepts he needs to teach the 7th graders, in addition to a classroom activity that support these concepts.
(Have www.dlese.orgopen to begin the task).

*If there is a site where there is going to be a lot of text, prompt them to read aloud so you can follow where they are on the site.

Other Tips
•Have a bottle of water for your participant
•If you have a long session planned, give them breaks
•Keep things upbeat and friendly
–If the user gets down, tell them “We are learning so much from your data! You are doing a great job for us in this study!”

Record the Session
•I like CamtasiaStudio
–Screen-capture + Voice recording (synched)
–Free trial for 30 days at:
http://www.techsmith.com/camtasia.asp
–1 year license through U of U Software Licensing
•$25 download, $30 for CD
https://software.utah.edu/osl/detail.shop?productId=1308

Create Videos
•Anonymity largely protected (voice is only identifying info)
•Powerful in highlighting problems, processes

Wednesday, February 18, 2009

Mockups & Prototypes

Feedback on Scenarios
•The “Goldilocks” Issue
–Too Detailed:
“Susie uses the mouse to move her cursor into the address bar. Once she sees the blinking bar in the address bar, she uses keyboard to type the URL and hits ‘Enter’ to bring up the site.”
–Too High-Level
“Susie opens the website to search for lesson plans.”
–Juuuuustright:
“Susie decides to go find lesson plans using uen.org. She opens a web browser and types the URL (www.uen.org) into the address bar.”

•How to decide on the right level of detail?
–Could someone who has never seen your system imagine your persona using it?
–You can assume that you don’t need to describe very common knowledge
•E.g., what it means to “click” or “type” or change text
–Does your scenario inform your design?

Mockups
•Definitions
–Pictures of screens as they would appear at various stages of user interaction
–Very early prototypes made of low-fidelity materials
–Prototypes with limited interactions and content
–Typically Paper (or Cardboard)
•Drawn or Created with software tools
(use something simple that you are most comfortable using)
–PowerPoint, Photoshop, Pencil, SmartDraw
OmniGraffle is popular for Macs
•Use different pages for new windows
Mockups = Low-Fidelity Prototypes = Paper Prototypes

Example Mockups
PDA travel application
http://www.youtube.com/watch?v=Bq1rkVTZLtU&NR=1
Website design (Not in English …)
http://www.youtube.com/watch?v=4WbxisFDYk4&feature=related
http://www.youtube.com/watch?v=AtfWM2jRS2w
Google Reader (Demonstration Prototype)
http://www.youtube.com/watch?v=VSPZ2Uu_X3Y

•Purpose
–Facilitate design evaluation BEFORE spending lots of time and money on a high-fidelity design
•Reduces programming time
•Can save money by identifying changes early
–Concrete design improves feedback/evaluation
•Prototyping: Same quality with 40% less code & 45% less effort (Boehm, Gray, Seewaldt, 1984)

•Use whatever is fast and easy for *you*
–Hand drawn?
–PowerPoint?
–Photoshop?
–Pencil (add-on to Firefox)
–Supports rapid paper prototyping for web/software interfaces
https://addons.mozilla.org/en-US/firefox/addon/8487

Local vs. Global Mockups
•Local
–Target selected aspects of the system
–Identify KEY issues. What is most tricky?!
•What terms to use in menus
•How to provide feedback, etc.
•Global
–Overall interaction with your system
–More holistic in nature
–Covers numerous tasks with the same prototype

Vertical vs. Horizontal
•Vertical <-> Local
–Implement one section of your system in detail
–Leave the rest as low-fidelity
•Horizontal <-> Global
–Implement all of your system in detail at high levels
–Make all high-level features interactive
–Leave in-depth content unspecified
•E.g., actual description of grants, actual help files

High-fidelity Prototypes
•Also known as
–Prototypes
–Version 0 systems
•Use after you have clarified your design requirements
•A working release of your system
•Developed with same tools as final system
•May lack some functionality/content

Wednesday, February 11, 2009

Design & Evaluation Plans

What is a Design Brief?
•Written plan
–Focuses on the design you want to achieve
•Captures goals you want to meet
•Identifies key components of success
•Explores challenges you’ll need to meet/resolve
–Ensures communication about priorities, milestones, and criteria for success
–Synthesizes important data & input
–Acts as a point of reference during the design process
•Does the solution fit the problem?
•How to prioritize issues in the design?

Components: Design Briefs
•Team Members
•Statement of Problem/Challenge
•Activities to Date
•Summary of User Interviews & Participant Profiles
•Design Priorities and Issues
•Quick Review of Related Products/Systems
•Persona(s)
•Task Statements
•Evaluation Plan
•Exposures
•Schedule of Design & Evaluation Activities
(Examples in blue are from the very first design brief I ever wrote.)

Team Members
•Specify the names and email addresses of all team members
•(If working alone, just name yourself!)

Statement of Problem/Challenge
•Should be one paragraph or less (short & sweet)
–What is the design challenge?
–What are you trying to accomplish?
–What will be the end result of your efforts?

The Center for Spoken Language Research (CSLR) has developed a series of interactive books and tutors that are designed to support the development of reading processes in children. Although there are a large number of existing, pre-defined tutors that ask the child to perform various tasks, it is not clear how to represent student performance in a clear and functional way. We will design the data display screens for the CSLR software so that it is useful and intuitive for literacy teachers working with these students.

Activities to Date
•Meetings with Project Sponsor/Stakeholders?
–Summary
•User Interviews
–Summarize the number of interviews and the general user profile (e.g., To date, we have conducted 5 interviews with technology teachers who have varied levels of experience in schools)
–Interview Questions: OK to provide the most current set of questions.

Summary of User Interviews
•Provide a summary of your user’s activities, needs, and concerns.
•These draw upon your “key themes”, but are described in detail.

All the teachers we spoke to tend to work with small groups of children at a time (between 1 and 8) and keep written records on their students. By law, the teachers are required to set yearly goals for the students, which are formalized in either an IEP (Individual Education Plan: developed for all Special Education students, and reading is one component of this plan) and/or in an ILP (Individual Literacy Plan: developed for students who are specifically behind in reading; these students may or may not be in special ed). All teachers expressed a definite need for representing growth over time. The times in the year at which teachers tested their students’ progress differed, but all were concerned with understanding a child’s growth, and being able to communicate that to others (e.g., parents or the child’s regular teacher)… (ONE OF FOUR PARAGRAPHS)

Summary of Profiles
•OK to summarize in a table
•Always is anonymized
–no names or identifying information
•Focus on important characteristics for your user group (these likely will vary by project)
–Age?
–Gender?
–Experience?
•Career
•Computer

Design Priorities and Issues
•What will a successful design need to achieve?
–What is necessary to meet the needs of your users?
–What will be key components of your design?
•What are likely design problems or challenges that you’ll face?
–What do you foresee as barriers to success?
–Where is your design likely to break down?

Design Priorities and Issues
Priorities
•Allow teachers to compare growth over time.
•Breakdown data into useful categories: skills, types of errors, etc.
•Support comparison of performance to specific goals for the student.
•Support materials that can be used in conferences with parents.
•Grade level assessment broken down into component skills.
•Selected information from teacher report summarized for parents.
•Support comparison to benchmarks, rubrics, grade-level standards, etc.Issues
•The CSLR wants a basic design template for data design, but diverse data will be produced from the interactive books and tutors. Can a single format accommodate diverse types of data? These include:
•Comprehension data from interactive books
•Skill data from interactive tutors
•Performance data (such as fluency speed)
•Teachers not only want summary data on student performance, but also strongly desire a categorical breakdown of errors and performance. This categorization may be variable based on student performance and may require “smarter” technology than currently exists.
•One teacher has mentioned that the use of “Excent” –a database program to track special education students’ progress –is being introduced in the school district. If the use of this program becomes standardized, reports from the interactive books and tutors should easily export to this program.

Persona(s)
•You may include multiple personas, but you should identify your primary user(s).

Task Statements
•Task Statements
–Not a task analysis
–Choose 3-5 tasks that your users will need to accomplish with the system. Use your persona(s) to describe those tasks in a rich, narrative manner. (In your final report, these will become full scenarios).

Dorothy is a reading specialist who works primarily with small groups of reading disabled children outside of their regular classroom. She is concerned about the progress of Brett, a grade 2 student. Brett seemed to be making steady progress when she started working with him, she feels as though he may not be showing the same level of growth in his phonological awareness this past month. Dorothy wonders if there is one particular skill, like the pronunciation of long vowels, that is causing his slower progress or if the problem is more general. Dorothy would like to compare Brett's recent progress with his long-term growth and to find out if Brett is performing at the same level for all the component reading skills in the suite of interactive books and tutors he has been using.

Review of Related Products/Systems
•Identify 2+ systems that can inform aspects of your design
•Review these systems based upon your design priorities & issues
•Hint: Screen shots often will be helpful in your analysis

Although many existing software programs advertise that they provide “progress reports” on children’s performance (e.g., Reader Rabbit, Reading Blaster, Accelerated Reader), the form of these progress reports does not meet the needs expressed by literacy teachers. For example, in “Read, Write, and Type,” if the child selects to complete the auxiliary skills section (which is optional) they receive a score for typing accuracy (a score of the number of letters the child typed correctly/ the total number of letters typed) and a score for speed. However, there are no means of specifically evaluating areas where the child may be experiencing difficulties. The only way to discover where the child might be having trouble is to sit with the child and watch his or her progress as he or she plays with the program. This type of assessment clearly does not meet the needs of reading teachers, who express a desire for specific skill-based assessment as well as time saving features such as summarizing and listing errors.

Evaluation Plan
•What will you do to test your design?
–High or low-fidelity prototypes?
–E.g., Cognitive Walkthrough or Heuristic Analysis?
–Testing with users?
•Who?
•Where?
•When?
–OK if this changes. Create a general plan for now and revise as needed.

Exposures
•What problems will you need to solve before/during system testing?
–Provide an anticipated solution, fall-back plan

•Seeding the prototype with realistic data (and data that can be interpreted independently of teacher observations of a student).
•One teacher has volunteered to share a “running record” with us. This may give us a good base for realistic and appropriate student data. However, it may be difficult to translate this qualitative record into quantitative data for testing.
•Meeting teachers at their schools for prototype evaluation may cause problems with platforms, software, & computer access. Teachers may not be able to come to campus for testing.
•One group member has a laptop available for use. We may use the laptop to provide a consistent computer environment even if we need to travel to the teacher’s location.

Schedule of Design Activities
•What are your key HCI milestones?
–User Interviews
–Prototype (low or high fidelity)
–Internal Evaluation (e.g., Cognitive Walkthrough)
–User Testing
–Final Report
•What are the tasks for each milestone?
•If in a team, who are the responsible members for each task?

Schedule of Design Activities Example
Deliverables
Completion Date
Responsible Member(s)
UserInterviews
3/07/2009
Create List of Interviewees
2/12/2009
Butcher,Baker
FinalizeList of Interview Questions
2/16/2009
Candlestickmaker
Schedule Interviews
2/20/2009
Candlestickmaker
Conduct AllInterviews
2/27/2009
Butcher, Baker
DevelopKey Themes, Summarize for Final Report
3/07/2009
Butcher, Baker, Candlestickmaker

Work on your design brief!
–Design briefs are due February 25th.

Wednesday, February 4, 2009

Interviewing Follow-Up, Scenario-Based Evaluation

Persona Exercise
Interviews:
Hardest Parts:
  • designing interview questions
  • non-leading questions
Suggestions:
  • open-ended questions to just get them talking
  • ex. How do you usually perform this task?
  • If there is an existing product like the one you are creating, get them to use it and watch them. How are they using it? What problems do they encounter?
  • If the interview goes in an unexpected direction, go with it for a while and find out why.
  • It's okay to ask more specific questions that are related to your project, just avoid leading them and making them feel like they are supposed to give a specific answer.
  • It really helps to record your interviews, but also take notes so you can have a quick record if you don't have time to listen to the entire interview again.
  • If working as a team, one person can interview and one can record. It allows the interviewer to be more focused on good follow-up questions and the person rather than writing.
  • Interview 6 to 10 people for your capstone project.
Deriving the Persona:
  • If you interview enough people, the key themes tend to come together.
  • You definitely may find that there are a couple personas, but you are going to need to decide on one persona for your design at least to begin with.
How did you decide who to interview?
  • Develop a profile: who are the general users? What user characteristics are relevant to design and testing? What characteristics should all users have in common? (ex. Do they all have to be able to use e-mail?) What characteristics will vary? How will you define them?
  • Try to interview between 6 and 10 people. Less makes it hard to identify patterns, more makes the info. not as useful.
  • Having trouble finding people? Ask the first few users to suggest others.
  • Do contextual interviews. Go to the users' relevant environment. (home, office, classroom)
How did you decide what to ask?
  • Interview as a team. One interviewer, one scribe
  • Explain why you are talking to them and they can end at any time for any reason
  • Ask open-ended questions that allow elaboration
  • Look for opportunities to probe interesting responses
  • Be specific in getting how people already do tasks: verbal - tell me about the last time you... observational - walk me through how you'd....
  • You can change/add questions as needed
  • Ask about contrasts: best experience? worst? likes? dislikes?
  • Last question: Can I contact you for a follow up interview? (It's okay if they say no)
  • Don't use yes/no questions
  • Be wary of leading questions -- Yes, Prime Minister youtube clip
  • Avoid speculative questions: (ex. Would you use an electronic calendar?) If you must ask, do so at the very end of the interview. Don't put too much stake in this answer.
  • Avoid specific design questions. Look for data to inform your design. Don't expect your interviewee to be able to solve your design problems.
What lessons learned for Capstone project?
  • It will be beneficial to interview 6-10 people.
  • Develop a user profile to help you decide who to interview and what questions to ask.
  • You may have to narrow your project to one persona to be your primary persona.
  • Are you going to focus on novice, intermediate, or advanced users? If the system is optional, you can focus on whichever group you want. But, if your system is required of a larger group of varied users, you may need to focus on the novice and just put in some short cuts to help the expert move through more quickly. (For me - links to get to what they want quickly, but detailed instructions for those who need it.
  • Encourage and reassure your interviewees and later your testing and usability people so they don't get nervous and feel like they're not doing a good job.
  • Adjust your questions for future interviews to avoid problems encountered in earlier interviews.
Analyzing your interview data:
  • Flesh out notes from interview IMMEDIATELY.
  • In teams, have each member start by analyzing separately. Look for patterns that concern design priorities, design issues, and deal-breaker (show-stoppers). Reconcile key themes identified by each team member.
  • Use data to identify priorities
  • Then derive tasks
Now you have personas & tasks … now what?!

Scenario-Based Evaluation
What is a scenario?
•Task + Persona = Scenario
•Idealized but detailed
•Describes a specific instance of interaction
•Tries to balance generality & accuracy
–Use persona
–Use common, general tasks
–Situate use in your design
•Scenario-Based Development
–Scenario-Based Evaluation

How to write a scenario
•Describe the situation of use that people (e.g., your persona) would experience
•Write it for what your persona would want (or need) to do
–Several scenarios for common tasks
•Include specific, realistic details from your data collection
•Scenarios tell a short story
–Represent the conditions, situations, and people that you’ll be designing for

*It's okay, initially, to have your scenario be idealistic because you will analyze problems with the system and where reality deviates from the scenario later during the analysis phase.

Scenarios for Design
•Usually, a collection of scenarios are used
–Should represent key priorities of your design
•Scenarios help you perform evaluations without the users
–Cognitive Walkthroughs
•Scenarios help justify & communicate decisions

HCI Exercise #3: Scenarios
•Due next week
•May work in teams
•Steps
–Use persona you created for HCI Ex #2
–Develop two common tasks
•Use an educational technology of your choice
•If possible, use your Capstone project
–Write two scenarios of use (one for each task), describing how your persona would engage with the technology to perform the task(s).

My Group's Practice Scenario:
Striving Stuart is currently enrolled in the Earth, Space and Geophysical Science course. His teacher, Ms. Rawlings, has just finished presenting the material on the forces that cause erosion to the class and assigned groups of students to work on a related project activity. Stuart needs to work with two other students, Bitter Betsy and Enigmatic Emily, to create a slide presentation reviewing the forces learned in class. Stuart is responsible for creating slides about wind, Betsy will create the slides about water, and Emily is in charge of slides about temperature. Stuart secretly has a crush on Emily and plans on sneaking in comments about how hot she is on her temperature slides. Betsy is very jealous of Emily. She plans on throwing a bucket of water on Emily. Since all three students have different schedules and they have existing gmail accounts, the group decides to create their presentation on Google Docs. Stuart creates a new presentation on Google Docs. He invites his group members to join the presentation via email. Using the information from his class notes, Stuart creates his slides about wind while the girls create their own. After every member has submitted their information, they add photos, proof read each other’s slides, and edit. Stuart watches to see when Emily signs on so he can daydream about her typing. He knows she is there because her name comes up on the screen. He waits until she is finished with her slides before trying to edit them since two people can’t edit the same slide at the same time. In class, the group simply pulls up their presentation on the web and each member takes turns speaking about their slides. The class enjoys the slide show, particularly because Stuart started the presentation with an “Earth, Wind and Fire” music video. After the presentation, Stuart asks Emily if she wants to ditch 8th period to go surfing with him.

Focus on key points of the system. Not to the point of task analysis, but do mention the general things he will do to accomplish the task.