Wednesday, April 29, 2009

Final Presentations

Rossi - Psychology project instructional website

Leilani - Citizenship class instructional website

Shannon & Camille - Science With Tom

Me - 30-Minute Computer Lab Lessons
  • perhaps use "Screenflow" to capture movies more professionally

JoAnna - "Team Explorers" student make-up work page

Matt - Big Brothers & Big Sisters mentor support and training site

Jenny - Granite District Teacher Tech Help

Kevin, Scott, June - Utah Center for Reading and Literacy, The University of Utah

Ross & Randy - Center for the Advancement of Technology in Education (CATE), The University of Utah

Wednesday, April 22, 2009

Final Reports and Presentations

You’ve Done Design Briefs
•Team Members
•Statement of Problem/Challenge
•Activities to Date
•Summary of User Interviews & Participant Profiles
•Design Priorities and Issues
•Quick Review of Related Products/Systems
•Persona(s)
•Task Statements
•Evaluation Plan
•Exposures
•Schedule of Design & Evaluation Activities

Final Reports Goals
•Present a problem/challenge
•Show how you used HCI methods to arrive at a solution
–Discuss design alternatives
–Motivate design decisions
•Choices were not arbitrary
•Choices move design closer to “optimal”

Final Report
Required Sections: Part 1
•Team Members*
•Statement of Problem/Challenge*
•Review of Related Products/Systems*
•Summary of Interviews and Participant Profiles*
•Key Design Issues and Priorities*
•Personas* (may be primary and secondary)
•Task Statements* (OK if revised)
*Included in Design Brief. Revise as necessary for Final Report

Final Report
Required Sections: Part 2
•Selection Among Design Alternatives - How did you decide on design elements early on?
–Mockups - Show examples. eg. Scan in your paper mock-ups
–Cognitive Walkthrough
–Heuristic Analysis
What were the key findings for each method?
How did the findings change your design?
Screen shots are helpful, if available!
•User Testing
–Design for User Testing
What was the design you used for user testing?
–Experimental Protocol
What was the procedure you used during testing? This includes instructions, practice tasks, etc.
-Tasks
What tasks did participants complete?
-Participants
Summarize the general characteristics of participants.
-Summary of Results/Overview of Findings
Can use a table and chart issues/results for each user
Can organize by aspects of the system and problems with each (rather than organizing by user).
Give data to support your claims. Give quotes, comments, screen shots, etc. of problematic elements that back up your summary.
If you don't have original formatting screen shots, use the current ones and talk about how you changed them from how they used to be.

Final Report
Required Sections: Part 3
•Recommendations/Revisions from User Testing
–High Priority - Deal breakers - what had to be changed or else the system would not work and goals would not be achieved
–Moderate Priority - things that don't make or break your system, but may help the users and meet the requests/comments
Explain how your prioritization of design issues and the design changes you made as a result.
•Final Design
•Future Issues
What was your final design? How does it resolve issues identified during user testing. How does your design meet the needs of your users? Finally, how well does the final design resolve the challenge/problem you identified at the beginning of the report.

Final Report Questions
•Does your report clearly show the evolution of your design from beginning to end?
•Do you show how HCI methods informed your initial design and subsequent changes?
•Have you made it clear how your user testing informed your final design?
•Is it clear how your final design will meet your users’ needs and resolve the challenge/problem you identified?

Due by 5:00 PM on May 6th

Presentations
•20 minutes per project
–We have 9 projects total
–Teams: Each person should present a portion of your project
•15-17 minutes for presentation
•3-5 minutes for answering questions
–Audience members: Be active & ask questions!

Presentation Goals
•Communicate the problem/challenge you were trying to solve
–Convince your audience that it’s important!
•User Needs
–What were the user needs that you identified?
–What was the initial design?
•HCI Methods --> Final Design
–How did HCI methods inform decision decisions? - Choose a few things that really helped you. Choose the most important design decisions and changes you made and how they happened.
–What were the major design changes and why did you make them (e.g., what HCI results convinced you?)
–How well does your final design meet users’ needs?

Presentation Questions
•Did you clearly state the problem/challenge and its importance?
•Is it clear what your users’ needs are?
•Did you clearly connect the outcomes of HCI methods (e.g., Cognitive Walkthrough, Heuristic Analysis) to major design changes/decisions?
•Did you make a case for your final design meeting users’ needs?

Present on April 29 - Bring Napkins

Wednesday, March 25, 2009

Think-Aloud Protocol II

Class Objectives
By the end of this class you should be able to:
•Articulate the difference between think-aloud and self-explanation protocols
•Describe basic categories of analysis for educational technology protocols
•Classify participant utterances into prescribed categories

Think-Aloud Protocol vs. Self Explanation
•Think Aloud
–Purported not to change user’s cognition
–Description (not explanation) is the target
–User’s process is the target
•Self Explanation
–Improves learning and understanding
–Explanation and reflection is the target
–User’s understanding is the target

Think Aloud vs. Self Explanation
•Purely HCI issues? Think-aloud
•Educational effectiveness? Self-Explanation
•Use a combination to evaluate your system’s educational content as well as the design
–Why students learn differently
–How students learn with your system (redesign)

Protocol Analysis for Experimental Studies
•Theory guides initial ideas for categories
•Comparison will be across groups
–What utterances will show learning differences?
•Errors
•Paraphrases vs. inferences
–What utterances will show processing differences?
•Shallow approaches (what kinds of statements?)
•Deep approaches (what kinds of statements?)

HCI Studies: Ed Tech
Why do we need to go beyond the think-aloud to self-explanations?
•Where do learners go wrong and why?
•Do learners understand the content?
•How do they work with the system and the content?
•Goals:
–Improve depth of processing
–Support metacognition
–Minimize errors

Useful Basic Categories
Metacognitive Monitoring (thinking about your own thinking)
–Occurs when the learner/user is monitoring the accuracy of their thinking
–Positive: Identifies thinking as correct
•“Yeah, that’s what I thought.”
–Negative: Identified thinking as incorrect or confused
•“Um, I totally don’t get this.”
•“I thought I should click on Assignments, but looks like not.”
Paraphrasing
–Occurs when learner reads, paraphrases, or summarizes system text
–Shows engagement, but not deep processing
–Repeated rereading or paraphrasing may indicate confusing instructions/text, especially if intermixed with errors
Goal-Directed Inferences
–Content: Learner uses goal to make an inference about the educational content.
•“I’m trying to find information on reading strategies, so I’ll watch this video to see what it tells me.”
•“So … if matter is neither lost nor gained … it must mean that the water just changed states here.”
–System Behaviors: Learner uses goal to make an inference about what the system will do.
•“I think this link will open the video for me.”
–Navigation: Learner uses goal to make an inference about where to go.
•“I’m looking for homework, so I think I need the calendar.”
Errors
–Inferences can be correct or incorrect
–Often occur in context
•Not an error in system design (e.g., a broken or misdirected link)
•Error or problem is tied to learner’s intended goal, or their interpretation of system behavior or educational content.
–“I can’t find the homework link.”
–“The video didn’t start. It must be broken.”
–“So, matter usuallyis conserved but sometimes it can be lost.”

Other Useful Categories
Negative vs. Positive Affect
“Ugh. I hate that picture!”
“Wow. This is such a cool activity”
•Help use
–Do users seek help? Is it available when they do?
•Strategies for use
–What will they try to do with the system?
–Is it what you intended?

Segmenting
•Look for fairly broad statements
Idea units. An utterance that expresses one thought, idea, or confusion.
•May consist of a partial sentence
•May consist of several sentences
•Fits into a single category
–Look for pauses–often indicate shift
–Turn-taking often results in new idea units

Example Transcript
Participant:
“When assembling the door, Nichols accidentally nails the transversal plank at a 36º angle. The measure of angle ARNequals 36º. What is the measure of angle BAR?”
Reading

ARNis 36º because that’s given. And BARis going to be 36º also, because of alternate interior angles. That’s because of measure angle ARN.
Content Inferences

“In the diagram below line ABis parallel to DC. If the measure of angle ACDis equal to 66.1º and the measure of angle ACBequals 44º, find the measure of angle ABC.”
Reading

Experimenter:
You seem a little unsure. What were you thinking?
Prompt

Participant:
I couldn’t find the question mark for where ACD was.
Error?

Tips for Coding
•Video/Audio
–Listen at LEAST 2 times before coding.
•Text transcripts
–Listen to audio while reading the transcript - this will allow you to hear the inflection of phrases that could mean a variety of things
•Context and intonation is important
–Then read at LEAST 2 times before coding

Videos to Critique
Usability Test: Would benefit from more verbalization
http://www.youtube.com/watch?v=SFwU_rvMBaE
Usability Test: Watch out for creating a conversation!
http://www.youtube.com/watch?v=Y_rKE0O7tek
Good information (but don’t get user to design!!)
http://www.youtube.com/watch?v=Jpt3qz1gtXI

Wednesday, March 11, 2009

Think-Aloud Protocol

Think Aloud Protocols (Methods)

What is a Think-Aloud Protocol?
•Participants, report (verbally) all task-relevant thoughts as they complete the task
–I prefer concurrent report
–Interviews: long-term version of retrospective protocols
–If you must use retrospective, use small time intervals! - You don't want them to have to remember what they thought about things from the very beginning of the task.
•Experimenters prompt users to produce (continuous) verbalizations, in a neutral manner
–“Don’t forget to keep talking”
–“What are you thinking?”
- But don't try to get them to say what you want them to say. Be neutral.

Getting Users Talking
•Tell them what they are going to be asked to do, and why
–We’re going to be asking you to “think-aloud,” which just means that we want you to say anything that’s in your head as you’re doing the task. This includes ideas, questions, frustrations, confusions, or comments as you work. Basically, you’ll be giving us your “stream of consciousness” thoughts as you work on each task.
Some people find this easy, but a lot of people find it weird –especially at first. But it’s the best way for us to get really good data on what is helping you and what is causing you problems as you work with the system. We’ll practice a little to help you get started.
•Always model the behavior FIRST.
–Users feel ridiculous
–Users need to hear what good thinking-aloud sounds like
–Need to model the level of detail required
•Practice! Need 2 practices (MINIMUM).
–Can use 1 practice, and provide feedback in the first few “throw-away” screens if you have them. Do not model or practice on your actual task. Use something else.
Model at the most detailed level possible. People will gravitate to less detail on their own, so model with a lot of detail. Give them feedback when they practice encouraging them to give you plenty of detail.

Practice Tasks: Option 1
•Mental Arithmetic
•Describe your thinking as you mentally solve an addition (or subtraction*) problem.
*Some users will find you unusually cruel and heartless, especially if the problems are too difficult.
49+56=

Practice Tasks: Option 2
•The Windows Walk
•Prompt: Imagine walking through your house or apartment. Go through each room, describing and counting the number of windows that you find.

Setting Up for Think-Aloud
•Position yourself behind and to the side of the user
–Peripherally visible, but not “in the action”
–Users are supposed to generate a verbal stream of data, not communicate with you
–If users ask you questions, praise them and encourage them to keep voicing those questions (even though you can’t answer them)
•Have them read their task scenario first
–E.g., “You are a 4thgrade teacher who is trying to set up the initial gradebook for your class. You want to …”
–Make sure the task is available on paper, for constant reference
•Remind the user of the instructions:
–Now we’ll start working with the system. Remember, just say whatever comes into your head, no matter how silly it seems to you. All that data is really useful in helping to improve the system.

Prompting the User
•Try to stay neutral
–Don’t ask ‘why’
–Don’t react to errors or successes
–Try to get them to forget about you!
•Prompt as needed (but keep it easy and breezy)
–“Don’t forget to keep talking”
–“What are you thinking?”
–“Can you say more about that?”
•Especially when they start to say something interesting but stop! This happens a lot…
•You can also just repeat the last bit of what they said as a question. Participant “So…. [trails off]” Experimenter: “So?”

Kids vs. Adults
•Both
–Concerned about looking “stupid”
–Vary widely in how naturally they keep talking
•Kids
–Generally more reluctant to talk aloud (prompt more)
–Mumble
–May need more reassurance

*If a user really can't complete the task, set a time limit, then just have them stop.
You can stop the entire thing and be done, but if the parts of your system are separate enough, get them to the next task and start there.

Praise, Praise, Praise
•Uncomfortable users = QUIET users
•Use praise for process liberally during practice and early in the task/study
•Sneak it in while system loads, or there are natural pauses
–“You’re doing a great job of thinking-aloud! Keep up the good work!”
–“You’re a natural at this” or “That’s great! Just what we need. I’ll keep prompting you to help.”

Example Task Scenario
Jack is a 10th grade science teacher who has volunteered to fill in for Jan, a 7th grade science teacher, while she’s out sick. Jan was supposed to teach her class about changes in the Earth’s surface this week. She suggests Jack come up with a classroom activity based on changes in the Earth’s surface.
One of the topics that Jack teaches in his 10th grade class relates to earthquakes. He wants to teach the 7th graders something related to this topic. Jack often uses DLESE in order to find activities and detailed text on material he teaches in his 10th grade class. He decides to check out what DLESE has to offer. He wants to find out which concepts he needs to teach the 7th graders, in addition to a classroom activity that support these concepts.
(Have www.dlese.orgopen to begin the task).

*If there is a site where there is going to be a lot of text, prompt them to read aloud so you can follow where they are on the site.

Other Tips
•Have a bottle of water for your participant
•If you have a long session planned, give them breaks
•Keep things upbeat and friendly
–If the user gets down, tell them “We are learning so much from your data! You are doing a great job for us in this study!”

Record the Session
•I like CamtasiaStudio
–Screen-capture + Voice recording (synched)
–Free trial for 30 days at:
http://www.techsmith.com/camtasia.asp
–1 year license through U of U Software Licensing
•$25 download, $30 for CD
https://software.utah.edu/osl/detail.shop?productId=1308

Create Videos
•Anonymity largely protected (voice is only identifying info)
•Powerful in highlighting problems, processes

Wednesday, February 18, 2009

Mockups & Prototypes

Feedback on Scenarios
•The “Goldilocks” Issue
–Too Detailed:
“Susie uses the mouse to move her cursor into the address bar. Once she sees the blinking bar in the address bar, she uses keyboard to type the URL and hits ‘Enter’ to bring up the site.”
–Too High-Level
“Susie opens the website to search for lesson plans.”
–Juuuuustright:
“Susie decides to go find lesson plans using uen.org. She opens a web browser and types the URL (www.uen.org) into the address bar.”

•How to decide on the right level of detail?
–Could someone who has never seen your system imagine your persona using it?
–You can assume that you don’t need to describe very common knowledge
•E.g., what it means to “click” or “type” or change text
–Does your scenario inform your design?

Mockups
•Definitions
–Pictures of screens as they would appear at various stages of user interaction
–Very early prototypes made of low-fidelity materials
–Prototypes with limited interactions and content
–Typically Paper (or Cardboard)
•Drawn or Created with software tools
(use something simple that you are most comfortable using)
–PowerPoint, Photoshop, Pencil, SmartDraw
OmniGraffle is popular for Macs
•Use different pages for new windows
Mockups = Low-Fidelity Prototypes = Paper Prototypes

Example Mockups
PDA travel application
http://www.youtube.com/watch?v=Bq1rkVTZLtU&NR=1
Website design (Not in English …)
http://www.youtube.com/watch?v=4WbxisFDYk4&feature=related
http://www.youtube.com/watch?v=AtfWM2jRS2w
Google Reader (Demonstration Prototype)
http://www.youtube.com/watch?v=VSPZ2Uu_X3Y

•Purpose
–Facilitate design evaluation BEFORE spending lots of time and money on a high-fidelity design
•Reduces programming time
•Can save money by identifying changes early
–Concrete design improves feedback/evaluation
•Prototyping: Same quality with 40% less code & 45% less effort (Boehm, Gray, Seewaldt, 1984)

•Use whatever is fast and easy for *you*
–Hand drawn?
–PowerPoint?
–Photoshop?
–Pencil (add-on to Firefox)
–Supports rapid paper prototyping for web/software interfaces
https://addons.mozilla.org/en-US/firefox/addon/8487

Local vs. Global Mockups
•Local
–Target selected aspects of the system
–Identify KEY issues. What is most tricky?!
•What terms to use in menus
•How to provide feedback, etc.
•Global
–Overall interaction with your system
–More holistic in nature
–Covers numerous tasks with the same prototype

Vertical vs. Horizontal
•Vertical <-> Local
–Implement one section of your system in detail
–Leave the rest as low-fidelity
•Horizontal <-> Global
–Implement all of your system in detail at high levels
–Make all high-level features interactive
–Leave in-depth content unspecified
•E.g., actual description of grants, actual help files

High-fidelity Prototypes
•Also known as
–Prototypes
–Version 0 systems
•Use after you have clarified your design requirements
•A working release of your system
•Developed with same tools as final system
•May lack some functionality/content

Wednesday, February 11, 2009

Design & Evaluation Plans

What is a Design Brief?
•Written plan
–Focuses on the design you want to achieve
•Captures goals you want to meet
•Identifies key components of success
•Explores challenges you’ll need to meet/resolve
–Ensures communication about priorities, milestones, and criteria for success
–Synthesizes important data & input
–Acts as a point of reference during the design process
•Does the solution fit the problem?
•How to prioritize issues in the design?

Components: Design Briefs
•Team Members
•Statement of Problem/Challenge
•Activities to Date
•Summary of User Interviews & Participant Profiles
•Design Priorities and Issues
•Quick Review of Related Products/Systems
•Persona(s)
•Task Statements
•Evaluation Plan
•Exposures
•Schedule of Design & Evaluation Activities
(Examples in blue are from the very first design brief I ever wrote.)

Team Members
•Specify the names and email addresses of all team members
•(If working alone, just name yourself!)

Statement of Problem/Challenge
•Should be one paragraph or less (short & sweet)
–What is the design challenge?
–What are you trying to accomplish?
–What will be the end result of your efforts?

The Center for Spoken Language Research (CSLR) has developed a series of interactive books and tutors that are designed to support the development of reading processes in children. Although there are a large number of existing, pre-defined tutors that ask the child to perform various tasks, it is not clear how to represent student performance in a clear and functional way. We will design the data display screens for the CSLR software so that it is useful and intuitive for literacy teachers working with these students.

Activities to Date
•Meetings with Project Sponsor/Stakeholders?
–Summary
•User Interviews
–Summarize the number of interviews and the general user profile (e.g., To date, we have conducted 5 interviews with technology teachers who have varied levels of experience in schools)
–Interview Questions: OK to provide the most current set of questions.

Summary of User Interviews
•Provide a summary of your user’s activities, needs, and concerns.
•These draw upon your “key themes”, but are described in detail.

All the teachers we spoke to tend to work with small groups of children at a time (between 1 and 8) and keep written records on their students. By law, the teachers are required to set yearly goals for the students, which are formalized in either an IEP (Individual Education Plan: developed for all Special Education students, and reading is one component of this plan) and/or in an ILP (Individual Literacy Plan: developed for students who are specifically behind in reading; these students may or may not be in special ed). All teachers expressed a definite need for representing growth over time. The times in the year at which teachers tested their students’ progress differed, but all were concerned with understanding a child’s growth, and being able to communicate that to others (e.g., parents or the child’s regular teacher)… (ONE OF FOUR PARAGRAPHS)

Summary of Profiles
•OK to summarize in a table
•Always is anonymized
–no names or identifying information
•Focus on important characteristics for your user group (these likely will vary by project)
–Age?
–Gender?
–Experience?
•Career
•Computer

Design Priorities and Issues
•What will a successful design need to achieve?
–What is necessary to meet the needs of your users?
–What will be key components of your design?
•What are likely design problems or challenges that you’ll face?
–What do you foresee as barriers to success?
–Where is your design likely to break down?

Design Priorities and Issues
Priorities
•Allow teachers to compare growth over time.
•Breakdown data into useful categories: skills, types of errors, etc.
•Support comparison of performance to specific goals for the student.
•Support materials that can be used in conferences with parents.
•Grade level assessment broken down into component skills.
•Selected information from teacher report summarized for parents.
•Support comparison to benchmarks, rubrics, grade-level standards, etc.Issues
•The CSLR wants a basic design template for data design, but diverse data will be produced from the interactive books and tutors. Can a single format accommodate diverse types of data? These include:
•Comprehension data from interactive books
•Skill data from interactive tutors
•Performance data (such as fluency speed)
•Teachers not only want summary data on student performance, but also strongly desire a categorical breakdown of errors and performance. This categorization may be variable based on student performance and may require “smarter” technology than currently exists.
•One teacher has mentioned that the use of “Excent” –a database program to track special education students’ progress –is being introduced in the school district. If the use of this program becomes standardized, reports from the interactive books and tutors should easily export to this program.

Persona(s)
•You may include multiple personas, but you should identify your primary user(s).

Task Statements
•Task Statements
–Not a task analysis
–Choose 3-5 tasks that your users will need to accomplish with the system. Use your persona(s) to describe those tasks in a rich, narrative manner. (In your final report, these will become full scenarios).

Dorothy is a reading specialist who works primarily with small groups of reading disabled children outside of their regular classroom. She is concerned about the progress of Brett, a grade 2 student. Brett seemed to be making steady progress when she started working with him, she feels as though he may not be showing the same level of growth in his phonological awareness this past month. Dorothy wonders if there is one particular skill, like the pronunciation of long vowels, that is causing his slower progress or if the problem is more general. Dorothy would like to compare Brett's recent progress with his long-term growth and to find out if Brett is performing at the same level for all the component reading skills in the suite of interactive books and tutors he has been using.

Review of Related Products/Systems
•Identify 2+ systems that can inform aspects of your design
•Review these systems based upon your design priorities & issues
•Hint: Screen shots often will be helpful in your analysis

Although many existing software programs advertise that they provide “progress reports” on children’s performance (e.g., Reader Rabbit, Reading Blaster, Accelerated Reader), the form of these progress reports does not meet the needs expressed by literacy teachers. For example, in “Read, Write, and Type,” if the child selects to complete the auxiliary skills section (which is optional) they receive a score for typing accuracy (a score of the number of letters the child typed correctly/ the total number of letters typed) and a score for speed. However, there are no means of specifically evaluating areas where the child may be experiencing difficulties. The only way to discover where the child might be having trouble is to sit with the child and watch his or her progress as he or she plays with the program. This type of assessment clearly does not meet the needs of reading teachers, who express a desire for specific skill-based assessment as well as time saving features such as summarizing and listing errors.

Evaluation Plan
•What will you do to test your design?
–High or low-fidelity prototypes?
–E.g., Cognitive Walkthrough or Heuristic Analysis?
–Testing with users?
•Who?
•Where?
•When?
–OK if this changes. Create a general plan for now and revise as needed.

Exposures
•What problems will you need to solve before/during system testing?
–Provide an anticipated solution, fall-back plan

•Seeding the prototype with realistic data (and data that can be interpreted independently of teacher observations of a student).
•One teacher has volunteered to share a “running record” with us. This may give us a good base for realistic and appropriate student data. However, it may be difficult to translate this qualitative record into quantitative data for testing.
•Meeting teachers at their schools for prototype evaluation may cause problems with platforms, software, & computer access. Teachers may not be able to come to campus for testing.
•One group member has a laptop available for use. We may use the laptop to provide a consistent computer environment even if we need to travel to the teacher’s location.

Schedule of Design Activities
•What are your key HCI milestones?
–User Interviews
–Prototype (low or high fidelity)
–Internal Evaluation (e.g., Cognitive Walkthrough)
–User Testing
–Final Report
•What are the tasks for each milestone?
•If in a team, who are the responsible members for each task?

Schedule of Design Activities Example
Deliverables
Completion Date
Responsible Member(s)
UserInterviews
3/07/2009
Create List of Interviewees
2/12/2009
Butcher,Baker
FinalizeList of Interview Questions
2/16/2009
Candlestickmaker
Schedule Interviews
2/20/2009
Candlestickmaker
Conduct AllInterviews
2/27/2009
Butcher, Baker
DevelopKey Themes, Summarize for Final Report
3/07/2009
Butcher, Baker, Candlestickmaker

Work on your design brief!
–Design briefs are due February 25th.

Wednesday, February 4, 2009

Interviewing Follow-Up, Scenario-Based Evaluation

Persona Exercise
Interviews:
Hardest Parts:
  • designing interview questions
  • non-leading questions
Suggestions:
  • open-ended questions to just get them talking
  • ex. How do you usually perform this task?
  • If there is an existing product like the one you are creating, get them to use it and watch them. How are they using it? What problems do they encounter?
  • If the interview goes in an unexpected direction, go with it for a while and find out why.
  • It's okay to ask more specific questions that are related to your project, just avoid leading them and making them feel like they are supposed to give a specific answer.
  • It really helps to record your interviews, but also take notes so you can have a quick record if you don't have time to listen to the entire interview again.
  • If working as a team, one person can interview and one can record. It allows the interviewer to be more focused on good follow-up questions and the person rather than writing.
  • Interview 6 to 10 people for your capstone project.
Deriving the Persona:
  • If you interview enough people, the key themes tend to come together.
  • You definitely may find that there are a couple personas, but you are going to need to decide on one persona for your design at least to begin with.
How did you decide who to interview?
  • Develop a profile: who are the general users? What user characteristics are relevant to design and testing? What characteristics should all users have in common? (ex. Do they all have to be able to use e-mail?) What characteristics will vary? How will you define them?
  • Try to interview between 6 and 10 people. Less makes it hard to identify patterns, more makes the info. not as useful.
  • Having trouble finding people? Ask the first few users to suggest others.
  • Do contextual interviews. Go to the users' relevant environment. (home, office, classroom)
How did you decide what to ask?
  • Interview as a team. One interviewer, one scribe
  • Explain why you are talking to them and they can end at any time for any reason
  • Ask open-ended questions that allow elaboration
  • Look for opportunities to probe interesting responses
  • Be specific in getting how people already do tasks: verbal - tell me about the last time you... observational - walk me through how you'd....
  • You can change/add questions as needed
  • Ask about contrasts: best experience? worst? likes? dislikes?
  • Last question: Can I contact you for a follow up interview? (It's okay if they say no)
  • Don't use yes/no questions
  • Be wary of leading questions -- Yes, Prime Minister youtube clip
  • Avoid speculative questions: (ex. Would you use an electronic calendar?) If you must ask, do so at the very end of the interview. Don't put too much stake in this answer.
  • Avoid specific design questions. Look for data to inform your design. Don't expect your interviewee to be able to solve your design problems.
What lessons learned for Capstone project?
  • It will be beneficial to interview 6-10 people.
  • Develop a user profile to help you decide who to interview and what questions to ask.
  • You may have to narrow your project to one persona to be your primary persona.
  • Are you going to focus on novice, intermediate, or advanced users? If the system is optional, you can focus on whichever group you want. But, if your system is required of a larger group of varied users, you may need to focus on the novice and just put in some short cuts to help the expert move through more quickly. (For me - links to get to what they want quickly, but detailed instructions for those who need it.
  • Encourage and reassure your interviewees and later your testing and usability people so they don't get nervous and feel like they're not doing a good job.
  • Adjust your questions for future interviews to avoid problems encountered in earlier interviews.
Analyzing your interview data:
  • Flesh out notes from interview IMMEDIATELY.
  • In teams, have each member start by analyzing separately. Look for patterns that concern design priorities, design issues, and deal-breaker (show-stoppers). Reconcile key themes identified by each team member.
  • Use data to identify priorities
  • Then derive tasks
Now you have personas & tasks … now what?!

Scenario-Based Evaluation
What is a scenario?
•Task + Persona = Scenario
•Idealized but detailed
•Describes a specific instance of interaction
•Tries to balance generality & accuracy
–Use persona
–Use common, general tasks
–Situate use in your design
•Scenario-Based Development
–Scenario-Based Evaluation

How to write a scenario
•Describe the situation of use that people (e.g., your persona) would experience
•Write it for what your persona would want (or need) to do
–Several scenarios for common tasks
•Include specific, realistic details from your data collection
•Scenarios tell a short story
–Represent the conditions, situations, and people that you’ll be designing for

*It's okay, initially, to have your scenario be idealistic because you will analyze problems with the system and where reality deviates from the scenario later during the analysis phase.

Scenarios for Design
•Usually, a collection of scenarios are used
–Should represent key priorities of your design
•Scenarios help you perform evaluations without the users
–Cognitive Walkthroughs
•Scenarios help justify & communicate decisions

HCI Exercise #3: Scenarios
•Due next week
•May work in teams
•Steps
–Use persona you created for HCI Ex #2
–Develop two common tasks
•Use an educational technology of your choice
•If possible, use your Capstone project
–Write two scenarios of use (one for each task), describing how your persona would engage with the technology to perform the task(s).

My Group's Practice Scenario:
Striving Stuart is currently enrolled in the Earth, Space and Geophysical Science course. His teacher, Ms. Rawlings, has just finished presenting the material on the forces that cause erosion to the class and assigned groups of students to work on a related project activity. Stuart needs to work with two other students, Bitter Betsy and Enigmatic Emily, to create a slide presentation reviewing the forces learned in class. Stuart is responsible for creating slides about wind, Betsy will create the slides about water, and Emily is in charge of slides about temperature. Stuart secretly has a crush on Emily and plans on sneaking in comments about how hot she is on her temperature slides. Betsy is very jealous of Emily. She plans on throwing a bucket of water on Emily. Since all three students have different schedules and they have existing gmail accounts, the group decides to create their presentation on Google Docs. Stuart creates a new presentation on Google Docs. He invites his group members to join the presentation via email. Using the information from his class notes, Stuart creates his slides about wind while the girls create their own. After every member has submitted their information, they add photos, proof read each other’s slides, and edit. Stuart watches to see when Emily signs on so he can daydream about her typing. He knows she is there because her name comes up on the screen. He waits until she is finished with her slides before trying to edit them since two people can’t edit the same slide at the same time. In class, the group simply pulls up their presentation on the web and each member takes turns speaking about their slides. The class enjoys the slide show, particularly because Stuart started the presentation with an “Earth, Wind and Fire” music video. After the presentation, Stuart asks Emily if she wants to ditch 8th period to go surfing with him.

Focus on key points of the system. Not to the point of task analysis, but do mention the general things he will do to accomplish the task.

Wednesday, January 28, 2009

Personas

Personas: From Theory to Practice
Question: The article mentions that some practitioners go to great lengths to represent their user as accurately and detailed as possible during their design process. Some even create "posters, websites, and real-size cardboards" of their poersona. Do we need to go that far in creating our personas? What is the benefit of going to such lengths?

Discussion: The more detail and effort you go to, the more concrete your persona will become. This will help you better design, especially if you are on a design team.
You could overdo it and waste time, money, and resources that aren't necessary.
What about recording the persona on a video? Either just describe the persona or hire an actor to portray the persona?
The more real it feels to you, the more detailed your design will be. Maybe not a life-size cardboard cutout, but detailed.

Q: The section "What Shapes a Persona?" quotes a designer saying, "Every product we build is a product we build for ourselves to solve our own problems." The figure (Figure 2) shown in this section also shows that, in practice, the designer's ideas play a large role in shaping a persona. To what extent do designers put themselves into a shaped persona?

D: Can you entirely separate yourself from a persona? Maybe not entirely.
It's hard to pull yourself out of it.
You are not the user.

Q: Do you get better Instructional Design when your group has concrete, defined personas, or can the design be just as good, or better, when given more creativity using no personas, (or unspoken personas)?

D: How do you know when you have the right persona? Your user analysis and research should lead you to the right persona.
If you only have one persona, are you leaving out a lot of your audience? Using multiple personas can help. You also have to realize that a business model is ok with leaving a few users behind if they don't fit your personas. This is contrary to what we do in education.

Q: Is there a time when we would choose not to use a persona?

D: Maybe not, but there are certainly times when it is difficult to pin down who your persona should be. When a large, diverse group of people are required to use a system. Mass use of a product creates a side range of users.

Child Personas: Fact or Fiction?
Q: Can a ten-year old child explain how they think to an adult? If not, are personae an adequate substitute? If not, what other methods could be brought into the process in order to improve the project?

D: You can have very good results, but there are some ways to make it more successful. If they have a relationship with the interviewer and are comfortable giving their opionions, you will have better response. If you didn't know the kids already, you would benefit from warming them up to you.
An unbiased, uninvolved interviewer (or at least someone the kids think don't care whether they liked it or not) might help because the kids would be willing to say what they really felt without worrying about offending the creator.
It seems more important to observe kids because they may not have the words to express their thoughts and they may have more surprising interactions with the program.

Q: Do you think we need to be more careful about putting our own views into child-personas than we need to be with adults?

D: I think so. I think if "you are not the user" as an adult, you are REALLY not the user when the user is a child. They are different than you were as a kid. You don't remember things as well as you may think. Designers may not be used to working with children.
I think it is a mistake not to use a child-persona. I think a child-persona will really help and can be created.

The Origin of Personas
Alan Cooper (1999, 2004) "The Inmates are Running the Asylum"
  • Personas Goal: "Develop a precise description of our user and what he wishes to accomplish"
  • User is a resiurce, but won't know solution!
  • "Make up pretend users and design for them"
  • Failure = design for borad, abstract groups
  • Success = design for a single (archetype) user
Personas: Definition
Description of a specific, fictitious person
-Written in the form of a narrative
-Represents gathered info about a group of users with common characterisitcs (single users are too quirky!)
  • Usually given both a name and a face
  • May contain personal information (family members, friends, occupations, possessions) make the persona more "real"
  • Focuses on the goals, needs, and frustrations of the persona when using the product being designed
-3 to 7 personas usually created for a project (three is probably good for our capstone project)
  • Some advocate using on primary persona
Personas: Key Considerations
"Pretend" but not "made up"
-Based on data with users
  • interviews - phone or face-to-face interviews can be better than email surveys because people self-correct in email, but when you are talking to them you get them more free-flowing
  • Observations - watch them use the system
Presented as a story about a believable person
-Project team should refer to the persona by name
  • Stop talking about abstract "users"!
Focused on enabling effective design decisions
-Should explicitly define the needs, goals, and frustrations of the persona
  • Designers should be able to infer what features are needed and how they should be designed
What are personas good for?
Assisting communication
-Easier to talk about "James" and his needs
-User is too abstract -> doesn't drive decisions

Informs design decisions
-What does James need to do with the new system?
-How do you meet James's goals?
-How do you resolve James's frustrations?

Supports design evaluation
-Where will you trip up James?
-Will he know what to do? How to interact with the system?
-Will he even use the system?

Personas: Drawbacks?
Bad personas won't help you

Some consider them too "artsy"

User interviews can be costly
-Recruiting users
-Conducting interviews
-Transcribing protocols
-Time to analyze data, extrat themes
-Some estimates (Forrester Research): $47,000 for commercial apps

Creating Personas
Interview potential users. Take good notes! Or record it.

Identify key observations ("factoids")
-10-12 per interviewee is typical

Sort individuals into groups based on observations

Cluster key observations from multiple interviewees
-Look for patterns/themes
-Typically, 3-4 characteristics from each person are relevant to the group

Interview Data
-Look for common goals
-Look for common frustrations
-Look for common perspectives, approaches
  • Technophlie vs. Technophobe?!
Observational Data
-How do users interact with existing technology?
-Do they take shorcuts?
-Frustrations? How quickly do they opt-out?
-Do they know how they use things? (Are they actually doing what they said they do when you interviewed them?)

Persona Template
Name
Role
Daily Tasks/Relevant
Experience
Likes
Dislikes
Goals? Needs?
Frustrations?
What interview questions should we ask?

Wednesday, January 21, 2009

Usability, HCI, Task Analysis

Usability

1. (Jennifer Hogle) According to Smith and Ragan (2005), the goal of instructional design is to create instruction that is "effective", "efficient", and "appealing". On page 27 of the Leventhal & Barnes chapter, usability is defined as, "the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use." What comparisons can be made between the usability and instructional design? How can understanding the process of instructional design help us in understanding usability?

*If you have bad ID, there probably wouldn't be good usability.
*If there is good ID, there is probably going to be good usability.
*Usability is a subset of ID. ID is the bigger umbrella. It's possible that in you ID process you would decide not to use computers at all. (Of course, usability can refer to non-computer based tools too.) But usability, although part of the whole ID process in some aspects, is more focused on the final product. It seems like a smaller part of the entire ID.
*Is there ever a time when good ID will violate good usability?
Maybe when it is done on purpose -- making something hard so that the user has to struggle a little to learn more. (Cognitive Load) But this would be intrinsic cognitive load, not extrinsic.

2. (Joanna Gibb) Nielsen & Shackrel are similiar in that their definition of 'usability' is a sytem; is valid or usable if it is found useful. How many systems are usable, easy to learn, but have no use once they have been learned? How useful will the final project be once it's completed? Easy to access only, or worthwhile and applied once it's been presented?

*There are times when it seems a system has been created that really is of no use to the the user. They did not task match. There wasn't the need for it. They didn't use participatory design.

HCI

1. (Kevin Dolan) The study and research of HCI is evident back to the 1980's - even as a multidisciplinary science and investment topic. Can we determine that there is measurable progress in the evolution of HCI over the years; i.e., is the software and computer development world improving the effort to connect humans and computers or are we just getting better at paying lip service to this area of interaction?

*Companies are having to pay attention to what customers want and need and want. If they don't pay attention to HCI they don't sell as much and people will go to another product.
*Web 2.0 has made user wants and needs more obvious. People are "talking" online about their opinions, wants, needs, etc. and they can't be as easily ignored or unknown.
*It seems that education is a niche that gets ignored. HCI principles aren't used as well because of limited funding and because they don't feel like they have to. Things are also designed top-down and then mandated to the users. They aren't driven by marketing, but the irony is they are in fact wasting money if they make a system that people will really use.

2. (W Scott Slade) If there is too much fragmentation, too many theories, too many methods, too many application domains and too many systems, is simplification possible?

3. (Shannon Ririe) Is there/should there be a committee or organization that defines what HCI is or isn't?


*Remember: You are not the user! You can't ever think like a novice again. It is really important to involve your user in the design process.


Tasks

Usability for Ed Tech
•Not just whether system is used (cf., Eason)
•System Functions should include
–Outcome of use(!)
•Task Characteristics should include
–Desirable difficulty
•Task match will need to include relevance to mental models/cognitive processes

Usability Engineering
•Works best early in design/development
•Good interface won’t solve all your problems
•HCI Usability Engineering builds on foundation of:
–Task
–User
–Context of use

Developing Useful Systems
•Slightly more than 30%of the code developed in application software development ever gets used as intended
•Likely because developers do not understanding what users need

Maybe also because the basic tasks were made usable and the rest was not

Maybe because the designer got excited after the basic user needs were designed and added a whole bunch of other stuff

Maybe the extra/advanced stuff too difficult to find users to test

The questions is do you want it this way so that as users become more proficient they can move on? or does the extra stuff get in the way?

Useful Systems Support Tasks
•Task ≠ User can use the system
•Tasks:
–Specific
–Observable
•Will you know if you’ve successfully accomplished it?
–Reflects end-goal of a user session
–Relate to key aspects of system components
•What do you think are the key parts of the system?
•What will be frequently used?

–Hugh Beyer and Karen Holtzblatt, "Contextual Design: A Customer-Centric Approach to Systems Design," ACM Interactions, Sep+Oct, 1997, iv.5, p. 62.

What is a Task Analysis?
•Analysis of how a task is performed
–Detailed description of behaviors in interface
•Highly detailed
•Step-by-Step
•Procedural (ignore mental processes for now)

How is Task Analysis Useful?
•Specify problems/gaps in process
•Highlight unnecessary or inconsistent steps
•Specifies procedural aspects of key tasks
•Requires concrete analysis of user actions


For next week:
•DUE:HCI Exercise: Task & Task Analysis Exercise - Something in education - Keyboarding for kids? - goal is to login and practice a lesson
•Read:2 articles on personas
•Post:2 questions per article on WebCTdiscussion area
–Due by 12 noon on day of class

Wednesday, January 14, 2009

First Day of Class

Class Objectives
Define human-computer interaction
Define usability and its relation to HCI
Identify important considerations for education technology

What is Human-Computer Interaction?
What you're doing with the system
Your experience with how those interactions are going
How you are going to accomplish your goals with the software
Depends on your user goals
Visual appeal, layout, design, perceptual aspects
Bulk and speed of the program
Design so it runs the way we want it to so it is functional and quick enough
The use of the user's time
Does the interface match what the user is used to and wants and can utilize?
Usability

What is usability? Is it the same thing as HCI?
Usability is only part of the entire picture of HCI
It's how we access a tech tool in the first place so we can interact with it at all
Our standards of usability are rising. We no longer are willing to use things that aren't user friendly.

Educational Technology: Do our definitions need revising? (If so, how?)
Your target audience might be a lot more specific (ie. are you trying to reach teachers, students, what age or level?)
Motivation and attention-getting becomes more necessary
Training people to use the technology - how quickly can they learn and be trained?

Formal Definition of HCI:
A discipline concerned with the design, evaluation, and implementation of interactive computing systems for human use and with the study of the major phenomena surrounding them

HCI vs. CHI
Some think we should put the Human first, not the computer, but sometimes you'll still see CHI. :)

Usability:
Usability is typically the goal human-computer interaction methods
Technology is usable if:
  • appropriate for the target users
  • allows users to accomplish their goals
Usability ≠ user friendly

HCI & Educational Technology
•Usability not necessarily end-goal
–Goals may be educator’s (not student’s)
–Learning processes?
–Learning outcomes?
•Learner Characteristics Complicate Usability
–Prior Knowledge
–Personalized technology

Desirable Difficulty
–Desirable Difficulty
•Learning can be triggered by impasses (VanLehn1991; 1995)
–The “Assistance Dilemma” (Koedinger& Aleven, 2007)
•Key problem for interactive educational technology.
•When to let students struggle, when to provide support
•How much struggle is productive may be personal

Prior Knowledge
•Expertise Reversal (Kalyugaet al., 2003)
–Experts: Visual representations
–Novices: Need plenty of textual instruction
•Expert knowledge structures (e.g., Chi et al, 1981)
–Experts have conceptual knowledge organization
–Integrate incoming information
•Self-Regulated Learners
–Hypermedia requires self-regulation (e.g., Azevedoet al, 2004)

Personalization
•Increases HCI & usability demands
•Intelligent tutors
–immediate feedback
–custom selection of content
–customized hints and help messages
•Automated knowledge analysis
–recommended materials
–provide customized prompts

HCI & Ed Tech: What Doesn't Change?
•Importnce of
–task
–users
–scenarios of use
•Useful methods for analysis
–cognitive walkthroughs
–heuristic analyses
–learner interviews & tests

For Next Week:
•2 chapters (one is short)
•Post questions on WebCT discussion area
–Due by 12 noon on day of class