Wednesday, February 11, 2009

Design & Evaluation Plans

What is a Design Brief?
•Written plan
–Focuses on the design you want to achieve
•Captures goals you want to meet
•Identifies key components of success
•Explores challenges you’ll need to meet/resolve
–Ensures communication about priorities, milestones, and criteria for success
–Synthesizes important data & input
–Acts as a point of reference during the design process
•Does the solution fit the problem?
•How to prioritize issues in the design?

Components: Design Briefs
•Team Members
•Statement of Problem/Challenge
•Activities to Date
•Summary of User Interviews & Participant Profiles
•Design Priorities and Issues
•Quick Review of Related Products/Systems
•Persona(s)
•Task Statements
•Evaluation Plan
•Exposures
•Schedule of Design & Evaluation Activities
(Examples in blue are from the very first design brief I ever wrote.)

Team Members
•Specify the names and email addresses of all team members
•(If working alone, just name yourself!)

Statement of Problem/Challenge
•Should be one paragraph or less (short & sweet)
–What is the design challenge?
–What are you trying to accomplish?
–What will be the end result of your efforts?

The Center for Spoken Language Research (CSLR) has developed a series of interactive books and tutors that are designed to support the development of reading processes in children. Although there are a large number of existing, pre-defined tutors that ask the child to perform various tasks, it is not clear how to represent student performance in a clear and functional way. We will design the data display screens for the CSLR software so that it is useful and intuitive for literacy teachers working with these students.

Activities to Date
•Meetings with Project Sponsor/Stakeholders?
–Summary
•User Interviews
–Summarize the number of interviews and the general user profile (e.g., To date, we have conducted 5 interviews with technology teachers who have varied levels of experience in schools)
–Interview Questions: OK to provide the most current set of questions.

Summary of User Interviews
•Provide a summary of your user’s activities, needs, and concerns.
•These draw upon your “key themes”, but are described in detail.

All the teachers we spoke to tend to work with small groups of children at a time (between 1 and 8) and keep written records on their students. By law, the teachers are required to set yearly goals for the students, which are formalized in either an IEP (Individual Education Plan: developed for all Special Education students, and reading is one component of this plan) and/or in an ILP (Individual Literacy Plan: developed for students who are specifically behind in reading; these students may or may not be in special ed). All teachers expressed a definite need for representing growth over time. The times in the year at which teachers tested their students’ progress differed, but all were concerned with understanding a child’s growth, and being able to communicate that to others (e.g., parents or the child’s regular teacher)… (ONE OF FOUR PARAGRAPHS)

Summary of Profiles
•OK to summarize in a table
•Always is anonymized
–no names or identifying information
•Focus on important characteristics for your user group (these likely will vary by project)
–Age?
–Gender?
–Experience?
•Career
•Computer

Design Priorities and Issues
•What will a successful design need to achieve?
–What is necessary to meet the needs of your users?
–What will be key components of your design?
•What are likely design problems or challenges that you’ll face?
–What do you foresee as barriers to success?
–Where is your design likely to break down?

Design Priorities and Issues
Priorities
•Allow teachers to compare growth over time.
•Breakdown data into useful categories: skills, types of errors, etc.
•Support comparison of performance to specific goals for the student.
•Support materials that can be used in conferences with parents.
•Grade level assessment broken down into component skills.
•Selected information from teacher report summarized for parents.
•Support comparison to benchmarks, rubrics, grade-level standards, etc.Issues
•The CSLR wants a basic design template for data design, but diverse data will be produced from the interactive books and tutors. Can a single format accommodate diverse types of data? These include:
•Comprehension data from interactive books
•Skill data from interactive tutors
•Performance data (such as fluency speed)
•Teachers not only want summary data on student performance, but also strongly desire a categorical breakdown of errors and performance. This categorization may be variable based on student performance and may require “smarter” technology than currently exists.
•One teacher has mentioned that the use of “Excent” –a database program to track special education students’ progress –is being introduced in the school district. If the use of this program becomes standardized, reports from the interactive books and tutors should easily export to this program.

Persona(s)
•You may include multiple personas, but you should identify your primary user(s).

Task Statements
•Task Statements
–Not a task analysis
–Choose 3-5 tasks that your users will need to accomplish with the system. Use your persona(s) to describe those tasks in a rich, narrative manner. (In your final report, these will become full scenarios).

Dorothy is a reading specialist who works primarily with small groups of reading disabled children outside of their regular classroom. She is concerned about the progress of Brett, a grade 2 student. Brett seemed to be making steady progress when she started working with him, she feels as though he may not be showing the same level of growth in his phonological awareness this past month. Dorothy wonders if there is one particular skill, like the pronunciation of long vowels, that is causing his slower progress or if the problem is more general. Dorothy would like to compare Brett's recent progress with his long-term growth and to find out if Brett is performing at the same level for all the component reading skills in the suite of interactive books and tutors he has been using.

Review of Related Products/Systems
•Identify 2+ systems that can inform aspects of your design
•Review these systems based upon your design priorities & issues
•Hint: Screen shots often will be helpful in your analysis

Although many existing software programs advertise that they provide “progress reports” on children’s performance (e.g., Reader Rabbit, Reading Blaster, Accelerated Reader), the form of these progress reports does not meet the needs expressed by literacy teachers. For example, in “Read, Write, and Type,” if the child selects to complete the auxiliary skills section (which is optional) they receive a score for typing accuracy (a score of the number of letters the child typed correctly/ the total number of letters typed) and a score for speed. However, there are no means of specifically evaluating areas where the child may be experiencing difficulties. The only way to discover where the child might be having trouble is to sit with the child and watch his or her progress as he or she plays with the program. This type of assessment clearly does not meet the needs of reading teachers, who express a desire for specific skill-based assessment as well as time saving features such as summarizing and listing errors.

Evaluation Plan
•What will you do to test your design?
–High or low-fidelity prototypes?
–E.g., Cognitive Walkthrough or Heuristic Analysis?
–Testing with users?
•Who?
•Where?
•When?
–OK if this changes. Create a general plan for now and revise as needed.

Exposures
•What problems will you need to solve before/during system testing?
–Provide an anticipated solution, fall-back plan

•Seeding the prototype with realistic data (and data that can be interpreted independently of teacher observations of a student).
•One teacher has volunteered to share a “running record” with us. This may give us a good base for realistic and appropriate student data. However, it may be difficult to translate this qualitative record into quantitative data for testing.
•Meeting teachers at their schools for prototype evaluation may cause problems with platforms, software, & computer access. Teachers may not be able to come to campus for testing.
•One group member has a laptop available for use. We may use the laptop to provide a consistent computer environment even if we need to travel to the teacher’s location.

Schedule of Design Activities
•What are your key HCI milestones?
–User Interviews
–Prototype (low or high fidelity)
–Internal Evaluation (e.g., Cognitive Walkthrough)
–User Testing
–Final Report
•What are the tasks for each milestone?
•If in a team, who are the responsible members for each task?

Schedule of Design Activities Example
Deliverables
Completion Date
Responsible Member(s)
UserInterviews
3/07/2009
Create List of Interviewees
2/12/2009
Butcher,Baker
FinalizeList of Interview Questions
2/16/2009
Candlestickmaker
Schedule Interviews
2/20/2009
Candlestickmaker
Conduct AllInterviews
2/27/2009
Butcher, Baker
DevelopKey Themes, Summarize for Final Report
3/07/2009
Butcher, Baker, Candlestickmaker

Work on your design brief!
–Design briefs are due February 25th.

No comments: