Users and ethics
1
Interface Hall of Fame or Shame?
Submit these to the Canvas Discussion Board “Interface Hall of Fame or Shame” and I’ll pick a couple to be discussed in the lecture
Learning Objectives of Today
• + Go through Assignment 1 specification
• 1) To understand the importance of identifying the correct
users.
• 2) Learn to identify users and work with them towards a design objective.
• 3) To understand ethics for research involving human participants
• 4) Learn to identify and define tasks.
Assignment 1 Diving into the jungle
4
Assignment 1
★ A group of 3-4
★ Group assignment mark: 15%
★ Peer assessment to moderate the mark based on contributions
★ Individual component: 5% (online assignment interview via Canvas)
Step 1: Choose a platform/product
Pick a project:
★ Choose three similar apps for the chosen project
★ Try not to choose one that is too broad like Google Maps
★ If you are adamant on choosing a huge platform, narrow your
research down to certain elements. I.e. Facebook reactions
★ Pick a direction for your survey.
★ Must be commercial, and have a reasonable amount of users.
Step 2: Analyse the product
★ Who are the target users? How many user groups are there? What is its business model?
★ What are the main tasks to be completed? Can these tasks be completed easily and intuitively?
★ Are there any pitfalls? Are these pitfalls due to the violation of usability principles or because of technological reasons?
Step 3: Create your survey
★ Identify the questions: issues or tasks you want to find out.
★ Identify your audience: who do you want to recruit to do your survey.
★ Identify where you can reach your users.
➔ Do not inject biases into your survey.
➔ Do not ask questions just because you want to.
➔ Do not ask for personal information.
Step 4: Online user interview
Along with your survey you should also conduct online user interview. Interview one potential user each (one per every group member)
Discuss the results of this activity during the online individual assignment interview (run through Canvas quiz).
You are not doing a product/market survey
★ Yourquestionsshouldalwaystiebacktotheinterfaceanduser experience.
★ Youarenotstudyingwhetherfeaturesareuseful,buthoweasyitis to understand and access those features.
★ Competitoranalysisisokaslongasit’suserexperiencerelated.
Step 5: Analyse your findings
★ Collateyourfindingsintocommonthemes.
★ CompareyourfindingswithyouranalysisfromStep2.
★ Synthesiseyourfindingstoanswertheresearchquestionsyousetoutwith ★ Compileareport
Quiz time
• Go to Canvas -> Quizzes -> Quiz 1
Identifying and Working with users
13
Why?
• Imagine you’ve been asked to
• study an existing product or interface
• maybe to re-design the interface
• research a product/interface soon to be created
• You need to answer the following questions –Who are your users?
–What might they do with your product? –What do they think is important?
Why? Think of this analogy
Satisfy the cat. i.e. User-centred design
Defining users
16
What is a User?
• What problem will your product solve? • Whose problems are those?
Identifying a User
• How do you know who will be using your product? • Remember that users are:
–Anybody that will be gaining value from your product, by completing a task within your product.
Identifying a User
• How many groups of users will your product have?
• Determining this could be more difficult than you think.
Consider – auspost.com.au
• Who are the user groups?
• How would you classify them?
• By type of business? Value of business? By tasks?
2015 web site
Let’s begin with: user characteristics
• Instead of trying to encompass each and every possible user, let us paint a picture by using attributes.
• Each user has attributes and there is a spectrum for each attribute.
Personal attributes
• Age
• Gender
• Ethnicity
• Others?
• It is your call to decide if these matter, or if they are set in stone.
• However, they can help when you create personas.
Other attributes – skills
• What sort of skills are required when interacting with your product?
• If you are building a website
• your user should have basic knowledge to use a browser
perhaps?
• What sort of specialized skills is required beyond that?
• Playing a video?
Other attributes – skills
• In fact, what sort of prior experience is expected of your users?
• Back to auspost.com.au
–It assumes that users will be familiar with drop down
menus, and able to learn a simple navigational pattern. • Manage your expectations.
Other attributes – domain experience
• Are your users expected to have domain experience?
• What is domain experience?
• For example, if you are designing a food finder website, do your users know words like “Japanese cuisine” or “Tapas”.
Other attributes – domain experience
• In addition, your users may have prior experience with your competitors’ products.
• This may create their expectations of your product.
Finding users
28
Users, finding them
• Questionnaires
• Interviews
• Observations
• Competitive evaluation
–Identify product –Ascertain users –Get access to them
Why are we looking for users?
• Because that is what UCD is about.
• We get the users involved from the start.
• We may not make design decisions based on their input but it is better to be informed than to make assumptions.
Finding users
• If we are already giving out surveys, haven’t we found them?
• Often, not.
• A huge reason for the surveys, interviews and
observations is to find the right people.
• And finding the right users could either make or break your design.
Finding users
• Can sometimes be difficult to pinpoint if your base is too broad, or inaccessible.
• Example, a very casual mobile game application.
• Who exactly will play it?
• Key is to involve your users early, and define them.
Finding users
• “Oh, everybody could play my game”. • Is this assumption correct?
• How does that help design the interface for your game?
Lessons from academic study
Kujala, S., & Kauppinen, M. (2004). Identifying and selecting users for user- centered design. In Proceedings of the third Nordic conference on Human- computer interaction (pp. 297–303). ACM.
Steps to identify and select users:
1. Brainstormapreliminarylistofusers.
2. Describethemainusercharacteristics(includingmarketsize).
3. Describemainusergroupsandprioritizethem.
4. Select typical and representative users from the groups.
5. Gatherinformationfromtheusersandredesigntheusergroupdescriptions according to the new information gathered.
Lessons from academic study
Kujala, S., & Kauppinen, M. (2004). Identifying and selecting users for user- centered design. In Proceedings of the third Nordic conference on Human- computer interaction (pp. 297–303). ACM.
–Describes seven case studies
–Very readable (find it on Canvas) • Lessons learned
–Iterative process
–Diversity of users wider than developers assume
Identifying and Defining Tasks
36
What is a Task?
• A task is work that needs to be done. • Cleaning your room for example.
• But what does it mean in UCD?
• How do we break it down?
What is a Task?
• First, what is the goal of the task.
• In our previous example, obviously is to have a clean
room.
• Next, how do you arrive at the goal? Are there multiple ways to do it?
• Do you hire someone to clean it for you? Or do you design a robot slave to do it?
• Cleaning it yourself is obviously too old school.
What is a Task?
• Ok, so what are the steps or sub tasks involved?
• Assuming you did it the “normal” way, you would check to
see if you have cleaning materials and tools.
• This is a precondition for the task. It needs to be met before proceeding to the next step.
So for a more relevant example,
• Say you are designing an email software.
• A precondition to sending an email is for the user to have
already logged in.
• A subtask could be for the user to key in the recipients email address.
Defining Tasks
• Next you will need to define it.
–Where exactly would the task be performed?
–How often do you think would be expected to be performed?
–How long should it take on average?
Defining Tasks
• But more importantly…
–How is the task learned?
–How would a fresh user understand your design?
–Is it intuitive? Refer back to identifying your users and their expectations.
Defining Tasks
• Also, what could go wrong?
• This is a loaded question, because it is very easy to make
assumptions all over the place.
• For example, what happens if the user spills mac and cheese onto his keyboard while trying to upload a video?
• Relevance.
Defining Tasks
• Who else is involved?
• Does that task require the involvement of another user? • Or another piece of software?
• What are the risks?
Task study in Microsoft-RMIT Cortana Intelligence Institute
•https://www.microsoft.com/en-us/research/blog/new-institute-explores-future-cortana/
User Research Methods
46
User experience research methods
What UX methods to use and when?
Requirement stage
•
•
• • • • •
•
•
Top tasks analysis http://www.measuringu.com/blog/method-when.php
•
Who are the Users and Customers? Survey
Persona
Contextual Inquiry Stakeholder Interviews Competitive Analysis
Quality Function Deployment
What are users trying to do?
Task analysis
3-dimensional framework
•Attitudinal vs. Behavioral
•Qualitative (formative) vs. Quantitative (summative)
•Context of Use
Surveys Challenges
50
Surveys
• You fill so many out.
• Tempting to think you know how to write them
• Creating a useful and reliable survey is an art form.
• You not only have to know which questions to ask, but how to frame the questions using concise and impartial words.
Survey
• Survey
• Coverage error
• Do you get to everyone using your system • Sampling error
• If you only talk to a sample, some error • Nonresponse error
• Survey the correct people, but what if not all answer?
• Measurement error
• People don’t answer accurately
Measurement error
• Ask factual questions people often answer themselves? • Generally no problem
• Ask questions people need to think about?
–The way you ask the question could influence the
answer
Measurement error
• Social desirability (answer so as to be viewed favourably) • Self administered
• Interviewer
• Acquiescence (bend to the “view” of the question)
• “Individuals are more to blame than social conditions for
crime and lawlessness in this country” • 60% agreed
• “Social conditions are more to blame than individuals for crime and lawlessness in this country”
• 57% agreed
Surveys Recommendations
55
Questions
•Preference for multi-choice over free text –Why?
• 15 examples of useful user feedback questions: http://www.uxforthemasses.com/online-survey-questions/
The Likert Scale
• Features a range of responses for example:
• The Learning Hub is easy to use • Strongly agree
• Agree
• Neither agree nor disagree
• Disagree
• Strongly Disagree
NASA Task Load Index (NASA-TLX)
• Rating scale widely used for assessing workload
• Commonly used in aeronautical interfaces • Links
– http://en.wikipedia.org/wiki/NASA-TLX
– http://humanfactors.arc.nasa.gov/groups/TLX/downloads/NASA-
TLXChapter.pdf
– http://humansystems.arc.nasa.gov/groups/TLX/downloads/HFES_
2006_Paper.pdf
“NasaTLX” by Hart and Staveland – http://humansystems.arc.nasa.gov/groups/TLX/downloads/TLXScale.pdf. Licensed under Public Domain via Wikimedia Commons
Interviews
59
Interviews
• Interviews are more expensive than survey, as it requires a higher time commitment from both the developer and the user.
• Compensations for the users are sometimes given, which adds to the cost.
• Some interviews are blind, as in random samples are chosen while a targeted interview will first use a survey to identify targeted users.
Why Interviews then?
• Offers a more detailed response.
• Able to pick up body language and tone.
• Able to dig a little deeper if encountered an interesting response.
Types of Interviews
• Formal/Structured Interviews
• Informal/Unstructured Interviews • Contextual Inquiries
Informal Interviews
• Ideal when you are still exploring the problem space.
• Leaves you space to change and alter questions on the fly.
• A good idea when you have no clear directions as to what to ask.
• Have a conversation with the user.
Formal Interviews.
• Directed approach, when you know exactly what you would like to uncover.
• Has specific questions, i.e. you wish to know if the navigation panels are actually working and intuitive.
• Scripted and can be quicker. Avoid deviations.
Measurement error
• Interviewer bias – often incorporated in leading questions • The use of leading questions can lead to skewed opinion
surveys
• Leading questions lead to bad data
• Yes, Prime Minister!
• A clip from a British satire comedy
• How to ask a question? https://www.youtube.com/watch?v=8tiuWYs5Z-A
Contextual Inquiries
• Main difference from previous two is that you engage the users in their environment where they are using your product.
• This is especially important for applications in static environments – i.e. kiosks.
• Even if your product is a mobile application for example, it would actually be beneficial to interview them when they are most liable to use it, i.e. on the streets or in a tram.
Research ethics
67
68
Ethics in user tests?
• Pressures on a user
–Performance anxiety
–Feels like an intelligence test –Comparing self with other participants –Feeling stupid in front of observers –Competing with other participants
RMIT University©2013 CS&IT 69
What to do?
• Does your experiment involve human participants (directly or indirectly?)
–Research must be approved by ethics board • RMIT
–CHEAN – College Human Ethics Advisory Network –E.g. http://www1.rmit.edu.au/seh/ethics
RMIT University©2013 CS&IT 70
Boards will ask
• Are participants are identifiable or re-identifiable?
• Is some form of deception is involved?
• Are participants aged less than 18 years?
• Are participants cognitively or emotionally impaired?
• Do participants belong to a cultural/minority group?
–Do participants consider themselves to be Aboriginal or Torres
Strait islander people?
RMIT University©2013 CS&IT 71
Boards will ask
• Does the procedure used in the research involves any experimental manipulation or includes the presentation of any stimulus other than question-asking?
• Are the questions asked include personally sensitive and/or culturally sensitive issues?
• Is there a power-dependency relationship between researcher(s) and participant(s) e.g. the doctor/patient or teacher/student relationship?
RMIT University©2013 CS&IT 72
Boards will ask
• Selection of tasks and participants
• Time and location of test
• Use of participants’ personal information • Presentation of results
• Data, will
–it be stored in a secure location?
–it be stored for 5 years after publication of research findings? –only the researchers will have access to the data?
RMIT University©2013 CS&IT 73
Treat users with respect
• Time
–Don’t waste it
• Comfort
–Make the user comfortable
• Informed consent
–Inform the user as fully as possible
• Privacy
–Preserve the user’s privacy
• Control
–The user can stop at any time
RMIT University©2013
CS&IT 74
In Canvas
• See example Participant Information Sheet
RMIT University©2013 CS&IT 75
Suggested text reading
• User Centered Design, Chapter 3: Working with Users?
Possible books
• Smashing UX Design: Foundations for Designing Online User Experiences by Jesmond Allen & James Chudley
• Simple and Usable Web, Mobile, and Interaction Design (Voices That Matter) by Giles Colborne
• Interaction Design: beyond human-computer interaction by Yvonne Rogers, Helen Sharp, Jenny Preece
• And now for something completely different…
–Make It So – Interaction Design Lessons from Science Fiction by Nathan Shedroff & Christopher Noessel