程序代写代做代考 graph flex go html USABLE SECURITY

USABLE SECURITY
7CCSMSEM Security Management
Dr. Jose M. Such
Based on Lorrie Cranor’s materials (CMU)

Learning Outcomes
• Understand why we should make secure systems more usable
• Understand how we can make secure systems more usable

Unusable Security
• Unpatched Windows machines compromised in minutes
• Phishing web sites costing £billions
• Most PCs and devices infected with spyware
• Users have more passwords than they can remember and practice poor password security
• Enterprises store confidential information on laptops and mobile devices that are frequently lost or stolen

Video
• https://www.youtube.com/watch?v=GVT5IgmA6WE

Usable Security Challenge
• “Give end-users security controls they can understand and privacy they can control for the dynamic, pervasive computing environments of the future.” Computing Research Association

Security is a Secondary Task

• “Users do not want to be responsible for, nor concern themselves with, their own security.”
Blake ross – co-creator or Mozilla Firefox

Concerns may not be aligned
Keep the bad guys out
Don’t lock me out!
Security Expert
User

Example: Grey
§ Smartphone based access- control system
§ Used to open doors in the Carnegie Mellon CIC building
§ Allows users to grant access to their doors remotely
L. Bauer, L.F. Cranor, R.W. Reeder, M.K. Reiter, and K. Vaniea. A User Study of Policy Creation in a Flexible Access-Control System. CHI 2008. http://www.robreeder.com/pubs/greyCHI2008.pdf
L. Bauer, L. F. Cranor, M. K. Reiter, and K. Vaniea. Lessons Learned from the Deployment of a Smartphone-Based Access-Control System. SOUPS 2007. http://cups.cs.cmu.edu/soups/2007/proceedings/p64_bauer.pdf

Grey Experiment
§ Year long interview study
§ Recorded 30 hours of
interviews with Grey
users
§ System was actively used:
29 users x 12 access per week

Grey Experiment: Speed complaints
§ Users said Grey was slow
§ But Grey was as fast as
keys
§ Videotaped a door to
better understand how doors are opened differently with Grey and keys

Average Access Time
Getting keys
Stop in front of door
Door opened
Door Closed
Getting phone
Stop in front of door
Door opened
Door Closed
3.6 sec
σ = 3.1
5.4 sec
σ = 3.1
5.7 sec σ = 3.6
Total 14.7 sec
σ = 5.6
8.4 sec
σ = 2.8
2.9 sec
σ = 1.5
3.8 sec
Total 15.1 sec
σ = 3.9
σ = 1.1

“I find myself standing outside and everybody inside is looking at me standing outside while I am trying to futz with my phone and open the stupid door.”

Convenience always wins

Unusable Security Frustrates Users

Typical password advice
§ Pick a hard to guess password § Don’t use it anywhere else
§ Change it often
§ Don’t write it down

What do users do when every web site wants a password?

Bank = b3aYZ Amazon = aa66x!
Phonebill = p$2$ta1

Encryption Usability
• Encryption is rarely configured by default • You need a good password
• And you can’t lose it or forget it
• Public/private key encryption
• How to get someone’s public key?
• How do I make it work on my phone?
• “Only paranoid people use encryption” • Seminal work – recommended reading
• A. Whitten and J.D. Tygar. Why Johnny Can’t Encrypt: A Usability Evaluation of PGP 5.0. USENIX Security 1999.

Why Glenn couldn’t encrypt
• Imagine that Ed wants to send a message to Glenn and worries that others might want to intercept his messages
• Ed asked Glenn for his PGP key
• “And yet, Greenwald still didn’t bother learning security protocols. ‘The more he sent me, the more difficult it seemed,’ he says. ‘I mean, now I had to watch a f***ing video . . . ?’”
• http://vimeo.com/56881481
• Snowden ended up reaching out to Laura Poitras instead
• http://www.rollingstone.com/politics/news/snowden-and-greenwald-the-men-who-leaked-the-secrets-20131204 http://www.dailydot.com/politics/edward-snowden-gpg-for-journalists-video-nsa-glenn-greenwald/

How can we make secure systems more usable?

How can we make secure systems more usable?
§ Make it “just work” – Invisible security
§ Make security understandable – Make it visible
– Make it intuitive
– Use metaphors that users can relate to
§ Train the user

Make security just work

Technology Should be Smarter
Move from explicit to implicit authentication:
1. Proximity sensors
2. Behavioral biometrics: zero-effort, one-step, two- factor authentication
3. Exploit interaction style: use video-based authentication in video, audio in audio, etc.
4. Web fingerprinting

Make security understandable

Connection security indicators
HTTPS HTTP
• What do proposed symbols indicate?
• What are the security properties of HTTPS?
• Secrecy – message is encrypted
• Authenticity – message has a valid
Invalid HTTPS
certificate
• Integrity – message has not been
tampered with
A.P. Felt, R.W. Reeder, A. Ainslie, H. Harris, M. Walker, C. Thompson, M.E. Acer, E. Morant, and S. Consolvo. Rethinking Connection Security Indicators. SOUPS 2016.

What can still go wrong at secure site?
• Malware on site
• Key logger on user’s computer
• Malicious third-party ads or trackers
• Site is not the site you think it is – certificate is valid but not for company you are expecting (phishing)
• Certificate authority was compromised and issued invalid certificate

Support user decision
High probability of danger
Block
Might be dangerous
User must decide
Very low probability of danger
Don’t bother user
Improve warnings
Help user decide by asking question user is qualified to answer

Train the user

Why do humans fall for phish?
§ Not motivated to pay attention to training – “Security is not my problem”
§ Mental models inconsistent with reality
– “If site looks professional it must be legitimate”
§ Need actionable advice they can understand
– Difficult to be alert if you don’t know what you’re looking
for

PhishGuru embedded training
§ Send email that look like phish
§ If recipient falls for it, train in succinct and engaging format § Study demonstrated effectiveness of PhishGuru and found
that same training was not effective sent as regular email
P. Kumaraguru, J. Cranshaw, A. Acquisti, L. Cranor, J. Hong, M.A. Blair, and T. Pham. School of Phish: A Real-Word Evaluation of Anti-Phishing Training. SOUPS 2009.
http://www.cylab.cmu.edu/research/techreports/tr_cylab09002.html
Learning science principles
+ Teachable moments +
Fun

PhishGuru experiment
§ 28-day study
§ 515 CMU students, faculty, and staff
§ Conditions: No training, 1 training message, 2
training messages
§ 7 simulated phishing emails and 3 legitimate
emails sent to each participant
P. Kumaraguru, J. Cranshaw, A. Acquisti, L. Cranor, J. Hong, M.A. Blair, and T. Pham. School of Phish: A Real-Word Evaluation of Anti-Phishing Training. SOUPS 2009.
http://www.cylab.cmu.edu/research/techreports/tr_cylab09002.html

Experiment Simulated E-mail
Plain text email without graphics
URL is not hidden

Experiment Simulated Website

Experiment Simulated Website

Experiment Results
§ PhishGuru training taught people to
distinguish phishing and legitimate emails – Those trained with PhishGuru still clicked on
legitimate links
– But those trained with PhishGuru were less likely
to click on phishing links, even 28 days after training

Conclusion
• Security VS Usability is a myth!
• You need both Security and Usability to go hand in hand