程序代写 THERAC-25

POWER, TRUST,
AND MACHINES
School of Computing and Information Systems
Co-Director, Centre for AI & Digital Ethics The University of Melbourne @tmiller_unimelb

Copyright By PowCoder代写 加微信 powcoder

This material has been reproduced and communicated to you by or on behalf of the University of Melbourne pursuant to Part VB of the Copyright Act 1968 (the Act).
The material in this communication may be subject to copyright under the Act.
Any further copying or communication of this material by you may be the subject of copyright protection under the Act.
Do not remove this notice

LEARNING OUTCOMES
Define trust and trustworthiness with respect to artificial intelligence
Discuss the effects of use, misuse, abuse, and disuse of machines when trust is not properly calibrated.
Apply models of trust and power to evaluate trustworthiness and potential ethics concerns of digital applications
Discuss the relationship between power, ethics, and trust

WHAT IS TRUST AND WHY IS IT IMPORTANT?

GOALS OF TRUST
Trust between people
Human-machine trust
PREDICTABILITY
ENABLE COLLABORATION
TRUST IS NOT THE END GOAL
PREDICTABILITY
ENABLE ¡°COLLABORATION¡±

TRUST: THE VIEW FROM SOCIOLOGY
Interpersonal trust = humans trusting humans
A trusts B if:
1 A believes that B will act in A’s best interests; and
2 A accepts vulnerability to B’s actions;
so that A can:
3 Anticipate the impact of B’s actions, enabling collaboration

HUMAN-MACHINE TRUST
Human-machine trust = one-way interpersonal trust of machine
H trusts M if:
1 H believes that M will act in H’s best interests; and
2 H accepts vulnerability to M’s actions;
so that A can:
3 Anticipate the impact of M¡¯s decisions, enabling collaboration

DISTRUST vs LACK OF TRUST
H distrusts M if:
1 H believes that M will act against H’s best interests.
Lack of trust = absence of trust:
1 H does NOT believe that M will act in H’s best interests; OR
2 H does NOT accept vulnerability to M’s actions;

CONTRACTUAL TRUST: THE VIEW FROM SOCIOLOGY
Social or normative contract, as well as legal
Contractual trust = humans trusting humans to fulfil a contract in a particular context

CONTRACTUAL TRUST: THE VIEW FROM SOCIOLOGY
Contractual trust = humans trusting a machine to fulfil a contract in a particular context

CONTRACTS FOR ARTIFICIAL INTELLIGENCE
European Guidelines for Trustworthy AI Models
Documentations
Fairnesschecklists All N/A
Factsheets (security) N/A Model cards (metrics) Factsheets (concept drift) Reproducibility checklists
Datasheets/statements Datasheets/statements Datasheets/statements All
Factsheets (explainability) Factsheets (explainability) N/A
Fairness checklists
Fairness checklists Reproducibilitychecklists
Fairness checklists Fairness checklists
Factsheets (lineage) Fairnesschecklists N/A
Fairness checklists
Explanatory Methods/Analyses
See¡°Diversity,non-discrimination,fairness¡± User-centered explanations [62] Explanations in recommender systems [42]
Adversarial attacks and defenses [21] N/A
Contrast sets [17], behavioral testing [61] ¡°Show your work¡± [14]
Removal of protected attributes [60]
Detecting data artifacts [24]
Saliency maps [65], self-attention patterns [41], influence functions [39], probing [16]
Counterfactual [22], contrastive [54], free-text [28,51], by-example [39], concept-level [20] explanations N/A
Debiasingusingdatamanipulation[70] N/A N/A
Analayzingindividualneurons[10] Biasexposure[69] Explanations designed for applications such as fact checking [3] or fake news detection [48]
Reporting the robustness-accuracy trade-off [1] or the simplicity-equity trade-off [38]
Key Requirements
Human agency and oversight
Technical robustness and safety
Privacy and data governance
Transparency
Diversity, non-discrimination, fairness
Societal and environmental well-being
Accountability
¡¤Fosterfundamentalhumanrights ¡¤Support users¡¯ agency
¡¤Enable human oversight
¡¤Resilience to attack and security ¡¤Fallback plan and general safety ¡¤A high level of accuracy ¡¤Reliability
¡¤Reproducibility
¡¤Ensure privacy and data protection ¡¤Ensure quality and integrity of data ¡¤Establish data access protocols
¡¤High-standard documentation ¡¤Technical explainability
¡¤Adaptable user-centered explainability ¡¤Make AI systems identifiable as non-human
¡¤Avoid unfair bias
¡¤Encourage accessibility and universal design ¡¤Solicit regular feedback from stakeholders
¡¤Encouragesustainableandeco-friendlyAI ¡¤Assess the impact on individuals
¡¤Assess the impact on society and democracy
¡¤Auditability of algorithms/data/design ¡¤Minimize and report negative impacts ¡¤Acknowledge and evaluate trade-offs
¡¤Ensure redress
Source: Table 1 from Formalizing trust in artificial intelligence: Prerequisites, causes and goals of human trust in AI. , , , and . In Proceedings of ACM Conference on Fairness, Accountability, and Transparency (ACM FAccT 2021), 2021.

TRUST AND TRUSTWORTHINESS

TRUSTWORTHY MACHINES
A machine is trustworthy if:
1 It can fulfill its set of contracts
Trust does not imply trustworthiness Trustworthiness does not imply trust

WARRANTED AND UNWARRANTED TRUST
Trustworthy Not Trustworthy
Warranted trust
Unwarranted trust
Distrusted
Unwarranted distrust
Warranted distrust

USE, MISUSE, DISUSE, AND ABUSE:
UNWARRANTED TRUST AND DISTRUST

FACTORS THAT DETERMINE USE OF AUTOMATION PARASURAMAN AND RILEY (1997)
MENTAL WORKLOAD
COGNITIVE OVERLOAD

MISUSE OF AUTOMATION
Definition: Using automation when it not should be used
Cause: Unwarranted trust
Over-reliance on automation
Decision biases and automation biases
Machine monitoring errors Impacts: Complacency

DISUSE OF AUTOMATION
Definition: Not using automation when it should be used
Cause: Unwarranted distrust
Human monitoring errors (law false alarm rate) Machine monitoring errors
Human bias
Impacts: Disabling/ignoring alarms

ABUSE OF AUTOMATION
Definition: Deploying automation when it should not be
Cause: Unwarranted distrust (from designer) Distrust in human operators
Automation bias
Impacts: Mismatch in human-automation interface

EXAMPLE: THERAC-25
Therac-25 A software-controlled radiation therapy machine
Outcome: Six patients with fatal radiation overdoses
Causes: Software errors from
Misuse Unwarranted trust from radiographers
Disuse Hardwareinterlocksremoved Abuse Minimal input from radiographers

POWER, TRUST AND MACHINES

WHAT IS POWER?
The ability to control our circumstances
POWER TO DO … POWER OVER …

POWER, TRUST, AND ETHICS
Trust ¡ Ethics ¡ Power
But! They are closely related and cannot be separated.

USER TRUST
In-control user
In-control user

USER TRUST
DInec-cisoinotnrosul ubsjecrt
Trust (no vulnerability)
Decision maker

POWER, TRUST, AND MACHINES: SUMMARY
TRUST AND POWER
Belief in acting ¡®in my interests¡¯ Accepting vulnerability Anticipating impact of decisions
Contractual trust
Warranted and unwarranted trust and distrust
Use, misuse,disuse, and abuse of technology
Power is ability to control circumstances
KEY TAKEAWAYS
Be explicit about which contracts holds for your systems/applications
Ethically, trust is only desirable if it is warranted
Distrust is desirable if it is warranted
Incorrectly calibrated trust leads to real problems
Ethical issues emerge from (real or perceived) power imbalances between groups with different interests

School of Computing and Information Systems Co-Director, Centre for AI & Digital Ethics
The University of Melbourne
@tmiller_unimelb

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com