代写代考 COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 2

Photo: https://www.createdigital.org.au/human-like-robot-aged-care-homes/
Week 10/S1/2022
AI in Care

Copyright By PowCoder代写 加微信 powcoder

School of Computing and Information Systems The University of Melbourne
jwaycott [at] unimelb.edu.au

Learning Outcomes
1. Describe the role of AI in supporting different kinds of care, including in sensitive and complex care settings (e.g., aged care).
2. Apply care ethics frameworks (e.g., Tronto) to define and analyse care and caregiving.
3. Critique the design and use of AI for care, and discuss the possible unintended consequences of using AI in care settings.
4. Apply concepts from care ethics and value-sensitive design to discuss the future appropriate design of AI for care.
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 2

Related Reading
This module has two readings about ethical issues in the design and use of robots in aged care (our main case study for exploring ethical challenges for AI in care):
, A. (2013). Designing Robots for Care: Care Centered Value-Sensitive Design. Social Engineering Ethics, 19, 407-433.
• Draws on care ethics and value sensitive design to consider how robots can be ethically designed for use in care settings and to propose a method for evaluating ethical issues arising from the use of robots in aged care.
Vandemeulebroucke, T., et al (2018). The Use of Care Robots in Aged Care: A Systematic Review of Argument-Based Ethics Literature, Archives of Gerontology and Geriatrics, 74, 15-25.
• Summarises arguments about ethical issues associated with the use of robots in aged care through a systematic literature review study.
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022

Introduction – my background and current research
What is care?
Who cares?
Designing AI for care: care ethics and value sensitive design
Joan Tronto’s four phases of care
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 4
Can AI care?

Introduction
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022

About me…
Then: Bachelor of Arts (Psychology) @ UniMelb
Now: Associate Professor, Computing & Information Systems, Faculty of Engineering and Information Technology
https://findanexpert.unimelb.edu.au/profile/52243-jenny-waycott

https://cis.unimelb.edu.au/hci

2000 – 2011:
Educational technology (mobile technologies and social technologies in higher education)

Now: Emerging Technologies for Enrichment in Later Life
Photo: https://www.createdigital.org.au/human-like-robot-aged-care-homes/

Imagine you work for a robotics company whose motto is: “robots for social good”
The company is designing a companion robot to support people like Donald -> older adults who are socially isolated

DISCUSSION
• What functions should the robot perform?
• What functions should the robot NOT perform?
• Is there anything that might go wrong?
• Arethereanyissuesthe company should be concerned about?

What is care?
Image: Fang cuddling Mr Potato Head

Care: Some examples
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 14

Care: Some examples
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 Photo from https://integrisok.com/ 15

Care: Some examples
Photo by on Unsplash
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022

Care: Some examples
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 Image from: https://www.news.com.au/ 17

Care: Tronto’s definition
“Care is a common word deeply embedded in our every day language. On the most general level care connotes some kind of engagement; this point is most easily demonstrated by considering the negative claim: ‘I don’t care’”
Care carries two important aspects:
“First, care implies a reaching out to something other than the self: it is neither self-referring nor self-absorbing.
Second, care implicitly suggests that it will lead to some type of action.”
Joan Tronto (1993). Moral Boundaries: A Political Argument for an Ethic of Care, Routledge (p. 102)
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 19

Care: Tronto’s definition
“On the most general level, we suggest that caring be viewed as a species activity that includes everything that we do to maintain, continue, and repair our ‘world’ so that we can live in it as well as possible. That world includes our bodies, our selves, and our environment, all of which we seek to interweave in a complex, life-sustaining web.”
Joan Tronto (1993). Moral Boundaries: A Political Argument for an Ethic of Care, Routledge (p. 103)
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 20

Caring about: noticing the need for care. Requires attentiveness.
Taking care of: “assuming some responsibility for the identified need and
determining how to respond to it.”
Care giving: requires competence. The need for care has only been met if good care has been provided.
Care receiving: “we need to know what has happened, how the cared-for people or things responded to this care.”

Elements of an Ethic of Care (Tronto)
Attentiveness: “If we are not attentive to the needs of others, then we cannot possibly address those needs.” OR: Ignoring other is “a form of moral evil”
Responsibility: “Ultimately, responsibility to care might rest on a number of factors; something we did or did not do has contributed to the needs for care, and so we must care.”
Joan Tronto (1993). Moral Boundaries: A Political Argument for an Ethic of Care, Routledge (pp. 127-132)
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 22

Elements of an Ethic of Care (Tronto)
Competence: “Intending to provide care, even accepting responsibility for it, but then failing to provide good care, means that in the end the need for care is not met. Sometimes care will be inadequate because the resources available to provide for care are inadequate. But short of such resource problems, how could it not be necessary that the caring work be competently performed in order to demonstrate that one cares?” (p. 133)
Responsiveness: “the responsiveness of the care-receiver to the care… By its nature, care is concerned with conditions of vulnerability and inequality… The moral precept of responsiveness requires that we remain alert to the possibilities for abuse that arise with vulnerability.” (pp. 134-135)

Elements of an Ethic of Care (Tronto)
“Care as a practice involves more than simply good intensions. It requires a deep and thoughtful knowledge of the situation, and of all of the actors’ situations, needs and competencies. To use the care ethic requires a knowledge of the context of the care process. Those who engage in a care process must make judgements: judgements about needs, conflicting needs, strategies for achieving ends, the responsiveness of care-receivers, and so forth.
[Care requires] an assessment of needs in a social and political, as well as a personal, context.” (p. 137)
Joan Tronto (1993). Moral Boundaries: A Political Argument for an Ethic of Care, Routledge (pp. 127-132)

You still work for a robotics company whose motto is: “robots for social good”!
The company has developed a robot and is ready to trial it with isolated older adults (in partnership with care providers).
What do you need to consider when preparing the trial?
Who will use the robot?
What are their care needs?

THE VIRTUAL ASSISTANT: ELLIQ
https://elliq.com/

https://elliq.com/pages/features

INTERVIEW STUDY
• 16 older adults living independently (aged 65 to 89)
• Interviews conducted in participants’ homes (pre-Covid – Jan 2019)
• Interviews focused on:
• Companionshippreferencesandsocialcircumstances
• Responses to videos of three different kinds of virtual assistant/robots: an assistant, a toy, and a pet

ELLIQ: COMPANY OR INTRUSION?
Beth (who longed for human conversation) thought ElliQ and her chatter would be comforting in a quiet and lonely household:
“It breaks the silence of the day”
Sarah (who liked human company) found the idea of ElliQ’s conversation appealing:
“like having a person in the house”
Stephanie (who shunned human company): ElliQ would be like having another person in the house – “No thanks!”
“I don’t know whether that would drive me mental if it kept interrupting me and telling me what to do… I might want to get an axe and cut it up.” (Brianna)

Who cares? Can AI care?

Who cares?
“Care seems to be the province of women… The largest tasks of caring, those of tending to children, and caring for the infirm and elderly, have been almost exclusively relegated to women” (Tronto, p. 112)
“Care is fundamental to the human condition and necessary both to survival and flourishing… In people’s everyday lives care is an essential part of how they relate to others” (Barnes, 2016, p. 1) -> everyone engages in care-giving and care-receiving.
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 33

Who cares?
medical professionals,
care professionals (social work, childcare, aged care, etc.), parents, children, family
government / organisations (caring about and taking care of)
Machines / AI ?
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 34

Discussion
What are some other examples of machines/AI supporting or providing care?
What are the benefits of using AI in care? What are the challenges?
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 35

AI in parenting
https://www.theguardian.com/media/2022/may/01/honey-lets-track-the-kids-phone-apps-now-allow-parents-to- track-their-children

Can AI support care?
Parents using tech to monitor children’s location:
✓ Safety – can locate the child if there is something wrong: “When I think about it, it makes me feel safe, because I know that Mum or Dad knows where I am” (Lola, aged 17)
✓ Peace of mind – children “don’t answer their phone to their parents or text them back… I tend to catastrophise” (Alicia, parent)
❖Invasion of privacy? “At that point in my life, I wasn’t necessarily that happy about Mum knowing where I was all the time. I was sneaking out to smoke, so I didn’t want Mum to see that I was leaving school” (Ben)
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 37

Can AI support care?
Parents using tech to monitor children’s location:
❖Digital footprint:
“The idea that children are getting a detailed digital footprint not of their own making that tracks everywhere they go, and that’s being used to sell advertising to them now or later, is reprehensible”
(Prof Sonia Livingstone)
https://www.theguardian.com/media/2022/may/01/honey-lets-track-the-kids-phone-apps-now-allow-parents-to- track-their-children
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 38

Can AI support care?
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 https://www.weforum.org/agenda/2020/10/ai-artificial- 39 intelligence-canada-homelessness-coronavirus-covid-19

– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 https://www.weforum.org/agenda/2020/10/ai-artificial- 40 intelligence-canada-homelessness-coronavirus-covid-19

Are there any other risks involved in using AI to predict homelessness? 41
https://www.weforum.org/agenda/2020/10/ai-artificial-intelligence-canada-homelessness-coronavirus-covid-19

Tackling rough sleeping: An example

https://www.theguardian.com/cities/2014/jun/12/anti-homeless-spikes- latest-defensive-urban-architecture

Can AI support social welfare?
Social welfare = societal and government responsibility to care for vulnerable citizens Can AI be used to determine who needs financial support?
Can AI be used to determine who has received financial support in error?
Robodebt scandal: the automated process of matching the Australian Taxation Office’s income data with social welfare recipients’ reports of income to Centrelink. Many people received debt notices in error.
-> scheme criticised for inaccurate assessment, illegality, shifting the onus of proof of debt onto welfare recipients, poor support and communication, and coercive debt collection.
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022
(Braithwaite, 2020)

Can AI support self-care?
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022

(Some) ethical challenges of AI in care
Cold technologies vs warm care? (see Pols and Moser, 2009)
-> instead of positioning tech as cold in contrast to warm care, the authors argue that it is
better to ask “what kind of affective and social relations are enabled” by care technologies
Privacy vs care/monitoring -> how are these balanced?
Different goals and values for different stakeholders -> do care recipients value and want care?
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022

Designing AI for care

Care Centred Value-Sensitive Design
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 48

What is value-sensitive design?
“Value Sensitive Design is a theoretically grounded approach to the design of technology that accounts for human values in a principled and comprehensive manner throughout the design process.”
(Friedman, , & Borning, 2008) “takes as its starting point the belief that technologies embody values”
(Wynsberghe, 2013)
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022

What is a value?
Value = “economic worth of an object”
Value = “what a person or group of people consider important in life.”
➢Can be anything people consider important, from profound to mundane, e.g.:
➢Family, friendship, morning tea, education, art, a walk in the woods, nice manners, good science, a wise leader, clean air, healthy planet
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 50

Discussion
What are your values?
What values do you see embedded in the technologies you use?
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 51

How is value-sensitive design practiced?
1. Conceptual investigations:
• Who are the direct and indirect stakeholders affected by the design at hand?
• How are the stakeholders affected?
• What values are implicated?
• How should we engage in trade-offs among competing values (e.g., autonomy vs. security, or anonymity vs. trust)?
• Should moral values have greater weight than nonmoral values (e.g., aesthetic values)?
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 52

How is value-sensitive design practiced?
2. Empirical investigations:
• What are the features of the human context in which the technical artifact is [or will be] situated?
• Evaluations to assess the success of a particular design, including observations, interviews, surveys,
experimental manipulations, collection of documents, and measurements of user behaviour.
• How do stakeholders view values in the context in which the technology is being deployed/used?
• How do they prioritize individual values?
• What are organisational values, motivations, methods of training, reward structures, etc.?
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 53

How is value-sensitive design practiced?
3. Technical investigations:
• How do existing technological properties and underlying mechanisms support or hinder human values?
• Proactive design of systems to support values identified.
• Focus on the technology itself (empirical investigations focus on the individuals, groups, or larger social systems that use or are affected by the technology)
(Friedman, , and Borning, 2008)
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022

How is value sensitive design practiced?
“Unanticipated values and value conflicts often emerge after a system is developed and deployed. Thus, when possible, design flexibility into the underlying technical architecture so that it can be responsive to such emergent concerns.”
(Friedman, , and Borning, 2008, p. 93)

Care Centred Value-Sensitive Design
Care is a core value that can be embedded in care technologies
Care values will manifest differently across contexts and people:
“A framework for the ethical evaluation of care robots requires recognition of the specific context of use, the unique needs of users, the tasks for which the robot will be used, as well as the technical capabilities of the robot” (p. 408)
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 56

References (1)
Tronto, J., (1993). Moral boundaries: A political argument for an ethic of care. Routledge.
Coghlan, S., et al (2021). Dignity, Autonomy, and Style of Company: Dimension Older Adults Consider for
Robot Companions. Proceedings of the ACM on Human-Computer Interaction 5, CSCW, 1-25.
Lewis, T. (2022). Honey, let’s track the kids: the rise of parental surveillance. The Observe, 1st May. https://www.theguardian.com/media/2022/may/01/honey-lets-track-the-kids-phone-apps-now-allow- parents-to-track-their-children
Arsenault, C. (2020). How AI is helping combat homelessness in Canada during COVID-19. World Economic Forum. https://www.weforum.org/agenda/2020/10/ai-artificial-intelligence-canada-homelessness- coronavirus-covid-19
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 57

References (2)
Braithwaite, V. (2020). Beyond the bubble that is Robodept: How governments that lose integrity threaten democracy. Australian Journal of Social Issues, 55, 242-259.
Pols, J., and Moser, I. (2009). Cold technologies versus warm care? On affective and social relations with and through care technologies. ALTER, European Journal of Disability Research, 3, 159-178.
, A. (2013). Designing Robots for Care: Care Centered Value-Sensitive Design. Science Engineering Ethics, 19, 407-433.
Friedman, Kahn, and Borning, (2008). Value Sensitive Design and Information Systems. In K. Himma and H. Tavani (eds), The Handbook of information and Computer Ethics. & Sons. [available as e-book in university library]
– COMP90087 – Semester 1, 2022 – © University of Melbourne 2022 58

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com