2.8.1 Automation, Undermining Face-to-Face Care, and the Risk of Depersonalisation
One uniformly acknowledged risk is that digitising crisis support may reduce the type of human interaction and compassion that is indispensable to providing and experiencing care, support and healing. In 2018, Christopher Hollis and colleagues conducted what appears to be the largest participatory study in the world concerned with charting a research agenda about digital technologies in mental healthcare.415 664 ‘people with lived experience of mental health problems and use of mental health services, their carers, and health-care practitioners’ in the UK were consulted. The number one research priority for participants was determining ‘the benefits and risks of delivering mental health care through technology instead of face-to-face’ and considering the impact of removing ‘face-to-face human interaction’.416 Participants’ concluded that – above all – technologies that emphasised connection should be prioritised. They warned against technologies, including well-intentioned ones, that would exacerbate isolation, loneliness and alienation.
The impact of new technologies on dynamics of care will be a key discussion in coming decades, both in terms of care in health care facilities and residential homes, but also home-based care. One common aim of new technology, such as AI, is to break down tasks into individual components which can be repetitively undertaken. Yet, care is not just tasks, it is also emotion; it is a fundamental part of human relationships and it is a highly complex social interaction.417> Some have claimed that technology will supersede human care. For example, a major suicide hotline in Australia claimed that a ‘virtual or robotic counsellor’ could ‘speak to lots of people and provide support to people immediately’ whereas ‘phone counselors can only speak to one person at a time.’418 Others have derided the benefit of human interaction in healthcare encounters, because it is not strictly clinical in nature and wasteful of healthcare resources. Nick Weston, the chief commercial officer at Lilli, a UK company behind biometric monitoring technology that was installed in the homes of older social care recipients, rejected claims that the monitoring could exacerbate the loneliness of older people who would have otherwise received care visits.419 He stated that ‘We shouldn’t be relying on home care agency staff to provide the social interaction for somebody’.420 Such claims rest on the narrowest conceptions of human care, flattening its complexity in the extreme.
Care has been persistently devalued in many societies. This devaluation is often based on the sexist premise that care is ‘women’s work’, given that care for older persons, persons with disabilities and children has largely been performed by women, paid and unpaid.421 Simplistic efforts to automate care can perpetuate this devaluation. To paraphrase Fiona Jenkins, relations of care are neither adequately recognised in many paradigms of digitised mental health care, and the language of market value, nor in our inherited traditions of contemplating the values and virtues that make for a truly human life.422 Such concerns highlight the potential rise of cheap (if limited) software to replace more expensive, expert, and empathetic professionals, and the disruption of care service provision and public assistance for the provision of private mental health care.423
The potential for care technologies to dehumanise and objectify care-recipients was raised by the UN Independent Expert on the enjoyment of all human rights by older persons, Rosa Kornfeld-Matte.424 The risk of ‘automating care’, she wrote, includes ‘losing one’s sense of identity, self-esteem and control over one’s life’.425 Kornfeld-Matte argued that human dignity must be ‘integrated from the conception to the application of assistive devices and robotics’.426 Even on a purely practical level, new technologies may also mean increased pressure on mental health and other service providers to increase multi-tasking and workloads in generating, inputting, organising, and constantly updating data records. Paradoxically, this extra work may reduce care workers’ time for face-to-face engagement and collaborative work with care recipients and other care workers
- 410 McQuillan (n 150).
- 411 Pasquale (n 360).
- 412 Fjeld et al (n 190). fn 305.
- 413 IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems (n 287) pp. 21-22 (See Principle 2.).
- 414 World Health Organisation, Guidance on Community Mental Health Services: Promoting Person-Centred and Rights-Based Approaches (World Health Organization, 2021) https://www.who.int/publications-detail-redirect/9789240025707.
- 415 Hollis et al (n 42).
- 416 Hollis et al (n 197) p.7
- 417 Eva Feder Kittay, ‘The Ethics of Care, Dependence, and Disability*: The Ethics of Care, Dependence, and Disability’ (2011) 24(1) Ratio Juris 49.
- 418 aggie Coggan, ‘Virtual Counsellor Steps in to Help out on Suicide Hotline’, Pro Bono Australia https://probonoaustralia.com.au/news/2019/04/virtual-counsellor-steps-in-to-help-out-on-suicide-hotline/.
- 419 Chris Baraniuk, ‘Sensors and AI to Monitor Dorset Social Care Patients’, BBC News (online, 24 August 2021) https://www.bbc.com/news/technology-58317106.
- 420 Ibid.
- 421 Fiona Jenkins, ‘The Ethics of Care: Valuing or Essentialising Women’s Work?’ in Marian Sawer, Fiona Jenkins and Karen Downing (eds), How Gender Can Transform the Social Sciences: Innovation and Impact (Springer International Publishing, 2020) 19 https://doi.org/10.1007/978-3-030-43236-2_2; Yvette Maker, Care and Support Rights After Neoliberalism: Balancing Competing Claims Through Policy and Law (In press 2021, Cambridge University Press).
- 422 Ibid.
- 423 Pasquale (n 6).
- 424 Human Rights Council, Report of the Independent Expert on the Enjoyment of All Human Rights by Older Persons (UN Doc A/HRC/36/48, 21 July 2017) para [46]-[49].
- 425 Ibid.
- 426 Ibid.