2.10 Future Efforts
This report could have explored much more. Issues requiring further research include digitised initiatives in low and middle-income countries, the growth of ‘emotion or affect recognition’ technologies, digital care labour platforms, automated resource allocation for social security and healthcare, and human rights by design methodologies
Digital mental health technology in low and middle-resource settings and countries is growing and is often framed as allowing intervention in the lives of large populations. Emerging practices include online training for professionals and laypersons, digitisation of population-level information and electronic health records, and ‘scalable’ digital therapies such as automated cognitive behavioural therapy.456 China Mills and Eva Hillberg have argued that any analysis of risks and benefits must include questioning assumptions about digital empowerment and – as is often the case in both information technology and mental health practice – of top-down imposition by high-income (and typically ‘Western’) countries.457 Advocacy groups like TCI-Asia have warned against perpetuating legacies of colonialism in response to disabled people in low- and middle-income countries; instead, they promote the importance of involving affected populations and carefully considering the unique historical, social, cultural and economic factors of different settings.458 Failed digital health interventions in low and middle-income countries suggest that technologies must be effectively localised in order to confer power to the communities they intend to serve—which raise tensions with ‘scalability’ as an aspiration.459 Much more work is needed on the potential and pitfalls of using algorithmic and data-driven technology to address distress and disability in low- and middle-income countries.460
Computational emotion or affect recognition, was discussed briefly in this report but requires attention to its role in mental health services and research. Some speculators project market value for ‘emotional AI’ at $91.67 billion by 2024, and it is being deployed by tech vendors, criminal justice agencies, advertisers, car manufacturers, and others.461 Despite the exponential growth of such technologies, a more and more research is claiming that emotion recognition technology is founded on pseudoscientific claims.462 Indeed the traditional research on emotion recognition based on facial features has been heavily criticised as lacking evidence.463 The impact of broader debates about affect recognition and its interaction with biometric monitoring work conducted in the mental health sciences remains uncertain and requires attention, particularly as mental health sciences risk lending a veneer of legitimacy to otherwise pseudoscientific claims.
Digital labour platforms are transforming care labour for people with disabilities who access home-based or community-based support, and other forms of support. This has been called the ‘Uberisation’ of care and therapy. Digital labour platforms can be designed in ways that institutionalise the exploitation of care and support labour, turning the interests of recipients against (generally low-paid) staff.464 Even as such platforms might equitably distribute care and support labour, they raise legitimate concerns, particularly in relation to health and safety, insurance, unpaid work and the long-term training needs of the workforce. These platforms will raise issues of platform regulation, but also accreditation and professional regulation, given therapy platform business models often depend on shrinking payment to therapists and increasing their caseloads, and endeavouring to de-medicalise therapy to hire cheaper, non-accredited counsellors.465
Algorithmic resource allocation for determining who receives state resources for healthcare or other forms of social security, including disability-based and mental healthcare support, is an area with serious implications. Lucy Series and Luke Clements reported on automated ‘personal budget decisions’ in the UK, suggest algorithmic resource allocation could be used as a mechanism for implementing spending cuts.466 Further, the budget allocations did not always respond to people’s needs, and the algorithmic nature of the system led to a lack of transparency. Advocacy organisation AlgorithmWatch cited the research, stating that it ‘serves not only to illustrate how flawed [automated] decisions can adversely impact people’s lives, but also how [automated decision-making] systems might be scrutinised and what obstacles are sure to arise in other domains of [automated decision-making] accountability research.’467
More broadly, research is required on the impact of the politics and ideology that shape the administration of mental health services, and its merging with the politics and ideology that drive the information economy
Photo by Note Thanun on Unsplas
- 456 John A Naslund et al, ‘Digital Technology for Treating and Preventing Mental Disorders in Low-Income and Middle-Income Countries: A Narrative Review of the Literature’ (2017) 4(6) The Lancet. Psychiatry 486. John Naslund and colleagues reviewed the clinical effectiveness of digital mental health interventions in diverse low- and middle-income countries and argued that there is reasonable evidence for their feasibility, acceptability, and initial clinical effectiveness, although they noted that most studies in the field are preliminary evaluations.
- 457 Mills and Hilberg (n 146).
- 458 Transforming Communities for Inclusion – Asia, ‘Submission to the UNCRPD Monitoring Committee, Day of General Discussion, Article 19’ http://www.ohchr.org (accessed 6 February 2018).
- 459 Varoon Mathur, Saptarshi Purkayastha and Judy Wawira Gichoya, ‘Artificial Intelligence for Global Health: Learning From a Decade of Digital Transformation in Health Care’ [2020] arXiv:2005.12378 [cs] http://arxiv.org/abs/2005.12378.
- 460 See also, Capella (n 439).
- 461 Evan Selinger, ‘A.I. Can’t Detect Our Emotions’, OneZero (6 April 2021) https://onezero.medium.com/a-i-cant-detect-our-emotions-3c1f6fce2539.
- 462 Article 19 (n 327) 19.
- 463 For overview of criticisms, see: Barrett (n 331) 12–24.
- 464 International Labour Office, World Employment and Social Outlook 2021: The Role of Digital Labour Platforms in Transforming the World of Work (ILO, 2021).
- 465 See generally Zeavin (n 47) pp.205-215.
- 466 Lucy Series and Luke Clements, ‘Putting the Cart before the Horse: Resource Allocation Systems and Community Care’ (2013) 35(2) Journal of Social Welfare and Family Law 207.
- 467 Automating Society 2019’, AlgorithmWatch https://algorithmwatch.org/en/automating-society-2019/ p.171