0.5 Minding Language about Mental Health and Technology
Finding appropriate terminology in global discussions about mental health is also challenging. There is no single set of definitions to describe people’s experience of mental health. Indeed notions of ‘mental health’ are contested. Different terms may be preferred according to national and cultural norms, professional conventions, and so on
‘people with lived experience and psychosocial disability’ will be used in this report to describe people with firsthand experience of mental health services, mental health crises, extreme distress, psychosis, and so on. We have also sought terminology that conveys our intent to a diverse audience, though we acknowledge that language around mental health is often contested.17
People with lived experience can have varying ways of understanding experiences that are often called ‘mental health conditions’, ‘mental health challenges’ or ‘mental illness’. We acknowledge that mental health can be described using terms such as emotional distress, trauma, mental health crisis and neurodiversity; and that people may describe themselves as ‘service users’, ‘consumers’, ‘psychiatric survivors’, ‘disabled’, ‘ex-patients’ and so on, and others may reject designations of their experiences in the terms of psychiatry and psychology.
The term ‘people with psychosocial disability’ may not be familiar to readers. It is simply used to refer to people with disability related to mental health. The term has become prominent since the UN Convention on the Rights of Persons with Disabilities (‘CRPD’) came into force in 2008, and its definition is crucial for this report. The CRPD establishes that ‘disability’ includes ‘mental impairment’ (Article 1). Importantly, the CRPD also covers people who experience harms due to imputed impairment or disability—that is, where a person is perceived by others to be impaired or disabled. This is important here because some algorithmic technologies purportedly function to deduce the inner states of human beings, including inferring mental health conditions and cognitive impairments. For example, at the time of writing, Apple is working with multinational biotechnology company Biogen and UCLA to explore using sensor data (such as mobility, sleep patterns, swiping patterns and more) to infer mental health and cognitive decline.18 Regardless of the accuracy of such predictions there remains the very real possibility of harms against people based on such data, again, even if those data are false, misleading or inaccurate.
The wide definition of psychosocial disability also highlights that all persons may interact with systems that generate data concerning their ‘mental health’. If you carry a smartphone into a counselling service, visit a depression information website, write about distress on a social media platform, or even simply type and scroll on a mobile device, then various ‘digital trails’ will be generated that could be used to infer particular mental states—including mental health conditions, distress and cognitive impairment.19 This is not to endorse or accept such claims but to reiterate that the rise of Big Data and AI has increased the likelihood of inferences and predictions being drawn from the behaviours, preferences, and private lives of individuals. These data have been described variously as ‘emergent health data’,20 or ‘indirect, inferred, and invisible health data’.21 These forms of sensitive personal data create new opportunities for discriminatory, biased, and invasive decision-making about individuals and populations.22 High risk applications of technologies are likely to have the greatest impact on people who are traditionally marginalised, such as those who are using mental health services or those who are subject to involuntary mental health interventions, and those with intersecting forms of marginalisation, but the range of technologies discussed in this report raise issues that affect everyone.23>
The term ‘data concerning mental health’ is therefore used throughout this report to mean personal data related to the mental health of a person, including the provision of health care services, which reveal information about his, her, their mental health status.24 Questions remain about what exactly constitutes ‘data concerning mental health’. The rising use of indirect, inferred, and ‘invisible’ health data is blurring the boundary between ‘patient’, ‘service user’, ‘consumer’ and ‘citizen’, creating multiple issues around the commodification and commercialisation of health, the rise of ‘bio-surveillance’, and other issues which have profound implications in the mental health context and beyond.
This report occasionally departs from these key terms when referring to specific data sources, describing research, or quoting an individual or organization, to accurately reflect the views presented in these materials
Stop the Algorithm by Stephanie Kneissl and Max Lackner in Science Gallery Melbourne’s MENTAL. Photo by Alan Weedon.
- 16 Claude Castelluccia et al, Understanding Algorithmic Decision-Making: Opportunities and Challenges (2019)http://www.europarl.europa.eu/ RegData/etudes/STUD/2019/624261/EPRS_STU(2019)624261_EN.pdf
- 17 LD Green and Kelechi Ubozoh, We’ve Been Too Patient: Voices from Radical Mental Health--Stories and Research Challenging the Biomedical Model (North Atlantic Books, 2019).
- 18 Rolfe Winkler, ‘WSJ News Exclusive | Apple Is Working on IPhone Features to Help Detect Depression, Cognitive Decline’, Wall Street Journal (online, 21 September 2021) https://www.wsj.com/articles/apple-wants-iphones-to-help-detect-depression-cognitive-decline-sources-say-11632216601.
- 19 Rachel Metz, ‘The Smartphone App That Can Tell You’re Depressed before You Know It Yourself’ (15 October 2018) MIT Technology Review https://www.technologyreview.com/s/612266/the-smartphone-app-that-can-tell-youre-depressed-before-you-know-it-yourself Christophe Olivier Schneble, Bernice Simone Elger and David Martin Shaw, ‘All Our Data Will Be Health Data One Day: The Need for Universal Data Protection and Comprehensive Consent’ (2020) 22(5) Journal of medical Internet research e16879; Rui Wang, Andrew T Campbell and Xia Zhou, ‘Using Opportunistic Face Logging from Smartphone to Infer Mental Health: Challenges and Future Directions’ in Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers (Association for Computing Machinery, 2015) 683 https://doi-org.ezp.lib.unimelb.edu.au/10.1145/2800835.2804391.
- 20 Mason Marks, Emergent Medical Data: Health Information Inferred by Artificial Intelligence (SSRN Scholarly Paper No ID 3554118, Social Science Research Network, 14 March 2020) https://papers.ssrn.com/abstract=3554118
- 21 Schneble, Elger and Shaw (n 19).
- 22 Sandra Wachter and Brent Mittelstadt, ‘A Right to Reasonable Inferences: Re-Thinking Data Protection Law in the Age of Big Data and AI’ (2019) 2019(2) Columbia Business Law Review 494.
- 23 This point was borrowed from the excellent report on new technologies of migration management by Petra Molnar. See P Molnar (2019) ‘Technological Testing Grounds: Migration Management Experiments and Reflections from the Ground Up’ eDRI, p.9. https://edri.org/wp-content/uploads/2020/11/Technological-Testing-Grounds.pdf> (accessed 2/03/2020)
- 24 This definition draws on the GDPR definition of ‘data concerning health’, which is defined as ‘personal data related to the physical or mental health of a natural person, including the provision of health care services, which reveal information about his or her health status.’ EU GDPR, Article 4 (15).