2.4.3 Equality
Equality is another generally accepted aspiration, centred on the goal of ensuring the same opportunities and protections to people interacting with algorithmic systems. Some have taken this aim further in seeking to use algorithmic systems to ‘eliminate relationships of domination between groups and people based on differences of power, wealth, or knowledge’ and ‘produce social and economic benefits for all by reducing social inequalities and vulnerabilities.’340340
Any discussion of equality in the mental health context must acknowledge existing inequalities in how mental health issues play out. Psychological distress does not occur equally across society: those who are poorer, from disadvantaged, marginalised and oppressed groups are more likely to experience distress, psychosis, trauma, mental health conditions and psychosocial disabilities.
Inequalities also appear within mental health services themselves. There is inequality of access in mental health services; for example, in the UK, older people are underrepresented in talking therapies341 and Black British men are overrepresented in involuntary psychiatric interventions.342 Inequalities of experience also arise. In Aotearoa New Zealand, people with higher economic deprivation report lower satisfaction with health services compared to others, and this group disproportionately includes high numbers of people of Māori, Pacific or Asian ethnicity.343 In high-income countries, higher proportions of Black people get diagnosed, with much research and debate about why this is the case.344
Some would argue that algorithmic and data-driven technologies could be used to address such inequalities; for example, helping to identify inequities in service systems, or by undertaking analyses of complex socio-economic dimensions to mental health problems. Yet, there is also potential that such technologies will replicate and even exacerbate inequalities. Sarah Carr, speaking from a UK perspective, has pointed out that the higher likelihood that Black British men will be subject to involuntary psychiatric intervention may mean that algorithmic approaches to service provision could exacerbate patterns of coercive intervention against racialised minorities.345 Psychiatric diagnoses are already skewed with respect to race. In the US, for example, Black and minority ethnic groups tend to receive more ‘severe’ diagnostic categories (eg. schizophrenia rather than schizo-affective disorder), for diverse reasons, including mental health practitioners seeking to ensure a low-income person can qualify for housing or social security, which a more ‘severe’ diagnosis might afford.346
The ‘Serenity Integrated Monitoring’ program in the UK, noted earlier ( X), which involves analysing health authority data to identify people repeatedly subject to forced psychiatric treatment and referring them to a police monitoring program, has been criticised for the likelihood that it will have ‘violent consequences [that] disproportionately impact Black, Asian and minority ethnic communities’.347 Sage Stephanou, founder of the Radical Therapist Network (RTN), stated:
- SIM perpetuates the prison industrial complex by monitoring and gatekeeping healthcare support and ultimately criminalises people who experience significant mental illness and trauma, often exasperated by systematic racism, oppression and adverse experiences. … SIM will exasperate the very real and legitimate fear that if racialised individuals access mental health support, or report abuse, they are at risk of systemic violence under the guise of care. Police involvement often escalates risk, creating dangerous situations through the use of physical restraint, coercive, unethical forms of treatment, detainment, and higher chances of Black and brown people dying whilst in police custody.348
Again, the SIM program only used data-driven technology at a small but crucial point in the program. Yet, the example suggests it is necessary to move beyond vague notions of equality and fairness in efforts to ensure the prevention of harm and equal distribution of benefits of using algorithmic and data-driven technology to address distress and healing.
As a final point on equality for this section, broader structural questions may be asked about growing inequality facilitated and indeed accelerated by the algorithmic and data-driven systems that power the information economy. The impact of social, political, and economic structures on mental health is well established. Drilling down into how a particular algorithmic system promotes or threatens equal opportunities for individuals with a mental health diagnosis under current conditions may distract from the way human distress arises primarily as a consequence of poverty, precarity, violence, and trauma as a form of social suffering—for which growing inequality in many countries will continue to be a major contributor. The role of technological change in these broader trends has a crucial role in discussions about new and emerging technology aimed at ameliorating distress. These issues will be discussed later in the report, in the section concerning public interest and societal wellbeing (page 83).
- 334 Marks, ‘Algorithmic Disability Discrimination’ (n 13).
- 335 Ibid.
- 336 Ibid.
- 337 Dana Pessach and Erez Shmueli, ‘Algorithmic Fairness’ [2020] arXiv:2001.09784 [cs, stat] http://arxiv.org/abs/2001.09784.
- 338 Goldenfein (n 75), p.130.
- 339 Fjeld et al (n 190).
- 340 University of Montreal, ‘Montreal Declaration for a Responsible Development of Artificial Intelligence’ (2018) 13 https://www.montrealdeclaration-responsibleai.com/the-declaration.
- 341 Rob Saunders et al, ‘Older Adults Respond Better to Psychological Therapy than Working-Age Adults: Evidence from a Large Sample of Mental Health Service Attendees.’ (2021) 294(1) Journal of Affective Disorders 85.
- 342 V Lawrence et al, ‘Ethnicity and Power in the Mental Health System: Experiences of White British and Black Caribbean People with Psychosis’ [2021] Epidemiology and Psychiatric Sciences 294(1) 85-93.
- 343 Carol HJ Lee and Chris G Sibley, ‘Demographic and Psychological Correlates of Satisfaction with Healthcare Access in New Zealand’ (2017) 130(1459) New Zealand Medical Journal 14.
- 344 Mary O’Hara, ‘Black and Minority Ethnic Mental Health Patients “marginalised” under Coalition’, The Guardian (online, 17 April 2012) https://www.theguardian.com/society/2012/apr/17/bme-mental-health-patients-marginalised.
- 345 Carr (n 53).
- 346 Richard Sears, ‘Combatting Structural Racism and Classism in Psychiatry: An Interview with Helena Hansen’, Mad in America (13 October 2021) https://www.madinamerica.com/2021/10/interview-helena-hansen/.
- 347 Jameela (n 80).
- 348 Ibid.