1.3 Digitising Involuntary Psychiatric Intervention and Other Coercive Measures
- Sarah Carr51
Unlike data concerning physical health, data concerning mental health can be used to initiate state-authorised coercive interventions in certain cases. This possibility adds important legal, social and political dynamics to this discussion. Although the majority of people who access some kind of service for mental health reasons will access those services voluntarily, a small but significant minority of people will be subject to involuntary psychiatric intervention – typically involving the person being detained in hospital and treated against her/his/their wishes – under existing mental health-related legislation.
1.3.1 AI-based Suicide Alerts and Self-harm Surveillance
Government agencies, social media companies, not-for-profits, health services, and others have begun using machine learning and artificial intelligence in suicide prevention, including in efforts to pre-emptively identify people who may self-harm.52 In some cases, these technologies appear to have been used to activate police powers to detain people for the purposes of involuntary psychiatric intervention.
CASE STUDY: AI-Based Suicide Alerts at Facebook/Meta
In November 2018, a Facebook employee in Texas reportedly alerted police in the Indian state of Maharashtra about a 21-year-old man who had posted a suicide note on his profile. The intervention came after Facebook expanded its pattern recognition software to detect users expressing suicidal intent. Mumbai police reportedly attended the young man’s home,53 for which they have power to authorise involuntary psychiatric intervention under the Mental Healthcare Act 2017 (India). In 2018, Facebook reported that it had conducted over 1000 ‘wellness checks’ involving the dispatch of first responders.54
Facebook/Meta’s algorithmic responses also encourage peer-responses from among the person’s user-network by drawing their attention to the person’s apparent distress.55 These measures were developed after some form of consultation with suicide attempt survivors and experts on suicide prevention (though few details are available).56 Facebook/Meta provides some information about the algorithmic process behind the interventions,57 and has described the ethical issues with which programmers grappled.58
However, there remains little information about what precisely is meant by a ‘wellness check’ (including whether location data are shared with first-responders). Nor is there publicly available research as to the accuracy, scale or effectiveness of the initiative. What Facebook/Meta does with the information following each apparent crisis is also unclear.
Police appear to be the first-responders undertaking ‘wellness checks’. Facebook/Meta has therefore drawn criticism for failing to grapple with the reality of anti-Black racism in the US and the prevalence of police violence in their encounters with distressed individuals, particularly Black, Indigenous, people of colour. For example, Joshua Skorburg and Phoebe Friesen write:
- While [Facebook/Meta’s wellness checks] may seem like a positive contribution to public health on Facebook’s behalf, it is becoming increasingly clear that police wellness checks can do more harm than good. Between 2015 and August 5, 2020, 1,362 people who were experiencing mental health issues were killed by police in the United States. This remarkable number constitutes 23 percent of police fatalities in that time.59
The US is by no means alone on such patterns of police violence.60
From a legal and regulatory perspective, suicide prediction in medical systems is governed by health information laws, medical practice and clinical governance regimes, and research regulations that require transparency and peer review. Flawed as these frameworks may be, AI-based suicide prediction on social media platforms, as Mason Marks points out, ‘typically occurs outside the healthcare system where it is almost completely unregulated, and corporations often maintain their prediction methods as proprietary trade secrets’.61 To remedy this, Marks recommends several steps to improve people’s safety, privacy and autonomy, including:62
- making prediction methods more transparent, and giving users unambiguous opportunities to opt-out and delete prediction information;
- protecting consumer privacy and minimising the risk of exploitation, by ensuring suicide predictions cannot be used for advertising or be shared with third parties (such as insurance companies, employers or immigration authorities); and
- the monitoring of ongoing prediction programs by independent data monitoring committees for safety and efficacy.
The use of individual and population monitoring in efforts to prevent suicide or efforts to promote its use, are likely to increase in coming years. In December 2020, the US National Suicide Prevention Lifeline administrator recommended that the US Government authorise geo-location systems to pin-point the exact location of all callers by 2022.63 Leah Harris has criticised this recommendation, warning that ‘Mad and disabled advocates who have experienced mental health crisis intervention, and even some crisis service providers, worry that geolocation would serve to further entrench coercion in mental health and crisis response systems, replicating problematic aspects of [the US emergency services line] 911’.64 The impact of automated surveillance of callers on rates of involuntary psychiatric interventions, police involvement in crises, citizens’ willingness to report to such services, and so on, remains unknown.
1.3.2 ‘Digitising mental health law
Some governments have sought to digitise processes of involuntary psychiatric intervention.65
CASE STUDY: Electronic Forms and Mobile Technology in Involuntary Psychiatric Interventions
In the UK in 2020, regulations were amended to speed up applications for compulsory psychiatric ntervention orders by providing an online communication platform between mental health professionals involved in involuntary interventions. This web-based interface allows social workers, nurses, psychologists and others who are interacting with a person in crisis to locate and communicate with medical practitioners via videocall who may assess the person and authorise involuntary intervention. One online platform to emerge with government support is reportedly used by over 70% of National Health Service Trusts at the time of writing.66 David Bradley, the Chief Executive of South London & Maudsley NHS, strongly endorses the practice, describing it as ‘[t]he Uber of finding doctors for the health service’.67 Relevant doctors can enter their availability on a personal calendar and ‘build a profile containing their location, specialities and languages spoken, and monitor their activity via a dashboard’.68
Proponents suggest the electronic forms and digital platforms will improve access to care, reduce errors, and improve information sharing, which ultimately reduces the distress of the individuals and prevents delays in the provision of healthcare.69 However, some mental health services users have raised concerns about the unknown impact of the digitised process on people subject to orders, which are potentially serious and warrant closer attention.70
In recent years, the processing of data about those subject to involuntary psychiatric intervention through electronic records systems has harmed people with lived experience in some cases. Concerns about police agencies sharing data concerning self-harm were raised in Canada, where municipal police collated non-criminal information about individuals who had self-injured or attempted suicide.71 The information was then circulated to US border authorities, who used the information to deny several Canadians entry into the US. (This example will be discussed at page 52). The ease with which people’s sensitive data concerning involuntary treatment can be accessed by various government departments, has raised concerns about the potentially unlawful uses of that data.
CASE STUDY: ‘Serenity Integrated Monitoring’ – Sharing Sensitive Information and Flagging ‘High Intensity Users’ of Mental Health Services
The program involved police officers, described as ‘High Intensity Officers’, regularly contacting the person to dissuade them from ‘unnecessary’ interactions with emergency health services, and to instead arrange more ‘appropriate’ support.73 Major concerns with the program were reported in May 2021
[w]hen tagged under the system, patients can be denied care, prevented from seeing doctors or psychiatrists, and sent home. An NHS doctor told [journalists] that he had to turn away a woman who had attempted suicide on multiple occasions because she had been assigned to the SIM scheme. He considered resigning as a result.74
The Royal College of Psychiatrists reported that where a person ‘remained unwell and continued to self-harm, attempt suicide or report suicidality, in some cases they were prosecuted and imprisoned or community protection notices were applied which required them to stop self-harming or calling for help, with imprisonment as a potential sanction if they breached the notice’.75
StopSIM Coalition, a ‘grassroots network of service users and allies’, raised concerns that the program ‘allows “sensitive data” (information like medical records, ethnicity, religion, sexuality, gender reassignment and financial information) to be shared between services without the subject’s consent … (for example, as a consequence of calling [emergency services] when feeling suicidal)’.76
The SIM program is being reviewed by the National Health Service at the time of writing, although it reportedly remains in place in 23 National Health Service mental health trusts in England77 and is being trialled in three US states.78
The SIM program will be discussed later in the report in sections on accountability and privatisation (page 58). SIM also appeared to have disproportionate impacts along lines of race and class (discussed at page 69)
1.3.3 Power and Coercion in Mental Health
Other areas of law around the world overtly discriminate against people with lived experience and psychosocial disability, which adds to the sensitivity of data concerning mental health. Discrimination in law could include preventing a person with a mental health diagnosis from holding public office, migrating into particular countries, and working in particular professions.79 Indeed, some countries continue to criminalise suicide attempts. For example, Section 226 of Kenya’s penal code states that ‘any person who attempts to kill himself [sic] is guilty of a misdemeanour’.80 Around 20 countries still criminalise suicide attempts, according to a 2021 report by the International Association for Suicide Prevention and United for Global Mental Health.81 Automated suicide alert programs must therefore be applied with extreme caution (see section on Non-Discrimination and Equity below)
There are also well-established examples where people with psychosocial disabilities and mental health diagnoses are occasionally subject to political scapegoating and public scare campaigns that attract intrusive and discriminatory proposals for state intervention.
CASE STUDY: ‘SAFEHOME for Stopping Aberrant Fatal Events by Helping Overcome Mental Extremes’ – Proposed Behavioural Monitoring and Preventive Policing
In 2019, the Washington Post reported that a prominent US businessman briefed top officials of the Trump administration, including the then president and vice president, on a proposal ‘to create a new research arm called the Health Advanced Research Projects Agency’.82 The advisor promoted a program titled, ‘SAFEHOME for Stopping Aberrant Fatal Events by Helping Overcome Mental Extremes’ that called for experimentation to explore whether ‘technology including phones and smartwatches can be used to detect when mentally ill people are about to turn violent’.83 The proposal was not pursued by the time Donald Trump left office in 2020
One category of biometric monitoring technology more broadly, known as ‘anomaly detection’, may have repercussions for people with psychosocial disabilities and lived experience if used in public surveillance. According to one report, automated surveillance systems are designed undertake ‘automatic detection and tracking of unusual objects and people’.84 The literature on anomaly detection, according to a report for the ACLU, is ‘full of discussion of algorithms that can detect people or behaviours that are “unusual,” “abnormal,” “deviant,” or “atypical”’.85 (See above the discussion about the politics of terminology in the mental health context, including characterisations of ‘deviance’ and ‘abnormality’, page 13).86 The authors warn that identifying statistical deviance is not a negative thing per se but when it ‘shades into finding deviant people it should raise alarms’,87 and given the historical and current exclusion of, and social hostility against, people with lived experience and psychosocial disability, and other disabilities, such as intellectual and cognitive disabilities, this is particularly troubling
A less experimental form of electronic monitoring at the intersection of mental health and criminal justice is the use of monitoring of people in forensic mental health services using global positioning system (GPS). Electronic monitoring devices record and regularly transmit data on a person’s location via devices fixed to his or her body. Some GPS devices, such as devices affixed to a person’s wrist or ankle, can be linked to blood-alcohol monitors.88
CASE STUDY: GPS Surveillance of Forensic Psychiatric Patients in Three Jurisdictions
Two jurisdictions in Australia authorise health services to impose involuntary ‘monitoring conditions’ on people detained in forensic psychiatric settings using electronic GPS devices, typically in the form of electronic ankle bracelets.89 In one jurisdiction, the program was advanced by government against the submissions and evidence of medical practitioners.90 In an appeal brought by a man subject to the surveillance regime,91 a treating psychiatrist submitted that ‘[n]ot only did [the] device add nothing to his clinical management or risk reduction, it had the effect of hindering his rehabilitation’.92
In England and Wales, GPS surveillance of people in forensic mental health settings is only possible if they consent to it.93 In Nova Scotia, Canada, legislators have prohibited GPS surveillance of forensic mental health patients in any form, with lawmakers citing concerns that it violates human rights.94 The province commissioned three reports into the clinical and legal issues, and each study indicated that ‘there was no support or even speculative support that electronic monitoring would enhance public safety.'95
More issues concerning involuntary psychiatric interventions and computer technology will emerge in coming years, raising pressing questions. Will monitoring devices be imposed in involuntary psychiatric interventions in the civil context, such as ‘community treatment orders’? Should algorithmic technologies be used at all in coercive crisis responses? How might these concerns relate to broader efforts in recent years to reduce and eliminate coercion in mental health settings, and to debates about ‘abolishing versus reforming’ involuntary psychiatric interventions?96
The future of algorithmic and data-driven technologies in coercive state interventions remains uncertain—but imagined futures are guiding activity today. As one industry publication that promoted technology in healthcare stated
- In the future, patients might go to the hospital with a broken arm and leave the facility with a cast and a note with a compulsory psychiatry session due to flagged suicide risk. That’s what some scientists aim for with their A.I. system developed to catch depressive behavior early on and help reduce the emergence of severe mental illnesses.97>
This imagined future is one possibility. Others will reject this vision of expanded risk predictions and technology-facilitated coercion, and instead promote the development of open and co-operative crisis support relationships that are enhanced by selective use of digital technology. These contested futures suggest that the power dynamic caused by coercion in mental health services must be a part of the discussion concerning ‘digital mental health’ measures today
- 51 Sarah Carr, ‘“AI Gone Mental”: Engagement and Ethics in Data-Driven Technology for Mental Health’ (2020) 0(0) Journal of Mental Health 1.
- 52 Marks, Artificial Intelligence Based Suicide Prediction (n 45).
- 53 Vijay K Yadav, ‘Mumbai Cyber Cops Log into Facebook to Curb Suicides’, Hindustan Times (online, 5 November 2018) https://www.hindustantimes.com/mumbai-news/mumbai-cyber-cops-log-into-facebook-to-curb-suicides/story-SMd03alcW0SUBzRJlmdDZJ.html.
- 54 Norberto Nuno Gomes de Andrade et al, ‘Ethics and Artificial Intelligence: Suicide Prevention on Facebook’ (2018) 31(4) Philosophy & Technology 669.
- 55 Catherine Card, ‘How Facebook AI Helps Suicide Prevention | Facebook Newsroom’ (10 September 2018) https://newsroom.fb.com/news/2018/09/inside-feed-suicide-prevention-and-ai/.
- 56 Gomes de Andrade et al (n 56).
- 57 Card (n 57).
- 58 Gomes de Andrade et al (n 56).
- 59 Joshua August Skorburg and Phoebe Friesen, ‘Mind the Gaps: Ethical and Epistemic Issues in the Digital Mental Health Response to Covid-19’ (2021) 51(6) Hastings Center Report 23.
- 60 See eg. Piers Gooding, ‘“The government is the cause of the disease and we are stuck with the symptoms”: deinstitutionalisation, mental health advocacy and police shootings in 1990s Victoria’ in G Goggin, L Steele, and R Cadwallader (Eds.) Normality and Disability: Intersections among Norms, Law, and Culture (Routledge, 2018) 100-110; Anthony J O’Brien et al, ‘The Nature of Police Shootings in New Zealand: A Comparison of Mental Health and Non-Mental Health Events’ (2021) 74 International Journal of Law and Psychiatry 101648.
- 61 Marks, Artificial Intelligence Based Suicide Prediction (n 45).
- 62 Ibid.
- 63 Vibrant Emotional Health, 988 Serviceable Populations and Contact Volume Projections (Vibrant, December 2020) https://www.vibrant.org/wp-content/uploads/2020/12/Vibrant-988-Projections-Report.pdf.
- 64 L Harris, ‘The New National Mental Health Crisis Line Wants to Track Your Location’, Disability Visibility Project (19 April 2021) https://disabilityvisibilityproject.com/2021/04/19/the-new-national-mental-health-crisis-line-wants-to-track-your-location.
- 65 In the UK, for example, a largescale government review of mental health legislation recommended the ‘digitising of the Mental Health Act’ HM Government, Modernising the Mental Health Act: Increasing Choice, Reducing Compulsion Final Report of the Independent Review of the Mental Health Act 1983 (Crown, December 2018) https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/778897/Modernising_the_Mental_Health_Act_-_increasing_choice__reducing_compulsion.pdf.
- 66 This platform emerged from a public private partnership, with funding from the NHS Innovation Accelerator (NIA), NHS England’s Innovation and Technology Payment Evidence Generation Fund, NHS England’s Clinical Entrepreneur programme and DigitalHealth.London’s Accelerator. S12 Solutions Website, www.s12solutions.com [accessed 3/3/2021])
- 67 S12 Solutions, ‘What is S12 Solutions?’ Twitter (21 Jan 2020) https://twitter.com/S12Solutions/status/1219262300667961349 [accessed 19/05/2021]
- 68 Doctors may also use the app to register assessments that were undertaken, provide supporting evidence for professional development, and complete payment claims for their assessment. The app also generates data about the Act assessment process.
- 69 Thalamos, ‘Mental Health Act Forms: The Benefits of Going Digital’, Thalamos.co.uk (10 November 2020) https://www.thalamos.co.uk/2020/11/10/mental-health-act-forms-the-benefits-of-going-digital/. Small pilot evaluations appear to support this view. S12 Solutions (2017d). Pilot Evaluation. NHS Innovations Accelerator. Available from: https://nhsaccelerator.com/wp-content/uploads/2019/05/S12-Solutions-pilot-evaluation1.pdf (accessed 13/07/2021).
- 70 M. Stevens, et al. The availability of section 12 doctors for Mental Health Act assessments - a scoping review of the literature. NIHR Policy Research Unit in Health and Social Care Workforce, The Policy Institute, King’s College London, p.16; Mental Elf, Digitising the Mental Health Act: A Public Debate #DigitalMHA (26 June 2020) https://www.youtube.com/watch?v=Yuzkctpv1dA.
- 71 Office of the Privacy Commissioner of Canada, ‘Disclosure of Information about Complainant’s Attempted Suicide to US Customs and Border Protection Not Authorized under the Privacy Act’ (21 September 2017) https://www.priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-federal-institutions/2016-17/pa_20170419_rcmp/.
- 72 The individuals were chosen based on local health authority ‘Mental Health Act data for the previous year to define which borough/ geographical area had the highest proportion of high intensity users of [Section] 136’. Aileen Jackson and Josh Brewster, THE IMPLEMENTATION OF SIM LONDON: Sharing Best Practice for Spread and Adoption (June 2018) 6 https://healthinnovationnetwork.com/wp-content/uploads/2018/11/The-Implementation-of-SIM-London-Report.pdf.>
- 73 Royal College of Psychiatrists (UK), ‘RCPsych Calls for Urgent and Transparent Investigation into NHS Innovation Accelerator and AHSN Following HIN Suspension’, www.rcpsych.ac.uk (14 June 2021) https://www.rcpsych.ac.uk/news-and-features/latest-news/detail/2021/06/14/rcpsych-calls-for-urgent-and-transparent-investigation-into-nhs-innovation-accelerator-and-ahsn-following-hin-suspension(accessed 9/9/21). Those flagged in annual Mental Health Act data tend to be very unwell and regularly phone emergency services or arrive at hospitals having self-harmed, attempted suicide, or threatened to take their own life.
- 74 Patrick Strudwick, ‘Campaigners Call for Inquiry after Mental Health Patients Turned Away by NHS under Controversial Scheme’, i (online, 16 June 2021) https://inews.co.uk/news/nhs-mental-health-stop-sim-inquiry-1056296
- 75 Royal College of Psychiatrists (UK) (n 75).
- 76 StopSIM Coalition, ‘STOPSIM’, STOPSIM (n.d.) https://stopsim.co.uk/
- 77 NHS Trusts refer to an organisational unit of the NHS that generally serves either a geographical area or a specialised function.>
- 78 Maryam Jameela, ‘Outrage Grows as Police Embed Themselves in Mental Health Services’, The Canary (online, 22 May 2021) https://www.thecanary.co/investigations/2021/05/22/outrage-grows-as-police-embed-themselves-in-mental-health-services/
- 79 Pūras and Gooding (n 27).
- 80 Laws of Kenya, The Penal Code, Chapter 63, Revised Edition 2009 (2008) s 226.
- 81 United for Global Mental Health, Decriminalising Suicide: SAVING LIVES, REDUCING STIGMA (International Association for Suicide Prevention, 2021) https://unitedgmh.org/sites/default/files/2021-09/UNITEDGMH%20Suicide%20Report%202021%C6%92.pdf.
- 82 William Wan, ‘White House Weighs Controversial Plan on Mental Illness and Mass Shootings’, Washington Post (9 September 2019) https://www.washingtonpost.com/health/white-house-considers-controversial-plan-on-mental-illness-and-mass-shooting/2019/09/09/eb58b6f6- ce72-11e9-87fa-8501a456c003_story.html.
- 83 Ibid.
- 84 Wallace Lawson, Laura Hiatt and Keith Sullivan, ‘Detecting Anomalous Objects on Mobile Platforms’ in 2016 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) (IEEE, 2016) 1426 https://ieeexplore.ieee.org/document/7789669
- 85 Jay Stanley, The Dawn of Robot Surveillance: AI, Video Analytics, and Privacy (American Civil Liberties Union, 2019).
- 86 One research group, for example, proposed that ‘computer vision’ designed to detect violence could be ‘extremely useful in some video surveillance scenarios like in prisons, psychiatric or elderly centers’. Enrique Bermejo Nievas et al, ‘Violence Detection in Video Using Computer Vision Techniques’ in Pedro Real et al (eds), Computer Analysis of Images and Patterns (Springer Berlin Heidelberg, 2011) 332 http://link.springer.com/10.1007/978-3-642-23678-5_39.
- 87 Stanley (n 87).
- 88 A Board-Certified Physician, ‘SCRAM Ankle Bracelet Measures Alcohol Consumption’, Verywell Mind https://www.verywellmind.com/scram-ankle-bracelet-measures-blood-alcohol-247-67122.
- 89 Stephanie Miller, ‘The Use of Monitoring Conditions (GPS Tracking Devices) Re CMX [2014] QMHC 4’ (2015) 22(3) Psychiatry, Psychology and Law 321.
- 89 Stephanie Miller, ‘The Use of Monitoring Conditions (GPS Tracking Devices) Re CMX [2014] QMHC 4’ (2015) 22(3) Psychiatry, Psychology and Law 321.
- 91 Re CMX [2014] QMHC 4 (Australia).
- 92 Ibid [42]-[43].
- 93 John Tully et al, ‘Service Evaluation of Electronic Monitoring (GPS Tracking) in a Medium Secure Forensic Psychiatry Setting’ (2016) 27(2) The Journal of Forensic Psychiatry & Psychology 169. Informed consent, it should be noted, is profoundly impacted by the power asymmetry inherent in forensic mental health services but nevertheless, the contrasting approaches between Queensland and England and Wales is significant. Regarding empirical evidence in support of the schemes ‘efficacy’ in reducing adverse events, one John Tully and his group of UK researchers reported a major reduction in ‘[e]pisodes of leave violation… which suggest potential benefits for speed of patient recovery, reduced length of stay, reduced costs and public safety’. Ibid p.169.
- 94 Donalee Moulton, ‘Nova Scotia Sets Direction on GPS Monitoring of Patients’ (2015) 187(8) Canadian Medical Association Journal E232.
- 95 Ibid.
- 96 Committee on the Rights of Persons with Disabilities, ‘General Comment No 1: Article 12 – Equal Recognition before the Law, 11th Sess, UN Doc CRPD/C/GC/1’; Tina Minkowitz, ‘The United Nations Convention of the Rights of Persons with Disabilities and the Right to Be Free from Nonconsensual Psychiatric Interventions’ (2007) 34(2) Syracuse Journal of International Law and Commerce 505; Kay Wilson, Mental Health Law: Abolish or Reform? (Oxford University Press, 2021)