1.4.2 Governing the Future of Biometric Monitoring in Mental Health Settings
Biometrics more generally are the subject of a growing field of research, practice, advocacy, activism, and law reform.158 In healthcare, the COVID-19 pandemic has accelerated the international adoption of forms of biomonitoring and surveillance, and other public health monitoring and security technologies, whether adopted by states, private entities or individuals.159
In the mental health context, scholarship that explores the legal, ethical, social, and political concerns with biometric technologies is emerging.160 More work is clearly required. Later themes discussed in this report will engage with some of the questions directly relevant to biometrics. Such questions include asking if those deemed through biometric monitoring to be ‘cognitively impaired’, ‘mentally disordered’, ‘suicidal’, or likely to become any of those things, will be informed that such attributions have been made. Will they be able to opt-out of the monitoring process in the first place? Will they be able to contest such labels before data are transferred to others? Given the purported ease with which mobile phone data-points can be used for automated profiling to determine cognitive impairment,161 are there sufficient safeguards to govern whether or how this should occur? More pointedly, should moratoria apply to some forms of biometric monitoring and surveillance in the mental health and disability context on the basis that they are fundamentally harmful or inconsistent with human rights? How would such a decision be made? What role is currently being played by psychiatric and psychological sciences in advancing such technologies? What role should they play?
This is a critical moment to reflect how the current choices being made in various institutions concerning ‘digital mental health’ – from research, services, policies and programming – might affect future approaches to distress, anguish, mental crises and so on. To conclude Part 1 we turn to the glaring omission from these choices of the very people for whom the technologies are purportedly designed
- 156 Schulich Law, Algorithmic Racism, Healthcare & The Law: ‘Race Based’ Data Another Trojan Horse? (19 September 2020) https://www.youtube.com/watch?v=PveOVJYIu3I.
- 157 Scully and Van Toorn (n 155).
- 158 Kak (n 100).
- 159 ‘Covid-19 Is Accelerating the Surveillance State’, The Strategist (17 November 2020) 19 https://www.aspistrategist.org.au/covid-19-is-accelerating-the-surveillance-state/; ‘Homo Deus Author Yuval Harari Shares Pandemic Lessons from Past and Warnings for Future’, South China Post (online, 1 April 2020) https://www.scmp.com/news/china/article/3077960/homo-deus-author-yuval-harari-shares-pandemic-lessons-past-and-warnings?fbclid=IwAR2b6pMEt1Gj4mpsBjSapqwL79e_tg_76eL4MLL788WYGDgTGRDbkM1H8y8.
- 160 See e.g. Cosgrove et al (n 98); Bossewitch, ‘The Rise of Surveillance Psychiatry and the Mad Underground’ (n 133); Harris, ‘The Rise of the Digital Asylum’ (n 119).
- 161 Jonas Rauber, Emily B Fox and Leon A Gatys, ‘Modeling Patterns of Smartphone Usage and Their Relationship to Cognitive Health’ [2019] arXiv:1911.05683 [cs, stat] http://arxiv.org/abs/1911.05683