2.1.6 Informed Consent
Rights of autonomy and decision-making have been a crucial concern in traditions of service user and survivor advocacy, activism, research, and so on. Informed consent, which is a key component of upholding the right to privacy but also has far broader importance, is key to rights to autonomy and decision-making, as reflected in human rights instruments, such as the Convention on the Rights of Persons with Disabilities (see articles 3 (general principles) and 12 (equal recognition before the law)). Like other human rights instruments, as UN
Special Rapporteur for the rights of persons with disabilities Gerard Quinn points out, the ‘Convention requires that consent should be informed, real, transparent, effective and never assumed’—and this is certainly the case in algorithmic and data driven developments.246 According to Quinn, autonomy is implicated, ‘where machine learning uses profiling and other decisions affecting persons with disabilities without their knowledge.247
Informed consent is particularly important with digital forms of diagnosis or proxy-diagnosis. The consequences of being diagnosed and pathologised in the mental health context, whether accurately or not, are often profound. Indeed, algorithmic and data-driven technological interventions in mental health services or in commercialised products that have a significant impact on individuals should never occur without their free and informed consent. All informed consent processes in the digital context should provide sufficient details of safety and security measures, including information about the person or entity that monitors compliance. (See also, Recommendation 5)
Overall, privacy is probably the most prominent theme in public discussion about the ethical and legal issues on data concerning people’s mental health,248 though this does not mean it is the most important. It could be reasonably asked whether privacy should dominate such discussion in comparison to other concerns, as it has a tendency to reduce the conversation to the level of the individual (rather than, say, social and economic underpinnings of distress, or collective claims to using data as a democratic resource rather than an individually owned artefact). Nevertheless, much work remains in applying principles of privacy to the mental health context in the digital era. This includes consideration of:249
- Control over the use of data;
- Ability to restrict processing (the power of data subjects to have their data restricted from use in connection with algorithmic technologies);
- The right to rectification (the power of a person to modify or amend information held by a data controller if it is incomplete or incorrect);
- The right to erasure (a person’s enforceable right to the removal of their data); and
- The general threat that market dominance by tech platforms poses to privacy in general (where the more market power a technology firm commands, the more people will have to trade their privacy to engage in social relations, civic life, wellbeing, etc.)
Any major effort to unpack these issues requires the active involvement of those most affected. It is also now unavoidable that new government regulation and robust enforcement is needed to protect privacy in the face of algorithmic and data-driven technologies. As advocacy organisation Access Now note, ‘data protection legislation can anticipate and mitigate many of the human rights risks posed by AI [and other algorithmic technologies]’.250 The Access Now position echoes a growing demand by some advocates for new data laws, enforceable penalties and the resources for affected communities to be proactive in contributing to enforcement.251 The need for law reform remains a subject of expanding scholarship that should continue to inform and be informed by developments that particularly impact people with lived experience and psychosocial disability.
- 242 Privacy International, Mental Health Websites Don’t Have to Sell Your Data. Most Still Do. (n 220).
- 243 Pasquale (n 6).
- 244 J Braithwaite, ‘Relational republican regulation’ (2013) 7(1) Regulation & Governance 124-144.
- 245 Jake Goldenfein, Monitoring Laws: Profiling and Identity in the World State (Cambridge University Press, 1st ed, 2019) https://www.cambridge.org/core/product/identifier/9781108637657/type/book; Salome Viljoen, Data Market Discipline: From Financial Regulation to Data Governance, Journal of International and Comparative Law (Forthcoming 2021) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3774418 (accessed 4/08/21). *Due to a drafting error, this citation was added at the proofing stage. The correct citation is: Nicole Martinez-Martin, Henry T Greely and Mildred K Cho, ‘Ethical Development of Digital Phenotyping Tools for Mental Health Applications: Delphi Study’ (2021) 9(7) JMIR mHealth and uHealth e27343.
- 246 Human Rights Council, Report of the Special Rapporteur on the Rights of Persons with Disabilities (UN Doc A/HRC/49/52, 28 December 2021) https://undocs.org/pdf?symbol=en/A/HRC/49/52 [para 43].
- 247 Ibid.
- 248 Gooding and Kariotis (n 43).
- 249 Fjeld et al (n 190).
- 250 Access Now, Human Rights in the Age of Artificial Intelligence (2018) https://www.accessnow.org/cms/assets/uploads/2018/11/AI-and-Human-Rights.pdf.
- 251 James (n 157).