2.1.5 Data Protection Law
It is generally agreed that robust data protection laws can provide a more comprehensive framework compared to privacy law for protecting a range of forms of personal data, and can also include additional rules for categories like health and research data. The EU’s GDPR is an influential example of data protection rules designed to remedy gaps caused by fuzzy definitions of what constitutes personal data; it specifies steps any organisation or agency handling ‘personal data’ must take in order to uphold the right to privacy. This includes ‘sensitive personal data’, which extends to ‘data concerning health’. In the US, the California Consumer Privacy Act takes a similar direction, though the scope of the GDPR is broader.240
New Approaches to Data Protection Law
Two contrasting examples highlight flaws in legacy regulation of data concerning mental health and the importance of robust data protection law that covers data in the current communications ecosystem
LEGACY EXAMPLE: Food and Drug Administration (FDA) (US)
In the US, apps that collect health related data and pose a high risk to the public only fall within the scope of the FDA if they transform a mobile phone or any other electronic device into a medical device. This is often referred to as ‘Software as Medical Device’. As Schenble, Elger and Shaw point out, the FDAs scope ‘does not address a substantial number of health data collectors, such as wellbeing apps; websites, especially patient centered portals… and social networks, and thus, it excludes most indirect, inferred, and invisible health data, which subsequently are subject to the US Federal Trade Commission guidance, resulting in lower safeguards of potentially highly personal data’.241
‘NEW GENERATION’ DATA PROTECTION LAW EXAMPLE: GDPR (EU)
In contrast, the EUs GDPR covers any kind of personal data regardless of the context in which it is collected. Additional rules are then applied to health or research data. Health data, for example, is treated as a special category of data that is sensitive by its nature. Article 9, section 1, states that:
Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be prohibited.
This provision makes clear that data generated in social media or by connected devices could reveal any of these different sensitive types of data.
The GDPR explicitly does not include the term ‘health data’ and instead uses the broader phrase ‘data concerning health’. Schneble and colleagues argue that this important distinction ‘opens the door to indirect and inferred health data falling within the scope of the GDPR’, and therefore strengthens its application outside as within the formal healthcare system. It is too early to determine the extent to which Schneble and colleagues are correct.
Others remain sceptical that even leading data protection laws like the GDPR can sufficiently protect people in the mental health context against the full range of the harms that may arise. Nicole Martinez-Martin and colleagues, for example, refer to the risk of misuse of data that is used to infer things about the health of an individual, and stated that:
- existing regulations do not address or sufficiently protect individuals from companies and institutions drawing health inferences from that personal data. Furthermore, these data or health inferences may be used in ways that have negative ramifications for people, such as higher insurance rates or employment discrimination. Adding further concern, some consumer digital mental health services also have been found to employ misleading or false claims regarding their collection and use of sensitive personal information. Against this backdrop, even clinical, “regulated” applications … present significant concerns regarding transparency, consent and the distribution of risks and benefits for patients and users regarding how their data may be shared and used.*
Even where harmful or potentially harmful practices are identified and found to be violating data protection laws, the success of those laws is dependent on the capacity of authorities to enforce compliance, which remains an issue with the GDPR.242 In any case, the GDPR is complex and only applies to organisations based in the EU. More conceptual and regulatory work is required to better define and regulate ‘data concerning health, mental health and disability’ and their use in automated profiling, to address issues of ‘indirect, inferred, and invisible health data’
Regarding law more generally, just as there is a risk of idealising technology’s promise, so there is of law: vigilance is required as to whether law reinforces unjust power relations. For example, if regulatory regimes to protect privacy are characterised by light-touch, pro-industry approaches that are designed to ease market authorisation of digital mental health services and products, this may promote the spread of cheap (if limited) software to replace more expensive, expert, and empathetic professional support, and disrupt care service provision.243 Regulation should aim to reduce all forms of domination, but there is always a risk that it will fail and/or reinforce domination.244 Some legal scholars have argued that laws governing privacy, data protection, and consumer protection have failed to govern the platform dominance of major technology corporations. Further, such laws have contributed to the massive expansion of big technology corporations into market-like structures that distort social relations and convert individuals into ‘users’—a resource to be mined for data and attention.245
More work is required to bring together those working on algorithmic and data-driven technology in response to disability and distress, with those who are pursuing broader alternative arrangements for the governance of our digitally mediated lives and economies. Possible alternatives include collective approaches to governing data and platforms, and community-produced data resources.
- 237 Office of the Privacy Commissioner of Canada, ‘Disclosure of Information about Complainant’s Attempted Suicide to US Customs and Border Protection Not Authorized under the Privacy Act’ (21 September 2017) para 107 https://www.priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-federal-institutions/2016-17/pa_20170419_rcmp/.
- 238 Ibid.
- 239 Ibid, para [6].
- 240 Laura Jehl, Alan Friel and Bakerhostetler Llp, ‘CCPA and GDPR Comparison Chart’ 9.
- 241 Schneble, Elger and Shaw (n 19).