Machine learning applications in healthcare and the role of informed consent: Ethical and practical considerations
Giorgia Lorenzini, David Martin Shaw, Laura Arbelaez Ossa, Bernice Simone Elger
Clinical Ethics, 24 April 2022
Informed consent is at the core of the clinical relationship. With the introduction of machine learning (ML) in healthcare, the role of informed consent is challenged. This paper addresses the issue of whether patients must be informed about medical ML applications and asked for consent. It aims to expose the discrepancy between ethical and practical considerations, while arguing that this polarization is a false dichotomy: in reality, ethics is applied to specific contexts and situations. Bridging this gap and considering the whole picture is essential for advancing the debate. In the light of the possible future developments of the situation and the technologies, as well as the benefits that informed consent for ML can bring to shared decision-making, the present analysis concludes that it is necessary to prepare the ground for a possible future requirement of informed consent for medical ML.
Public opinion on sharing data from health services for clinical and research purposes without explicit consent: an anonymous online survey in the UK
BMJ Open, 7 April 2022
UK National Health Service (NHS/HSC) data is variably shared between healthcare organizations for direct care, and increasingly de-identified for research. Few large-scale studies have examined public opinion on sharing, including of mental health (MH) versus physical health (PH) data. We measured data sharing preferences.
Pre-registered anonymous online survey, measuring expressed preferences, recruiting Feb–Sep 2020. Participants were randomized to one of three framing statements regarding MH versus PH data.
Open to all UK residents. Participants numbered 29275; 40% had experienced a MH condition.
Most (76%) supported identifiable data sharing for direct clinical care without explicit consent, but 20% opposed this. Preference for clinical/identifiable sharing decreased with geographical distance and was slightly less for MH than PH data, with small framing effects. Preference for research/de-identified data sharing without explicit consent showed the same small PH/MH and framing effects, plus greater preference for sharing structured data than de-identified free text. There was net support for research sharing to the NHS, academic institutions, and national research charities, net ambivalence about sharing to profit-making companies researching treatments, and net opposition to sharing to other companies (similar to sharing publicly). De-identified linkage to non-health data was generally supported, except to data held by private companies. We report demographic influences on preference. A majority (89%) supported a single NHS mechanism to choose uses of their data. Support for data sharing increased during COVID-19.
Support for healthcare data sharing for direct care without explicit consent is broad but not universal. There is net support for the sharing of de-identified data for research to the NHS, academia, and the charitable sector, but not the commercial sector. A single national NHS-hosted system for patients to control the use of their NHS data for clinical purposes and for research would have broad support.
Foundations for Meaningful Consent in Canada’s Digital Health Ecosystem: Retrospective Study
Nelson Shen, Iman Kassam, Haoyu Zhao, Sheng Chen, Wei Wang, Sarah Wickham, Gillian Strudwick, Abigail Carter Langford
JMIR Medical Informatics, 31 March 2022; 10(3)
Canadians are increasingly gaining web-based access to digital health services, and they expect to access their data from these services through a central patient access channel. Implementing data sharing between these services will require patient trust that is fostered through meaningful consent and consent management. Understanding user consent requirements and information needs is necessary for developing a trustworthy and transparent consent management system.
The objective of this study is to explore consent management preferences and information needs to support meaningful consent.
A secondary analysis of a national survey was conducted using a retrospective descriptive study design. The 2019 cross-sectional survey used a series of vignettes and consent scenarios to explore Canadians’ privacy perspectives and preferences regarding consent management. Nonparametric tests and logistic regression analyses were conducted to identify the differences and associations between various factors.
Of the 1017 total responses, 716 (70.4%) participants self-identified as potential users. Of the potential users, almost all (672/716, 93.8%) felt that the ability to control their data was important, whereas some (385/716, 53.8%) believed that an all or none control at the data source level was adequate. Most potential users preferred new data sources to be accessible by health care providers (546/716, 76.3%) and delegated parties (389/716, 54.3%) by default. Prior digital health use was associated with greater odds of granting default access when compared with no prior use, with the greatest odds of granting default access to digital health service providers (odds ratio 2.17, 95% CI 1.36-3.46). From a list of 9 information elements found in consent forms, potential users selected an average of 5.64 (SD 2.68) and 5.54 (SD 2.85) items to feel informed in consenting to data access by care partners and commercial digital health service providers, respectively. There was no significant difference in the number of items selected between the 2 scenarios (P>.05); however, there were significant differences (P<.05) in information types that were selected between the scenarios.
A majority of survey participants reported that they would register and use a patient access channel and believed that the ability to control data access was important, especially as it pertains to access by those outside their care. These findings suggest that a broad all or none approach based on data source may be accepted; however, approximately one-fifth of potential users were unable to decide. Although vignettes were used to introduce the questions, this study showed that more context is required for potential users to make informed consent decisions. Understanding their information needs will be critical, as these needs vary with the use case, highlighting the importance of prioritizing and tailoring information to enable meaningful consent.
A Big Data Framework for Consent
Wei Yap, Muhammad Rizwan Asghar
Trust, Security and Privacy for Big Data, 2022 [Taylor&Francis]
Privacy is a vast and vital area of law with possibly diverse interpretations, legislation and standards worldwide with the aim to protect data. Consent plays a vital role in preserving privacy as it ensures that all involved parties understand the reason for the use and collection of data. Many organisations still have lengthy guidelines that cause legibility and usability issues. This makes it difficult for a data subject to understand what they are consenting to and creates a restrictive environment for consent. Unfortunately, existing works do not provide any solution for implementing a dynamic privacy consent framework. In this book chapter, we aim at presenting a dynamic consent framework for big data to ensure that each privacy consent policy is legible, understandable, usable, and customisable. We propose a new method to communicate, analyse, and request consent from a data subject in a way that is simple and understandable. We also aim to ensure that this framework does not increase the burden on data subjects to provide consent while implementing an ability to simplify and audit the consent process.
Trust and digital privacy in healthcare: a cross-sectional descriptive study of trust and attitudes towards uses of electronic health data among the general public in Sweden
Sara Belfrage, Gert Helgesson, Niels Lynøe
BMC Medical Ethics, 4 March 2022; 23(19)
The ability of healthcare to protect sensitive personal data in medical records and registers might influence public trust, which in turn might influence willingness to allow healthcare to use such data. The aim of this study was to examine how the general public’s trust relates to their attitudes towards uses of health data.
A stratified sample from the general Swedish population received a questionnaire about their willingness to share health data. Respondents were also asked about their trust in the management and protection of electronic health data.
A large majority (81.9%) of respondents revealed high levels of trust in the ability of healthcare to protect electronic patient data. Good health was associated with significantly higher levels of trust compared to bad health. Respondents with low levels of trust were significantly less willing to allow personal data to be used for different purposes and were more inclined to insist on being asked for permission beforehand. Those with low levels of trust also perceived risks of unauthorized access to personal data to be higher and the likely damage of such unauthorized access worse, compared to those with high levels of trust.
Trust in the ability of healthcare to protect electronic health is generally high in Sweden. Those with higher levels of trust are more willing to let their data be used, including without informed consent. It thus seems crucial to promote trust in order to be able to reap the benefits that digitalization makes possible through increased access and use of data in healthcare.
Informed consent and algorithmic discrimination – is giving away your data the new vulnerable?
Hauke Behrendt, Wulf Loh
Review of Social Economy, 1 March 2022
This paper discusses various forms and sources of algorithmic discrimination. In particular, we explore the connection between – at first glance – ‘voluntary’ sharing or selling of one’s data on the one hand and potential risks of automated decision-making based on big data and artificial intelligence on the other. We argue that the implementation of algorithm-driven profiling or decision-making mechanisms will, in many cases, disproportionately disadvantage certain vulnerable groups that are already disadvantaged by many existing datafication practices. We call into question the voluntariness of these mechanisms, especially for certain vulnerable groups, and claim that members of such groups are oftentimes more likely to give away their data. If these existing datafication practices exacerbate prior disadvantages, they ‘compound historical injustices’ (Hellman, 2018) and thereby constitute forms of morally wrong discrimination. To make matters worse, they are even more prone to further algorithmic discriminations based on the additional data collected from them.
A Blockchain-Based Consent Mechanism for Access to Fitness Data in the Healthcare Context
May Alhajri, Carsten Rudolph, Ahmad Salehi Shahraki
IEEE Access, 24 February 2022; 10 pp 22960 – 22979
Wearable fitness devices are widely used to track an individual’s health and physical activities to improve the quality of health services. These devices sense a considerable amount of sensitive data processed by a centralized third party. While many researchers have thoroughly evaluated privacy issues surrounding wearable fitness trackers, no study has addressed privacy issues in trackers by giving control of the data to the user. Blockchain is an emerging technology with outstanding advantages in resolving consent management privacy concerns. As there are no fully transparent, legally compliant solutions for sharing personal fitness data, this study introduces an architecture for a human-centric, legally compliant, decentralized and dynamic consent system based on blockchain and smart contracts. Algorithms and sequence diagrams of the proposed system’s activities show consent-related data flow among various agents, which are used later to prove the system’s trustworthiness by formalizing the security requirements. The security properties of the proposed system were evaluated using the formal security modeling framework SeMF, which demonstrates the feasibility of the solution at an abstract level based on formal language theory. As a result, we have shown that blockchain technology is suitable for mitigating the privacy issues of fitness providers by recording individuals’ consent using blockchain and smart contracts.
Value-based Consent Model: A Design Thinking Approach for Enabling Informed Consent in Medical Data Research
Simon Geller, Sebastian Müller, Simon Scheider, Christiane Woopen, Sven Meister
Proceedings of the 15th International Joint Conference on Biomedical Engineering Systems and Technologies – HEALTHINF, 2022; pp 81-92
Due to new technological innovations, the increase in lifestyle products, and the digitalisation of healthcare the volume of personal health data is constantly growing. However, in order to use, re-use, and link personalised health data and, thus, unlock their potential benefits in health research, the authors of the data need to voluntarily give their informed consent. That is a major challenge to health data research, because the classic informed consent process requires the immense administrative burden to ask for consent, every time personal health data is accessed. In this paper we argue that all alternative consent models that have been developed to tackle this problem, either do not reduce administrative burdens significantly or do not conform to the informed consent ideal. That is why we used the design thinking approach to develop an alternative consent model that we call the value-based consent model. This model has the potential to reduce administrative burdens while empowering research subjects to autonomously translate their values into consent decisions.
Toward an architecture to improve privacy and informational self-determination through informed consent
Information and Computer Security, 23 February 2022
Most developed countries have enacted privacy laws to govern the collection and use of personal information (PI) as a response to the increased misuse of PI. Yet, these laws rely heavily on the concept of informational self-determination through the “notice” and “consent” models, which is deeply flawed. This study aims at tackling these flaws achieve the full potential of these privacy laws.
The author critically reviews the concept of informational self-determination through the “notice” and “consent” model identifying its main flaws and how they can be tackled.
Existing approaches present interesting ideas and useful techniques that focus on tackling some specific problems of informational self-determination but fall short in proposing a comprehensive solution that tackles the essence of the overall problem.
This study introduces a model for informed consent, a proposed architecture that aims at empowering individuals (data subjects) to take an active role in the protection of their PI by simplifying the informed consent transaction without reducing its effectiveness, and an ontology that can partially realize the proposed architecture.
Sovereign Digital Consent through Privacy Impact Quantification and Dynamic Consent
Arno Appenzeller, Marina Hornung, Thomas Kadow, Erik Krempel, Jürgen Beyerer
Technologies, 21 February 2022; 10(35)
Digitization is becoming more and more important in the medical sector. Through electronic health records and the growing amount of digital data of patients available, big data research finds an increasing amount of use cases. The rising amount of data and the imposing privacy risks can be overwhelming for patients, so they can have the feeling of being out of control of their data. Several previous studies on digital consent have tried to solve this problem and empower the patient. However, there are no complete solution for the arising questions yet. This paper presents the concept of Sovereign Digital Consent by the combination of a consent privacy impact quantification and a technology for proactive sovereign consent. The privacy impact quantification supports the patient to comprehend the potential risk when sharing the data and considers the personal preferences regarding acceptance for a research project. The proactive dynamic consent implementation provides an implementation for fine granular digital consent, using medical data categorization terminology. This gives patients the ability to control their consent decisions dynamically and is research friendly through the automatic enforcement of the patients’ consent decision. Both technologies are evaluated and implemented in a prototypical application. With the combination of those technologies, a promising step towards patient empowerment through Sovereign Digital Consent can be made.