Ethical Challenges of Human Research with Neural Devices

Image
NIH logo

In a recent paper published in JAMA Neurology, a collaborative team of ethics and neuroscience experts highlight key ethical issues surrounding neural device research and provide points to consider for researchers and others working in this space.

 

Human research is fundamental to the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative®’s goals of understanding the brain and developing treatments for brain disorders. Scientists supported by the BRAIN Initiative are rapidly developing modern tools to monitor and regulate brain activity. This research involves the new and expanded use of invasive and noninvasive neural devices in humans, bringing important ethical considerations to light.

To ensure that BRAIN-funded neural device research is done ethically, one goal of the BRAIN Initiative’s Neuroethics Working Group (NEWG) – a group of experts in neuroethics and neuroscience – is to anticipate ethical questions that arise as technology advances. For example, the NEWG recently published eight guiding neuroethical principles for BRAIN Initiative research. Building upon these overarching principles, in October 2017 the NIH brought together neuroethics and bioethics experts, neuroscientists, and clinicians to discuss and offer input on the ethical issues surrounding neural device research in humans. For details on this discussion, please view the videocast of the NEWG workshop on this topic.

In their paper published in JAMA Neurology, authors emphasize three main areas of ethical challenges in neural device research: 1. Analysis of risk; 2. Informed consent; and 3. Posttrial responsibilities to research participants.

Analysis of risk

Human research with invasive or non-invasive neural devices is rarely void of risk. Before using devices for research purposes, authors encourage scientists to analyze risk by determining the type and extent of risk for each proposed study (DHEW, 1979). In the paper, authors recommend that scientists assess risk from six sources, including 1. Risks related to surgery; 2. Device hardware; 3. Stimulation; 4. The nature of research; 5. Privacy and security; and 6. Financial burden. Importantly, although surgical and hardware risks are unique to invasive neural devices, invasiveness alone is not sufficient to determine risk.

Authors highlight that both invasive and non-invasive neural devices have other, sometimes lesser emphasized risks. Because brain circuit activity forms the basis of everyday human experiences (e.g., perception, thought, emotion, action) neural devices may have unique risks related to mental states and personal identity. For instance, deep brain stimulation (DBS) may negatively affect cognition and involve ‘atypical’ risks such as effects on personality, mood, behavior, and altered perceptions of identity, authenticity, privacy, and agency (Klein et al., 2016; Schupbach et al., 2006). According to authors, these ‘atypical’ risks require special attention because they are poorly understood, variable, and unpredictable. In evaluating risks, they recommend that overall, the risks of a study should ultimately be justified by the possible therapeutic benefit to the participant and the importance of knowledge to be gained.

Informed consent

Informed consent is essential to protect the rights of human research participants. Because neural device research can affect the brain in predictable and unpredictable ways, typical informed consent challenges in clinical research can be exacerbated in neural device research.

Researchers using neural devices must inform participants about “reasonably foreseeable” emerging or atypical risks associated with a neural device. This is particularly difficult because individuals may have diverse preferences and value systems, explain authors. For example, some participants may perceive neural stimulation as enhancing their sense of empowerment, while others may see stimulation as undermining their level of control (Gilbert, O’Brien, & Cook, 2018; Klein et al., 2016). What can be done to ensure that these risks are appropriately disclosed? Authors encourage researchers to draw upon experience in disclosing adverse effects from neuropharmacological studies and utilize a multidisciplinary team to figure out appropriate consent language.

Further, authors discuss how informed consent can be impeded by the link between brain disorders and making or communicating decisions. Complex experimental information and disorders that impair cognition may hinder a participant’s capacity to make an informed choice. Participants may also feel pressured to participate in research studies. For example, they may already have a clinical relationship with their neurosurgeon who is also the study investigator and find it difficult to decline. Authors propose that researchers and institutional review boards (IRBs) can alleviate these pressures by providing patients with alternative communication tools (e.g., written communication or using pictures), ensuring that patients understand that research participation is voluntary and will not jeopardize clinical care, and by providing them with a different investigator (other than their surgeon) with whom to discuss the study.

Posttrial responsibilities to research participants

Once a trial ends, participants may leave a study with specific posttrial needs related to trial participation (see Figure below). This ethical issue can be complicated in neural device research. Individuals who participate in a neural device study may have long-lasting, lifelong changes – such as a permanent brain implant – that impact their future. They may need medical care and equipment for device maintenance (e.g., battery replacement) long after study participation. Costs of maintenance, repairs, and device removal are also an issue: clinical trials involving invasive devices and funders do not always cover these costs and health insurance plans often deny coverage for experimental devices. In fact, no definitive ethical or regulatory frameworks, nor standard practices, exist for posttrial responsibilities in neural device research (Lázaro-Muñoz, Yoshor, Beauchamp, Goodman, & McGuire, 2018), state authors.

Authors suggest that researchers, funders, and device manufacturers anticipate and make plans for participants’ posttrial needs. Prior to beginning a study, IRBs and participants should be well-informed and consent to potential needs, risks, complexities, and costs of posttrial care. Complex neural devices, such as DBS, may make participants particularly vulnerable. Researchers should provide participants with experts who can assist them with device technicalities. Authors also propose creating a registry to track long-term outcomes of neural devices, such as adverse effects and costs. Overall, authors recommend that researchers, device manufacturers, funders, and health care institutions share responsibility for clarifying responsibilities in posttrial care. Neural devices will continue to be improved for years to come. Therefore, ongoing efforts to clarify researcher and funder responsibilities for posttrial care are essential to align posttrial responsibilities with proper research entities.

The referenced media source is missing and needs to be re-embedded.
The light blue timeline shows the development of a neural device, and the brown timeline depicts scenarios in which participants may have posttrial needs. Recommendations for researchers and funders are shown in dark blue boxes.
Advancements in neurotechnology have the potential to outpace existing ethics guidelines. Therefore, it is critical that current ethical frameworks for neural device research evolve with advancements in neurotechnology. With this publication, authors hope to help scientists, clinicians, IRBs, and funders involved in research with neural devices navigate the novel ethical concerns raised by using these emerging neurotechnologies in humans.

References

DHEW. (1979). The Belmont Report. Ethical Principles and Guidelines for the Protection of Human Subjects of Research. The National Commission for the Protection of Human Subjects of Biomedical Behavioral Research Retrieved from https://videocast.nih.gov/pdf/ohrp_appendix_belmont_report_vol_2.pdf

Klein, E., Goering, S., Gagne, J., Shea, C. V., Franklin, R., Zorowitz, S., . . . Widge, A. S. (2016). Brain-computer interface-based control of closed-loop brain stimulation: attitudes and ethical considerations. Brain-Computer Interfaces, 3(3), 140-148. doi:10.1080/2326263X.2016.1207497

Schupbach, M., Gargiulo, M., Welter, M. L., Mallet, L., Behar, C., Houeto, J. L., . . . Agid, Y. (2006). Neurosurgery in Parkinson disease: a distressed mind in a repaired body? Neurology, 66(12), 1811-1816. doi:10.1212/01.wnl.0000234880.51322.16

Gilbert, F., O’Brien, T., & Cook, M. (2018). The Effects of Closed-Loop Brain Implants on Autonomy and Deliberation: What are the Risks of Being Kept in the Loop? Camb Q Healthc Ethics, 27(2), 316-325. doi:10.1017/s0963180117000640

Lázaro-Muñoz, G., Yoshor, D., Beauchamp, M. S., Goodman, W. K., & McGuire, A. L. (2018). Continued access to investigational brain implants. Nat Rev Neurosci, 19(6), 317-318. doi:10.1038/s41583-018-0004-5

Latest from The BRAIN Blog

The BRAIN Blog covers updates and announcements on BRAIN Initiative research, events, and news. 

Hear from BRAIN Initiative trainees, learn about new scientific advancements, and find out about recent funding opportunities by visiting The BRAIN Blog.

This site is protected by reCAPTCHA and the Google Privacy Policyand Terms of Serviceapply.
Image
black and white image of people working on laptops at a counter height table on stools at the annual BRAIN meeting