This case discusses issues of informed consent and conflict of interest as it relates to database development in a research environment.
From: Graduate Research Ethics: Cases and Commentaries - Volume 6, 2002
edited by Brian Schrag
Dr. Edwards is very excited about the database he has developed. He has been a clinical psychologist and researcher for fifteen years and has strongly advocated for the use of computers in his field since his med school days. Several years ago, looking at the shelves and drawers filled with patient files, he began to think about data management. He came up with the idea of developing a multimedia, fully integrated database that would allow physicians and researchers to store, analyze and query patient/subject information quickly and easily. Through such a resource, he thought, he could maximize the utility of large collections of data, allowing researchers to ask questions currently impossible to address, due to the logistics of handling such large, heterogeneous data sets. Edwards envisioned a database that would allow researchers to record and track all aspects of an experiment including patient samples and records (everything from name and address to CAT scans), experimental reagents, protocols, raw data and primary analysis. It is a fantastically powerful tool with tremendous potential.
The Medusa database (Multimedia Data Storage and Analysis) was developed in Edwards's lab, at a prominent teaching hospital, using data from his own studies of ADHD and bipolar disorder. Medusa is fairly robust for his data set, but needs to be beta-tested with a range of data types and formats that are not commonly encountered in clinical psychology (e.g., DNA sequence data or results from animal breeding experiments). He meets with several labs on the medical campus and gives presentations on Medusa in an effort not only to advertise, but also to recruit high-profile labs in which to beta-test his product. During demonstrations, all patient names are encrypted and family relationships obscured. Edwards navigates through Medusa, showing off the ease with which one can toggle between a patient's blood chemistries and the results of behavioral tests. He convinces three labs with large ongoing projects to import their data into Medusa, helping him work out bugs and continue to develop the design and utility of his database. He knows that if all goes well, his name and that of Medusa will be mentioned in future publications out of the beta-labs, which will be important when he takes his database to market.
When Edwards began developing Medusa, he did not inform his patients or ask their permission to be included. He believes that storing the data in Medusa is equivalent (if not superior) to storing it in folders in file cabinets and is simply the best way for him to provide care to his patients.
Currently, the database is located in Edwards's lab on only one computer, which is accessible over the web. The bioinformatics staff of the respective beta-labs must learn the data structures and file formats used in Medusa, as well as how to manipulate the encryption utility. That will enable them to devise ways to import their own labs' data, which may be markedly different than the data from Edwards's lab. In order to import the data, a few individuals must have full access to the database, which means that they also have full access to Edwards's data set. To avoid this exposure, Edwards would have had to set up a complete duplicate database, which would be onerous and time-consuming. Edwards provides each beta-lab programmer with the encryption key; Amy is one such database programmer. While learning Medusa, Amy has access to complete patient files and experimental data stored in the database, although she has no need to look through these files. Periodically, her supervisor asks her to update the rest of the lab on her progress. During her lab presentations, it is easier to demonstrate much of Medusa's functionality without the encryption in place; although Edwards is working on it, the key currently must be entered each time a query is submitted, which is cumbersome and slow for demonstration purposes.
Posted 12 years and 8 months ago
Deborah G. Johnson University of Virginia
In the initial description of the case, none of the behavior of the people involved seems grossly, ethically problematic. When patient records are moved from paper to electronic mode, important changes occur in the accessibility of the data. However, that alone is not problematic as long as steps are taken to protect the accessibility of the data and the identity of the individuals.
As the case proceeds, protecting the identity of the patient/subject records comes into focus as an important issue. Before addressing this issue, however, there is another very subtle issue here. Dr. Edwards is soliciting data to help beta test Medusa. Presumably the data he is soliciting have been collected for research that meets the consent requirement for research with human subjects. However, the subjects agreed to participate in research and did not agree to have their data used to beta test a database management system. The lack of consent here becomes even more important as the case unfolds and we learn that the privacy of individual participants will be exposed to increased risk because the data are being used to test Medusa.
As the case continues, we have a straightforward situation of wrongfully balancing convenience over risks to subjects. Amy has access to the complete patient files, and she should know (and should have been instructed about) the importance of confidentiality of medical/research records. Indeed, the system has an encryption function that allows users to remove the identity of patients. However, the encryption function is cumbersome and slow, and Amy doesn't use it when she demonstrates her progress on the database to members of the lab.
Amy's behavior violates the patients'/subjects' trust in researchers when they agree to participate in research. The central importance of trust to the research endeavor should be clear. If trust in researchers is violated, over time individuals will begin to refuse to participate in research. In fact, this case illustrates a number of factors in achieving trust. For one thing, it illustrates that trust is a function of multiple actors. Achieving trust involves more than a single researcher or team of researchers. All who handle research data must behave properly. The researchers who collected the data allowed them to move out of their hands and be stored in a database that they no longer controlled. To ensure the privacy of their subjects, they should have asked for assurance that the data would be treated confidentially and without revealing identity.
Technology also plays a role, which takes us back to the difference between paper and electronic records. The change from paper to electronic storage of records is what allows and facilitates movement of data from one place to the next. Thus, the technology calls upon us to create appropriate norms of behavior around it - norms that protect the trust of research participants.
From: Graduate Research Ethics: Cases and Commentaries - Volume 6, 2002 edited by Brian Schrag
Scientists are always searching for better ways to test their hypotheses, to ask smarter questions and to get the most out of their data. Today, not only does the science drive the technology, but the technology drives the science, as well. That is especially true in the case of computers, with their increasing speed and sophistication, which allow us to ask bigger, more complex questions about the data we generate and collect. In this case, Edwards tries to take advantage of the technology for the benefit of his patients and colleagues.
Since the advent of computers, questions and fears have been raised about their role in our lives and their potential for harm. The line between benefit and harm can be blurry, and a balance must be struck. Edwards, with his development of Medusa, must make a host of decisions in which the "right" answer is not necessarily clear.
Is there a substantive difference between paper records and Medusa?
Concerns have long been voiced about the ill uses to which computers might be adapted. Concerns about privacy and security continue. The main issue in this case, I think, is not so much the security of medical information in isolation - in my experience, paper records are often no more secure than electronic ones - but rather the power of medical information when used in the context of a broader scheme of information collection or database mining (DG Johnson 2001). Presumably, Edwards will ensure that his patients' data are adequately protected from unauthorized access over the network, as well as unauthorized direct terminal access.
Is Edwards justified in his use of patient information for database development and promotion? Why or why not?
Two major issues must be addressed here: informed consent and conflict of interest. The issue of informed consent is a tricky one because it is understood that one's physician will keep a thorough and accurate medical history for each patient. While historically such records have been kept in manila folders in file cabinets, it does not follow that technological advances ought to be ignored. The question then, is whether database development constitutes "research." It may seem clear that if Edwards were to begin to mine the database in an attempt to ask questions about his patient population as a whole, it would be appropriate and required that he formally enroll his patients as participants in a study - with appropriate consent. However, what if he wants to ask whether patients with different income levels have significantly different health outcomes? Is it clear that this question goes outside the bounds of his duty as a physician to provide the best possible care to his patients? Where does one draw the line? Does the mere development of the capacity for such queries constitute a need for consent?
In regard to the potential conflict of interest, it cannot be assumed that Edwards's obvious perceived conflict of interest is, in fact, a real conflict of interest about which his patients ought to be concerned. It may be reasonable to give him the benefit of the doubt, to assume that he is driven by a genuine desire to help present and future patients through the development of his database. Medusa certainly has the potential to allow elaborate and unwieldy data analyses and lead to valuable findings that significantly impact science and medicine. That is not to say that Edwards ought to be absolved of responsibility, only that we cannot assume that he is acting out of pure self-interest - the situation is likely far more complex than that.
Given this additional information, do you feel differently about Edwards's use of patient information in the development and promotion of Medusa? Why or why not?
Edwards has clearly failed to think through the implications of his actions in regard to the distribution of the encryption key. As their physician, it is his responsibility to protect the confidentiality of his patients' information. He might have preempted this sort of behavior by having all persons receiving the encryption key read and sign a statement of confidentiality and/or participate in some sort of training related to working with sensitive information.
Some people may suggest that despite the difficulties, Edwards ought to have created a mirror database, with fake names and information; however, it is important to keep in mind that Edwards's primary interest is in getting Medusa out there, in use and serving patients, physicians and scientists. Again, that is not to absolve Edwards of responsibility, but only to show that this case is much messier than one might think at first blush.