Author / Contributor
image for Brian Schrag
Brian Schrag More Posts
But For the Fear of What You Might Find Out

Added04/14/2006

Updated10/20/2015

Authoring Institution Association for Practical and Professional Ethics (APPE)
Show More Show Less
Contributor(s) Brian Schrag
Notes Brian Schrag, ed., Research Ethics: Cases and Commentaries, Volume 5, Bloomington, Indiana: Association for Practical and Professional Ethics, 2001
Format Text
Share with EEL Yes
Rights The Association for Practical and Professional Ethics (APPE) grants permission to use these case and commentary material with the citation indicated above.
Publisher Association for Practical and Professional Ethics
Language English
Sort By
  • Deborah G. Johnson

    Posted 13 years and 1 month ago

    Deborah G. Johnson 

    Georgia Institute of Technology


    This case illustrates an extremely complex and difficult issue for researchers involved with the development of new technologies. At the heart of the case is uncertainty and the role uncertainty plays both in technological development and in ethics. Uncertainty makes for difficult decision making.


    In one of the first textbooks on engineering ethics, Martin and Schinzinger1 suggested that engineering should be understood as social experimentation. They argued that engineering should be seen on the model of medical experimentation since engineering always involves some degree of risk and uncertainty. Even if engineers are building something that has been built before, the new undertaking will involve differences that may affect the outcome ` a different environment, different materials, a different scale and so on. Martin and Schinzinger seemed to believe that the risk and uncertainty of engineering undertakings had not been sufficiently recognized. Consequently, those who are put at risk by an engineering endeavor are rarely involved in the decision making or given an opportunity to consent or withhold consent. In this case, engineering and medical experimentation are fused. There is no distinction. Nevertheless, the fact that the engineering endeavor is framed as medical experimentation does not seem to make the ethical issue any clearer or easier. The powerful role played by uncertainty is quickly brought into focus when we compare this case to a hypothetical situation in which researchers use standard imaging modalities to test some other aspect of the machinery. Suppose, for example, that researchers are testing a new, ergonomic design for a machine that deploys standard imaging modalities. The researchers discover an anomaly in the breast of a research participant. I believe the researchers would not hesitate to inform the patient and her doctor; they would be confident with regard to the significance of the finding.


    The researchers hesitate in this case because they are uncertain of the meaning of their finding and they do not want to cause unnecessary stress to the participant. This response is understandable given that the engineers are so unsure about the validity of the imaging modalities.


    The situation is actually not so uncommon in engineering. Often engineers and scientists have evidence, but the evidence is limited and doesn't give them the certainty they need to make a decision. This parallels the situation in which Roger Boisjoly found himself with regard to the launching of the Challenger.2 Boisjoly had some evidence that the 0-rings behaved differently in extremely cold temperatures, but he had not had time to do further testing to establish how the 0-rings would function. He had evidence, but he was unsure of the meaning or strength of the evidence. Was it strong enough to justify stopping the launch of the Challenger? Was it weak enough to be ignored? It just wasn't clear.


    The parallel with this case should be obvious. Is the evidence strong enough to contact the participant or her physician? Weak enough to be ignored? It just isn't clear.


    In situations of this kind, many factors come into play: the severity of the risk involved, the timeframe before outcome, details of the domain (spaceships, breast cancer, etc.), the possibility of gathering further evidence, and so on. In the case at hand, the severity of the risk of saying or doing nothing is high in the sense that a woman's life is at stake.


    The engineers are reluctant to inform the woman for fear of causing her unnecessary stress. While this attitude is understandable, it also hints at paternalism. Their hesitation presumes that the woman is not capable of understanding the uncertainty of the data and the risks at stake. Thus, I believe the researchers did the right thing by telling the woman and her physician about their discovery, and I am inclined to think they should have done so earlier. Nevertheless, I admit this case is difficult because of the uncertainty of the data.


    Footnotes



    • 1 Martin, Michael, and Schinzinger, Ronald. Ethics in Engineering. New York: McGraw Hill, 1983, 1989.

    • 2Boisjoly, Roger. "The Challenger Disaster: Moral Responsibility and the Working Engineer" in D.G. Johnson, Ethical Issues in Engineering. Englewood, N.J.: Prentice Hall, 1991.


    From: Graduate Research Ethics: Cases and Commentaries - Volume 5, 2001 

    edited by Brian Schrag

  • Anonymous  Participant

    Posted 13 years and 1 month ago

    From: Graduate Research Ethics: Cases and Commentaries - Volume 5, 2001 

    edited by Brian Schrag


    This case examines the potentially negative outcomes that can occur when all aspects of one's actions are not taken into account. Here, perhaps out of ignorance, the engineers who were developing early prototypes of various medical imaging methods failed to appreciate the potential impact of these untested images on their volunteers. The heart of the problem lies in the cloudy statement of what the experiment's intentions were. As this was not a traditional scientific experiment in the sense of conducting a large number of trials with careful prescreening and a detailed statistical analysis of the results, the "human subjects research" aspects of the examinations are not necessarily clear.


    The NIH's "Guidelines for the Conduct of Research Involving Human Subjects" defines "research" as "any systematic investigation designed to develop or contribute to generalizable knowledge." The "try it out and see if it works" type of protocol in place during the tests described in this case study may defy the designation "systematic investigation." In any case, even if it were clear to the investigators from the start that they should take precautions because of the involvement of human subjects, it is not clear that that would have prevented this situation. Really, the only type of reasoning that could have prevented, or at least predicted, the situation described in this case is careful forethought about the full impact of the imaging tests.


    This discussion brings up two interesting points. First, is it enough to merely "predict" a situation such as the one described in this case? If so, at what point does it become necessary to "prevent" a situation, rather than just "predict" it? Is it too much to ask that a woman face the idea that someone may have detected cancer in her breast but can't be sure? These issues must be weighed against the fact that at some point a new medical device will have to be tested if it is ever going to come into regular clinical use. Second, how can we ensure that adequate forethought will precede every experiment without slowing the research process to a halt? Any given action has an uncountable number of potential effects; admittedly, most have very low chances of actually occurring. At what percentage chance of occurrence can one stop worrying about potential experimental side effects? How does this equation change with the severity of the side effect? A related issue concerns the danger that guidelines governing research will become too detailed to be of practical value. While it may not be common practice among the designers of noninvasive medical instruments, the type of forethought that this case begs for is certainly not crippling. In fact, it bears a close resemblance to the scientific design process used in developing such devices to begin with. The fundamental question one is really asking is, "What would happen if . . . ?", the same type of thought experiment that appears throughout the engineering design process. The only difference is that the "what" is an ethical concept rather than a scientific one.


    That means that the person asking the "what if" questions must be versed in issues of ethical importance. While that may mean additional training for members of the scientific or engineering design teams, or even the addition of special ethical consultants or overseers on certain projects, there is a significant benefit to this type of ethical thought experiment, just as there is to those of a scientific nature. When asked by knowledgeable individuals, such questions will provide a great deal of insight into how to steer the project's development to avoid serious ethical problems. In these days of detailed lines of accountability and the threat of serious financial repercussions for poor ethical decisions, the extra cost of such ethical training or expertise is easily returned with the avoidance of even one potential crisis. This case could be seen as an argument for applying the scientific model to the practice of research ethics.


Cite this page: "But For the Fear of What You Might Find Out" Online Ethics Center for Engineering 4/14/2006 OEC Accessed: Tuesday, May 21, 2019 <www.onlineethics.org/Resources/gradres/gradresv5/fear.aspx>