This material is designed to provide assistance to those involved in ethics education in physics. It is not intended to be a complete discussion of all topics in ethics relevant to the physics community. Rather, it is designed to give the reader some feel for the breadth of relevant topics, to point the reader towards useful resources, and to suggest ways in which this material could be addressed in a classroom setting.
The underlying premise of this work is that much has already been written about ethics in physics, but most of this existing material is not readily located by searching on the...
The underlying premise of this work is that much has already been written about ethics in physics, but most of this existing material is not readily located by searching on the terms “ethics” and “physics”. These chapters will not describe ethical issues and case studies in detail but instead will point the reader to sources that do supply the more detailed perspective. The intent is to identify resources that can conveniently be used as reading assignments in undergraduate or graduate level physics classes. Part of the challenge in making ethical decisions is dealing with the complexity that real-world situations introduce. For that reason, where possible sources in which physicists describe cases they have had personal experience with will be used.
Incorporated into the description of each resource will be suggestions on how to run a class discussion based on the material. It is hard to over-emphasize the usefulness of guided classroom discussion as a means for providing multiple perspectives and further insight into ethical issues. It is helpful to ground these discussions in the professional codes discussed in Chapter 1.
Chapter 0: Introduction: Pedagogy and Assessment
Using case studies
Managing class discussions
Other activities to engage the mind
About this guide
Chapter 1: Ethical Codes
Section 1.1: Introduction
Section 1.2: The American Physical Society Guidelines on Ethics
Section 1.3: Other American Institute of Physics codes
Section 1.4: Physics codes outside of the United States
Section 1.5: Codes from other fields
Section 1.6: Ethical standards implied by institutional policies
Section 1.7: Human subjects research issues: sometimes overlooked in physics
Chapter 2: Laboratory Practices
Section 2.1 Introduction
Section 2.2: Research misconduct and how it harms the scientific community
Section 2.3: Carelessness and how it harms the scientific community
Section 2.4: Computational physics
Section 2.5: Laboratory safety
Section 2.6: How common is research misconduct in physics?
Chapter 3: Data: Recording, Managing, and Reporting
Section 3.1: Introduction
Section 3.2: The lab notebook
Section 3.3: Data management and archiving
Section 3.4: Digital images
Section 3.5: Reporting results
Section 3.6: Case studies
Chapter 4: Publication Practices
Section 4.1: Introduction
Section 4.2: Authorship
Section 4.3: Citations
Section 4.4: Plagiarism
Section 4.5: Self-plagiarism, dual submission, and fragmented publication
Section 4.6: Errata and retractions
Section 4.7: Conflicts of interest
Section 4.8: Publication metrics
Section 4.9: Journal quality
Section 4.10: Publication in the electronic age
Chapter 5: Peer Review
Section 5.1: Introduction
Section 5.2: Fairness
Section 5.3 Participation
Section 5.4: Timeliness
Section 5.5: Confidentiality
Section 5.6: Conflicts of interest
Section 5.7: Career advancement
Section 5.8: Textbooks
Chapter 6: Underrepresented Groups in Physics
Section 6.1: Introduction—The need for diversity
Section 6.2: Statistics
Section 6.3: APS policy statements
Section 6.4: Explicit bias
Section 6.5: Systemic bias
Section 6.6: Implicit bias
Section 6.7: Programs of the American Physical Society and other organizations
Section 6.8: Role models
Chapter 7: Physics and Military Research
Section 7.1: Introduction
Section 7.2: The Manhattan Project
Section 7.3: The Strategic Defense Initiative
Section 7.4: Arms control in the age of nuclear weapons
Section 7.5: Dual-use technology
Section 7.6: General discussion prompts for the entire chapter
Chapter 8: Climate Change
Section 8.1: Introduction
Section 8.2: Observational data
Section 8.3: Some elements in a climate model
Section 8.4: Global Climate Models
Section 8.5: Focused action
Section 8.6: Broader action on climate change
Chapter 9: Communicating Science to the General Public
Section 9.1: Introduction
Section 9.2: Communicating about climate change
Section 9.3: Communicating with the media
Section 9.4: Communicating with political leaders
This chapter addresses ethical issues related to how experiments are performed. A number of well documented cases can be used to provide students with insight into why carefully designed procedures are important in the responsible conduct of research. Such procedures can both reduce the opportunity for research misconduct and increase the level of confidence others have in the research. For several decades, the computer has been an essential part of physics research, so issues in computational physics will be addressed as well. While the dividing line between laboratory procedures and data management is fuzzy, the latter will be addressed in the next chapter.
It is noteworthy that a clear connection exists between the elements of many scientific codes of ethics and a variety of descriptions of the nature of science as a discipline. In Elements of Ethics for Physical Scientists, Sandra Greer lays the foundation for subsequent discussions by asking the question, “What is Science?”. Instructors may want to consider assigning that chapter early in a course on ethics. Alternatively, the American Association for the Advancement of Science has an online book called Science for All Americans. Its first chapter, The Nature of Science, concisely describes what defines science as a discipline. While to the seasoned scientist, this information will seem straightforward, it can be helpful to use a reading like this to present all of these ideas to students in an organized way. Both of these sources can help students address questions like, “What are some of the considerations in designing a good experiment?”
Most definitions of research misconduct in science refer explicitly to fabrication and falsification. The Office of Research Integrity, whose charge is to direct U. S. Public Health Services research integrity activities, defines fabrication as “making up data or results and recording or reporting them” and falsification as “manipulating research materials, equipment, or processes, or changing or omitting data or results such that the research is not accurately represented in the research record.” The use of fabrication or falsification is considered research misconduct for several reasons, which include:
In the early 2000s, there were two highly publicized cases in the physics community resulting in findings of research misconduct, the cases of Victor Ninov and Hendrik Schön.
Very concisely, Victor Ninov was part of a collaboration at Lawrence Berkeley National Laboratory that in 1999 reported having discovered two new elements, 116 and 118. When other labs were unable to reproduce these results, LBNL re-examined their original data and tried rerunning their experiment. These efforts triggered an internal investigation that found that Ninov had created data for events that did not in fact exist.
Many accounts of the Ninov case are available. David Goodstein has a brief discussion in chapter 6 of his book On Fact and Fraud: Cautionary tales from the front lines of science. While Goodstein’s discussion of the incident is short, it is worth pointing out that he did have access to the investigation report, most of which has remained confidential. Physics Today has a concise but fairly comprehensive treatment. Students reading the New York Times article on this incident will be able to see how technical issues are covered in the popular press and gain insight into how research misconduct may impact public trust in science.
The ethical breach in this case is obvious: falsification of the research record. One aspect of the case that could be usefully explored in a classroom discussion is, what is it about the research environment that might have led Ninov to do what he did? It is likely that Ninov fully expected elements 116 and 118 would eventually be discovered. He presumably also recognized that in the first experiment, he was the only one who had direct contact with the data and with the data analysis program. In the follow up analysis and experiments, the fact that others had direct contact with the data was integral in their arriving at the conclusion that the research record had been falsified. Thus the Ninov case can be used as a springboard to discuss group dynamics in research collaborations. Collaborations are put together usually because no one person has both the time and expertise to complete all aspects of a project on their own. When trying to maintain the integrity of the project, what are the tradeoffs between relying primarily on trust among collaborators that each individual will act both responsibly and competently, as opposed to building up a robust system of independent checks within the group? Some students may be tempted to propose extensive independent checking; they could be reminded that science almost always operates with limited resources where efficiency is nearly always a concern. Students might also be encouraged to reflect on their personal experiences either in research or working with a partner in an advanced lab course.
A second aspect of this case that can be probed is the response of the research group to concerns that there might be a problem with the experiment. Often, ethics case studies focus on situations involving ethical ambiguity and/or unethical behavior. However, we should not overlook discussions of situations that demonstrate proper actions in challenging settings. For instance, asking the students to discuss how the evidence of falsification was obtained will give them insight into how scientists can police themselves. In addition, the actions of Ninov’s collaborators to retract that publication promptly upon realizing the data were problematic shows respect for the publication process.
Hendrik Schön was a researcher at Bell Labs, focusing primarily on the electrical properties of materials. He appeared to have an extraordinarily productive period there, publishing papers at a very high rate, including several in Nature and Science. There were, however, reproducibility problems with his work—not only were others outside of Bell Labs unable to reproduce much of what he reported, but even colleagues inside Bell Labs were having difficulty getting sufficient cooperation from him to attempt to reproduce his results or to perform independent tests to further study his samples. Concerns were also raised about his results being improbable and about the fact that it appeared he represented a single data set as arising from two different experiments. In 2002, Bell Labs commissioned an external investigation of his work, leading to findings of research misconduct and to his termination.
Three different approaches can be taken to studying this case. The quickest way to get a handle on it is to read two news articles from Physics Today., Both include links to earlier articles in Physics Today that reported on Schön’s research in a favorable light. A second approach is to read the report issued by the investigation committee. While the entire report is 129 pages long, the main body is only nineteen pages, with the rest being appendices, most of which would probably be optional reading. The report describes not only findings but also methods, so it would give students some insight into how research misconduct investigations are conducted.
An in depth discussion of the Schön case is found in Eugenie Samuel Reich’s book Plastic Fantastic: How the biggest fraud in physics shook the scientific world. Several chapters in this book that can be understood independently and hence can be used as moderate-length reading assignments. Of particular relevance to the topic of research misconduct are:
The Schön case is particularly well documented. It is common for students to ask how someone could expect to get away with the type of fabrication and falsification evident in Schön’s work. In addressing this, students can be directed to Reich’s book, which describes how Schön carefully assessed expectations in the field before he produced his results. In this regard, he was not that different from Ninov (who fully expected elements 116 and 118 to be discovered eventually).
Another issue that commonly comes up in discussing this case is the length of time Schön was working at Bell Labs before the problems with his data were brought to light. When we rely on reproducibility checks to weed out bad science (whether based on research misconduct or not), then there is a limit on how fast the community can respond. It is rare that a lab can check the work of someone else with just a few days of efforts. More commonly, equipment must be reconfigured (or even purchased), samples acquired, and time in a busy lab schedule blocked out in order to make the experiment happen. Moreover, there is not much incentive to only reproduce someone else’s work because such reproduction is not, in most cases, considered publishable. For an instructor planning on covering the Ninov, Schön, and cold fusion cases, it could be interesting to compare response times of the community in all three cases.
While the responsibilities of coauthors is a publication issue, it is relevant to discuss it at this point as well. It is noteworthy that Schön had a large number of coauthors but that all of them were apparently unaware of his fabrication and falsification of data. There are some challenging issues here of how to balance working with colleagues in an atmosphere of trust with the need to make sure your trust is not misplaced. As students explore this challenge, you can ask them to place themselves in the role of working in a collaboration. One way of helping to strike the balance is to be very open with your collaborators on your methods, your data, your analysis, and your level of confidence in your results. This can help set the standards for sharing in the collaboration.
Carefulness is not usually identified as a principle in ethics codes associated with the physical sciences. However, in professions such as medicine and engineering, most would agree that being careful is an ethical requirement. A doctor or an engineer may have lives in their hands, so the requirement for carefulness is obvious. Put another way, the harm carelessness can do to others is readily apparent in those professions.
How about in physics? Is there harm done to others by a careless physicist? As the cases discussed in this section illustrate, the answer can easily be yes. A physicist who performs research carelessly and then disseminates the results wastes not only the resources that were provided to perform the experiments but also the time and resources of other physicists who try to reproduce or extend flawed lines of research.
The scientific community has several mechanisms designed to detect careless research. One line of defense is peer review, both the formal peer review associated with the publication process as well as the informal peer review that takes place in seminars, conference talks, and casual conversations. While peer review may catch some carelessness, it will not catch all carelessness. For instance, a reviewer of a manuscript might easily detect a careless experimental design but is less likely to detect a carelessly implemented experimental procedure. A second line of defense is the principle of reproducibility: if no one else can reproduce the results of an experiment, then the original results are called into question. While this can be very effective at identifying out careless research, it can also require a lot of resources. Inefficient use of resources has an impact both on the physics community and on society at large.
To be clear, everyone makes mistakes, sometimes due to lack of expertise and sometimes just because humans are fallible. However, when the mistakes could have been easily prevented by more careful attention to experimental design and execution, that is an indication of carelessness. Bezuidenhout wrote an article on the importance of routine procedures in developing useful experimental data. The context of the article is a discussion of the Open Data movement—the effort to get scientists to freely share their raw data. The point she makes is that sharing raw data is of little value unless we fully understand the procedures used to generate that data. The article is written from a life sciences perspective and it could have used one more round of editing for style, but it can still make a valuable addition to an ethics discussion, especially for students just beginning to do research.
One consequence of carelessness can be self-deception. Irving Langmuir gave a talk on what he termed “pathological science” in 1953. A transcription of his talk appeared in Physics Today. Langmuir discussed the Davis and Barnes experiment, N rays, mitogenetic rays, the Allison Effect, Joseph Rhine’s ESP experiments and UFO sightings. These examples illustrate the importance of designing an experimental procedure that minimizes the chance of producing self-deception. While the focus is more on the experimental procedures, to some extent, the same principles apply to the procedures one might choose to employ in data analysis. The history of N rays can be found in several sources, but perhaps the most relevant one is the paper R. W. Wood submitted to Nature documenting the way in which poorly designed experiments left the researchers open to self deception.
To illustrate debate in the physics community, one could ask students to read a set of letters to the editor in response to the “Pathological Science” article. Clusters of letters in response to articles often appear in Physics Today. Usually they are followed by a response from the article’s author, but that was not possible in this case since Langmuir was no longer alive when “Pathological Science” was published.
Since the experiments described by Langmuir took place well before his talk in 1953, the equipment and related procedures are dated. Some of the terminology may therefore be a bit challenging for students. For the most part, this issue will not prevent students from realizing that most of these experiments are flawed due to their reliance on humans to detect weak signals directly. When students make this connection, they may tend to dismiss these cases as completely irrelevant since in the modern lab such data would be acquired by instrumentation, not by eye. There are two ways in which these cases have continuing relevance, however. First, they serve as readily understood reminders of the need for vigilance in avoiding self deception. Second, they remind us that in less formal settings (i.e., outside of the laboratory) it is common for scientists and non-scientists alike to fall victim to self deception.
On March 23, 1989, University of Utah chemist Stanley Pons announced that he and Martin Fleischmann had observed room temperature fusion involving deuterium nuclei in a tabletop experiment. They further claimed to have seen significant excess heat generated by their experiment. Soon after that, Steven Jones of Brigham Young University announced that he, too, had observed evidence for fusion, but in a somewhat different tabletop experiment. These were the first of what came to be known as the cold fusion experiments. A concise overview of some essentials of the cold fusion saga can be found in two Physics Today news articles., The Levi article16 covers the initial announcement, the inconsistency between reported excess heat flows and reported rate of neutrons observed, attempts by groups at other universities to replicate the experiment, and theoretical considerations for the experiment. The subsequent article by Goodwin17 focuses on the difficulties that arose due to most information coming through press conferences and newspaper reports rather than through refereed journals. This article then moves on to recount requests by Pons, Fleischmann, and the University of Utah for federal funding to pursue this research. Glenn Seaborg suggested to Energy Secretary James Watkins that a committee be formed to study the validity of the cold fusion claims. The resulting DOE report noted that tens of millions of dollars had been spent on cold fusion research in the short time since Pons and Fleischmann’s announcement. The report concluded that there was no convincing evidence that excess heat generation would be sufficient for commercial applications or that excess heat generation was linked to a nuclear process. Additional discussion can be found in letters to the editor.,
Chapter 5 of Goodstein’s book On Fact and Fraud provides a different perspective on the story.5 In addition to covering much of what is covered in the Physics Today articles, Goodstein goes into some detail describing the activities of a Cal Tech research group, members of which he knew personally, as they prepared for and presented their conclusions at a May 1, 1989 session of an American Physical Society. They reported that the original experiments could not be replicated and the claims of Pons and Fleischmann were inconsistent with well established nuclear theory. Interestingly, Goodstein then discusses in length the work of an Italian group led by Franco Scaramuzzi that seemed to find evidence for some form of cold fusion, although not necessarily by the mechanism that Pons and Fleischmann originally reported. In a classroom discussion of this chapter, it can be useful to compare the way in which Pons and Fleischmann approached cold fusion research with the way Scaramuzzi’s group approached it.
A review of Goodstein’s book appeared in Physics Today. This review was followed a few issues later by two letters to the editor and a response by the author of review, with the focus being on reproducibility.
John Huizenga, co-chair of a Department of Energy committee tasked with investigating reports of cold fusion, wrote a book on the cold fusion saga entitled, Cold Fusion: The Scientific Fiasco of the Century. As a physicist, he is able to provide relevant technical background in a way that an undergraduate physics student can follow. While the entire book is worth reading, it is possible to lead discussions based on selected chapters. Most of the chapters can be understood and put into context based on Chapter I.
There are two other noteworthy books on cold fusion that might be relevant to a discussion of ethics, Frank Close’s Too Hot to Handle: The Race for Cold Fusion, and Gary Taubes’s Bad Science: The Short Life and Weird Times of Cold Fusion. The book by Close is a good choice for an instructional setting where you can assign the entire book. Its technical detail is at a level similar to Huizenga’s book. The book by Taubes is written from a journalist’s perspective, so the students who read it may not be as effectively connected to the science behind the experiments; on the other hand, it does provide a valuable perspective. In an ideal situation, a classroom discussion could be based on each student having read one of these three books so that all three perspectives are present.
Computational physics often takes the form of developing a model of a physical system, reducing that model to a set of mathematical equations, and employing a computer to find numerical solutions to those equations. In the field of computational physics, one might consider the computer(s) to be analogous to the lab and the computer coding analogous to an experimental procedure. The endpoint of carrying out the experimental procedure or running the computer code is a body of data, which then must be interpreted. Just as it is possible to falsify results in experiments, it is also possible to falsify results in computations. Just as it is possible to intentionally design an experimental procedure to generate misleading results, it is possible to intentionally construct a poor model or mathematical rendering of the model to get misleading computational results. Careless computer coding can cause the same inaccuracies as careless experimental execution.
Since computational physics is a much newer field than experimental physics, a lot less is written about the field and ethical codes tend to overlook it. There is, however, an interesting article about computational modeling in mechanical engineering that covers much the same territory that one on computational physics might. The authors develop ethical standards for developing and using computational models and simulations, and they base their standards in part on the results of consulting experts in computation in various fields of engineering and science. The authors illustrate the importance of some of their proposed standards by briefly summarizing real cases in which developers or users failed to follow the standards with disastrous consequences. The paper at times assumes that the code developer and the end user are two different people, which is often not true in computational physics. Even so, some of the considerations raised in the paper are helpful in cases where a computational physicist later passes their own code on to someone else, such as a new member of their research group. Issues addressed by the authors include care in model development, choosing appropriate algorithms to solve the model equations, documenting the code properly, validating the code thoroughly, and representing the results fairly.
When one looks at the broad spectrum of physics experiments, many hazards may be present: high voltage, lasers or other intense light sources, radioactive sources, strong magnets, etc. Adherence to appropriate safety standards, setting a good example, teaching others about safety standards as appropriate, and preventing injury to those who are not a part of the research group are among the ethical principles associated with experimentation.
The consequences for giving insufficient attention to safety can be severe. A letter to the editor of Physics Today describes an accident involving shop equipment leading to the death of a high school physics student. In July of 2014, an intern at Los Alamos National Laboratory was exposed a brief, intense flash of light. The exposure resulted in a small hole being burned into her retina and a consequent degradation of her eyesight. A lengthy investigative report brought up numerous problems with safety procedures in the lab. The report may be a bit much for some undergraduate students to digest, but its critique of the laboratory in question will provide a sense for what a robust safety program should look like. Two employees lost their job as a result of this incident.
Richard Feynman relates a story about enriched uranium during the Manhattan Project. Oak Ridge was designing a plant for the enrichment, and some people involved in the project did not fully appreciate the importance of not allowing the accumulation of too much enriched uranium in a small space. Despite the fact that the Army wanted to keep this highly classified information compartmentalized, Feynman convinced the officer in charge that safety procedures would be followed more closely if the people working on the plant were given enough background about the nature of nuclear reactions to understand why this aspect of safety was so important. Once the plant workers understood why these procedures were needed, compliance improved significantly.
As far as resources are concerned, research-oriented universities generally have formal safety policies and procedures and one or more safety offices. There are numerous safety manuals available online. For instance, as of this writing, Princeton has a publically available, comprehensive manual. Due to its length, it would not be feasible to have students read the entire manual. Having the students read one or two sections (and perhaps having different students read different sections) will be enough to promote a discussion on general safety considerations.
There are two well known cases from the field of physics in which findings of research misconduct have been documented: the Schön and Ninov cases. The Ninov case was discussed earlier in this chapter and the Schön case will be discussed in the next chapter. When addressing well documented instances of misconduct, it is helpful to keep the perspective that these cases are outliers both in terms of the impact they have had and the extent to which they are documented. Even though they are more dramatic than most cases of misconduct, they are worth studying because we know so much about them.
To get a feel for how common research misconduct in the field of physics is, it is helpful to examine survey data. A survey of physicists in the United States was taken in 2004 in order to learn about perceptions related to ethics in the community. The response rate by early career physicists (PhD students and those having recently received a PhD) was larger than other categories, and 39% of them reported having direct knowledge of ethical transgressions related to research. They also reported serious concerns over mistreatment of subordinates, and how this mistreatment could, at times, foster research misconduct. The article about the survey includes both quantitative and qualitative results from the research. A more recent survey took a different approach, using a much smaller number of subjects but involving in depth interviews. The survey of physicists in the United States and United Kingdom focused on whether there is a gender difference in perception of ethical issues. Among the findings were that some physicists view women in physics as being more ethical and men as being more competitive. The interviews indicated a tendency to perceive competitive behavior as having an inherently unethical component. These researchers also found that physicists were encountering a wide range of ethically ambiguous situations. The detailed results appear in Science and Engineering Ethics, while a brief version is found in a Physics Today commentary.
The author is grateful for the time and effort of the anonymous reviewers of this work, and for their numerous helpful suggestions.
 Greer, Sandra C. Greer, Elements of Ethics for Physical Scientists (MIT Press, Cambridge, MA, 2017).
 American Association for the Advancement of Science, Science for All Americans. Chapter 1: The Nature of Science. (American Association for the Advancement of Science, Washington, DC, 1989). http://www.project2061.org/publications/sfaa/online/chap1.htm (accessed September 20, 2019).
The Definition of Research Misconduct, The Office of Research Integrity, https://ori.hhs.gov/definition-misconduct (accessed September 20, 2019).
 Jonathan D. Quick and Heidi Larson, “The Vaccine-Autism Myth Started 20 Years Ago. Here’s Why it Still Endures Today,” Time February 18, 2018. https://time.com/5175704/andrew-wakefield-vaccine-autism/ (accessed September 18, 2019).
 David Goodstein, On Fact and Fraud: Cautionary tales from the front lines of science, (Princeton University Press. Princeton, NJ, 2010).
Bertram Schwarzschild, “Lawrence Berkeley Lab Concludes that Evidence of Element 118 Was a Fabrication,” Physics Today 55 (9) 15 (2002). https://physicstoday.scitation.org/doi/full/10.1063/1.1522199
 George Johnson, “At Lawrence Berkeley, Physicists Say a Colleague Took Them for a Ride,” New York Times October 10, 2002. https://www.nytimes.com/2002/10/15/science/at-lawrence-berkeley-physicists-say-a-colleague-took-them-for-a-ride.html
 Barbara Gross Levi, “Bell Labs Convenes Committee to Investigate Questions of Scientific Misconduct,” Physics Today 55 (7) 15-16 (2002). https://doi.org/10.1063/1.1506737
 Barbara Gross Levi, “Investigation Finds that One Lucent Physicist Engaged in Scientific Misconduct,” Physics Today 55 (11) 15-17 (2002). https://doi.org/10.1063/1.1534995
 Lucent Technologies, “Report of the Investigation Committee on the Possibility of Scientific Misconduct in the Work of Hendrik Schön and Coauthors, September 2002,” https://media-bell-labs-com.s3.amazonaws.com/pages/20170403_1709/misconduct-revew-report-lucent.pdf (accessed September 28, 2019).
 Eugenie Samuel Reich, Plastic Fantastic: How the biggest fraud in physics shook the scientific world, (Palgrave Macmillan New York, NY, 2009).
 Louise Bezuidenhout, “Variations in Scientific Production: What Can We Learn from #Overlyhonestmethods?,” Science and Engineering Ethics 21 (6) 1509-1523 (2015). https://doi.org/10.1007/s11948-014-9618-9
 Irving Langmuir and Robert N. Hall, Physics Today 42 (10) 36 (1989) ; https://doi-org.ezproxy.emich.edu/10.1063/1.881205
 R. W. Wood, Nature 70 530-531 (1904). https://www.nature.com/articles/070530a0.pdf.
 Christopher Cooper et al., “Second Opinions on ‘Pathological Science’”. Physics Today 43 (3) 13-14 and 105-112 (1990). https://doi.org/10.1063/1.2810480
 Barbara G. Levi, “Doubts Grow as Many Attempts at Cold Fusion Fail,” Physics Today 42 (6) 17-19 (1989). https://doi.org/10.1063/1.2811042
 Irwin Goodwin, “Fusion in a Flask: Expert DOE Panel Throws Cold Water on Utah ‘Discovery’,” Physics Today 42 (12) 43-45 (1989). https://doi.org/10.1063/1.2811241
 W. Peter Trower, “Cold Fusion as Seen with X-Ray Vision,” Physics Today 43 (7) 13 (1989). https://doi.org/10.1063/1.2811074
 Leaf Turner et al., “Thoughts Unbottled by Cold Fusion,” Physics Today 42 (9) 140-144 (1989). https://doi.org/10.1063/1.2811168
 Bernard J. Feldman, “On Fact and Fraud: Cautionary Tales from the Front Lines of Science” (book review). Physics Today 63 (7) 50 (2010). https://doi.org/10.1063/1.3463629
 Fred McGalliard, Scott R. Chubb, and Bernard J. Feldman, “Cold Fusion and Reproducibility, Physics Today 63 (11) 11 (2010). https://doi.org/10.1063/1.3518197
 John R. Huizenga, Cold Fusion: The Scientific Fiasco of the Century, (University of Rochester Press, Rochester, NY, 1992).
 Frank Close, “Too Hot to Handle: The Race for Cold Fusion,” (Princeton University Press, Princeton, NJ, 1991).
 Gary Taubes, “Bad Science: The Short Life and Weird Times of Cold Fusion,” (Random House, New York, NY, 1993).
 David J. Kijowski, Harry Dankowicz, and Michael C. Loui, “Observations on the Responsible Development and Use of Computational Models and Simulations,” Science and Engineering Ethics 19 (1) 63-81 (2013). DOI:10.1007/s11948-011-9291-1
 Irving E. Dayton, “Student lab safety emphasized,” Physics Today 64 (8) 9 (2011). https://doi.org/10.1063/PT.3.1196
 Connon R. Odom, “Los Alamos Laser Eye Injury Investigation,” LA-UR-05-0282 (2004). https://permalink.lanl.gov/object/tr?what=info:lanl-repo/lareport/LA-UR-05-0282
 Rebecca Trounson, “Safety and Security Breaches at Los Alamos Nuclear Lab Cost 5 their Jobs,” Los Angeles Times, September 16, 2004. https://www.latimes.com/archives/la-xpm-2004-sep-16-na-alamos16-story.html
 Richard P. Feynman, “Surely You’re Joking, Mr. Feynman!”: Adventures of a curious character (W. W. Norton & Company, New York, NY, 1985), pp. 120-125.
 Princeton University Office of Environmental Health and Safety. Laboratory Safety Manual. https://ehs.princeton.edu/laboratory-research/laboratory-safety/laboratory-safety-manual (accessed September 2, 2019).
 Kate Kirby and Frances A. Houle, “Ethics and the Welfare of the Physics Profession,” Physics Today 57 (11) 42-46 (2004). https://doi.org/10.1063/1.1839376
 Elaine Howard Ecklund, “A Gendered Approach to Science Ethics for US and UK Physicists,” Science and Engineering Ethics 23 (1) 183-201 (2017). https://doi.org/10.1007/s11948-016-9751-8
 David R. Johnson and Elaine Howard Ecklund, “Ethical Ambiguity in Science,” Science and Engineering Ethics 22 989-1005 (2016). https://doi.org/10.1007/s11948-015-9682-9
 Elaine Howard Ecklund, David R. Johnson, and Kristin R. W. Matthews, “Commentary: Study highlights ethical ambiguity in physics,” Physics Today 68 (6) 8 (2015). https://doi.org/10.1063/PT.3.2796