This material is designed to provide assistance to those involved in ethics education in physics. It is not intended to be a complete discussion of all topics in ethics relevant to the physics community. Rather, it is designed to give the reader some feel for the breadth of relevant topics, to point the reader towards useful resources, and to suggest ways in which this material could be addressed in a classroom setting.
The underlying premise of this work is that much has already been written about ethics in physics, but most of this existing material is not readily located by searching on the...
The underlying premise of this work is that much has already been written about ethics in physics, but most of this existing material is not readily located by searching on the terms “ethics” and “physics”. These chapters will not describe ethical issues and case studies in detail but instead will point the reader to sources that do supply the more detailed perspective. The intent is to identify resources that can conveniently be used as reading assignments in undergraduate or graduate level physics classes. Part of the challenge in making ethical decisions is dealing with the complexity that real-world situations introduce. For that reason, where possible sources in which physicists describe cases they have had personal experience with will be used.
Incorporated into the description of each resource will be suggestions on how to run a class discussion based on the material. It is hard to over-emphasize the usefulness of guided classroom discussion as a means for providing multiple perspectives and further insight into ethical issues. It is helpful to ground these discussions in the professional codes discussed in Chapter 1.
Chapter 0: Introduction: Pedagogy and Assessment
Using case studies
Managing class discussions
Other activities to engage the mind
About this guide
Chapter 1: Ethical Codes
Section 1.1: Introduction
Section 1.2: The American Physical Society Guidelines on Ethics
Section 1.3: Other American Institute of Physics codes
Section 1.4: Physics codes outside of the United States
Section 1.5: Codes from other fields
Section 1.6: Ethical standards implied by institutional policies
Section 1.7: Human subjects research issues: sometimes overlooked in physics
Chapter 2: Laboratory Practices
Section 2.1 Introduction
Section 2.2: Research misconduct and how it harms the scientific community
Section 2.3: Carelessness and how it harms the scientific community
Section 2.4: Computational physics
Section 2.5: Laboratory safety
Section 2.6: How common is research misconduct in physics?
Chapter 3: Data: Recording, Managing, and Reporting
Section 3.1: Introduction
Section 3.2: The lab notebook
Section 3.3: Data management and archiving
Section 3.4: Digital images
Section 3.5: Reporting results
Section 3.6: Case studies
Chapter 4: Publication Practices
Section 4.1: Introduction
Section 4.2: Authorship
Section 4.3: Citations
Section 4.4: Plagiarism
Section 4.5: Self-plagiarism, dual submission, and fragmented publication
Section 4.6: Errata and retractions
Section 4.7: Conflicts of interest
Section 4.8: Publication metrics
Section 4.9: Journal quality
Section 4.10: Publication in the electronic age
Chapter 5: Peer Review
Section 5.1: Introduction
Section 5.2: Fairness
Section 5.3 Participation
Section 5.4: Timeliness
Section 5.5: Confidentiality
Section 5.6: Conflicts of interest
Section 5.7: Career advancement
Section 5.8: Textbooks
Chapter 6: Underrepresented Groups in Physics
Section 6.1: Introduction—The need for diversity
Section 6.2: Statistics
Section 6.3: APS policy statements
Section 6.4: Explicit bias
Section 6.5: Systemic bias
Section 6.6: Implicit bias
Section 6.7: Programs of the American Physical Society and other organizations
Section 6.8: Role models
Chapter 7: Physics and Military Research
Section 7.1: Introduction
Section 7.2: The Manhattan Project
Section 7.3: The Strategic Defense Initiative
Section 7.4: Arms control in the age of nuclear weapons
Section 7.5: Dual-use technology
Section 7.6: General discussion prompts for the entire chapter
Chapter 8: Climate Change
Section 8.1: Introduction
Section 8.2: Observational data
Section 8.3: Some elements in a climate model
Section 8.4: Global Climate Models
Section 8.5: Focused action
Section 8.6: Broader action on climate change
Chapter 9: Communicating Science to the General Public
Section 9.1: Introduction
Section 9.2: Communicating about climate change
Section 9.3: Communicating with the media
Section 9.4: Communicating with political leaders
This chapter looks at several different aspects of recording, managing and reporting data. In many cases, concrete, specific standards are lacking even though there are widely accepted general principles. To provide more insight into the general principles, the chapter concludes with a discussion of three well documented case studies. Instructors planning to address any of the issues outlined in Sections 3.1 – 3.5 may wish to look ahead to the case studies to see if any of these cases will support their intended focus.
The National Science Foundation has a policy on Data Management Plans that provides a useful starting point for discussion of data-related issues. The policy indicates that prompt dissemination of results is an expectation of all grant awardees. Investigators are also expected to share their primary data with others, although there are a few exceptions to this part of the policy. There must also be a plan for retaining data for an appropriate length of time. One of the goals of this policy is to ensure that investigators have the information needed to answer questions about their disseminated research. The APS Guidelines on Ethics covers similar issues in Section I: The Research Record and Publication Results, Subsection: Research Results.
What are the areas of overlap between the NSF Data Management Policy and the APS Guidelines on Ethics?
There was a time when details about experimental procedures and results were all recorded in a single place, the lab notebook. A variety of guidelines, such as numbering your pages and writing only in ink, were designed to provide a lasting record of the experimental details so that they could be studied later both by the investigators and by others outside the research group. If someone were to question the legitimacy of a published result, the lab notebook could be referred to as the best available documentation for what happened during the experiment. The traditional lab notebook was an essential element in maintaining trust within the scientific community. As electronic instruments began taking over the duties of recording data, it became possible to accumulate larger quantities of data. This development in turn led to the accumulation of bodies of data so large that they would not easily fit into a lab notebook. Now it is common for most data to reside in computer memory while other information about the underlying experiments may be found either in a traditional handwritten lab notebook or in a file, either on a lab-based computer or on a remote server in the cloud. Currently, no single approach is used throughout all of physics for collecting and archiving data.
Similarly, those in the physics community who continue to use lab notebooks do not appear to have a formal consensus on how they should be structured. An article in the American Journal of Physics identifies a fair amount of diversity in the structure of lab notebooks. Some standard procedures for hardcopy lab notebooks include using a bound book with numbered pages, writing in ink, deleting information only by crossing out (not by erasing), dating entries, and writing out complete descriptions of experiments. These procedures are designed to maintain the integrity of the research record and to make it easy for others to review that record. Recognizing that no system is foolproof, a reasonable goal for electronic forms of lab notebooks is that they be as difficult to tamper with as their hard copy predecessors.
While students are used to owning the notebook they keep for lab-based course work, it is important for them to recognize that the research lab notebook is often considered to be the property of the lab, not of the individual who makes entries into it. As such, it should be understandable to others working in that lab, and it should remain in the lab, even after that individual leaves the institution.
As noted in Chapter 2, expanding the frontiers of scientific knowledge is a community activity. This is an important principle to keep in mind when acquiring and archiving data. Data should be recorded in a way that is not only understandable to the individual, but also to that person’s collaborators and to others outside the immediate circle. Some inexperienced researchers, in their haste to push through experiments, may make some of the following mistakes:
Not much has been written in peer reviewed journals on good practices for data management and retention, particularly in the area of physical sciences. Large research collaborations typically have carefully designed data management plans, and the size of their group affords them the opportunity to have people who devote a significant amount of effort to maintaining the integrity of the data. On the other hand, limited guidance is available for smaller collaborations who may not have the same resources to maintain the integrity of the data. There are two open access articles that provide some helpful advice about issues to consider when deciding how to manage data. While they are written from the perspective of the life sciences, much of the content is relevant to all data-intensive fields of science. The first article provides tips for constructing the type of data management plans required by many funding agencies. The second article provides considerations in deciding how to archive data. Both articles are relatively short and can easily be put together in a single reading assignment. They also provide a starting point for discussing data management and retention issues that arise in the Schön and Ninov cases discussed below.
The ease with which digital images can be inappropriately manipulated has become a concern among publishers of scientific journals. An article by Parrish and Noonan looks at case histories of research misconduct involving image manipulation. While the focus is on the life sciences, many of the issues raised are relevant to fields of physics where digital images are used.
The Physical Review Letters Information for Authors states, “Figures should accurately present the scientific results. If adjustments to images, such as changing its brightness, are made, state the adjustment in the figure caption.”
Science magazine advises, “Science does not allow certain electronic enhancements or manipulations of micrographs, gels, or other digital images. Figures assembled from multiple photographs or images, or non-concurrent portions of the same image, must indicate the separate parts with lines between them. Linear adjustment of contrast, brightness, or color must be applied to an entire image or plate equally. Nonlinear adjustments must be specified in the figure legend. Selective enhancement or alteration of one part of an image is not acceptable. In addition, Science may ask authors of papers returned for revision to provide additional documentation of their primary data.”
The Office of Research Integrity (ORI), whose charge is to direct U. S. Public Health Services research integrity activities, has a comprehensive set of recommendations for treatment of digital images in their Guidelines for Best Practices in Image Processing. The main page for these recommendations has individual boxes for each of the twelve guidelines. Note that clicking on any box sends you to a new page with more details about that guideline, making this a more extensive reading assignment than it initially appears
An instructor wishing to address the topic of digital images might want to have their students read the ORI guidelines as well as the brief statements from Physical Review Letters and Science. The instructor could then draw from the Parrish and Noonan paper some examples of inappropriate digital image manipulation.
One of the most challenging areas to explore from an ethical perspective is that of reporting scientific results. Large grey areas exist when it comes to deciding what can be reported versus what must be reported, as well as how to present results to highlight patterns without suggesting trends for which insufficient evidence exists. In a commencement address, Richard Feynman set a high standard in this regard:
It’s a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty—a kind of leaning over backwards. For example, if you’re doing an experiment, you should report everything that you think might make it invalid—not only what you think is right about it: other causes that could possibly explain your results; and things that you thought of that you’ve eliminated by some other experiment, and how they worked—to make sure the other fellow can tell if they have been eliminated….
In summary, the idea is to try to give all of the information to help others to judge the value of your contribution, not just the information that leads to judgment in one particular direction or another.
It may be valuable for the instructor to acknowledge that Feynman’s suggestion is easier to follow if one’s scientific reputation is already firmly established. Someone trying to make a mark in the field of physics may be hesitant to express any uncertainty in the ideas they put forth. Nevertheless, this standard of scientific honesty benefits scientific community and, in the long run, the individual scientist.
It is also important to remind students that misleading presentation of research results can have a negative impact on how the physics community relates to society at large, for instance leading society to make bad choices. Pickett and Roche report results from a survey of the general public that indicates a desire for harsh consequences not only for those who commit fraud in research but also for those who mislead the public through selective reporting of results. While the Pickett and Roche paper does bring to light valuable information, there are ways in which it is written that might not be helpful for students (such as over generalizations in the introduction) so it is probably best used as background reading for the instructor.
Finally, one of the foundational principles of science is that results of experiments should be reproducible. It is generally expected that results of laboratory-based experiments are not reported until they have been checked for reproducibility by that group. Moreover, one of the primary reasons that research groups are expected to report fully on their experimental methods and to maintain good records of their data is to allow other research groups to try to reproduce their results.
At this time, few articles have been written about ethical representation of data in physics. Instructors wishing to address the issue may find the Millikan oil drop experiment, discussed below, a good way to launch discussion.
As discussed in Chapter 2, many accounts of the Victor Ninov case are available. David Goodstein’s, On Fact and Fraud: Cautionary tales from the front lines of science covers the case briefly, based in part to his access to the investigation report, most of which has remained confidential. For a more comprehensive treatment, see Physics Today. One issue that could be discussed in this case is whether enough safeguards were in place to protect the integrity of the research data. Students may not have enough information to answer that question, however, so posing a hypothetical situation may help: Having read about the Ninov case, imagine you are involved in a collaboration of 5-10 people. The experiment generates large quantities of data that are stored electronically. What steps could your group take to reduce the chances of one group member altering the raw data in an undetectable way?
The essentials on the case of Hendrik Schön are covered in two news articles from Physics Today., Schön was employed by Bell Labs, doing research on the electrical properties of materials. Of particular relevance to this chapes trnal of Physics thas to investigate Sch research.the language is not particularly refined.ives trnal of Physics thathis chapter is the admission by Schön that he substituted an analytically calculated curve for experimental data. Another good reference is the report issued by the committee set up by Bell Labs to investigate Schön’s research. While the entire report is 129 pages long, the main body is only nineteen pages, with the rest being appendices, most of which would probably be optional reading. Among the issues worth bringing out in discussion of the importance of proper recordkeeping are:
There are a number of sources one can consult on ethical issues related to the Millikan oil drop experiment. Two key concerns have arisen as a result of historians studying his lab notebooks. First, in his lab notebooks he had recorded measurements on a larger number of drops than were reported in his 1913 paper. Did he select data to report in an appropriate way, or was his data selection based on displaying only what would support his hypothesis? Even if one concludes that his data selection was appropriate, a second issue arises when Millikan’s paper states, “It is to be remarked, too, that this is not a selected group of drops but represents all of the drops experimented upon during 60 consecutive days….”. In light of the evidence in Millikan’s lab notebooks, is there any way in which that statement can be viewed as truthful?
A paper by Richard C. Jennings examines some of these issues. While some of the conclusions drawn in the paper do not necessarily seem consistent with the factual information introduced, the paper nevertheless serves as a good springboard for discussion. That said, many physicists may be bothered by the way Jennings treats fractional charge. A few people (mostly non-physicists) have suggested that some of Millikan’s omitted data might have contained evidence for fractional charge and that thus Millikan perhaps hindered the development of quark theory, which involves charges of value e/3 and 2e/3, by not publishing this data. However, current quark theory indicates it would not be possible to observe fractional charge in the oil drop experiment. Moreover, if it were, there would almost certainly have been copious experimental confirmation of this observation during the century that has elapsed since Millikan’s paper was published.
Allan Franklin has written extensively about the Millikan oil drop experiment. In one of Franklin’s works, he introduces the experiment with some basic physics, accessible to first or second year physics students, to help the reader understand how the experiment worked. He then discusses issues related to how Millikan analyzed his data. Franklin concludes that Millikan’s exclusion of five drops as well as his selective use of analytical techniques could not be justified. He points out, though, that the effect of these actions was not to change significantly the proposed value of the charge of the electron but rather to reduce the apparent uncertainty in that proposed value.
One of the questions a discussion based on this case can address is when it is alright to exclude some measurements from reported results. It may be that some students take the position that all measured values should be reported. To help the discussion along, one might ask, what is meant by data. To help clarify what must be reported, data can be usefully defined as measured values or other information acquired by following a well-defined experimental protocol. As an extreme case, if a student timing the period of a pendulum dozes off during one trial, causing the measured time to be much too large, that measured value would not satisfy the definition of a data point. Likewise, if a student records a series of measurements before realizing that a critical instrument has not yet been calibrated, those measured values would not be considered data. This definition of data allows one to exclude measured values that can easily be dismissed as irrelevant.
There remains however, the case of measured values that seem far outside the trend apparent in other data but for which there is no obvious breach of experimental protocol. One approach to handling such apparent outliers is to report them and then state explicitly that they will be excluded from subsequent analysis. This clarifies exactly how the data set has been narrowed and provides the reader with the tools to examine the impact of those data points being neglected.
Three other considerations that may also be helpful to bring out in a discussion of the oil drop experiment:
The author is grateful for the time and effort of the anonymous reviewers of this work, and for their numerous helpful suggestions.
 National Science Foundation, Directorate of Mathematical and Physical Sciences, Division of Physics, “Advice to PIs on Data Management Plans”, January 2, 2018. https://www.nsf.gov/bfa/dias/policy/dmpdocs/phy.pdf (accessed September 18, 2019).
 American Physical Society Guidelines on Ethics (19.1) (2019). https://www.aps.org/policy/statements/guidlinesethics.cfm (accessed September 18, 2019)
 Jacob T. Stanley and H. J. Lewandowski, “Recommendations for the use of notebooks in upper-division physics lab courses,” American Journal of Physics 86 (1) 45-53 (2018). https://doi.org/10.1119/1.5001933
 William K. Michener, “Ten Simple Rules for Creating a Good Data Management Plan”, PLOS Computational Biology 11 (10): e1004525 (2015). https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4619636/ doi: 10.1371/journal.pcbi.1004525
 Edmund M. Hart, et al., “Ten Simple Rules for Digital Data Storage, PLOS Computational Biology 12 (10): e1005097 (2016). https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5072699/ doi: 10.1371/journal.pcbi.1005097
 Debra Parrish and Bridget Noonan, “Image Manipulation as Research Misconduct,” Science and Engineering Ethics 15 (2) 161-167 (2009). https://doi.org/10.1007/s11948-008-9108-z
 Physical Review Letters, “Information for Authors,” https://journals.aps.org/prl/authors (accessed September 23, 2019)
 Science, “Instructions for preparing an initial manuscript,” https://www.sciencemag.org/authors/instructions-preparing-initial-manuscript (accessed September 23, 2019)
 Office of Research Integrity, “Guidelines for Best Practices in Image Processing,” https://ori.hhs.gov/education/products/RIandImages/guidelines/list.html (accessed September 23, 2019).
 Richard P. Feynman, “Surely You’re Joking, Mr. Feynman!”, (W. W. Norton & Company, New York, NY, 1985), p. 341.
 Justin T. Pickett and Sean Patrick Roche, “Questionable, Objectionable or Criminal? Public Opinion on Data Fraud and Selective Reporting in Science,” Science and Engineering Ethics 24 (1) 151-171 (2018). https://doi.org/10.1007/s11948-014-9618-9
 David Goodstein, On Fact and Fraud: Cautionary tales from the front lines of science, (Princeton University Press. Princeton, NJ, 2010).
Bertram Schwarzschild, “Lawrence Berkeley Lab Concludes that Evidence of Element 118 Was a Fabrication,” Physics Today 55 (9) 15 (2002). https://physicstoday.scitation.org/doi/full/10.1063/1.1522199
 Barbara Gross Levi, “Bell Labs Convenes Committee to Investigate Questions of Scientific Misconduct,” Physics Today 55 (7) 15-16 (2002). https://doi.org/10.1063/1.1506737
 Barbara Gross Levi, “Investigation Finds that One Lucent Physicist Engaged in Scientific Misconduct,” Physics Today 55 (11) 15-17 (2002). https://doi.org/10.1063/1.1534995
 Lucent Technologies, “Report of the Investigation Committee on the Possibility of Scientific Misconduct in the Work of Hendrik Schön and Coauthors, September 2002,” https://media-bell-labs-com.s3.amazonaws.com/pages/20170403_1709/misconduct-revew-report-lucent.pdf (accessed September 28, 2019).
 R. A. Millikan, “On the Elementary Electrical Charge and the Avogadro Constant,” Physical Review 2 (2) 109-143 (1913), see especially p. 138. https://doi.org/10.1103/PhysRev.2.109
Richard C. Jennings, “Data Selection and Responsible Conduct: Was Millikan a Fraud?” Science and Engineering Ethics 10 (4) 639-653 (2004). https://link.springer.com/article/10.1007/s11948-004-0044-2
 Allan Franklin, “Selectivity and the Production of Experimental Results,” Archives for History of Exact Sciences 53 (5) 399-485, see especially pp. 422-431 (1998). https://link.springer.com/article/10.1007/s004070050031