Privacy and Confidentiality: As Related to Human Research in Social and Behavioral Science (Research Involving Human Participants V2)

Author(s): Joan E. Sieber

Abstract

As social and behavioral research expands to involve diverse populations, contexts, and sensitive topics, it raises many complex issues of privacy and confidentiality. These issues go beyond what is contained in textbooks or known to most researchers and Institutional Review Board (IRB) members. The literature on these issues is found in a variety of applied research publications in applied statistics, program evaluation, criminology, education, economics, and demography, but not in the mainstream social science literature. Understanding and solving some of these problems requires knowledge and resources that can be found via major research libraries, but only if one knows what to look for. IRBs cannot be expected to generate and disseminate the needed resources and body of knowledge on their own, though many have made a remarkable start in this direction.

The key recommendations of this report are that more useful definitions of privacy and confidentiality be given in the Common Rule and that two educational web sites be developed structured similarly to the Help menu of Microsoft Word. The small web site would guide IRBs in locating and structuring helpful materials and resources tailored to their specific location. The major web site would provide education and information needed by IRBs, researchers, and students, as well as teachers of research methodology who wish to use these materials for instructional purposes.

Only minor modifications of the Common Rule are needed, under definitions and informed consent requirements. No additional regulations or surveillance are needed except that IRBs would be required to take the steps specified in the small IRB web site to tailor the larger general web site to their specific setting. Enforcement of intelligent use of the web sites by researchers would be the role of IRBs when dealing with protocols, just as IRBs seek to foster intelligent ethical problem solving now.

The problem that needs to be solved is not lack of rules or lack of ethical concern on the part of researchers. The problem is lack of education - that is, lack of knowledge, problem solving skills, and resources to interpret the existing rules. Additional rules or greater specificity of rules would raise three serious problems (Evers, 2000):

  1. Acceptance of many more detailed or specific rules across 17 agencies and diverse research contexts would be limited.
  2. Opportunities for intelligent interpretation, and deciding between principles or values in conflict would be diminished.
  3. Efforts required to follow a given rule may be disproportionally great relative to the expected gain or results.

An intelligently developed and managed user-friendly web site in the hands of a capable scientific workforce and its managers creates a culture in which ignorance is no excuse and learning is easy. Even the occasional desperate individual, eager for a quick publication, would find it more difficult to skirt the rules. Three basic recommendations are proposed to the Commission:

  1. Change the definitions of privacy and confidentiality contained in the Common Rule so that they are more generic and relate to a broader understanding of these two concepts, as opposed to the current definitions which seem to relate to biomedical research and medical records.
  2. Commission the development of educational information for the two web sites. Establish a small oversight committee that would edit the educational material, oversee the work of a web manager, and consider new ideas, submissions, or criticisms from users of the web site. Above all, this material and the roles connected with its development and management should be treated as educational and not as a regulatory requirement.
  3. Refer all readers of the Common Rule to the major web site for assistance with solving problems of privacy and confidentiality. Refer IRBs to the small web site for guidance in tailoring educational material to their context and location.

Many specific recommendations are offered concerning the content of the two web sites. Most of these web-content recommendations concern concepts of privacy and confidentiality, relevant laws and regulations, approaches to learning what is private to individuals, and approaches to assuring privacy and confidentiality.

Back to Top

Introduction

The National Bioethics Advisory Commission (NBAC) has requested analysis and recommendations concerning issues of privacy and confidentiality that arise in social and behavioral research. In this response to NBAC's request, relevant issues are raised and explored, and recommendations are offered. This introductory section provides an overview of the problem that is addressed and the general direction that is taken throughout this paper.

The Common Rule governing human research discusses privacy and confidentiality in ways more suited to biomedical research than to social and behavioral research (hereinafter referred to as social research). More useful definitions of privacy and confidentiality are offered herein. Even with more useful definitions, however, the interpretation of these ideas into an effective protocol sometimes requires kinds of knowledge, experience, and problem-solving skills that are absent from the training of most scientists, students, and IRB members. Hence, it is primarily education and easily accessed information, and not more rules or enforcement, that are needed by IRBs and their clientele.

The meaning of privacy and confidentiality in social research inheres in the culture and particular circumstances of the individual subject,1 the nature and context of the research, and the particular social and political environment in which the research and use of the data occur. Consequently, their definition, as well as the interpretation of requirements to respect privacy and assure confidentiality, is not a trivial or simple matter.

Informed consent is the mechanism through which subjects decide whether or not to allow access to themselves, and through which agreements are made concerning the handling of identifiable data. However, the regulations of human research, as currently written, give little hint of how finely the protocol and informed consent relationship must be crafted in response to the manifold aspects of privacy and confidentiality in social research. Worse, they do not allude to the background of concepts and plans that would underlie such an effective protocol and relationship. Remarkably, some IRBs function quite effectively despite these ambiguities, through wise interpretation by IRB members who are well schooled in ethical problem solving and whose scientific training has provided relevant research competencies. Such a fortunate confluence of education, competency, and effort is not the norm, however. Nor can such outstanding performance reasonably be expected of most IRBs, which are woefully overworked and under-budgeted. An educational program is recommended herein that would provide a foundation for effective ethical problem solving by IRBs and researchers with respect to privacy and confidentiality.

At the core of any ethical problem solving in research is the meshing of a) valid scientific methods with b) relevant ethical considerations such as respect for privacy and assurance of confidentiality, in pursuit of answers to c) nontrivial research questions - with as little compromise as possible in each of these three dimensions. Skilled IRBs can help guide this process.

For example, in the interest of confidentiality and informed consent, parents recruited for a study of child-rearing practices should be warned of the limits of confidentiality (e.g., mandatory reporting of evidence of child abuse). While this might distort the researcher's random sampling scheme and jeopardize generalizability by eliminating those who decline to participate, it provides a higher level of confidence in the candor of those who choose to participate and suggests conducting a parallel study of parents who have been convicted of child abuse.

In response to a problem such as this, the IRB can put an inexperienced researcher in touch with local resources who can counsel the researcher about issues of mandatory reporting. They can remind the researcher to use appropriate skills of rapport and effective communication to ensure that the limits of confidentiality are clearly communicated to potential subjects. A range of research approaches can enable the project to "surround" the research question despite distortion of the random sampling scheme. In short, many institutional resources can be focused on solving this set of confidentiality-related problems. Some IRBs operate at this level of sophistication.

Unfortunately, issues of privacy and confidentiality may also be handled in naaï¿e or bureaucratic ways due to lack of knowledge and relevant problem solving skills. It is not a required role of IRBs to rehabilitate deficient protocols, nor is the knowledge required to do so necessarily inherent in the repertoire of skills of most scientists. Research methodology textbooks remain woefully deficient in this area. This places the IRB in an awkward leadership position and strains the constructive and collegial atmosphere that an effective research organization requires.

In response either to a particular interpretation by the Office for Human Research Protections (OHRP) or on account of their own inexperience, IRBs may respond to avoid one risk and unwittingly create another risk. For example, one irate parent objected to an extremely imprudent survey question about parents (administered to college students). In response, OHRP (then the Office for Protection from Research Risk, or OPRR) announced an interpretation to the effect that when questions are asked of a subject about another person, the consent of that other person must also be obtained (OPRR, 1999). This unfortunate general interpretation led an IRB to require a researcher interviewing parents about their childrearing practices to obtain the assent of their four-year-old children - a meaningless and developmentally inappropriate request. If taken seriously, this interpretation would also lead to the demise of much epidemiological research. An on-line educational program, shared nationally, would enable IRBs and researchers to engage in more effective ethical problem solving and to authoritatively question an OHRP interpretation that did not fit their specific circumstances.

However, even the most capable, competent, and energetic IRB finds it difficult to educate every researcher and student within its domain. Individual researchers (especially students) may not know that their work falls within IRB purview or may decide to avoid his or her IRB because of the belief that it will impose requirements that will render the research impossible. Some privacy-related beliefs on the part of individual researchers (e.g., "You always have to get parental permission") become urban myths that get disseminated among uninformed researchers and students. A user-friendly, web-based education within institutions would help prevent or dispel such urban myths. It will also create a culture in which ethical problem solving becomes a more integral part of the research curriculum and irresponsible faculty members will find it more difficult to transmit their scofflaw attitudes to their students.

Who needs to be reached through such an educational resource? The social sciences are usually considered to include psychology, sociology, anthropology, political science, and economics. However, for IRB purposes, the point is not which are and are not social sciences but rather who uses social science methods. One must look virtually everywhere within a university, clinic, prison, school, or hospital to find such researchers. An internet information and education approach, indexed and patterned somewhat after the Help feature of word-processing software, would make it easier for administrators to guide researchers and students to the information they need, with little ambiguity or inefficiency. That information could also be incorporated by individual instructors into the curriculum of courses on research methodology.

An adequate resource would introduce all major aspects of respecting privacy and assuring confidentiality and would explain how each is operationalized. It would provide information about all major social research methods and present some illustrative cases to show how methods interact with privacy and confidentiality considerations. Each method brings with it specific problems of privacy and confidentiality depending on other related aspects of the research. Each method may be used in a range of contexts, e.g., survey research can be conducted in person, or via mail, phone, or internet. The data or research site and subjects may be shared with other scientists subsequently; the data may be used in secondary analysis and meta-analysis. The subjects may be anyone - students, gang members, prostitutes, business persons, people in therapy, families, infants, professionals, recovering patients, runaway children, kindergartners, or representatives of whatever population the research is focused on. Each method, context, and subject population has many variations, and each brings with it a range of privacy/confidentiality issues to be resolved. Consequently, an awareness of alternative methods and their scientific and ethical implications is vital to effective ethical problem solving.

An important aspect of respect for privacy involves giving people reasonable choices about participation, which is sometimes precluded under traditional ways of designing experiments with random assignment. One alternative is to use a "wait list" design in which the control group also receives the desired treatment - later. People who want to participate in an experiment because they consider the treatment highly desirable can choose to participate with the understanding that they might get the treatment they desire right away or later.

For example, in a randomized trial on the effects of monetary scholarships paid to high-achieving students from impoverished families (Spencer et al., 1999), the monetary payments were delayed for one year for equally eligible students in the control group. As this example illustrates, developing a protocol that respects subjects' privacy and autonomy requires selecting the method(s), procedures, and context(s) that best meet both scientific and ethical criteria.

No IRB can be expected to embody all of the scientific competencies needed to help craft the best protocols for all research projects. However, an educational resource, including a list of consultants by research topic, can be developed that would make it relatively easy for IRBs and investigators to locate the information they need in order to develop effective protocols. This would remove the IRB from the hostile role of "ethics police" and place it in a more professional and constructive role.

Back to Top

The Difficulty of Defining Privacy and Confidentiality

With the rise of concern for personal privacy, many confusing definitions have emerged, adding to the difficulty of specifying what it means to respect privacy or to avoid invading subjects' privacy. The difficulty of defining invasion of one's own privacy is compellingly expressed by Melton (1992, p. 66):

"I know it when I feel it." A gut sense of personal violation may be the tie that binds such disparate events as being subjected to a body search, being the subject of gossip, having one's mail read, being asked one's income, or having one's house entered without permission. It should come as no surprise that such an intensely personal construct is difficult to define.

Even more difficult to fathom, define, understand, or respect is the privacy of other persons situated differently from ourselves with respect to age, ethnicity, locale, socioeconomic status, gender, or the context in which the issue of privacy arises. Privacy is an aspect of respect for persons that can be difficult to translate into respectful behavior in cultures and contexts in which one does not understand the relevant norms and beliefs.

Without a useful definition or theory of privacy to guide them, or at least guidelines to remind them of possible threats to privacy, researchers and IRBs must depend on their own culture-bound notions of what people consider as private. They are left to invoke their personal and idiosyncratic definitions, resulting in a capricious standard of protection. Examples of inappropriate notions of privacy are provided herein, and guidelines are proposed for discovering what is private to others.

There are many definitions and concepts of privacy and confidentiality, some of which use the two words interchangeably. For purposes of guiding social research, it is important to distinguish clearly between these two concepts in precise terms that a researcher can act upon intelligently in any research context (e.g., internet, workplace, schools, hospitals, families, neighborhoods) and in the diverse ethnic and social cultures where social research is performed.

Privacy

One useful, though simple definition of privacy, following Boruch and Cecil (1979), is as follows:

Privacy refers to persons and to their interest in controlling the access of others to themselves. (Confidentiality refers to data, as discussed subsequently.)

This definition of privacy recognizes that control and autonomy, rather than isolation, are at issue. As Beauchamp and Childress (1994) explain, rights of privacy are valid claims against unauthorized access; such claims have their basis in the right to authorize or refuse access. Accordingly, the above definition recognizes the vital role of informed consent (properly formulated and administered) in giving subjects control over whether they will allow the researcher access to themselves and to their attitudes, behavior, beliefs, and opinions. It alludes to the two directions of access: a) information that is given to one or rejected by one, e.g., pornography that a researcher wishes to show male subjects to study its effect on subsequent response to scenarios of violence towards women, and b) information one reveals to, or withholds from, others, e.g., a subject's willingness or unwillingness to disclose personal details about his or her own life.

Informed Consent and Privacy

There are many verbal and nonverbal dimensions of informed consent that impact a subject's decision of whether or how much to reveal to the researcher. The informed consent statement that fulfills the elements required by the Common Rule provides the objective factual information needed to control access satisfactorily. But there is a nonverbal dimension that is at least as important to individuals as they seek to manage their privacy.

Suppose, for example, that a disheveled researcher administered informed consent without establishing rapport and with body language that is "closed" and hostile. He or she would be regarded as having invaded the subject's privacy before even uttering a word. Such a researcher's manner would belie any "respectful" informed consent language. An unconcerned or ritualistic recitation of the informed consent conveys that it is merely a legal maneuver designed to protect the institution - that whatever happens to the hapless subject is of no personal concern to the researcher.

The nature, timing, and delivery of informed consent has important implications for subjects' sense of privacy. For example, a longitudinal or ethnographic study or any research that involves repeated contact between researcher and subject should treat informed consent as an ongoing process of communication in which the nature of the relationship is repeatedly clarified and discussed in an informal, friendly way. Research in which it is easy for the subject to end the relationship (e.g., a survey that is conducted by mail or phone) should not require a signed consent, as this is unnecessary, inconvenient, and regarded by some to mean that they are making an irrevocable commitment to participate. Moreover, it reduces response rate and distorts random sampling by eliminating the least literate of potential subjects. Worst of all, it risks, unnecessarily, a breach of confidentiality and harm to subjects if a signature is somehow attached to sensitive data that could as well have been gathered anonymously.

Cultural and Developmental Determiners of Privacy

The boundaries between wanted and unwanted information proffered to us by others, as well as between information we will and will not share, are partially defined by sociocultural values. For example, current efforts to survey people's safe-sex practices are made difficult because the very mention of some specific sexual practices (quite apart from responding to the survey) is taboo or offensive in some subcultures in the United States, but acceptable in others. On the response side, providing information about one's sexual practices is acceptable to some respondents and highly offensive to others.

Some ethnic differences in sense of personal privacy are counter-intuitive to most Americans. For example, most Americans consider eye contact a sign of respect and honesty, but ethnic Japanese (especially in Hawaii) consider a researcher who establishes eye contact as disrespectful. Guessing at cultural nuances of privacy is a poor practice for social researchers.

How can researchers learn what is private in cultures and contexts that are foreign to their own personal experience? One solution is to employ research assistants from the same cultural community as the subjects. However, this sometimes risks a breach of confidentiality (disclosure of personal secrets) within a close-knit community. Correspondingly, subjects sometimes disclose more sensitive information to a stranger who has established appropriate rapport and promised confidentiality than to a member of their own group. The optimal solution to this problem rests on judicious selection and training of research assistants - training that is designed to produce the most culturally sensitive behavior possible without any actual or apparent risk to confidentiality within the community.

Everyone wants to have some choice about those with whom they will interact and what the conditions of that interaction will be. Early in life, people learn a variety of ways to avoid or evade unwanted interactions. The researcher who is not mindful of the privacy interests of subjects will be lied to, stood up, or complained about. In contrast, the researcher who understands or respects the privacy interests of subjects may find them overwhelmingly forthcoming, and may even find it difficult to end the session.

The researcher who understands the sociocultural aspects of privacy looks beyond the immediate scientific need to "get data." He or she does some ethnographic work before designing recruitment and research approaches. By learning about the norms, beliefs, and culture of the subjects, the researcher can then appropriately express respect, establish trust, and create rapport that makes interaction acceptable.

The recruitment of subjects and design of research should be culturally appropriate and should instill sufficient trust that subjects will want to participate candidly. Much of the local ethnography needed to conduct social research effectively cannot be found in textbooks. However, networks of local researchers, educators, and outreach workers can share valuable information about the most appropriate ways to approach members of various cultures. Individual IRBs would do well to add to their educational web site suggestions of useful contact persons (e.g., AIDS outreach workers, social workers, farm agents, public health nurses) and to even sponsor local conferences on culturally sensitive approaches to research and develop proceedings for future use in designing appropriate protocols.

To respect privacy is to let subjects control the access of others to themselves: to provide the conditions under which the researcher's inquiries are welcome, and to provide adequate opportunity for people to decline to participate. To breach privacy is to violate people's space, to intrude where not welcome or not trusted, or to seek to control access to people against their wishes - for example, organizational research on employees to which management gives the consent and deceptive research in which the researcher induces the subject to behave in a way the subject would not wish to be observed are invasions of privacy. An educational resource for IRBs and researchers should discuss practices of recruitment, consent, timing of procedures, research methods, and debriefing in terms of whether they respect privacy or are likely to constitute breaches of privacy.

The definition of privacy developed so far works well with theories borrowed from the social sciences that describe the conditions under which access by others is welcome or aversive. Such theories describe the controls people use to limit the access of others to themselves and the conditions under which those controls are available and likely to be employed.

Theories of Privacy

Various theories of privacy are instructive to researchers and need to be incorporated into the education of researchers and IRB members. The following two are offered as examples.

Laufer and Wolfe (1977) describe how self-ego, environmental, interpersonal, and control-choice factors operate to regulate the experience of privacy. This elegant analytical framework indicates virtually every element one must consider to understand the privacy of another.

The self-ego dimension of privacy refers to the development of autonomy and personal dignity. For young children, being alone is aversive. By middle childhood, time alone is sought to establish a sense of self and autonomy and to nurture new ideas, creating a basis for self-esteem, personal strength, and dignity. Thus, children in middle childhood have a need and right to privacy not found in infants and younger children. Adults continue to need time alone and develop many means of protecting that privacy.

The environmental dimension includes cultural, sociophysical, and life-cycle dimensions. Cultural elements include norms for achieving privacy, e.g., one culture may permit lying while another may permit persons to have private rooms. Sociophysical elements refer to physical settings that offer privacy (e.g., indoor bathrooms, tree houses, automobiles, etc.) Life-cycle elements vary with age, occupation, available technology, and changing sociocultural patterns. The kinds of privacy one establishes at one age, under one set of responsibilities, constraints, and technological aids may be unsatisfactory or unavailable in another stage of one's life.

The interpersonal dimension refers to how social interaction and information are managed. One's social setting and its physical characteristics provide options for managing social interaction; physical and social boundaries can be used to control people's access to one another.

The control/choice dimension develops out of one's dimensions of self-ego, culture, and environment. Young children have no control over their privacy, except through hiding. Later, they learn to use personal, cultural, and physical resources to control their privacy. Events that would threaten one's privacy early in the development of these mechanisms are later so easy to control that they are no longer considered a threat to privacy.

Thompson (1982) presents a developmental theory describing how the sense of privacy changes from early childhood through late adolescence, and how youngsters learn to control access by others. Thompson shows that popular ideas about vulnerability decreasing linearly with age are inaccurate: older children are more easily embarrassed, more concerned about personal and informational privacy, and more likely to feel upset if they reveal more than they intended. However, younger children's sense of privacy is enhanced when their parent is present during a study, while an older child is confident of ability to manage access with the researcher, but wants the parent out of the room.

Theories of this nature would have an important role in any effort to foster greater skill in discovering and understanding the privacy interests and privacy management approaches of a given research population.

Places and Privacy

The concept of privacy is related to the notion of private places that can be used to control access. There are degrees of privacy in relation to places:

  • Public behavior in public places.
  • Private behavior in public places, e.g., internet chat rooms, airports, restaurants, etc., where people may exchange deeply private communication, oblivious of others. Persons in these settings are likely to consider a researcher as an eavesdropper who is invading their privacy, though the researcher might argue otherwise.
  • Private behavior in private places e.g., bathrooms and bedrooms where certain behaviors are surmised by others, but to observe or record them without informed consent would be considered a serious breach of privacy.
  • Secret behavior (not surmised by others, and not to be observed). Studies of illegal behavior, secret rituals, and personal documents such as diaries pose special problems as they clearly breach privacy unless consent is obtained and/or special precautions are taken to assure anonymity. Practices of ubiquitous surveillance (e.g., continuous video monitoring or other uses of modern technology to monitor all events within a particular domain) pose special problems since people may be observed and videotaped while engaging in secret behavior. Even with warning and informed consent, persons may readily forget that they are being observed via a camera. Hence warning signs and very obvious recording equipment might be used.

In research, only the first category is clearly without ethical concerns. Observational research or participant observation in private places is especially problematic.

Summary

Individuals, organizations, and cultures all have their own ways of wishing to be perceived, and one's sense of privacy is intimately related to controlling the way one is perceived. However, researchers are often so focused on getting data that they forget even the norms of their own culture and are still more oblivious to the norms of other cultures. Moreover, researchers typically examine subjects from a critical perspective that may differ from the way their subjects wish to be perceived.

Researchers should be well informed and mindful of the ethnography of their subject population so that their approach is tactful and their critical analysis is objective. Yet, they will still function as outsiders, almost voyeurs, and should compensate for this by carefully masking the identity and location of those whose behavior they critically analyze in scientific publications. In summary, the concepts of privacy offered here do not presume that:

  • Privacy is an objectively defined phenomenon;
  • Privacy means being left alone;
  • One can readily guess what others will consider as private or how they will regulate their privacy; or that
  • Notions of privacy are universal across cultural or demographic subgroups.

Rather, these concepts lead the researcher to look to theory, methods, and ethnographic information to learn how to respect the interests of a given subject population in controlling the access of others to themselves.

Should IRBs vary the degree of protection of privacy depending on the ethnographic standards of the research population? Clearly, some subgroups have particular concerns about privacy that are not shared by others, and that must be respected for scientific as well as ethical reasons. That is, respect for privacy is not only ethically required, it also encourages subjects to give truthful and unguarded responses. But what about populations that have diminished expectation of privacy in their everyday life, such as celebrities, prisoners, or citizens in a totalitarian state where privacy protections are few and confidentiality is always in doubt? Respect for privacy means giving people the privacy they would like to have, not the invasions of privacy that are regularly imposed on them. For example, a psychologist studying prisoners or workers in a company where the employer disrespects privacy should respect subjects' privacy even if others do not. The IRB or researcher in doubt about the sense of privacy of a particular population can remove that doubt by asking surrogate subjects drawn from the target population. For more autonomous populations such as celebrities, the matter may be settled through pilot research and informed consent in which persons can participate or decline based on an accurate understanding of what will be asked of them.

Confidentiality

A useful, though simple definition of privacy, adapted from, and similar to, that developed by Boruch and Cecil (1979) is as follows:

Confidentiality is an extension of the concept of privacy; it refers to data (some identifiable information about a person, such as notes or a videotape of the person) and to agreements about how data are to be handled in keeping with subjects' interest in controlling the access of others to information about themselves.

This definition provides a clear distinction from privacy, which is vital. For in addition to understanding what is private to given subjects and how that privacy may be respected, the researcher must be able to assure subjects that the access of others to information about themselves will be controlled in a way that is acceptable to them. Every detail of subsequent data management need not be communicated to subjects, though it should be worked out clearly in the protocol. Some matters such as required disclosure of child or elder abuse, plans for possibly sharing identified data with other scientists, or sharing archives of videotaped data should be part of the informed consent, if relevant.

This definition of confidentiality does not presume that:

  • Data can necessarily be protected from the scrutiny of others;
  • Confidentiality means simply a promise not to disclose; or
  • Researchers can safely assume that their intention not to disclose identifiable data means that it will not be disclosed somehow.

This definition helps researchers and IRBs to recognize that there are various risks of unintended disclosure, e.g., through snooping by hackers or research assistants, theft by blackmailers, legally mandated disclosure, or careless construction of data tables that permit some readers to deduce the identity of individual subjects. It also reminds one that confidentiality is whatever arrangement about disclosure the researcher and subject agree upon, within the constraints of law and ethics. Confidentiality is more than just a promise or an intention on the part of the researcher. It is an arrangement to use certain techniques that are available for controlling the disclosure of identifiable information. There may be limits to what can be promised or guaranteed, and these must be discovered and stated at the outset.

Kinds of Confidentiality-Assuring Techniques

This definition of confidentiality leads naturally to the immense and growing literature on procedural, methodological, statistical, and legal approaches to assuring the confidentiality of research data. These methods are developed and described in various applied research literatures (e.g., Boruch and Cecil, 1979; Campbell et al., 1972; Jaro, 1989) and discussed subsequently. This definition also leads to recognition of the immense advantages of rendering data anonymous, where feasible.

The intelligent use of confidentiality-assuring techniques depends upon understanding what threats to confidentiality may exist or be perceived by subjects, what legitimate uses of the data may be anticipated including sharing with other scientists and agency audit of the data, and what costs and analytic disadvantages may accompany some of these techniques. There is considerable and ever-growing depth to this body of knowledge. When anticipating gathering of any data, and especially sensitive data, it is important to a) make early plans concerning the confidentiality-assuring techniques that will be used, b) incorporate these appropriately into any consent agreements with subjects and contractual arrangements with subsequent users (including funders who wish to audit the data), and c) include these details in the IRB protocol.

Emerging Issues of Privacy and Confidentiality

Within the last two decades, data collection and storage practices have been altered radically. New digital media support a wide range of social relationships such that social scientists, their colleagues, and their subjects need not meet face-to-face and may even reside in different parts of the world. The issues of confidentiality that are emerging are more varied and dangerous than current policy makers can easily anticipate. However, there are also hopeful solutions on the horizon. Within another decade or two, issues of confidentiality will be transformed in ways we cannot imagine today. There are now digital communication networks on a global scale and the possibility that hackers with a laptop computer and internet technology could download any electronic data stored on any server anywhere in the world. There are also emerging technologies for protecting communication and personal identity, and there is a whole new cohort of technology-sophisticated privacy activists. New laws that protect data are being developed and tested, and globalization of culture and policy processes is occurring.

These major shifts in technology have already begun to produce an immense literature on the accompanying threats and safeguards to privacy and confidentiality (e.g., Agre and Rotenberg, 1998). Much of this literature is esoteric. It needs to be tracked by specialists who are concerned with the confidentiality of social research data and translated into a body of knowledge that is useful to researchers, research administrators, and IRBs.

One relatively old, low-technology form of data collection with which most researchers are familiar illustrates a few of the kinds of problems that will arise with the use of high-technology data gathering. When data are gathered on videotape, the distinction between persons and their data is blurred. If one observes another on a video monitor, in real time, is this direct observation or data? If the material that is seen in real time on the monitor is taped, does the same image then become data? Can anonymity be complete if there is any visually recognizable aspect of the persons' identity? Risk of deductive disclosure (the possibility that someone who already knows the individual will see the tape and correctly deduce that he or she is the subject depicted) takes on new meaning with videotaped data. When data are in videotaped form, special attention must be given to consent and avoidance of incriminating or unnecessarily sensitive material. The researcher should disclose the projects' plans for storage, sharing, and disposal of the tapes. Subjects should have the option of requesting tape erasure if they decide that they have just said or done something they do not want to be retained on tape.

Summary

The proposed definitions emphasize that privacy and confidentiality are quintessentially behavioral and social phenomena and appropriate topics of scholarship and investigation. Curiously, however, the federal regulations of human research and requirements of IRB review have tended to be treated by researchers as leading to last-minute paper work rather than to the creative use of current privacy and confidentiality-assuring techniques, or the development of new theory or methodology that would make for more ethical human research. Thus, a main principle developed in the Belmont Report, that research should be performed by competent scientists who apply ethical principles intelligently based on sound scientific knowledge and research skills, seems to be lost on many scientists and IRBs. Part of this problem may be due to the failure to frame the concepts of privacy and confidentiality in ways that would invite innovation and investigation.

Back to Top

Regulations and Statutes

A wide range of federal and state laws, as well as the Common Rule and other federal regulations, concern privacy and confidentiality in social research.

The Common Rule (Subpart A of 45 CFR 46)

The federal policy for the protection of human subjects, which formerly pertained only to research funded by the Department of Health and Human Services (DHHS) (45 CFR 46 Subpart A) has now become the Common Rule and has been incorporated into the regulatory structure of 17 federal agencies, 8 of which have additional human subject protections beyond the Common Rule, most of which do not relate directly to privacy and confidentiality.

These agencies, their regulations that contain the Common Rule, and their additional regulations related to privacy and confidentiality are as follows: Housing and Urban Development (24 CFR 60), Justice (28 CFR 46 with additional protections in 28 CFR 512), Transportation (49 CFR 11), Veterans Affairs (38 CFR 16 with additional protections in 38 CFR 17.85, M-3, Part 1, Chapters 9 and 15), Consumer Product Safety (16 CFR 1028), Environmental Protection (40, CFR 26), International Development (11 CFR 225), NASA (14 CFR 1230), NSF (46 CFR 690), Agriculture (7 CFR 16), Commerce (15 CFR 27), Defense (32 CFR 219, plus 12 additional regulatory protections), Education (with extensive additional protections to privacy and confidentiality as noted below), Energy (10 CFR 745), Health and Human Services (45 CFR 46 Subpart A), Social Security (P.I. 103-296), and CIA (Executive Order 12333); the last three agencies also employ Subparts B, C, and D of 45 CFR 46.

The Common Rule specifically requires that informed consent include a statement about how confidentiality will be maintained, but it leaves to the IRB and the researcher the subtle matter of understanding what confidentiality is and how it relates to privacy. Moreover, the Common Rule does not define privacy, but defines private information as follows:

Private information includes information about behavior that occurs in a context in which an individual can reasonably expect that no observation or recording is taking place, and information which has been provided for specific purposes by an individual and which the individual can reasonably expect will not be made public (for example, a medical record) (45 CFR 46.102(f)(2)).

This definition of private information confuses it with confidentiality and fails to convey the notions of personal privacy that are important to ethical research. It also implies that everyone has the same concerns about others' access to themselves and to identifiable data about themselves.

Based upon this confusing set of definitions, 45 CFR 46.111 (Subpart A), the criteria for IRB approval of research, states:

(7) When appropriate, there are adequate provisions to protect the privacy of subjects and to maintain the confidentiality of data.

Unfortunately, this requirement assumes a level of sophisticated knowledge concerning privacy and confidentiality that many IRBs and researchers do not possess.

Subparts B, C, and D of 45 CFR 46

Subpart B contains protections in research, development, and related activities involving fetuses, pregnant women, and human in vitro fertilization. Its relationship to privacy arises in connection with 46.209(d) which concerns consent of the mother and conditions under which the father's consent is not required. Being biomedical in focus, it is not within the purview of this paper.

Subpart C pertains to research on prisoners that is conducted or supported by DHHS. It defines "prisoners" quite loosely to refer to any individual involuntarily confined or detained in a penal institution. This encompasses those sentenced under a criminal or civil statute, detained in other facilities by virtue of statutes or commitment procedures which provide alternatives to criminal prosecution or incarceration in a penal institution, and any individual detained pending arraignment, trial, or sentencing. (In contrast, the Department of Justice provides a more comprehensive set of protections, as outlined below, and pertains only to research conducted within the Bureau of Prisons.) Subpart C of the DHHS regulations of human research is responsive to privacy/confidentiality issues in that it recognizes the limits to prisoners' voluntary and uncoerced decision whether to participate in research. It limits the amount of risk to which prisoners may be exposed. It requires an IRB containing at least one prisoner or a prisoner representative and a majority of the members having no association with the prison apart from their membership on the IRB.

A shortcoming of Subpart C is its failure to address the special problems concerning research on incarcerated juveniles who deserve special attention due to issues of distrust, incomprehension, and their often unusual (strained or nonexistent) relationship with their parents. Neither Subpart C nor the Justice Department regulations described below (28 CFR 512.11) consider that issue. Moreover, Subpart D, which deals with research on children, does not offer special guidance regarding research on incarcerated juveniles.

Subpart D discusses additional DHHS protections for children involved as subjects in research. It is sensitive to children's personal privacy interests in requiring the child's active assent when assent would be meaningful, as well as parental permission, with either party having veto power. Unlike research on adults, Subpart D requires IRB approval when the research involves surveys, interviews, and observation of public behavior when the investigator participates in the activities being observed. It sets forth limits to the amount of risk to which children may be exposed in research, and places strict requirements for consent and benefit when any more than minimal risk is involved except where there is no other way to benefit the subject or to study a serious problem affecting children in general. Subpart D recognizes that there are contexts or circumstances in which parental permission is not a reasonable requirement to protect the subjects (e.g., in the case of neglected or abused children), and that the IRB needs to designate other ways of protecting the interests of such children. Procedures for selecting guardians or advocates who function in loco parentis are discussed.

Additional Protections: Department of Education

The Department of Education provides additional protections regarding respect for privacy and assurance of confidentiality under 34 CFR 99. For example, Part 99 concerns "Family Educational Rights and Privacy" and sets out requirements for the protection of privacy of parents and students under section 444 of the General Education Provisions Act, known as the Family Educational Rights and Privacy Act of 1974 (FERPA). FERPA and other statutes governing research in educational settings are discussed below under "Federal Statutes."

Additional Protections: Department of Justice

The Department of Justice has developed an excellent set of additional protections of privacy and confidentiality pertaining to prisoners (28 CFR 512). These are most noteworthy in their emphasis on a) creating an impermeable firewall between research data and prison administration, b) requiring an IRB membership of at least one prisoner and a majority who are not prison personnel, and c) giving prisoners a high degree of control over their identifiable data.

Protections specifically pertaining to privacy and confidentiality are many: The researcher must not provide research information which identifies a subject to any person without that subject's prior written consent to release of the information. Identifiable data cannot be admitted as evidence or used for any purpose in any action without the written consent of the person to whom the data pertains. Except for computerized data kept at a Department of Justice site, records containing identifiable data may not be stored in, or introduced into, an electronic retrieval system.

Access to Bureau of Prison records by researchers is limited by law. No one conducting research may have access to records relating to the subject which are necessary to the purpose of the research without the subject's consent. A nonemployee of the Bureau of Prisons is limited in access to information available under the Freedom of Information Act (5 USC 552). A nonemployee of the Bureau may receive records in a form not individually identifiable when advance adequate written assurance is given that the record will be used solely as a statistical research or reporting record.

Federal Statutes

The rise of concern about the amount of governmental record keeping and the potential for abuse through inappropriate and unauthorized use of such records led to enactment of the Privacy Act of 1974. This omnibus law governs record keeping and record use in all federal agencies. (See in this volume paper by Goldman and Choy for discussion of the Privacy Act of 1974.) The Privacy Act authorized the examination of privacy issues in diverse environments via the Privacy Protection Study Commission, whose findings have continued to influence concern to balance the interests of individuals, record keeping institutions, and society as a whole. One outcome has been increased attention to the individual's role in controlling information about himself. In keeping with the overall philosophy of the Privacy Act of 1974, and as an alternative to the omnibus approach to regulation of record keeping taken by the Privacy Act, various laws have been passed that are tailored specifically to privacy in educational settings and to research in educational settings. The Family Educational Rights and Privacy Act of 1974 is designed to protect educational records from disclosure without consent of parents or students over 18. The Protection of Pupil Rights Amendment (PPRA) gives parents the right to inspect innovative curriculum materials and requires parental consent for students to participate in sensitive research. An interesting aspect of FERPA and PPRA is that, like the federal regulations of human research, they allow the participating institutions to develop and implement local substantive and procedural requirements that meet minimum requirements established by law. We turn now to FERPA and PPRA, and then to the National Center for Educational Statistics (NCES) Confidentiality Statute that protects the confidentiality of identifiable data collected by the NCES. Finally, we discuss legislation that protects identifiable data from subpoena.

The Family Educational Rights and Privacy Act of 1974 (FERPA)

FERPA (also commonly known as the Buckley Amendment) states that "An educational agency or institution shall obtain the written consent of the parent of a student or of the eligible student (if 18 or older) before disclosing personally identifiable information from educational records of a student, other than the directory information." Directory information includes information contained in an educational record of a student which would not generally be considered harmful or an invasion of privacy if disclosed. It includes, but is not limited to, the student's name, address, telephone listing, date and place of birth, major field of study, participation in officially recognized activities and sports, weight and height of members of athletic teams, dates of attendance, degrees and awards received, and the most recent previous educational agency or institution attended.

Educational records refers to records directly related to a student and that are maintained by an educational agency or institution or by a party acting for the agency or institution. Educational records does not refer to records of instructional, supervisory, and administrative and educational personnel that are kept in the sole possession of the maker of the record and are not generally accessible to others. It also does not include information gathered after the student has ceased attending that institution.

FERPA applies to educational agencies or institutions that receive federal funds under any program administered by the Department of Education. This includes all public elementary and secondary schools and virtually all post-secondary institutions.

In contrast to the Common Rule, FERPA provides post-violation remedy rather than prior approval of a research protocol. However, IRBs typically impose prior approval based on FERPA requirements. Some degree of IRB discretion exists at the fringes of FERPA's definitions, but generally the Buckley amendment prevents a researcher from inspecting educational records without permission of the parent or of the student (after age 18). The Department of Education reports that while it receives quite a few complaints from parents, most contain no specific allegations of fact that would indicate that a violation had actually occurred. Consequently, most of the effect of FERPA on research arises from restrictions that school administrators and IRBs place on researchers.

The Protection of Pupil Rights Amendment (PPRA)

PPRA is also commonly known as the Hatch Amendment. It affects research in various ways by giving parents certain rights regarding Department of Education-funded activities. When first introduced in 1974, it gave parents the right to inspect instructional material used in connection with any research designed to explore or develop new or unproven teaching methods or techniques. A later addition in 1978 (the Hatch Amendment) requires parental permission for certain types of surveys administered to minor students that seek information about the student's private life, specifically children's attitudes, beliefs, or habits in the following seven areas: 1) political affiliation; 2) mental and psychological problems potentially embarrassing to the student and his/her family; 3) sexual behavior and attitudes; 4) illegal, antisocial, self-incriminating, and demeaning behavior; 5) critical appraisals of other individuals with whom respondents have close family relationships; 6) legally recognized privileged or analogous relationships such as those of lawyers, physicians, and ministers; and 7) income (other than required by law to establish eligibility for a program).

PPRA was further amended in 1994 (Grassley Amendment) to remove the terms "psychiatric and psychological examination, testing and treatment," and to clarify PPRA to refer to any survey, analysis, or evaluation that elicits information in the above seven areas.

Like FERPA, PPRA provides post-violation remedy rather than prior approval of a research protocol. However, IRBs typically impose prior review following PPRA requirements, irrespective of whether Department of Education funding is sought.

Issues Concerning Regulation of Research on School Children

School children are a convenient captive subject pool, especially in the case of schools located near universities. Laws providing special protections for school children have been enacted largely because overzealous, insensitive, or culturally inappropriate researchers have offended parents. These laws make it easy for school administrators and IRBs to rein in intrusive research on children. Thus, they are highly effective, even though the Department of Education rarely acts on them.

Signed Parental Permission

Unfortunately, however, there is a problem with FERPA and PPRA that begs for resolution. Among the children who most need to be served through research and development programs are those whose parents are unlikely to be able to provide their written permission. Parents who are illiterate, do not speak English, are irresponsible or largely absent from the home, or are unfamiliar with American culture are unlikely to respond to a request for written parental permission. The requirement of a parent's signature invites forgery of parents' signatures by youngsters who want to participate with their friends, and who may be accustomed to signing for illiterate or non-English speaking parents in any case. It also invites poor solutions such as having a youngster do the translation of the consent statement, mentioning such matters as confidentiality to a parent who does not understand the concept. Most damaging of all, it results in invalid sampling of populations most in need of the kinds of understanding and support that research can help to provide.

Parental permission for school-based research often is handled poorly unless the IRB helps by providing useful, well-tested models of good communication - model letters, town meetings, use of adult community members trained to communicate effectively with their peers, tape recorded messages, and so on. Once the IRB has established such mechanisms, their adaptation to new situations may be relatively easy. However, signed parental permission may remain elusive in some populations even though the parents would not object to their child's participation. It may be appropriate for legislators to consider whether there should be exceptional circumstances (e.g., when parents are functionally illiterate) in which the usual requirements for written parental consent can be modified and so that the informed consent communication process can be conducted in a way that is more conducive to comprehension, such as by a community leader or gatekeeper.

What Research Is Covered by FERPA and PPRA?

While the regulations and statutes reviewed above may pertain only to research that is intended to provide generalizable knowledge, intended to be published, or funded by a given agency, in fact most IRBs act on their prerogative to impose more stringent requirements. For example, most IRBs review master's thesis research which is not funded, does not produce generalizable findings (perhaps merely assesses what clients in some kind of treatment program think of the program), and may not be done with publication as a goal. Arguably, some of these projects need not be reviewed at all. However in most contexts such a student research project that involved school children would be subjected to the same stringent requirements (e.g., FERPA and PPRA) as would a project funded by the Department of Education. The willingness of IRBs to apply FERPA and PPRA requirements, even in cases when not federally mandated, mirrors societal and school concerns for family privacy. It expresses respect for local schools and the standards they would also be likely to apply and teaches researchers to do likewise. Moreover, as a practical matter, it is awkward for an IRB to impose stricter regulations on similar projects that happen to vary with respect to funding.

Most IRBs act to protect human research subjects, not simply to follow the letter of the law. Some IRBs describe their more stringent requirements in their single or multiple assurances. Other IRBs, at their lawyers' insistence, describe their requirements as the minimum requirements set forth under the law, but, as an internal policy, impose a broader and more stringent set of requirements. However, with ever-increasing workloads, IRBs are increasingly tempted to discontinue such extra responsibilities as reviewing proposals that do not even qualify as "research" and imposing strict standards such as FERPA and PPRA when this may make the research much more difficult to carry out and evoke complaints from researchers. Thus, in their laudable attempts to educate researchers and to respect family privacy in research on school children, IRBs are placing themselves in somewhat of a no-win situation by requiring, with some research populations, a standard that may be very difficult to meet and that is not federally mandated. (These observations are based on many years of general observation, as well as on a recent discussion on McWirb.)

The National Center for Educational Statistics (NCES) Confidentiality Statute

The NCES Confidentiality Statute was enacted in 1988 to protect research subjects in several ways, as follows:

  • Individually identifiable data collected by NCES cannot be used for any purpose other than the statistical purpose for which they were collected.
  • Individually identifiable data are immune from the legal process.
  • Without the consent of the individual concerned, the individually identifiable data cannot be admitted as evidence or used for any purpose in any action, suit, or other judicial or administrative proceeding
  • NCES must strip data of personal identifiers before it releases public use files to researchers for research purposes such as secondary analysis; or, if the data would not be useful to the researchers if stripped of identifiers, NCES must require the researcher to enter into a restricted use data licensing agreement with NCES. That licensing agreement includes safeguards to protect the data, including penalties of up to five-year jail terms and $250,000 fines.
  • These confidentiality requirements apply from time of initial collection until the time the data are destroyed.

Legal Protections of Data

Statutory protection of research data enables researchers to arrange to assure the confidentiality of research records on identifiable individuals from subpoena. Subpoena of social research data is rare. However, if vulnerable data could not be protected from subpoena, there would be a chilling effect, especially on criminological and delinquency research.

Certificates of Confidentiality

The Public Health Service Act (PHSA) was amended (1970) authorizing the researcher's withholding of information concerning the identity of people who are subjects of research on use and effect of drugs. This withholding authority occurs through the issuance of certificates of confidentiality by the Secretary of the DHHS. A 1988 amendment broadened its scope to include mental health, biomedical, clinical, behavioral, and social research. Under this amendment, the Secretary of DHHS may authorize persons engaged in biomedical, behavioral, clinical, or other research (including research on mental health, including research on the use and effect of alcohol and other psychoactive drugs), to protect the privacy of individuals who are the subject of such research by withholding from all persons not connected with the conduct of such research the names or other identifying characteristics of such individuals. Persons so authorized to protect the privacy of such individuals may not be compelled in any federal, state, or local civil, criminal, administrative, legislative, or other proceedings to identify such individuals (42 USC 242a(b)(1989)).

Various institutes within DHHS are authorized to issue certificates. Since 1993, certificates have become obtainable for research that is not federally funded. DHHS regards a certificate's protection to supercede state law; this position has been challenged and upheld in the New York Court of Appeals (People v. Newman 32 N.Y.2d 379, 298 N.E.2d 651, 345 N.Y.S.2d 502, 1973) (Boikess, 2000).

A certificate does not protect identifiable data of "nonsubjects," that is, other people about whom the subject provides information to the researcher, a point which researchers may fail to clarify in the informed consent. The certificates only protect against compelled disclosure of subjects' names or other identifiers, coupled with their data. It does not protect a subject who voluntarily consents to disclosure of his or her research record, nor preclude a researcher from reporting the identity of subjects who disclose intentions to harm themselves or others. Moreover, the language of PHSA is rather imprecise, which gives rise to uncertainty. It offers protection to "names and other identifying characteristics," but the data of a known subject may not necessarily be protected. Melton (1992, p. 81) provides an example of this possible loophole:

[I]n one of my own studies, all of the children in a particular county who are involved in criminal child abuse prosecutions are invited to participate. Knowing that fact, a defense attorney might seek the data of a particular child (not the names of participants) as a fishing expedition for information intended to impeach the child's testimony. A literal interpretation of the statute would suggest that the subpoena might be enforceable if the data could be shown in some way to be relevant to the proceeding. Although it is also possible - perhaps even probable - that a court would interpret the statute more broadly in keeping with congressional intent, the uncertainty prevents unequivocal offers of confidentiality to participants and, therefore, should be eliminated by a technical amendment.

It is also unclear whether child abuse reporting laws are abrogated by certificates of confidentiality. Is such reporting a "legal proceeding" that cannot be mandated under a certificate of confidentiality? Finally, the certificate of confidentiality must be requested in advance of each research undertaking. Some researchers are unfamiliar with this protection or lack the diligence to make the request. Moreover, subpoenas typically occur for reasons unrelated to the study itself and therefore are not reasonably foreseeable by either the subjects or the investigator. In short, the protections offered by certificates of confidentiality may be unavailable when needed. It is perhaps unrealistic to expect that legislation will be enacted in the near future that would make all research data privileged information. However, it is reasonable to urge that the language of the PHSA be clarified to indicate exactly what is and is not covered. It is also appropriate that IRBs be provided guidance as to when certificates of confidentiality should be considered. Given that the Secretary of DHHS may issue certificates for "biomedical, behavioral, clinical or other research," there apparently are no reasonable grounds for refusing a certificate prior to the time that data collection is initiated, unless perhaps it were obvious that some kind of nonresearch work was being recast as research to obtain a privilege against subpoena.

Placing Data in a Foreign Country and Laws Governing Foreign Discovery2

Many researchers assume that sending confidential data to a foreign country (e.g., to a colleague in Canada) will protect data from subpoena. However, the relevant laws are complex, offering only a deterrent from subpoena, not a guarantee of protection. The Federal Rules of Civil Procedure govern the procedures for discovery, including foreign discovery, in federal cases. Rule 26(b) states that parties may obtain discovery of anything that is relevant, not privileged, and admissible or "reasonably calculated to lead to the discovery of admissible evidence." Rule 34 states:

(a)Scope. Any party may serve on any other party a request (1) to produce and permit the party making the request (2) to inspect and copy, any designated documents, or (3) to inspect and copy, test or sample any tangible things which constitute or contain matters within the scope of rule 26(b) and which are in the possession, custody, or control of the party upon whom the request is served...

(c)Persons Not Parties. A person not a party to the action may be compelled to produce documents and things or to submit to an inspection.

The courts cannot compel a party to produce data if the party does not have "possession, custody or control" of the documents, but it is unclear what constitutes "control" in the kinds of situations discussed here. If a researcher sends data out of the country for the express purpose of preventing subpoena, would this qualify as loss of control in the eyes of a court? Jason Gilbert (2000), a legal intern at the Federal Judicial Center, offers the following analysis of this question:

While the courts seem to have settled on defining control as when a party has a legal right to obtain something, questions remain for the researcher seeking to give up control of research data to a foreign colleague in an attempt to protect it from being disclosed. Legal rights to possession can come from a variety of sources, particularly when one is considering intellectual property such as research data. If a researcher were to create a set of data, when exactly would he or she no longer have a legal right to that set of data? What if the researcher gave one part of the data to a colleague? What if the researcher only gave up a small "key" to the data that allowed the individuals who participated in the study to be identified? What if the researcher gave part, or even all, of the data to a colleague but still continued to collaborate with that colleague to perform analysis on the data even though it was not in the researcher's possession? Would that researcher still have a legal right to get back what he or she had surrendered? While the concept of giving away the legal right of possession is relatively straightforward, the mechanics of how exactly a researcher can give away the legal right to possess his own data (particularly if one does not allow for a sale or some type of contract) remains unclear.

Gilbert also reminds us of some other implications of "loss of control" of data: 1) Transfer of data out of the country would mean loss of all electronic or hard copies in the researcher's possession. 2) Transfer of data must never be done after a subpoena has been received. Even if it is done as a safeguard beforehand, the researcher may still be found to have acted not in good faith and be cited for contempt of court. 3) If the research is done under a contract requiring that the researcher maintain control of the data, relinquishing control to a foreign colleague would constitute a breach of that contract. 4) The researcher's professional code of ethics or the requirements of a future journal editor may require that the researcher maintain control of the data.

If the researcher can be said to have "lost control" of data by sending it to a colleague in a foreign country, the researcher then puts that colleague at risk of having to respond to a request for the data and having to seek legal means of protecting confidentiality. However, the rules and procedures of foreign discovery are so complex as to deter discovery. If the person who controls the subpoenaed information resides in a foreign country and is not a national or resident of the United States, the party seeking production must follow certain procedures for foreign production. The United States has ratified various treaties concerning the obtaining of evidence from foreign countries, each having its own procedures. Discovery in a foreign country is a lengthy process. It involves the sending of a formal "letter of request" by the court where the action is pending to a court in the foreign country, requesting that court to take a deposition or request documents of the person in possession of the desired information. There are various diplomatic and legal approaches to delivering such a request and accomplishing the discovery. These are time consuming, difficult, and costly and may make discovery of the information too unattractive to pursue.

A further protection may come from the courts themselves. Over the years judges have shown sensitivity to researchers' need to protect sensitive data from disclosure. Their concern has been not so much to protect individual subjects as it has been to protect researchers who promised confidentiality and to prevent the chilling effects that excessive subpoena power would have on research in general. However, the decision to quash a subpoena that would require disclosure of confidential research data is left to the discretion of a judge who must balance conflicting interests.

Freedom of Information Act (FOIA)

The FOIA concerns the responsibility of government agencies to maintain, organize, and disclose agency records on request by members of the public. There has been much concern about whether this pertains to identifiable research data (see, for example, Morris, Sales and Berman, 1981). Fraud in government sponsored research has stimulated interest in full disclosure of research data. The Supreme Court, in Forsham v. Harris (445 US 1699) (1979), held that data developed by independent researchers funded by a government agency need not be disclosed. However, Congress recently passed the "Shelby Amendment" (Public Law No. 105-277 (1999)) requiring federal agencies to make available via FOIA request at least some raw research data. The Shelby Amendment pertains to researchers in institutions of higher education, hospitals, and other nonprofit organizations. Research data are defined as "the recorded factual material commonly accepted in the scientific community as necessary to validate research findings, but not preliminary analyses, drafts of scientific papers, plans for future research, peer reviews, or communications with colleagues" (Office of Management and Budget, 1999). Moreover, research data do not include materials necessary to be held confidential by a researcher until they are published or similar information which is protected under the law, or information that could be used to identify a particular research subject.

The Shelby Amendment appears to protect research data, but some precautions should be kept in mind. The pressure on the federal government to ensure the integrity of critical research that it sponsors is likely to remain high. Researchers may one day be required to release raw, identifiable data to the sponsoring agency, and those data may be vulnerable to FOIA disclosure. Moreover, the Shelby Amendment pertains only to research grants to nonprofit organizations. The legal standing of research performed by for-profit organizations remains unclear. In the case of research contracts that require access by the sponsoring agency, it is sometimes possible for researchers to guard against such breach of confidentiality by requesting that audits be performed at the research site and not transferred to a federal agency where they might be obtained by some other party under the FOIA. It seems unlikely that most IRBs would be sensitive to these risks of disclosure through FOIA and be prepared to advise the researcher about possible future FOIA threats to confidentiality.

State Laws

Common Rule does not diminish any protections offered by state laws, nor may state law diminish any protections of federal regulations. This interplay of state and federal requirements of human research is set forth elegantly in the commissioned paper "Oversight of Human Subject Research: The Role of the States" (Schwartz, 2000). While the Schwartz paper focuses primarily on medically related research, the sections on the consent of minors to research participation, the certificate of confidentiality procedure, and the conclusion are especially pertinent to this paper as well. Of particular importance are Schwartz' points about the state-to-state variability of laws and unpredictability of court decisions, the desirability of creating a DHHS clearinghouse on state regulation of research, and the desirability of seeking greater uniformity of laws governing human research. The fact that many multisite research projects cut across state boundaries increases the importance of these issues. In any event, if NBAC recommends the development of a web-based information and education program for IRBs and researchers, such a project would be enhanced if Schwartz' recommendations were also acted upon.

Mandatory Reporting Laws

State laws relevant to privacy and confidentiality in social research include those that mandate reporting and hence require that the informed consent state the limitation to confidentiality that the mandate implies. All states have mandatory reporting of evidence from identifiable subjects of child abuse or neglect and many require reporting of elder abuse or neglect. These state laws are in response to the federal Child Abuse Prevention and Treatment Act of 1974, which required that child protective services be established and that states mandate reporting laws. By 1978, state laws were in place, along with federally reimbursable child protective services. The history of the literature on child protection and the evolution of these laws may be found in Levine and Levine (1993).

Who must report and to whom? In most states, it is only helping professionals (e.g., teachers, therapists, physicians, nurses, social workers), not necessarily the typical researcher, who must report abuse, neglect, or intention to harm. Researchers might claim that they are not helping professionals, and not bound by the mandate. However, that might not be a winning argument in court. Most researchers are also teachers (university professors) and may be perceived by troubled subjects as an understanding professional to whom one might reach out for help. Moreover, IRBs would not permit researchers to ignore reportable evidence, and hence would require that they include in their informed consent a statement such as the following:

What is discussed during our session will be kept confidential with two exceptions: I am compelled by law to inform an appropriate other person if I hear and believe that you are in danger of hurting yourself or someone else, or if there is reasonable suspicion that a child, elder or dependent adult has been abused. [This statement was adapted from a statement developed by David H. Ruja and is discussed in Gil (1982).]

The same sort of warning must appear in the parental permission for research on one's child. Such a warning is certain to muddle the sampling efforts and reduce candor in research on family processes.

The exact wording of reporting laws varies from state to state, though each state basically requires helping professionals to report any cases in which there is reason to believe or suspect child abuse - past or present.3 The variation in wording does not impact helping professionals, researchers (or IRBs) nearly as much as does the vagueness of every state's laws. It is not clear whether "reason to believe" refers to a clinical hunch or to firm evidence, nor do these laws define what constitutes abuse. This leaves researchers to consider cultural differences, e.g., in the harshness of childrearing practices, and to weigh these against the possibility that the legal bureaucracy may be more harmful to the child or elder than are their seemingly abusive relatives. The difficulties of defining abuse are many: Estimates of the amount of child abuse run from 1 percent to 30 percent of the U.S. child populations depending on one's definition (Weis, 1989). How is the act perceived by the child - as done to teach an important lesson (Corbin, 1987), to cure a disease (Gray and Cosgrove, 1985), or out of malice? Thus, added to the costs of breaching confidentiality is the possibility that both the "victim" and the "perpetrators" will be wronged. Reporting laws vary from state to state with respect to how the professional learns about the suspected abuse. Some state statutes are limited to the child seen by the professional, while in other states a report is required even if the professional learns of it through a third party. Most statutes require the reporting professional to testify in court proceedings, include a criminal penalty for failure to report, and permit civil action against a professional whose failure to report was followed by injury to the child. However, all statutes provide immunity from a suit when a report made in good faith turns out to be unfounded (Levine, 1992).

What is ethically responsible research behavior with respect to reporting? Should the researcher stop and warn the subject who starts to mention abuse? Should the researcher listen to what is reported and follow the lawor ignore the law? Should the researcher actively seek and report evidence of abuse and neglect? How much discretion should the researcher use in deciding what should trigger reporting, in relation to the likely outcomes of reporting for the researcher, the project, the institutions involved, the child, the parents, and so on? How should the likelihood of inept handling by the Child Protective Services influence this decision?

An inexpensive, simple, safe, and legally acceptable way to study child abuse is through retrospective study of reported cases. This allows the researcher to trace abuse backward in time from its discovery to its apparent antecedents without accompanying reporting requirements. However, comparison of retrospective versus prospective research approaches on other problems of human behavior show that this is unlikely to lead to valid and useful findings. By proceeding in the opposite direction (selecting a random sample of children and collecting repeated-measure multivariate data with appropriate controls) one is likely to find a different set of conditions associated with emerging cases of abuse or neglect (Weis, 1989; Sieber, 1994). Given the popular media interest in the topic of child abuse, it is doubly crucial that scientists report valid findings. But the legal barriers to such an approach make it unworkable.

This is an area in which IRBs and researchers need wise guidance. Some IRBs may not recognize when there is risk of uncovering evidence of child abuse. Or if risk is recognized, the ambiguity of state laws concerning reporting can lead to capricious IRB decisions such as rejecting the protocol out of hand or suggesting poor solutions. If the IRB does not have a knowledgeable clinician among its members, it should call upon such a person for advice as needed. Clinically trained practitioners know how to interpret verbal or behavioral communications, and are able to determine the appropriate action to take. They probably are acquainted with the Child Protective Services agency in their area and with the strengths and weaknesses of its professional staff. They will know how to report suspected abuse in a way that maximizes the likelihood of a beneficial outcome. Researchers who do not have clinical training and who plan research on subjects who might be at risk of harming themselves or others or of being harmed need to plan ahead. They and their IRB should arrange to have access to a licensed health care practitioner in advance and have a plan for responding to events that seem indicative of relevant harms perpetrated or likely to occur.

Since most IRBs frequently review protocols for research that might happen upon evidence of abuse, most IRBs should arrange permanent institutional resources to advise and support researchers in this area in their decision-making. Without a trained clinician to advise on what constitutes "reasonable evidence," a risk-averse researcher or IRB may over-report to protect themselves from possible prosecution. Both the IRB and the researcher need to be clear that their duty is to make a considered decision in consultation with others qualified to advise. It is not their duty to jump to conclusions and report without consultation or without good advice on the agency to which they should report. IRBs and researchers would also benefit from having carefully developed guidelines concerning the duty to report. The guidelines should be tailored to the specific state and local situation, and to the particular institutional resources available for consultation.

In the short run, it is important that investigators at risk of discovering abuse understand the manifold significance of warning respondents of their duty to report. Federal regulations regarding confidentiality require that subjects be warned of mandatory reporting requirements, and researchers must be ready to respond appropriately to signs of abuse. Realistically, however, this requirement protects researchers, primarily, and not abused children or elders; worse, it hinders efforts to understand the dynamics of abuse. The message that researchers are required to deliver does not evoke appreciation that society cares about abuse; rather it may be interpreted as something like: "If I discuss (such and such) they're going to put me in jail and take my kid away from me." Such a message skews the sample by eliminating those subjects who have committed abuse or eliminating their candid admission of so doing.

In the long run, it might benefit society if reporting requirements for funded research on family processes deemed of great national importance could be altered. For example, the modified requirement might mandate training in effective parenting or elder care with follow-up supervision and built-in provision for such training and supervision. A warning to this effect might produce better outcomes for science, society, and the families involved.

Spreading of Legal Principles from State to State

Beyond these somewhat uniform mandatory reporting state laws, the task of informing oneself about the possibly relevant laws in any given state is daunting. Moreover, some high-profile legal principles "spread" from state to state. The case of Tarasoff v. UC Regents is instructive. A UC Berkeley graduate student, Prosenjit Poddar, revealed to a campus psychologist his pathological intent to kill Tatiana Tarasoff, who had spurned his affections. The psychiatrist notified the police, who found the man rational. Poddar understandably did not return to therapy, and stabbed Tarasoff to death. Through a series of appeals, the Tarasoff family persuaded the California Supreme Court (1976) that professionals have a duty to intervene effectively in such cases.

Depending upon the case, this might mean warning the intended victim, notifying authorities, or securing an involuntary commitment. Although some therapists and researchers consider this an unacceptable infringement on their duty to hold professional information confidential, the Tarasoff law in California holds that there is a duty to intervene effectively when the subject of therapy (including those in research on the therapeutic process) reveals an intent to harm another. The Tarasoff law is now widely embraced in other states, in one form or another. Even if one does not live in a state that has a "Tarasoff law," it is reasonable to consider whether victims or their families might seek, as the Tarasoff family did successfully, to apply the Tarasoff principle if a subject indicates intent to harm and then commits a violent act.

Back to Top

Threats to Privacy and Confidentiality

Threats to privacy of subjects and confidentiality of research data may be viewed from the perspective of subjects, researchers, or IRBs.

Perspective of Subjects

Viewed from the perspective of subjects, one might begin with the specific harms that may occur to them and work backward to the source and nature of those harms. The basic harms and some examples of sources are shown in the following table:

Harm Example

  • Inconvenience
  • Psychological
  • A bothersome intrusion, e.g., phone surveys conducted at dinnertime.
  • A decision to lie due to mistrust of researcher's promise of confidentiality.
  • Stress, abhorrence, or embarrassment, personal harm from invasion of privacy; e.g., subjects view pornographic pictures and suffer self-blame, loss of dignity and self-respect, and loss of their sense of or control of personal boundaries.
  • Emotional Harm
  • Worry that responses will not be kept confidential

Note: Psychological or emotional harm may arise even if the researcher has taken appropriate steps to prevent harm but has not adequately assured the subject of this. The researcher has a dual duty - to prevent risk of harm and to effectively assure subjects that this has been done. In cross-cultural contexts this requires extra efforts to communicate in a way that is understandable and believable.

Physical Harm
  • A battered wife is observed by her husband while being interviewed.
  • Identifiable data from research on gay students is stolen by "gay bashers."
Social Harm The above example of research on gay students also illustrates how the mere presence of the identified researcher or the disclosure of identifiable data could lead to stigma, rejection, or other forms of social harm.
Economic Harm There may be significant economic costs to the battered wife or the identified gay students due either to their efforts to evade their persecutors or to recover from resulting harm or stigma.
Legal Harm The battered wife and the identified gay students again suffice to illustrate the risk of being involved in an arrest and interrogation and legal costs.

A protocol recently discussed on McWirb illustrates all six of these risks of harm:

In a proposal to study response to homosexual invitation, a researcher would approach a same-sex stranger, express admiration, proposition the person, observe the response, then announce that it was just an experiment. This is not the sort of exposure to others that most people would welcome. This invasion of privacy and concern about what will be done with the findings may cause any of the following harms:

Inconvenience: worry, hassle, irritation
Emotional harm:
for timid subjects: fear, embarrassment, self-doubt, etc.
Psychological harm:
for psychologically unstable subjects, worsening of condition
Personal injury and possible legal harm
of those who were angered and fought
Social harm
if research assistants gossip about those who respond positively to a homosexual advance
Financial harm
(e.g., through blackmail or unemployment) if one were observed responding positively to a homosexual advance

Perspective of Researchers

The ideal researcher is mindful of the possible harms to subjects and to the sources of, and solutions to, those harms. That ideal researcher is also mindful of the many procedural, methodological, statistical, and legal approaches to respecting privacy and assuring confidentiality. Unfortunately, that ideal is rarely realized because most researchers do not have the education, information, or support needed to develop and fully apply this ideal perspective to their research. Even today, most textbooks and teachers of research methodology do not include this material. IRBs cannot be expected to provide the tutelage and knowledge that researchers lack, nor would their efforts be appreciated in many instances. Moreover, most researchers have other perspectives that become countervailing pressures when they lack the resources to function in the ideal mode described above.

Above all, researchers have career concerns. Some researchers are at primarily teaching institutions where they must do research in order to be promoted, but are quite overwhelmed with teaching and student advisement responsibilities. Others are at primarily research institutions and must undertake major research responsibilities and publish extensively. In either case, time management and efficiency are major considerations. However, the way in which each type of career concern is expressed is influenced by whether the researcher has the resources of education, information, and support needed to approach the ideal described above. The following table was developed to suggest extreme opposite ways in which researchers might pursue career concerns, with and without the resources of education, information, and support.

Concern Resources Present Resources Absent
Obtain Valid, Publishable Scientific Data Researcher understands how to achieve rigor while gaining rapport and cooperation through respect for privacy and confidentiality and other aspects of subject autonomy. When in doubt, the researcher knows how to quickly seek relevant information and skills to achieve research goals ethically. Researcher is trained in the tradition that overlooks the interests of subjects and considers subject autonomy a threat to rigor, e.g., to random assignment and response rate. Researcher does not know how to achieve rigorous research goals and follow the regulations. Perceives successful research as incompatible with following the Common Rule.
Publish as Much Research as Possible Researcher gains in-depth knowledge of the culture of the target research population and develops research sites in ways that fully demonstrate respect for members' privacy interests. Community members experience benefits of the research and welcome long-term, multifaceted R and D. Researcher "wastes" no time relating to local gatekeepers and other members of the research population as they do not understand science and may stand in the way of research if they know what is going to be done. If necessary, the researcher sends others to handle these "public relations issues."
Avoid Trouble Connected with Harm to Subjects Researcher uses many avenues to discover risks to subjects, is open and respectful with subjects about possible risks and risk-prevention measures taken, reduces risks as much as possible. Researcher perceives no risk, and tells subjects nothing that would make them think about possible risks or assume that they had any say in the matter, as this would interfere with science and with the researcher's career.
Researcher integrates research, community and university service, teaching, and scholarship under the umbrella of understanding and serving the subject population. Researcher uses the educational and informational tools available and consults with the IRB as research plans develop. Researcher has minimal time to "waste" on ethics in research planning and delegates it to students. Does not get involved in activities that do not immediately produce publishable data. Avoids, ignores, or placates the IRB as much as possible. Sends the IRB a minimal protocol at the last minute; complains if IRB delays the research.

The ideal researcher must at times be prepared to educate a nonideal IRB. Most of the researchers I spoke with in connection with this paper mentioned at least one instance in which the IRB wanted to take unnecessary precautions that would cause more harm than good. A better education program for the entire institution would reduce this problem. One example of such a complaint came from an eminent survey researcher who has been deeply involved in various aspects of social research ethics:

She and her colleagues proposed to study whether computer-assisted interviewing or audio-computer assisted interviewing yields better data on sensitive topics. Subjects would enter the lab and complete an interview including some sensitive questions and nonsensitive questions, in one mode or the other. In the consent form, they would be told that some of the questions asked would be sensitive, e.g., about their sexual behavior, alcohol usage, and drug usage. The IRB requested that the consent include examples of the questions. These highly experienced survey researchers responded that a "fair sample" is impossible, and that whatever examples were given would bias the respondent's perception of what was to come. The IRB relented, but only after three tries.

Perspective of IRBs

The IRB must take into account the perspective of subjects and researchers in addition to the perspectives that are solely theirs. This is an onerous responsibility if there is not a good research ethics educational program available at that institution. To move beyond my own quarter century of experience as a researcher and IRB chair and administrator, I discussed with a number of my colleagues, who have been in roles similar to mine, the issues of privacy and confidentiality they had found particularly challenging. The issues they described and many issues from my own experience are organized below according to when they occur in relation to when the protocol is reviewed.

Privacy and confidentiality issues are prevalent at every stage of research and solutions are wide-ranging. Some of these problems and solutions are ones to which the federal regulations do not address. Some of the issues raised were (and remain) so sensitive that the actual cases related to me are not retold here; only the issues involved are described. Moreover, some of the issues raised by IRB members are critical of the decisions of their fellow IRB members.

The reader should bear in mind that most social and behavioral IRBs apply the same standards required of funded research to nonfunded research, unpublishable student "research" (such as evaluations of what clients in some local program think of the program), and research not necessarily intended to produce generalizable findings - kinds of projects that are not considered research under the federal regulations. Their concern is to protect human subjects and to teach their constituency to be ethical. They are also concerned to protect the institution from law suits and poor public relations. Hence many of the issues raised below may be typical of IRBs even though they do not flow strictly or exactly from government requirements.

Overarching Issues

The following general issues were raised in various forms by many respondents.

Ignorance About "How"

Overwhelmingly, researchers simply do not know how to carry out the regulations. They do not suffer from lack of morality or even lack of familiarity with the regulations, but from lack of the knowledge, preparation, and craftsmanship required to interpret regulations. For example, how does one know what is private or sensitive from a subject's perspective? How can one anticipate when a subject might reveal sensitive information, e.g., concerning child or elder abuse. Relatedly, how does one recognize when data might be subpoenaed and how to block a subpoena? How is a child's sense of privacy different from that of an adult?

Details of the Regulations

Some aspects of the Common Rule are like road signs or road maps - extremely easy to follow if you already know how, but quite confusing otherwise. For example, the issue of children's privacy is intrinsically connected to parental permission, which intrudes on the child's privacy in some contexts and protects it in others; there are situations in which parental permission should be waived. However, to the incredulity of those who are well versed in 45 CFR 46 Subpart D, virtually no one else seems to find it easy to locate and understand the material on waiver of parental consent. Subpart D is buried behind Subpart B concerning fetuses, in vitro fertilization, pregnant women, etc., and Subpart C concerning prisoners. Having found Subpart D, one must then interpret the regulations pertaining to waiver of parental permission. Neither researchers nor most IRB members approach this interpretation without some sense of uncertainty about whether they are being perhaps too liberal or too restrictive in their interpretation.

Research the IRB Does Not See

The amount of research activity that is never overseen by an IRB is difficult to estimate, but it is a topic often raised by IRB members who express bewilderment that certain departments in their institution have never submitted a protocol. Institution-wide education and commitment would reduce the incidence of such violations.

Omission of Details to Which the IRB Might Object

Among researchers, "urban myths" grow about what the regulations allow and do not allow, e.g., "The IRB always requires signed parental consent." One researcher sought to undertake important and useful research on inner-city teenagers' drug use and sexual practices. Under the circumstances there was no way to obtain parental consent. The parents were variously absent, high on drugs, or dependent on their sons for drugs and drug money and hence not supportive of research designed to help kids get off of drugs. The researcher did not know about the legal exemptions for parental permission and did not reveal to the IRB that parental approval would not be sought. A researcher who was educated about waiver of parental permission could have worked with the IRB to solve this problem legally.

When Is It Human Subjects Research?

Among IRBs, tales are legion about particular departments that consider themselves not engaged in human subject research and therefore exempt from IRB review. For example, a professor of marketing believed that marketing research is not about human subjects hence need never go through the IRB. He undertook research on problems of introducing computer technology into the workplace. He had his students administer surveys to university staff to learn what problems they experienced with their supervisors in connection with computerization of offices. The survey was purportedly anonymous - only job title and department were requested! Needless to say, the IRB chair learned of this from outraged staff who felt politically vulnerable and could not imagine how the IRB could approve such a dangerous and invasive study. This was the first the IRB knew of the research. The researcher neither understood nor wished to understand the issues.

Late Protocols

Many researchers learn, part-way through their research, that they should have obtained IRB approval. A protocol is submitted as though the study had not begun. The innocent IRB raises issues and the researcher resubmits, correcting the mistakes. But the research already has been done.

Evasion in Response to Prior Rejection of Protocol

While most of the overarching problems of noncompliance are caused by ignorance, some represent blatant skirting of IRB requirements. In response to rejection of an extremely faulty protocol, some researchers find ways to get around IRB review. For example, a researcher wished to survey Chicano fifth graders about safe sex and AIDS. He did not plan to get parental approval as he judged that to be too difficult. The IRB indicated a number of deficiencies in the protocol, including the fact that no competent school administrator would permit such a study to be done, particularly without parental permission. It was later learned that he had assigned his student teacher to administer the questionnaire, as coursework, in a local school.

There may always be persons who would skirt the regulations. However, the more thorough and user-friendly the institution's resources for education concerning human subjects research, the more one's colleagues and students will exert pressure for adherence to the regulations.

Issues Preceding IRB Review

Nonresearch activities that turn into research, requiring special guidance.

Example: A student started an in-class paper based on observation of a family with a recently aphasic member. She met with the extended family regularly, often over dinner, and not always to discuss her project. She gained enormous insight into the family and how it coped with the person's gradual recovery from a stroke. She and they want the case turned into a book. However, family members do not realize that she is getting from each individual information that is devastatingly personal about the other family members also being interviewed. The risk is not that she is asking personal information, but that they are insisting on telling her personal information (not about the aphasic individual) and that the family members are at risk of learning harmful things about one another. The IRB is working with her on reasonable ways to omit damaging information.

When IRBs are asked to tell thesis students what they need to know.

These meetings are useful, but just a first step. Each thesis is likely to raise different issues of privacy and confidentiality. Too much or too little detail is inappropriate for a one-hour seminar. The main accomplishment of such a meeting is to encourage subsequent consultation with the IRB on specific issues. While it takes time and commitment for an IRB member to meet with individual students who have questions, this is far easier and more satisfactory than dealing with poor protocols. However, the process of educating individual students would be made much easier through use of a comprehensive web site such as that proposed herein.

When approaching the IRB raises confidentiality concerns.

Example: A historian obtained a list of the names and addresses of high-profile, historically important criminals in hiding, whom he planned to interview. The nature of the list and his research plans were so intriguing that he feared one or more IRB members would be unable to resist discussing it with others. He feared that ultimately the FBI might learn of the study, trace his movements, and arrest his subjects. Solution: The IRB chair agreed with this concern about possible breach of confidentiality by IRB members. He and an appropriate dean reviewed the protocol and approved it. An important book ultimately was published based on the interviews.

Issues Encountered in Reviews of Protocols

When research resembling investigative journalism raises special issues.

What rules pertain regarding consent, privacy, confidentiality, anonymity, and risk? Example: A Yugoslavian student disbelieved Western analysis of events in Bosnia. He proposed interviewing survivors from two neighboring cities who had become involved in terrible atrocities against one another. He believed that only they could tell the real story. He planned to ask for the names of perpetrators. What standards of consent or confidentiality should pertain?

Deductive disclosure.

Researchers often design studies that they claim to be anonymous although they plan to request (often unnecessary) information that would make it possible for persons to deduce the identity of individual subjects. Or, they may present tabular data with such small cell sizes that persons who know the people who participated in the research could deduce the identity of certain subjects. For example, suppose people in a convalescent home were interviewed about suicidal ideation and the data were broken down by age. Further, suppose that there were just one person over 90 in the institution and a table showed the number of person over 90 who had considered suicide (one). Anyone familiar with the institution could deduce the identity of that person. A skilled IRB can suggest changes in design or data display that eliminate risk of deductive disclosure.

Promises of confidentiality that preclude referring persons to appropriate help.

Example: Researchers survey teens in school about whether they have contemplated suicide. If the survey is anonymous to respect privacy and foster candor, then the researchers cannot directly locate and assist respondents who report having suicidal ideation. In essence, the usual methods of respecting privacy could result in a subject's death. Solution: The IRB recommended giving each participant a handout listing suicide-prevention services (e.g., counseling, hotlines, web sites, etc.) Moreover, surveys are coded by class. If there were one or more suicidal risks in a class, that entire class would receive a curriculum designed to help students cope with causal issues (e.g., stress, rejection, too-high expectations). Thus, privacy and confidentiality was assured, and suicidal students received the needed treatment along with their peers who received useful information about mental health issues relevant to themselves and their cohort.

Promises of anonymity that preclude treating persons in need of help.

Many master's degree students in counseling psychology have teen clients and want to do theses on how those teens perceive and cope with problems of their generation (e.g., premarital sex, STDs, HIV infection, birth control, pregnancy). They conduct the interviews anonymously. What if a subject reveals a serious problem (e.g., pregnancy, STDs or HIV infection, abusive relationships)? Solution: the interviewer, without knowing the name or other unique identifiers of the subject, simply encourages the subject to seek help that will be given at no cost. The graduate program, working with the IRB, arranged with a campus clinic to provide extensive free support and medical service for any teen who appears with an anonymous referral slip from one of the researchers. The clinic encourages such teens to keep coming back until the problem is handled appropriately.

Culturally insensitive researchers.

Researchers may lack sensitivity to cultural norms of their subject population. They risk communicating in ways that are considered invasive of privacy, insulting, and unacceptable.

Example: A researcher planned to interview ethnic Chinese mothers in Chinatown who had received training in AIDS/STD education, who would then educate teenagers in their community. The researcher was planning to ask some very direct questions about how these women's own children responded to the information, not realizing that one does not directly ask sensitive questions about family members in that subculture. Rather one asks about the responses of "other people's teenagers."

Applying rules overzealously.

Example #1: A student spent the summer at home, about 3,000 miles from college, working in a prison. He chatted with the prisoners, most of whom were uneducated. He soon found the conversations so interesting that he decided to keep a diary of the conversations. That fall, faced with an opportunity to write a senior thesis, he decided to write up these summer experiences and insights as a senior paper (not for publication). He was required to obtain IRB approval. The IRB instructed him to get consent from all of his "subjects." He considered this too impersonal and inappropriate to do by mail, especially given the literacy level of most of the prisoners. It was also too expensive to travel home and administer the consent in person; moreover he judged it too intrusive into the lives of people who had considered him just a pal and who were likely to misunderstand the point of his request. He finally persuaded the IRB to accept the careful use of pseudonyms.

Example #2: A graduate student worked for a State Renal Dialysis Registry and was responsible for the registry data. She had a relationship with the patients and wished to conduct a survey (in her student role) that was designed to help understand the needs of the renal patients. The project was enthusiastically endorsed by her supervisor at the Registry. The questions were not sensitive, and her access to the data was legitimate. She wished to write to the patients and ask if she might survey them. The IRB where she was in graduate school required that she have each patient's primary physician contact that patient to ask if she could contact the patient. It is normal and understandable that physicians and their staffs broker the relationship between their patients and a researcher who is unknown to them and who would not otherwise have legitimate access to their data. However, in this particular instance the requirement was unnecessary and made the research impossible to carry out. The researcher could not afford to pay the costs of hourly employment of physicians' staffs who would contact her patients and get their permission for her to then contact them. Useful research was thus rendered impossible to carry out.

Insensitivity to the stage of intellectual or social development of subjects when trying to preserve privacy or confidentiality.

Example: An autistic woman was to be tested by a clinical psychologist studying autism. Her elderly parents must make provision for her care when they die and need to know the results of the testing. In the usual interests of privacy and confidentiality, the report should go only to the subject. In this case, it was decided that the researcher should inform the subject, in the informed consent, that her results would be shared with her parents.

How subjects are paid: implications for confidentiality.

Example #1: In street-based studies such as interviews of drug addicts, it is not advisable for the researcher to carry cash, but it would not be acceptable to pay via checks, as the subjects are anonymous and probably do not have bank accounts or convenient check cashing arrangements. A voucher payable at the institution's bank (no identification required) solved this problem. A one-use, limited-amount debit card might work as well.

Example #2: Studies that pay subjects more than $600 must comply with IRS requirements and issue an IRS Form 1099 to subjects, thus making it possible to trace subjects through payroll and tax records. Participants in a study that stigmatizes them (e.g., studies that recruit HIV-positive persons, pregnant teens) have concerns about exposure of their identity. Timing the study to spread over two calendar years with half-payment in each may be the only legal solution to this problem.

Example #3: This problem is even more complicated when the subjects are illegal aliens and do not have Social Security numbers.

Respecting relationships between gatekeepers and subjects.

A breach or perceived breach of confidentiality or invasion of privacy may destroy a gatekeeper's relationship with subjects. Researchers are often dependent on the goodwill of members of communities or organizations who provide access to a particular population of subjects. Researchers who violate or appear to violate privacy or confidentiality may destroy important community relationships.

Example #1: A researcher received permission to interview patients at an out-patient drug rehabilitation clinic. The clinic director placed many restrictions on the kinds of questions that may be asked, the times when the researcher may seek appointments with clinic patients, and the amount of time he may spend with them. It also restricted his access to clinic information. Though disappointed, the researcher realized that the clinic has worked hard to win the trust of its clients, and that both the rehabilitation and the research depend on the maintenance of that trust.

Example #2: A researcher interested in the problems and recovery processes of persons who have been sexually abused "hung out" in an internet chat room support group of victims of sexual abuse. The chat room was a public venue even if the subjects tended to forget that others were free to "lurk" there. He took careful notes on what he read, but did not print out messages or write down anything that would identify specific individuals, locations, or the identity of the on-line chat room. He decided not to seek consent from the participants or from the therapist who ran the chat room, as this might have a chilling effect on the therapeutic processes that were taking place. He realized that anything he published based on this observation should carefully disguise the identity of the chat room and its members so that none of the activities or individuals could ever be recognized.

Certificates of Confidentiality.

It is not always obvious when there is a possibility that data might be subpoenaed. Example: Data from a study of employees' occupational exposure to an environmental toxin (including data on their history of alcohol and drug abuse, and neuro-psychological battery test data) were subpoenaed by the manufacturer against which workers had filed a class action suit. The data were coded, but the number of subjects was small enough that their demographic and workplace data could be identified. The IRB had not required a certificate of confidentiality. The subpoena could not be squelched. The researcher was called as a witness. The study methodology was trashed by expert witnesses for the defense. The case was settled out of court in favor of the plaintiff subjects.

Waiver of parental permission.

The question of when parental permission may be waived is usually cause for concern. Many researchers do not know that provision for this waiver exists (are unfamiliar with Subpart D), and IRBs are often troubled about when to apply it. Example: How does one handle consent with respect to emancipated or run-away teenagers who subsequently return home in the course of the research? Does one abandon the study of these teenagers? Does one disclose to a homophobic parent that his gay son was recruited into a study of safe sex practices while he was a run-away? Solution: Because of potential harm to the teen by the parent, parental permission was waived in this instance.

Issues Arising After IRB Approval

IRB audit of informed consent.

Example: An IRB proposed to audit the signed consents of subjects who are HIV-positive. The subjects were local people and some were probably known by some of the IRB members who do not know their sero-positive status. Solution: The IRB chair decided to conduct the audit alone and report the findings to the rest of the committee. (Although the possibility of audit could have been mentioned in the informed consent, that might have muddled the researcher's sampling scheme. Perhaps this was the best solution.)

Sharing of videotaped data.

Videotaped data present new problems of privacy and confidentiality. Important studies, e.g., longitudinal studies of family interaction, are worth archiving for other scholars to use, but individuals might be recognized on the tapes. Researchers and archivists are only now developing consent language and other appropriate safeguards or restrictions on the use of such tapes. Pioneers in this area are the Murray Center Video Archive, and the Stanford and Silicon Valley Archive Project. Example: The Murray Center now has a videotape archive. Donors of videotape data must submit to the Murray Center the informed consent document showing that what was promised subjects is compatible with the intended archiving and sharing arrangement. It requires that scientists present a research proposal that must be approved before they may have the use of a videotape. They must indicate that they have never been located in neighborhoods where the subjects might be known to them. They must agree that they will immediately stop the video and not observe tape of anyone they think they recognize. Tapes must be returned to the archive by a specified time.

Site sharing.

Site sharing sometimes occurs when a prior researcher has established a rich archive of information based on an interesting community of persons. For a new researcher to have access to the history and current lives of these people would constitute an invasion of their privacy if not handled courteously, with respect for their feelings and autonomy. Example: A secondary analyst wished to build on longitudinal data collected two years earlier by the data donor. To simply "show up" as though he were entitled to peer into the lives of these people would probably be considered an invasion of privacy and affront to dignity by at least some of the subjects. The initial researcher agreed to return to the research site and work through key community leaders to report what he learned in his study and to ask if people would volunteer to participate in a follow-up conducted by the new researcher.

Data management.

This issue is especially important when data are not anonymous or when deductive disclosure is possible. Researchers promise confidentiality in good faith, but often overlook steps that should be taken to assure that theft or snooping cannot occur. Example: A researcher transported identifiable data (including data on sexual practices, diagnoses of STDs, etc.) from the research site to his home overnight before bringing them to the office. The car was broken into at his home and the records stolen. The IRB had failed to require a data management plan. Subjects whose records were stolen were immediately informed and given contact information for a police detective assigned to the case. No harm to subjects occurred, but the data were never recovered, thus damaging the project.

Institutional use of research data.

A firewall between scientific and administrative use must be established and guarded so that subjects remain in control of access to themselves and to their data. Thus, this is both a privacy and confidentiality issue. Example: A state department of health examined the rate of illness in employees in a local laboratory. A high rate of illness was noted in certain employees, and the lab was instructed to provide better shielding in some areas. Strict confidentiality was maintained so that credit rating, insurance rates, and employability of affected employees would not be jeopardized. The laboratory later requested the data (with identifiers) ostensibly to further investigate the effectiveness of the shielding. It proposed to safeguard privacy and confidentiality by getting employees' (subjects') consent to obtain their records from the agency. An IRB rejected the protocol because the employees could not autonomously refuse, and might be laid off to protect the lab from law suits and insurance claims.

Issues Requiring Special Competence

There are many issues touched upon in the survey concerning the need for special resources of information and expertise. Some have already been mentioned. A few deserve special mention here.

Plethora of state laws.

Several IRB chairs mentioned bewilderment at the many state laws that might pertain to privacy and confidentiality in social research. Even an IRB chair who is an attorney felt that the state law situation was hopelessly complex, especially in the case of multi-state research projects. A well-organized web site that could be accessed by state and topic would serve IRBs and researchers well.

Specialists.

Some IRBs have worked hard to provide information to researchers, as needed, on specialists they might consult on various issues. Other IRBs have not recognized how useful a service this might be or what areas of special skill would be useful, periodically, to their clientele. While the particular kinds of specialists would vary by location, IRBs would benefit by having guidelines for developing appropriate informational resources of this kind tailored to their location and situation.

Methodological sophistication.

Several related issues were raised that call for sophistication in statistical and research design. Some IRBs are troubled by research designs that appear to use more or fewer subjects than are necessary, or to employ an overly long set of questions. Unfortunately most investigators and IRB members are fairly parochial in their understanding of what might comprise adequate designs and analyses for given types of research. Does the IRB understand the methods of disciplines other than their own? Can it assist researchers in recognizing and rehabilitating poor designs?

  1. Methods such as grounded theory, historiography, use of focus groups, and even interview and survey research are sometimes rejected as inappropriate by some who were trained that experimental designs are the only valid research designs.
  2. What is the appropriate number of subjects? Single subject designs, and approaches to reducing variance between subjects are important but often poorly understood methods of producing useful data with one or a small number of subjects.
  3. There is now a literature of virtually hundreds of approaches to protecting privacy or assuring confidentiality. This literature is rarely sought out by IRBs, researchers, or teachers of research methods. Most are not even aware that it exists. It needs to be available in a user-friendly format.

Federal Interpretations of the Regulations

There was a startling amount of spontaneously generated expression of concern among those surveyed that two recent governmental interpretations of the regulations are unworkable:

Requiring the consent of persons about whom a survey inquires.

This virtually destroys the possibility of epidemiological research. It also gets applied foolishly to such contexts as interviews of parents about their young children. It is an example of a bad rule growing out of an extreme case.

Requiring written consent of participants in Centers for Disease Control and Prevention health surveys.

Concern was expressed that this requirement would decrease response rate, bias samples, increase time and cost required to do the research, and add a risk to confidentiality.

Back to Top

IRB Practices Regarding Privacy and Confidentiality

As the above examples illustrate, the range of privacy and confidentiality issues confronting IRBs is immense, and the regulations are too general to offer useful guidance in dealing with these issues. This is probably as it should be, for if the regulations were too specific they would leave no room for interpretation and flexibility. However, without intelligent interpretation, the regulations tempt IRBs to engage in overzealous, heavy-handed, or capricious enforcement, resulting in more harm than good. Some researchers can anticipate issues inherent in their research plans and argue effectively with IRBs that propose inappropriate applications of the regulations. Other researchers are not so fortunate, and may not know where to turn when faced with requirements they cannot translate into workable research procedures. IRBs may seek to educate their members and clients, but this is not always welcomed or practical and requires much initiative and commitment.

Many IRBs have little sense of the range of privacy issues they should consider or of the range of solutions that might be employed in the service of valid and ethical research. Many IRB chairs, members, and staff persons are not in a position to effectively guide or teach their clientele, or to gain the respect of their clientele. Other IRBs are able to do a remarkably effective job of teaching and guiding their clientele. This section will be devoted to discussion of current practices of highly effective IRBs.

Effective IRB Practices

Some IRBs have members and staff persons who have cultivated the knowledge and resources needed to engage effectively in creative ethical problem solving. Moreover, some IRB chairs have shown considerable wisdom and courage in their willingness to bend rules when necessary to protect human subjects. My colleagues, the IRB members who shared relevant experiences with me, have participated regularly as faculty in IRB workshops and other related professional activities at a national level. Their level of awareness and creativity is probably far above average. Several characteristics emerge concerning effective IRB practices.

Proactive Problem Solving

Effective IRBs identify key areas of difficulty and develop a range of resources to handle these issues. For example, some IRBs review many sensitive protocols involving adolescents. A key issue is finding a way to help or treat teens in trouble once their problems have been disclosed to the researcher. Confidentiality and teen/parent conflict are usually at the heart of the problem. A range of ready solutions can be created in consultation with knowledgeable persons within the institution and community. Extensive referral lists can be kept on disk and can be revised as needed for each relevant project. Competent professionals can be identified who can be called upon for advice and assistance. Arrangements can be established with departments that are in a position to help.

Long-Term Professional Membership

Key IRB members have a long-term commitment to keep abreast of the literature and are involved professionally at a national level. Through personal contacts and involvement with IRB organizations, they can seek advice and information as needed. Thus emerging issues having relevance for privacy and confidentiality and the myriad approaches to respecting privacy and assuring confidentiality are quickly integrated into the IRB's collective consciousness, policy manual, evolving guidelines, web site, etc. New members find a treasure trove of experience and resources at hand when they join the IRB. The researchers served by the IRB learn to respect the helpful intelligent human beings who serve on the IRB. Deans and department chairs learn to look to the IRB as a prudent and diplomatic collaborator who can help them deal with the "lost sheep" or "black sheep" in their flock.

Availability as Educators and Advisors

Effective IRB members, and especially chairs and key staff personnel, seek to be user-friendly problem solvers who are readily available by phone, e-mail, and in person. They give or arrange for workshops tailored to the current needs of specific groups of students and researchers. They are available on a consultative basis. They realize that there are as many privacy issues as there are kinds of human interaction, and as many confidentiality issues as there are conduits for data. Hence they see their own role as that of listening and helping researchers think through privacy and confidentiality problems and solutions. They work patiently with students, and regard themselves as teachers. As one IRB chair/researcher said:

I get students to stay in a confidentiality mode by insisting that they use pseudonyms and aliases even with me. I make a big fuss if I accidentally find out who somebody is and insist that the student be much more careful from then on. I ask to look through their field notes, and act really annoyed if I spot real names or identifiers. I ask "What if you lost the notes and the respondent found them. Or, if their worst enemy found them?"

Relevant Competencies

The IRB membership includes persons who have statistical and methodological sophistication, quantitative and qualitative skills, clinical skills, and legal skills. Alternatively, it has ties with key persons within the institution and community so that needed expertise is always at hand and individual members never hesitate to consult with these experts on problems. With this range of competencies they are quick to detect privacy and confidentiality issues and to help devise solutions that respect subjects and the integrity of the research.

Referral of Issues Outside of IRB Purview

One effective IRB has developed a chart listing all the kinds of borderline issues they cannot afford to ignore but that do not fall quite within their purview. The chart gives examples of each kind of issue and indicates the office or individual to whom each such issue might be referred. Thus, when presented with a sticky problem - judging whether a piece of research apparatus is safe, quashing a subpoena of data, or responding to a whistle-blower's allegations of research fraud - the IRB has a ready process to engage and need not waste time and energy re-inventing solutions to each case. For each kind of case, the chart also indicates whether the office to which they refer must report its decision back to the IRB or whether the matter is settled once the referral is made. This IRB devoted much time developing this ever-evolving document, and got approval from those who would be involved in the referrals. Thus, those who used to complain that the IRB over-reached now have no complaint!

Web Site and E-Mail Efficiency

IRBs conduct much of their routine communication and education on line. Their web site contains their guidelines, protocol application forms (to be used either as hard copy or transmitted back to the IRB via e-mail), sample materials (e.g., consent forms) for a variety of purposes, etc. Many protocols are transmitted as e-mail attachments. Preliminary issues can be discussed and disposed of by e-mail; expedited reviews can be handled in a day or two. Some materials can be sent out instantaneously.

Time saved in this manner is then available for serious discussion of policy issues and proactive problem solving. Problems of privacy and confidentiality are often procedural ones with tidy solutions. A web-site reference to such problems and solutions coupled with on-line advising of researchers can make for much more efficient solving of privacy and confidentiality problems.

Back to Top

Discovering What Is Private to Research Subjects

The difficulty of understanding what is private to others and how that privacy might be invaded has been the underlying theme of this discussion. We have seen that experience, dedication, and wisdom builds something akin to intuition about issues of privacy and confidentiality. How can one directly nurture this ability? There are literatures, exercises, and other sources of information and expertise that are available to assist researchers, IRBs, and teachers of research methodology. This section discusses concepts and resources that can provide insight into the privacy needs of diverse subject populations.

Privacy as a Developmental Phenomenon

Thompson (1982) and Melton (1992) have summarized literature on child development to show how the sense of privacy grows and changes through the years of childhood and what this implies for research on children. Studies by Wolfe (1978), Rivlin and Wolfe (1985), and Parke and Sawin (1979) indicate the privacy is salient even for elementary school children. Children evaluate the quality of living situations by the degree of invasion of privacy and infringement of liberty present in them. Being alone is an important element of privacy to children. By adolescence, maintenance of control over personal information becomes important to the development of intimate relationships, development of self-esteem, and sense of personhood. Melton (1992) has opined that the range of inherently abhorrent intrusions is broader for children and adolescents than for adults because of the acute significance of privacy to personality development. Thus, the privacy interests of children and families are to be respected even if there is no risk of disclosure of intimate information. Many researchers fail to focus on this fact of human experience. IRBs should enlist members who are competent to advise in this area and be prepared to provide an annotated bibliography on developmental aspects of privacy and confidentiality.

Back to Top

Privacy as Situational

"I don't feel comfortable discussing that here." An important aspect of privacy is the freedom to pick where one says what to whom, what aspects of one's life are observed, what degree of control one has over one's personal space, and how fully one can control access to personal information. However, that generalization is too abstract to guide research practices; researchers situated differently from the subject are not good judges of what subjects might consider private. An important element of planning sensitive research is to ask gatekeepers or surrogate subjects what contexts would be most conducive to candid discussion of the specific topics of the research, what space is considered personal, and what one would not want others to observe - even if the data would be kept confidential.

Privacy as Culturally Determined

As Laufer and Wolfe (1977) have shown, persons' privacy interests are idiosyncratic to their culture. To learn about the privacy interests of one's research population, it is useful to a) ask someone who works with that population regularly (e.g., ask a social worker about the privacy interests of low socioeconomic status parents); b) ask an investigator who has had much experience working with that population, or read ethnographic literature on that population, provided that work was recent and in about the same geographical area; c) ask members of that population what they think other members of their group might consider private in relation to the intended study; or d) employ research associates from the subject population and work with those individuals to develop procedures that are sensitive to the group's privacy interests.

Privacy as Self-Protection

What is it about the subject that might create a special interest in controlling the access of others? Is the subject engaged in illegal activities, lacking in resources or autonomy, visible, public or famous, stigmatized, institutionalized, etc? Each role carries its own special interests in privacy, which the researcher needs to understand in order to gather valid data and respect subjects. Experience with the target population, use of focus groups and surrogate subjects, knowledge of the ethnographic literature on that population, and consultation with expert professionals who have conducted research similar to one's planned project - all can be useful sources of information on the privacy needs and interests of the given population. There are many national and local sources of this kind of information which a given IRB might help researchers locate and use.

Subject Population Beliefs About Research

Researchers and IRB members who have conducted community consultations have been astonished at the perspectives of some community members and at the obvious impact of those perspectives upon candor and beliefs about researchers' promises of confidentiality. Time spent in "town meetings" or focus groups with members of the target population, parents of potential child subjects, or gatekeepers of the target population is well spent. It invariably yields useful insight regarding the privacy interests of subjects, and the implications for organizing the research to respect those interests and provide appropriate assurances of confidentiality.

Privacy as a Personal Perspective

One's sense of privacy grows out of one's system of values, morals, norms, experiences, beliefs, concepts, and language. The following table is a useful heuristic for research teams seeking to integrate what they have learned about their subjects into a viewpoint that will generate good solutions to the problem of respecting privacy appropriately:

  Subject Knows Subject Does Not Know
Researcher Knows Shared culture or understanding of subject's culture and beliefs. What the subject does not understand about the researcher's culture and beliefs.
Researcher Does Not Know What the researcher does not understand about the subject's culture and beliefs. Shared ignorance.

This table concerns differences in perception, knowledge and communication. After consulting with members of the target population, it is useful to fill in this matrix to better understand the perspectives of the researcher and subjects with regard to their respective interests in access to personal information. When there is much shared culture, the researcher can readily negotiate a valid agreement concerning privacy and confidentiality. Otherwise special measures are needed.

Back to Top

Procedures for Assuring Confidentiality4

There is a major body of literature on approaches to assuring confidentiality. The most outstanding and comprehensive source is Boruch and Cecil (1979). Approaches to assuring confidentiality fall into nine categories:

  • Kinds of confidentiality assurances and their consequences
  • Procedures for eliminating linkages of data and identifiers in various types of data
  • Intersystem linkage
  • Statistical strategies
  • Secondary analysis or audit of data
  • Legal protections
  • Statistical data: rendering them truly anonymous
  • Qualitative data, case studies, and ethnography: masking identities and rendering descriptions respectful
  • Internet research

Kinds of Confidentiality Assurances and their Consequences

Researchers' pious promises of confidentiality are not always effective in producing trust and candor in research participants. Moreover, they are not always promises that can be kept due to faulty data management practices and other possible compulsory disclosures. What kinds of assurances of confidentiality are there and what are their consequences? There is considerable research and scholarship on this topic (e.g., National Academy of Sciences, Committee on Federal Statistics, 1979; Singer, Mathiowetz, and Couper, 1993; Singer, VonThurn, and Miller, 1995; Sieber and Saks, 1989.).

Procedures that Eliminate Linkage of Data to Unique Identifiers

Anonymity offers the best assurance that disclosure of subjects' responses will not occur. Many dozens of techniques have been developed that are responsive both to the need for anonymity and to other research needs as well. Different kinds of data - cross-sectional, longitudinal, and data from multiple sources - bring with them different research requirements and different ways of meeting these without use of unique identifiers of subjects. The following brief summary is illustrative, not comprehensive. See Boruch and Cecil (1979) for a comprehensive review.

Cross-Sectional Research

Cross-sectional research in its simplest form requires just one data collection session. Anonymity, in which even the researcher is at all times ignorant to the identity of subjects, protects the respondent from legal prosecution, social embarrassment, and concern that the data may fall into corrupt hands. However, it may be desirable to have some form of follow-up to test for sampling validity, response validity, or to do further research on some or all subjects. These refinements are impossible with complete anonymity, but can be realized through temporarily identified responses with subsequent destruction of identifiers, or through use of brokers to provide anonymous data to the researcher after completing one or more of these refinements.

Longitudinal Research

Longitudinal research seeks to track change in individual subjects over time. This cannot be accomplished with strictly anonymous data. However, there are many ways in which aliases or arbitrary identifiers can be used as a basis for linking observations over time while preserving the confidentiality of individual responses. The simplest of these involves having subjects choose an easily remembered alias and using it on repeated occasions. There is a considerable body of literature examining the success of variations of this procedure. Some approaches are quite complex. For example, in research by the American Council on Education (Astin and Boruch, 1970) on political activism among American college students, a three-file linkage system was used as follows:

Initial Data Collection:

  • File A contains each subjects' data and arbitrary account number (X).
  • File B pairs each subject's name with a second arbitrary account number (Y). File C matches the two sets of account numbers, X and Y.
  • File C is shipped to a researcher in a foreign country.

Second Data Collection:

  • Second set of identifiable longitudinal data are gathered.
  • Names are replaced by their Y account number; this file is shipped to the foreign researcher.

Data Analysis:

  • Foreign researcher substitutes the X account numbers with their corresponding Y numbers.
  • Each set of data files is returned to the data analysts.
  • Data are organized in longitudinal sequences, but the identity of each subject is unknown.
  • The longitudinal data are analyzed.
  • The foreign researcher destroys File C so that the three files can never be merged to learn subject identities.

This procedure renders the data safe from snooping and theft. Conceivably, foreign discovery procedures could be used to obtain some of the identifiable data before File C is destroyed, hence a certificate of confidentiality could be obtained to preclude that unlikely event.

Intersystem Linkage

It is sometimes necessary to link research records on subjects with other, independently stored records on the same individuals. In the case of highly sensitive data such as psychiatric or police records, a linkage strategy may be needed so that the researcher does not have access to any identified records. One such method is as follows:

  1. Researcher wishes to link data on 50 subjects with information from their police records.
  2. Subjects each provide data and an alias (no name) to the researcher.
  3. Subject provides to the archive (e.g., police) his name and alias.
  4. Archive provides the requested police information with the aliases (not the names) attached.
  5. Researcher analyzes relationship between his research data and the police record data. This brief summary is merely illustrative of some of the many specific procedures for preserving anonymity or confidentiality and the problems they are intended to solve. The actual literature on this topic is immense.

Statistical Strategies

Various statistical strategies have been developed to eliminate any direct link between the respondent's identity and his true answer. All of these methods involve the injection of a specified amount of random error into the data set so that no individual's true condition can be ascertained but useful statistical analysis of the data is still possible. A very simple example (oversimplified for purposes of this exposition) is the randomized-response method which can be used in direct interview. Suppose the researcher wished to ask whether subjects had struck their child in anger this week, or cheated on their income tax this year - an obvious invasion of privacy. The subject is instructed to roll a die in his cupped hands and observe which side came up without showing it to the researcher. If (say) a "one" came up, the subject was to respond "yes," irrespective of the true answer. By an algebraic removal of the expected number of false "yes" answers from the data, the researcher can determine the true proportion of yes responses. Neither the researcher nor anyone else besides the subject knows who gave a true "yes" response. As with procedural strategies, statistical strategies have been designed for use with longitudinal and multiple source data, as well.

Secondary Analysis or Audit of Data

Concern for the integrity of data and for extending the analyses of important data sets brings with it the need to do so without risk to privacy or confidentiality. The simplest solution is to render the data anonymous. However, anonymity is not always acceptable or useful from the perspective of the secondary user or auditor. There are many procedures that diminish a) outright breach of confidentiality, b) likelihood of deductive disclosure, c) the sensitivity of the information to which the secondary users have access, or d) the need for the secondary user to actually take possession of the data. Researchers and IRBs who have knowledge of the literature on these issues will know what agreements to make with funders who require audits about how confidentiality will be assured. They can also decide what to include in the informed consent so that potential subjects understand what will be done with the data subsequent to the initial project.

Legal Protections of Confidentiality

As discussed in a prior section of this paper, there are few legal protections of research data. While the courts have shown considerable respect for the need to keep promises of confidentiality made to subjects, they must weigh the importance of this against countervailing issues. However, there is growing use of certificates of confidentiality. Researchers and IRBs need to understand the uses and protections these provide, and their limitations.

Descriptive Statistics

Much work has been done by statisticians in governmental agencies in the United States (e.g., Bureau of Census), Great Britain, and Sweden to develop practices of adjusting tabular presentations so that deductive disclosure is not possible. The most common of these practices is to broaden categories so that data from unique individuals (e.g., top income earners) are not apparent.

Various procedures for preventing disclosure from presentations of quantitative data and descriptive statistics have been developed. The issue of deductive disclosure and the methods that will be employed to prevent published information from disclosing identities should be considered in the planning stages of research and discussed in the IRB protocol, but this is rarely done as most researchers and IRBs lack knowledge in this area. A web site describing main approaches to preventing deductive disclosure from statistical presentation, as well as an annotated bibliography of more complex methods, would be useful aids to researchers and to teachers of quantitative methods.

Qualitative Research

It is sometimes possible to deduce personal information about identifiable individuals from qualitative data, such as a cultural anthropologist's "anonymous" account of life in community that is described, but in which persons and places are given fictitious names (Johnson, 1982). The same kinds of problems will probably arise as more studies are conducted in virtual communities of participants in on-line chat rooms (King, 1997). Any clue such as a description of the research site (e.g., a map or a web address) might permit deductive disclosure by anyone familiar with that territory. Johnson (1982) reviewed cases of well-known publications in cultural anthropology in which the identities of specific rural community members could be deduced. In some of these cases, anthropologists had written detailed accounts about the secret, illegal, immoral, or reprehensible deeds of these people, with no awareness that the actual identities would be discovered.

Those who do qualitative studies of the lives of others cannot ensure confidentiality; the subjects themselves, the research assistants, or even the sponsor of the research may inadvertently leak the identity of the research site. Since total confidentiality or anonymity cannot be guaranteed, the issue becomes one of ongoing communication and agreement with subjects (informed consent) and respectful communication of the findings. There is, by now, a growing literature on this issue (e.g., Cassell, 1982; Gallaher, 1964; Glazer, 1982; Johnson, 1982; Kelman, 1968; King, 1999; Wax, 1982).

For example, Johnson recommends guidelines for "ethical proofreading" of cultural anthropology manuscripts to diminish potential harm to subjects or communities as follows:

  • Assume that the identities of the location and individuals studied will be discovered. What would be the consequences within and outside the community? Will its effect on individuals and relationships be positive or negative? Does the importance of the material warrant any risk of harm?
  • Look at the language. Is it descriptive or judgmental? For example, "Twenty percent of the adults are functionally literate" is less judgmental than, "Most of the people are backward."
  • When describing private or unflattering characteristics, first describe the cultural context, then describe the specific characteristic. This is more informative and does not single out individuals as much.
  • Negative stereotypes may affect similar other people and communities even if the specific people and communities are not identified. Ask yourself how the information might be used in a positive way? In a negative way? Are revelations worth the possible risk?
  • whether the research site will be usable again or whether it will have been destroyed if the residents read what has been written about them.
  • Have some of the subjects proofread the manuscript for accuracy and invite them to provide any general feedback they are inclined to offer. Have some colleagues proofread the manuscript using the above guidelines as criteria for acceptability.

Internet Research

Internet research was mentioned above in relation to participant observation or field data from the virtual environment of chat rooms. However, there is now a rapidly emerging literature on various other kinds of internet research, associated methods of solving problems of privacy and confidentiality, and uncertainties or vulnerabilities connected with these "solutions." Researchers' insouciant claims that internet data are anonymous or that confidentiality will be protected are reminiscent of such promises regarding non-web research of two or three decades ago.

This area of research will grow rapidly since it enables researchers to reach far-flung subjects quickly, inexpensively, round-the-clock, and without a research staff. The problems and solutions to issues of privacy and confidentiality will change rapidly over the ensuing years as new technologies render old problems and old solutions obsolete. Some of the rapidly evolving issues include:

  • How to ensure that children are not being studied under rules that pertain to adults;
  • How to ensure anonymity of responses, given that web page software logs as header lines the IP address of the machine from which the respondent accessed the researcher's web page; and
  • How an on-line data file can be stored so that unauthorized persons cannot access it.

Given the uncertainties, especially with regard to assurances of confidentiality, it is reasonable at this stage to recommend that assurances of confidentiality contain appropriate disclaimers. A detailed review of the current literature on this topic is not warranted since the details will change within a few months. However, for purposes of the recommendations of this paper, sources of current information (typically web sites) should be made available to IRBs and researchers with frequent updates and assurances of confidentiality should be limited appropriately.

Back to Top

Summary: Key Issues for IRBs and Researchers

A poor and misleading definition of privacy and confidentiality in the regulations.
Specific definitional problems are discussed below, in "Improving the Regulations."
Research topics and methods that do not fit within the federal regulations.
Research that more closely resembles investigative journalism, historical research, or biography is subject to conflicting standards - those of its discipline and those of the federal regulations. If IRB review and ethical guidelines must be applied to such research, rather than making it conform to inappropriate standards, a better solution would be to ensure that subjects understood what standard would apply. For example, instead of trying to impose some inappropriate limit on disclosure of unique identifiers, one IRB simply requires that oral history researchers inform potential subjects of all the kinds of disclosures that will appear in the resulting material that is made public.
Lack of effective, ongoing education of IRBs and their clients.
One of the most serious problems facing the research community is the lack of time and resources to educate IRB members. IRBs are drowning in paper work. They need resources to foster their own effective learning and problem solving and create appropriate learning contexts for their clientele.
Lack of time and resources for proactive problem solving.
Proactive problem solving includes creating and continuing to update educational materials for members, researchers, and students. IRB members need released time, resources, and administrative support if they are to function effectively as proactive problem solvers.
Lack of relevant research skills among IRB members.
The many contexts and methods of social/behavioral research are mysteries to some IRB members not trained in those disciplines. It is important to have IRB members who view research methods with intelligent skepticism and who can anticipate the concerns of subjects. However, it is not useful to have many IRB members who are ignorant of research methods. For example, a competent applied statistician, qualitative methodologist, survey researcher, and field experimenter can educate fellow IRB members and their clientele.
Students who are poorly trained in ethics and methodology.
Most research methodology still focuses on the "get data" approach and pays scant attention to learning the culture of subjects in their various contexts and to understanding how to effectively respect the privacy of subjects. The absurd notion that ethics is the enemy of methodology prevails and is reinforced throughout students' training. Students are the scientists of the future. If user-friendly and science-friendly curriculum is available on the web, this material can be used in the classroom and in anticipation of student research. A recent symposium at the American Association for the Advancement of Science concluded that curriculum on research ethics needs to be taught integrally with methodology curriculum. Otherwise, it is seen as the study of the misdeeds of others, rather than as problems oneself might encounter.
Lack of awareness of the range of privacy and confidentiality issues extending from the conception of the project to data management, publication, and ultimate data archiving or sharing.
This is the one problem that appears to lend itself to a checklist. To prevent the checklist from being used in a procrustean fashion, it might be part of an indexed web site that raises the issues and offers solutions.
Too many rules, not enough basis for good judgment.
One last word about rules and checklists: More rules and procedures will only render overworked IRBs less effective at the complex, subtle task of responding to privacy and confidentiality issues. It is tempting to devise a checklist to help identify and solve problems, but that, too, is subject to misuse by poorly educated IRB members. Once an issue is raised in some quasi-official way, there is a tendency to make much ado of nothing, or at least to require researchers to respond to an endless set of questions. At most, a checklist could be part of an on-line protocol and researchers could simply indicate N/A to items that seemed irrelevant. The web site could include a brief explanation of each checklist item.

Back to Top

Recommendations to the Commission

Two basic recommendations are offered for consideration by the Commission:

  1. Provide, in the Common Rule, clear, separate definitions of privacy and confidentiality that are broad enough that researchers and IRBs can apply them to diverse research activities in different disciplines, and
  2. Recommend to OHRP the development of web-based educational materials, formatted much like Microsoft Word's "Help" menu, using the "Book," "Index," and "Find" methods of retrieval. Two web pages are recommended. A major web page would provide the knowledge needed to design research ethically and prepare an effective protocol. A smaller web page would guide IRBs in locating, organizing, and tailoring information to serve local needs (e.g., state and local laws, local informational resources, helpful professionals who might consult with researchers, useful institutional resources).

Improving the Common Rule by Redefining Privacy and Confidentiality

The complexity of privacy and confidentiality issues that arise in social and behavioral research, especially in field contexts, cannot be directly embraced in the Common Rule. Rather, privacy and confidentiality, should be addressed in more comprehensive, useful ways both in the "Definitions" section and in the "Informed Consent" section of the Common Rule. The reader should also be referred to a web site that clarifies what it means to respect privacy and assure confidentiality.

Since the regulations are often the first and only things that new IRB members see and since most institutional assurances are copied out of the regulations, the actual content of the regulations is significant. Since the current regulations handle the definition of privacy and confidentiality in a most misleading way, it is essential that these be made more useful and less confusing.

Privacy

The Common Rule does not define privacy per se, but defines private information as follows:

Private information includes information about behavior that occurs in a context in which an individual can reasonably expect that no observation or recording is taking place, and information which has been provided for specific purposes by an individual and which the individual can reasonably expect will not be made public (for example, a medical record) (45 CFR 46.102(f)(2)).

This statement is embedded in a larger context as follows:

40.102 Definitions

(f) Human subject means a living individual about whom an investigator (whether professional or student) conducting research obtains

  1. data through intervention or interaction with the individual, or
  2. identifiable private information. Intervention includes both physical procedures by which data are gathered (for example, venipuncture) and manipulations of the subject or the subject's environment that are performed for research purposes. Interaction includes communication or interpersonal contact between investigator and subject. Private information includes information about behavior that occurs in a context in which an individual can reasonably expect that no observation or recording is taking place, and information which has been provided for specific purposes by an individual and which the individual can reasonably expect will not be made public (for example, a medical record). Private information must be individually identifiable (i.e., the identity of the subject is or may readily be ascertained by the investigator or associated with the information) in order for obtaining the information to constitute research involving human subjects.

This is not a definition of privacy. Privacy should be defined separately, as 45 CFR 46. 102(g). A comprehensive and useful definition of privacy would be:

(g) Privacy refers to persons and to their interest in controlling the access of others to themselves (e.g., via informed consent).

It is widely recognized (e.g., Beauchamp and Childress, 1994) that informed consent (and the right to withdraw at any time) is a major way in which research subjects control access to themselves. However, 45 CFR 46.116 General Requirements for Informed Consent do not mention privacy. Those sensitive to privacy issues would recognize privacy as a concern under 46.116(2):

  • a description of any reasonably foreseeable risks or discomforts to the subject; but this should be made explicit. 45 CFR 46.116(2) should read:
  • a description of any reasonably foreseeable risks or discomforts to the subject, including possibly unwelcome seeking or presenting of information or experiences, i.e., possible invasions of privacy;

Because people are so accustomed to over-simplified notions of privacy, the recommended web-based educational document that OHRP might provide should include a range of well-chosen examples of privacy, such as the following:

  • A young child would prefer to have a parent present when asked sensitive questions, but a teenager has a different set of control interests and would prefer the parent to be absent.
  • A hidden video camera denies research participants the opportunity to control access to themselves. Subjects should be warned of the existence of the camera in the consent statement.
  • A discussion of childcare issues might be interesting and nonthreatening for most parents, but might be deeply embarrassing for a homeless parent and trigger refusal to answer or give evasive answers unless the questions and context were developed in consultation with experts in this area (e.g., social workers, those previously homeless).
  • Oglala Souix Indians consider it an invasion of privacy for persons to ask direct questions of another. They consider it appropriate for persons who have such questions to observe quietly and learn the answer by seeing what the other person does.

The examples offered should convey that subjects' ability to regulate the access of others to themselves depends on such factors as age, station in life, and culture. Populations vary with respect to their need for help in regulating access to themselves, and this should be taken into account in research planning, procedures, and formulation of the informed consent statement.

Many researchers think that privacy and associated consent requirements are an impediment to research. The kind of static definition currently found in the current regulations only reinforces this impression. The regulations and accompanying web site examples should convey the relationship between contextual factors and privacy interests, and convey that respect for privacy is good scientific practice. They should make it obvious that privacy concerns affect subjects' willingness to participate in research and to give honest answers.

Confidentiality

The Common Rule does not define confidentiality and seems to refer to it somewhat interchangeably with privacy. This confusing and oversimplified language can be detrimental when it is focused on the conduct of research in which issues of personal privacy and access to data are vital to the protection of subjects and to the willingness of subjects to participate and to provide candid responses.

A separate definition of confidentiality (as CFR 46.102(h)5) should be added. Following Boruch and Cecil (1979) this definition might be as follows:

(h) Confidentiality is an extension of the concept of privacy; it refers to data (some identifiable information about a person, such as notes or a videotape of the person) and to agreements about how data are to be handled in keeping with subjects' interest in controlling the access of others to information about themselves.

This definition should then be reflected in 45 CFR 46.116, the section on informed consent. Currently, in the presentation of the elements of informed consent, 45 CFR 46.116 (5) states:

(5) a statement describing the extent, if any, to which confidentiality of records identifying the subject will be maintained.

45 CFR 46.116 (5) does not distinguish between confidentiality and anonymity. In many cases data gathered in an identifiable form can and should be rendered anonymous. The procedures for doing so should be carefully planned and described, at least briefly, in the informed consent.

45 CFR 46.116 (5) also fails to distinguish between normal and extraordinary confidentiality concerns. With sensitive data, there may be unusual threats to confidentiality (e.g., subpoena, hacker break-in, theft, mandated reporting) and there may or may not be solutions to these problems. 45 CFR 46.116 (5) should be reworded to encompass these distinction, as follows:

(5) a statement of whether and how the data will be rendered anonymous; or, a statement describing the conditions of confidentiality of identifiable data: who will have access to such information, what safeguards will prevent or reduce the likelihood of unauthorized access, and what unavoidable risks of disclosure may exist.

This more comprehensive informed-consent requirement reminds the researcher and IRB that there are many methods that can be used to prevent or reduce unauthorized access, and that one should be mindful that some risks of disclosure are difficult to prevent entirely. It virtually leads the researcher and IRB to seek out some of the voluminous literature on this topic. It informs subjects about what they need to know concerning the confidentiality or their data.

The related web site material should emphasize that confidentiality is not an agreement not to disclose. Rather, it refers to any kind of agreement about disclosure.

Web-Based Educational Resources for IRBs, Researchers, and Teachers

A web-based educational resource is recommended that will guide the ethical problem solving in research. It should not be offered as an official regulation, or interpretation of regulations, but as a user-friendly educational resource that will challenge IRBs, researchers, teachers, and students to improve their ability to craft solutions to ethical and methodological problems. Much of the success of this resource will depend on the ability of the IRB to tailor it to the particular needs of their institution, and to present it effectively. While the general web site would be educational, not regulatory, a smaller IRB web site would contain required actions for the IRB to take, as appropriate to their locale. That second, much smaller web site would guide IRBs in the development of resources for handling issues of privacy and confidentiality.

The main goals of the recommended web resource are: a) to present the most current knowledge concerning protection of privacy and confidentiality, and b) to ensure that this information presented in the general (large) web site is perceived by IRBs and researchers as educational and informational resources, not as an interpretation or requirement of OHRP. The intelligent interpretation and use of the educational material in the general web sites would be required of researchers by their IRB, but would be regarded as guidelines and not as rules to be applied slavishly.

To reiterate the argument presented earlier, the reasons for making this an educational and not a regulatory document, and for making sure it is perceived thus by IRBs and researchers are as follows:

  1. Acceptance of many more detailed or specific rules across 17 agencies and diverse research contexts would be limited.
  2. Opportunities for intelligent interpretation and deciding between principles or values in conflict would be diminished.
  3. Efforts required to follow a specific rule may be disproportionally great, costly, or even inappropriate, relative to the expected gain or results.

Administration of the Web Resource

To ensure that the general web site is perceived as educational and not regulatory, its contents and possibly also the contents of the smaller IRB web site should be the result of work commissioned to subject matter specialists, though overseen by a standing committee including researchers, IRB specialists, and representatives of OHRP. The commissioned work, in turn, should be edited by this committee. The final draft of each set of elements prepared for incorporation into the two web sites should be put out (on the web) for IRBs and researchers to review and critique as they wish. The finally edited documents should be designed and developed into two web sites formatted much like the Help menu of Microsoft Word. The web sites should be managed by a professional web master employed by the standing committee of experts.

To function as an evolving resource, responsive to new problems and information, both web sites must be frequently updated by a consultant and reviewed by the standing committee. Moreover, all users should be invited to submit suggested additions and modifications for consideration by the committee.

Some Recommended Elements of the (Small) Web Site for IRBs

  1. How to appraise the IRB's need for expertise in its members and outside consultants.
  2. How to structure workshops and curriculum for the major segments of researchers, tailored to the needs of their institution.
  3. How to select and develop needed institutional resources (e.g., consultants, counselors, health center staff, subject matter specialists) who can satisfy typical needs of some major sectors of the IRB's clientele.
  4. How to locate and enlist the cooperation of local specialists (e.g., school personnel, therapists, social workers, demographers, urban anthropologists, farm agents, etc.) who would be added to a local directory and serve typical needs of the IRB's clientele.
  5. How to help faculty who teach research courses to adapt portions of the web site for instructional purposes.
  6. How to bookmark or extract materials especially pertinent to their institution and add them to an internal IRB web page or document, e.g., pertinent state and local laws.
  7. How to select and add new materials as they become relevant to the evolving goals of that IRB.
  8. How to communicate with the web master and consultants about issues with which they need assistance or clarification.

The General (Larger) Web Site

The recommended elements of the general web site would include virtually everything that IRB members, researchers, teachers of methodology or student researchers might need to know about the nature of privacy and confidentiality, how to respect privacy and assure confidentiality, and how to handle unavoidable risks to confidentiality.

These elements should be presented at about the level of an upper-division methodology text, and should mirror and go beyond the material presented in this paper. There should be an annotated bibliography along with each topic, and those materials should be available at the institution's library. It is beyond the scope of this paper to provide a detailed outline of all of the specific topics that might appear in the general web site.

However, a summary of the main topics appears in Appendix A

Additional Points to Consider

Some other kinds of issues were raised in this paper which the Commissioners may wish to consider:

Legal Issues

In the section on Regulations and Statutes, some problems were raised which the Commissioners may wish to seek to resolve:

  1. Signed parental permission is difficult to obtain in some populations (e.g., of non-English speaking or illiterate parents) even though the parents would not object to their child's participation in research. It may be appropriate for legislators to consider permitting waiver of parental permission under these conditions, provided other mechanisms are in place to protect the children.

    46.408 states that parental permission may be waived for a subject population in which parental or guardian permission is not a reasonable requirement to protect the subjects (for example, neglected or abused children) provided there are appropriate other mechanisms for protecting the children, etc. In order to generalize this to permit research on children whose parents are functionally illiterate, perhaps a community consultation model might be employed to explore the attitudes and concerns of the parent population, and to explore what safeguards are deemed appropriate. Particular attention would need to be given to the choice of community gatekeepers or leaders who would participate in this consultation, and who might then be responsible for confirming with parents that they understand what the research will entail and determining whether any of them object to their child's participation. This alternative to written and signed informed consent might be tested in a variety of types of communities where most of the parents are functionally illiterate to determine its acceptability before such a provision becomes part of 48.408(c).

  2. The language concerning protections offered by certificates of confidentiality is rather imprecise and should be clarified. Researchers and IRBs need clear guidance and rules regarding the issuance and use of certificates of confidentiality, an explanation of what they do and do not protect, and what the reporting requirements are for child and elder abuse and neglect, given a certificate of confidentiality.
  3. Neither Subparts C and D of 45 CFR 46 discuss special protections that should be given to incarcerated youths, many of whom have strained or nonexistent relationships with their parents. IRBs and researchers need to understand this population and protections that are needed. This is a complex topic that should be addressed by an expert in the field of research on juvenile justice.
  4. Because warning about mandatory reporting of child or elder abuse serves researchers but not children, the Commission may wish to recommend that reporting requirements for funded research on family processes deemed of great national importance be altered. For example, the modified requirement might mandate training in effective parenting or elder care with follow-up supervision and built-in provision for such training and supervision. A warning to this effect might produce better outcomes for science, society, and the families involved.

Making the Regulations More Accessible

The Common Rule contains much that is not relevant to most social researchers, e.g., how to organize and conduct an IRB, how to do research on fetuses, pregnant women, etc., or on prisoners. The entire Common Rule should appear somewhere on the general web site. However, of most relevance to social researchers are the elements of a protocol and of informed consent (ideally with web links to explanatory material elsewhere in the general web site). Additionally, Subpart D of 45 CFR 46 is relevant to social research on children (ideally also with links to explanatory material). These should be presented separately along with examples of well-planned, well-designed protocols and consent statements, including well-designed explanations to parents and parental permission forms. Tips on how to plan the protocol should be included, along with discussion of how to make the informed consent process one of effective communication that promotes comprehension, trust, and good decision-making by the prospective subjects.

Back to Top

Acknowledgments

I am grateful to the many colleagues who reviewed earlier drafts of this paper and contributed to the collection of privacy/confidentiality problems and solutions discussed herein. They contributed immeasurably to the quality of this paper; its deficiencies are due entirely to me. Ironically, I cannot name these generous people because I promised confidentiality in return for their disclosure of the IRB foibles, problems, and solutions that enrich this paper. Some readers would inevitably deduce, correctly as well as incorrectly, who said what. Special thanks go to the competent and conscientious staff of NBAC for their guidance throughout the development of this paper.

Back to Top

Appendix A: Some Recommended General Web Site Topics

  1. Explorations of the concepts of privacy, confidentiality, and anonymity and how they pertain to various specific contexts.
  2. Exploration of theories of privacy and how they guide researchers in asking the right questions about subjects' privacy interests and suggest sources of answers.
  3. Exploration of kinds of confidentiality-assuring techniques; details of their use; exercises in tailoring confidentiality assurance techniques to specific problems. Techniques for preventing deductive disclosure from quantitative data. Techniques for masking identities of persons and places in qualitative data and for "ethical proof reading" of case study or ethnographic material so that accidental disclosure of identities is not damaging.
  4. Current updates on emerging issues of confidentiality, e.g., issues stemming from new modes of electronic communication, new safeguards, new privacy advocacy, policies and laws, and new electronic venues for conducting research.
  5. Research in public venues where people nevertheless interact with one another in an intimate or private way (e.g., some internet chat rooms) and where publication of research identifying the site could prove chilling to interaction or embarrassing to subjects.
  6. Summary and interpretation of regulations and laws (federal, state, and local) governing privacy and confidentiality of social research.
  7. Certificates of confidentiality and other protections of data: what they cover and how to use them (and any new emerging protections that become available).
  8. Responding to legal problems that actually arise in the course of one's research, such as subpoena of one's data or involvement in a situation that may mandate reporting.
  9. Approaches to evaluating what may be private to one's subject population and to assessing actual or imagined threats to confidentiality. Designing research that satisfies subjects' interest in controlling access to themselves and to safeguarding the confidentiality of data.
  10. Tips to researchers on organizing their careers so that they can readily generate the resources (information, networks, community ties, ethnographic knowledge, techniques for assuring confidentiality, etc.) required to conduct research successfully in ways fully respectful of privacy and confidentiality concerns.
  11. Issues of research on children, the elderly, and other vulnerable populations and their special privacy and confidentiality interests.
  12. Frequently asked questions and answers about 45 CFR 46, including Subparts A, B, C, and D, and about FERPA and PPRA.
  13. How to identify appropriate ethical standards for kinds of research that do not fit the usual social/behavioral research paradigms (e.g., oral history, ethnography, market research, research resembling investigative journalism). When does each kind of research require IRB review and how are conflicts reconciled between the usual standards of research (e.g., anonymity of subjects, masking of location) and the standards of that discipline (e.g., full disclosure of that information).
  14. How to resolve issues of responsibility to help subjects in dire need, when anonymity or promises of confidentiality constrain the ability of the researcher to locate or help those subjects.
  15. How to resolve issues of paying subjects when these are in conflict with the need to maintain confidentiality or anonymity.
  16. How to build mutually beneficial relationships with gatekeepers that are respectful of the privacy and confidentiality concerns of subjects, the organization, and the gatekeeper.
  17. How to plan for subsequent data sharing and audits of one's data.
  18. Data management issues.
  19. Understanding how many subjects, how many trials, etc., are needed for valid research and how this depends on the particular design and goal of the research.
  20. Kinds of research methods, where they typically are used, and associated kinds of data analysis techniques. Ethical and other issues connected with each method and approaches to resolving those issues
  21. Research on research ethics. When researchers are unsure what procedure best solves their ethical or scientific concerns, they can build a study within a study. In a given research context, they can then examine which of two or more different procedures produce the best results. For example, does anonymity produce more disclosure of undesirable behavior or a higher response rate than promises of confidentiality in a given kind of research project? Suggest topics of research on ethical issues from the perspective of the various relevant scientific disciplines. Mention journals that would be likely to publish these studies.

Back to Top

Footnotes

  • 1Many argue that the term research participant is more respectful than the term subject. For some purposes I agree. For the purposes of this paper, I prefer to use a term that reminds the reader that the person being studied typically has less power than the researcher and must be accorded the protections that render this inequality ethically acceptable.
  • 2 I am indebted to Dr. Joe Cecil and Jason Gilbert, Federal Judicial Center, for providing me with their detailed summary and analysis of these issues.
  • 3 A copy of state reporting laws may be obtained by writing to Dr. Seth C. Kalichman, Psychology Department, University of Chicago, 6525 North Sheridan Road, Chicago, IL 60626.
  • 4 I am indebted to Drs. Robert F. Boruch and Joe S. Cecil for their work in this area, and particularly for their seminal work Assuring the Confidentiality of Social Research Data (1979) which has been my main source for this part of the paper.
  • 5 The current items in 45 CFR 46 10 (g-j) would be moved down and become items (i.) through (l.)
Cite this page: "Privacy and Confidentiality: As Related to Human Research in Social and Behavioral Science (Research Involving Human Participants V2)" Online Ethics Center for Engineering 5/25/2007 National Academy of Engineering Accessed: Tuesday, October 21, 2014 <www.onlineethics.org/Topics/RespResearch/ResResources/nbacindex/nbachindex/hsieber.aspx>