During the fall of 2013 and spring of 2014, I traveled to numerous universities across the United States and England to conduct in-depth interviews with physicists as part of the Ethics among Scientists in International Context Study, a project led by my colleague Elaine Howard Ecklund at Rice University(1). The study sought to find out how physicists approach ethical issues related to research integrity in their day-to-day work.
My colleagues and I began our interviews with a relatively straightforward question: “What does it mean to you to be a responsible scientist in your role as a researcher?” For many scientists, responsibility in research is a relatively black and white question: don’t falsify, don’t fabricate, and don’t plagiarize. And if one looks to the literature, scholarship and policy also tend to focus on these black and white instances of misbehavior because they are unambiguous and deserving of stern sanctions.
As our research unfolded, Ecklund and I began to question whether a black and white view of misconduct is overly simplistic. From a sociological perspective, whether scientists reach consensus about the meaning of unethical conduct in science is debatable because the same behavior in a given circumstance may be open to different ethical interpretations based on the statuses of the stakeholders involved and the intended and actual outcomes of the behavior. Our research ultimately demonstrated that the line separating legitimate and illegitimate behavior in science tends to be gray, rather than black and white—a concept we refer to as ethical ambiguity.
For the purpose of illustration, consider a scenario in which a scientist receives funding for one project and then uses a portion of that money to support a graduate student on a study unrelated to the grant. Many scientists would view this practice as a black and white instance of unethical conduct. But some scientists we interviewed view this an ethically gray scenario, indicating that the use of funds for reasons other than specified in a grant is justifiable if it means supporting the careers of their students or keeping their lab afloat. In these and other circumstances, scientists cope with ambiguity through decisions that emphasize being good over the “right” way of doing things.
What strategies help resolve these and other ethically ambiguous scenarios?
Frameworks for ethical decision-making offer some, but in my view limited, help. Kantian deontological theories assert that one should follow a priori moral imperatives related to duty or obligation. A deontologist would argue, for example, that a scientist has an obligation to acknowledge the origins of her work. And policies regarding plagiarism have a law-like quality. But how far back in the literature should one cite prior work? Deontology does not help us much in this example. Another framework, consequentialism, would suggest that in an ethically ambiguous scenario, a scientist should select the action that has the best outcomes for the most people. But like other individuals, scientists are limited in their ability to weigh the outcomes of their actions (particularly as it relates to the long-term implications of scientific research).
One ethical decision-making framework, virtue ethics, does offer some help in resolving ambiguity. Virtue ethics recognizes that ethical decision-making requires consideration of circumstances, situational factors, and one’s motivations and reasons for choosing an action, not just the action itself. It poses the question, “what is the ethically good action a practically wise person would take in this circumstance?” For individual scientists, this may require consulting with senior and trusted colleagues to think through such circumstances is always a valuable practice.
A pre-emptive strategy for helping scientists resolve ethically ambiguous scenarios is to create cultures in which ambiguity can be recognized and discussed. For their part, the physicists we spoke with do not view ethics training as an effective way to create such a culture. As one physicist we spoke with explained, “It’s the easy thing to say, oh make a course on it. Taking a physics course doesn’t make me a good physicist. Taking a safety course doesn’t make me safe. Taking an ethics course doesn’t make me ethical.”
There may be merit to this physicist’s point. Nevertheless, junior scientists must learn—likely through the watching, talking, and teaching that accompanies research within a lab—that the ethical questions that scientists encounter are more likely to involve ambiguous scenarios where the appropriate action is unclear than scenarios related to fabrication, falsification, and plagiarism. __
David R. Johnson, a sociologist, is an assistant professor of higher education the University of Nevada, Reno, in the United States. His first book, A Fractured Profession: Commercialism and Conflict in Academic Science, is published by Johns Hopkins University Press.
This post may be cited as:
Johnson D. (2017, 21 June) Strategies for resolving ethically ambiguous scenarios Research Ethics Monthly. Retrieved from: https://ahrecs.com/research-integrity/strategies-resolving-ethically-ambiguous-scenarios
(1) (National Science Foundation grant # 1237737, Elaine Howard Ecklund PI, Kirstin RW Matthews and Steven Lewis, Co-PIs)
2 thoughts on “Strategies for resolving ethically ambiguous scenarios”
Good info. Lucky me I found your site by accident (stumbleupon). I have book-marked it for later!
Hi there, just became alert to your blog through Google,
and found that it’s truly informative. I’m going to watch out for brussels.
I will be grateful if you continue this in future.
A lot of people will be benefited from your writing.