ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Resource Library

Research Ethics MonthlyISSN 2206-2483

International

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Disaster Research and its Ethical Review0

 

Disaster research ethics is a growing area of interest within the research ethics field. Given the lack of a universal definition of disasters, it should not be a surprise that disaster research ethics is defined in various ways. Early approaches focused on ethical issues in conducting research in the acute phase of disasters (O’Mathúna 2010). Given the similarities of some of the ethical issues, it came to include humanitarian crises and emergencies. A recent review combined mental health research in natural disasters, armed conflicts and the associated refugee and internally displaced persons (IDP) settings (Chiumento et al. 2017). Each of these settings raises distinct ethical issues, as well as practical challenges for those ethically reviewing disaster research. The 2016 revision of the Council for International Organizations of Medical Sciences (CIOMS) research ethics guidelines included a section on disaster research (https://cioms.ch/wp-content/uploads/2017/01/WEB-CIOMS-EthicalGuidelines.pdf). This blog will highlight a few of the practical challenges and note some efforts to respond to these.

One issue is how some disasters happen suddenly, while research ethics review takes time. The 2016 CIOMS guidelines call for innovative approaches to research ethics review, including ways to pre-assess protocols so that they can be reviewed rapidly once a relevant disaster occurs. As committees develop ways to adapt to disaster research, other review practices can be examined to identify innovative approaches to the challenges.

A key ethical issue to address with disaster research is whether a particular project should be conducted at this time with these particular participants. In the most immediate phase of an acute disaster, resources and energy should be focused on search and rescue. Researchers could hinder this, or divert scarce resources. At the same time, data should be collected as soon as possible to contribute to the evidence based for first responders. Ethics review committees should ensure justifications are provided for why a project needs to be done during the acute phase. Questions also need to be asked about whether disaster survivors have more important needs than to participate in research. For example, some have questioned whether children who survive war should be asked to participate in research when there are few resources available to help them with the mental health challenges of surviving war (Euwema et al. 2008).

With the move towards a more evidence-based approach to humanitarian work, international and non-governmental organisations (NGOs) are increasingly engaging in research and other evaluation programmes. Some of these organisations may have little experience with research or research ethics, and hence need additional support in developing and conducting projects. Much debate has occurred over what ‘counts’ as research and is therefore required to undergo formal research ethics approval. Rather than asking if a project is research or not, it is more important to identify the ethical issues in the project and ensure they are being addressed as carefully and thoroughly as possible (Chiumento et al. 2017). Needs assessments, projects that monitor or evaluate programmes, public health surveillance, and many other activities raise ethical issues whether or not they are formal academic research studies. At the same time, every project does not need to submit the same sort of detailed research ethics application as a randomised control trial of an experimental drug. Some sort of ethical evaluation should be conducted, and here again there is an opportunity to be innovative. Different formal and informal review mechanisms could be developed to support groups conducting different types of projects. The key concern should be that the ethical issues are being examined and addressed.

Also key here is that people in the communities from which participants will be sought are involved from the design of the project (O’Mathúna 2018). Too many ‘parachute projects’ have been conducted (some with ethical approval) whereby the project is designed completely by outsiders. Once everything has been decided, the team approaches the community only to identify a lack of interest in participating or that certain ethical challenges have been overlooked. Research in other cultures, especially in the midst of armed conflicts, is especially prone to such challenges. Review committees may need to encourage exploratory discussions between researchers and participant communities, or seek evidence of how such discussions have gone.

Unexpected ethical issues often arise in disaster research given the instability and complexity of its settings (O’Mathúna & Siriwardhana 2017). An approach where ethics review bodies give approval to projects and then have little or no engagement other than an annual report is especially inadequate in disasters. Researchers may be forced to make changes in fluid settings, or may encounter unexpected issues. Submitting amendments may not be practical or fast enough, when what is needed is advice and direction from those with research ethics expertise. Thus, initiatives are being developed to provide “on call” ethics advice.

This points to how disaster research often requires additional support and protection for researchers than other types of research. Researchers may enter danger zones (natural or violent) and may see or learn of horrors and atrocities. Researchers can be subjected to physical dangers or traumatised psychologically.. In addition to the normal stresses of conducting research, these additional factors can lead to mistakes and even ethical corner-cutting. Therefore, review committees need to carefully investigate how the physical and mental well-being of researchers will be protected and supported.

These are some examples of how research ethics needs to go beyond approval processes to mechanisms that promote ethical decision-making and personal integrity during research. One such project in which I am involved is seeking insight from humanitarian researchers into the ethical issues experienced in the field (http://PREAportal.org). We are also conducting a systematic review of such issues and collecting case studies from researchers. The goal is to produce a practical tool to facilitate learning lessons from disaster researchers and promote ethical decision-making within teams.

The world is increasingly experiencing disasters and conflicts and huge amounts of resources are put into responses. Some of these resources are put towards evaluating disaster responses, and developing evidence to support disaster responders. We can expect disaster research to increase and to be increasingly seen by research ethics committees. It is therefore important that ethics committees prepare themselves to respond to the ethical challenges that disaster research raises.

References

Chiumento, A., Rahman, A., Frith, L., Snider, L., & Tol, W. A. (2017). Ethical standards for mental health and psychosocial support research in emergencies: Review of literature and current debates. Globalization and Health 13(8). doi 10.1186/s12992-017-0231-y

Euwema, M., de Graaff, D., de Jager, A., & Kalksma-Van Lith, B. (2008). Research with children in war-affected areas. In: Research with Children, Perspectives and Practices, 2nd edition. Eds. Christensen, P. & James, A. Abingdon, UK: Routledge; 189-204.

O’Mathúna, D.  (2010). Conducting research in the aftermath of disasters: Ethical considerations. Journal of Evidence-Based Medicine 3(2):65-75.

O’Mathúna, D. (2018). The dual imperative in disaster research ethics. In: SAGE Handbook of Qualitative Research Ethics. Eds. Iphofen, R. & Tolich M. London: SAGE; 441-454.

O’Mathúna, D., & Siriwardhana, C. (2017). Research ethics and evidence for humanitarian health. Lancet 390(10109):2228-9.

Declaration of interests

Dónal O’Mathúna has been involved in research ethics for over twenty years. He was chair of the Research Ethics Committee at Dublin City University (DCU) for six years. In addition to his joint position at DCU and The Ohio State University, he is Visiting Professor of Ethics in the European Master in Disaster Medicine, Università del Piemonte Orientale, Italy. His research interests focus on ethical issues in disasters, in particular disaster research ethics. He was Chair of the EU-funded COST Action (2012-2016) on Disaster Bioethics (http://DisasterBioethics.eu) and is the Principal Investigator on the R2HC-funded research project, Post-Research Ethics Analysis (http://PREAportal.org).

Contributor
Dónal O’Mathúna, PhD
Associate Professor, School of Nursing & Human Sciences, Dublin City University, Ireland
Associate Professor, College of Nursing, The Ohio State University, Columbus, Ohio, USA
Dónal’s DCU profiledonal.omathuna@dcu.ie
Twitter: @domathuna
http://BioethicsIreland.ie

This post may be cited as:
O’Mathúna D. (2018, 26 February 2018) ‘Disaster Research and its Ethical Review’. Research Ethics Monthly. Retrieved from https://ahrecs.com/human-research-ethics/disaster-research-ethical-review

How can we get mentors and trainees talking about ethical challenges?0

 

When it comes to research integrity, the international community often tends to focus on the incidence of research misconduct and the presumption that the remedy is to have more training in responsible conduct of research. Unfortunately, published evidence largely argues that these perceptions are demonstrably wrong. Specifically, formal training in courses and workshops is much less likely to be a factor in researcher behavior than what is observed and learned in the context of the research environment (Whitbeck, 2001; Faden et al., 2002; Kalichman, 2014).

These research findings should not be surprising. Most of an academic or research career is defined by actually conducting research and working with research colleagues. The idea that a single course or workshop will somehow insulate a researcher from unethical or questionable behavior, or arm them with the skills to deal with such behavior, would seem to be a hard case to make. That isn’t to say that there is no value in such training, but the possible impact is likely far less than what is conveyed by the research experience itself. With that in mind, the question is how, if at all, can research mentors be encouraged to integrate ethical discussions and reflections into the context of the day-to-day research experience?

With this as a challenge, we have been testing several approaches at UC San Diego in California to move conversations about RCR out of the classroom and into the research environment. With support from the US National Science Foundation, this project began with a 3-day conference comprised of ~20 leaders in the field of research integrity (Plemmons and Kalichman, 2017). Our goal was to develop a curriculum for a workshop in which participating faculty would acquire tools and resources to incorporate RCR conversations into the fabric of the research environment. Based on consensus from the conference participants, a curriculum was drafted, refined with input from experts and potential users, and finalized for pilot testing. Following two successful workshops for faculty at UC San Diego, the curriculum was rolled out for further testing nationally with interested faculty.

The focus of the workshop curriculum was five strategies participating faculty might use with members of their research groups. These included discussions revolving around (1) a relevant professional code of conduct, (2) creation of a checklist of things to be covered at specified times with all trainees, (3) real or fictional research cases defined by ethical challenges, (4) creation of individual development plans defining roles and responsibilities of the mentor and trainees, and (5) developing a group policy regarding definitions, roles, and responsibilities with respect to some dimension of practice particularly relevant to the research group. In all cases, the goal is to create opportunities that will make conversations about the responsible conduct of research an intentional part of the normal research environment.

The results of this project were encouraging, but still leave much to be done (Kalichman and Plemmons, 2017). Workshops were provided for over 90 faculty, who were strongly complimentary of the program and the approach. In surveys of the faculty and their trainees after the workshops, there were high levels of agreement that the five proposed strategies were feasible, relevant, and effective. However, while use of all five strategies was high post-workshop, we surprisingly found that trainees reported high levels of use pre-workshop as well. In retrospect, this should have been expected. Since workshops were voluntary, it is likely that faculty who attended were largely those already positively disposed to discussing responsible conduct with their trainees. One question worth asking is whether repeating workshops for interested faculty only will have a cascading effect over time, drawing in increasing numbers of faculty and serving to shift the culture. Also, it remains to be tested whether these workshops would be useful if faculty were required to attend.

For those interested in implementing these workshops in their own institutions, the curriculum, template examples and an instructor’s guide are all available on the Resources for Research Ethics Education website at: http://research-ethics.org/educational-settings/research-context.

References

Faden RR, Klag MJ, Kass NE, Krag SS (2002): On the Importance of Research Ethics and Mentoring. American Journal of Bioethics 4(2): 50-51.

Kalichman M (2014): A Modest Proposal to Move RCR Education Out of the Classroom and into Research. J Microbiol Biol Educ. 15(2):93–95.

Kalichman MW, Plemmons DK (2017): Intervention to Promote Responsible Conduct of Research Mentoring. Science and Engineering Ethics. doi: 10.1007/s11948-017-9929-8. [Epub ahead of print]

Plemmons DK, Kalichman MW (2017): Mentoring for Responsible Research: The Creation of a Curriculum for Faculty to Teach RCR in the Research Environment. doi: 10.1007/s11948-017-9897-z. [Epub ahead of print]

Whitbeck C (2001): Group mentoring to foster the responsible conduct of research. Science and Engineering Ethics 7(4):541-58.

Contributors
Michael Kalichman – Director, Research Ethics Program, UC San Diego | University biomkalichman@ucsd.edu

Dena Plemmons | University of California, Riverside | University page

This post may be cited as:
Kalichman M. and Plemmons D. (2017, 21 December 2017) How can we get mentors and trainees talking about ethical challenges? Research Ethics Monthly. Retrieved from: https://ahrecs.com/research-integrity/can-get-mentors-trainees-talking-ethical-challenges

Dealing with “normal” misbehavior in science: Is gossip enough?0

Posted by Admin in Research Integrity on September 20, 2017 / Keywords: , , , ,
 

As scientists, whether in the natural or social sciences, we tend to be confident in the self-policing abilities of our disciplines to root out unethical behavior. In many countries, we have institutionalized procedures for dealing with egregious forms of misconduct in the forms of fabrication, falsification, and plagiarism (FFP).

But research is increasingly calling attention to more “everyday” forms of misconduct—modes of irresponsible (if not unethical) behavior, pertaining to how we conduct our research as well as our relationships with colleagues. These include, for example:

  • cutting corners and being sloppy in one’s research (which makes future replication difficult)
  • delaying reviews of a colleague’s work in order to beat them to publication
  • exploiting students
  • unfairly claiming authorship credit
  • misusing research funds
  • sabotaging colleagues, and so on.

Such behaviors don’t violate FFP, but nevertheless fall short of the professional standards we aspire to. They begin to shape the implicit norms we internalize about what it takes to become successful in our fields (i.e., the formal script may be that we are to give others their due credit, but “really” we know that winners need to play dirty). Further, such actions can foster experiences of injustice and exploitation that lead some of us to leave our professions altogether. They thus compromise the integrity of scientific research and can create the climate for more serious violations to occur.

Just because such forms of what DeVries, Anderson, and Martinson call “normal misbehavior” can’t be formally sanctioned, it doesn’t mean they go unnoticed. Rather, in the research that my colleagues and I conducted on scientists in several countries, we found such accounts to be commonplace. Why, then, the confidence in the self-policing abilities of our disciplines? The answer, we were surprised to find, was gossip.

Scientists regularly circulate information in their departments and subfields about those who violate scientific norms. Through such gossip, they try to warn one another about colleagues whose work one ought not to trust, as well as those with whom one should avoid working. The hope here is that the bad reputation generated by such gossip will negatively impact perpetrators and serve as a deterrent to others.

What we found, however, was that the same respondents would admit that many scientists in their fields managed to be quite successful in spite of a negative reputation. Some talked about stars in their disciplines who managed to regularly publish in top journals precisely because they cut corners, or managed to be highly prolific because they exploited students. Others feared that influential perpetrators could retaliate against challengers. Some others complained of “mafias” in their disciplines that controlled access to prestigious journals and grants. Still others didn’t want to develop a reputation as a troublemaker for challenging their colleagues.

Perhaps the strangest case we encountered was of a scientist at a highly reputed institution in India who was notorious for beating students with shoes if they made mistakes in the lab. Former students would try to warn incoming students through posters around campus, but this did little to hinder the flow of new students into the lab.

Our findings overall suggest that such gossip works as an effective deterrent only when targets of gossip are of lower status than perpetrators. For instance, gossip among senior scholars about the irresponsible behavior of a postdoc or junior faculty member can inhibit their hiring and promotion. However, the veracity of such gossip is hard to verify, and false rumors can destroy someone’s career. In one case we encountered, a scientist saw a colleague spread false gossip about a potential hire, but was unable to intervene in a timely manner to correct this rumor. Transgressors may also remain unaware of gossip, and thus may not be able to correct their behaviors. In cases where targets are of higher status, gossip seems little more than a means of venting frustration, with little effect on perpetrators. Overall, as a means of social control in the discipline, gossip is rather ineffective.

So why does all this matter?

The very prevalence of such gossip indicates that scientific communities still need to take more steps to improve the integrity of their organizations and fields, beyond simply sanctions for FFP. The content of such gossip should be important to leaders of scientific institutions because it can provide important access to rampant forms of irresponsible behavior that erode the integrity of scientific institutions. Obviously, such gossip can’t simply be taken at face value; investigation is needed to weed out false rumors. Institutions need to develop better channels to report questionable behavior and need to regularly analyze such reports for patterns that warrant attention.

What’s most crucial is that institutional leaders prioritize creating a climate that fosters prevention and transparency, encourages speaking up about such issues, and provides safety from potential retaliation. These are among the best practices for protecting whistleblowers, as identified by the Whistleblower Protection Advisory Committee (WPAC) of the US Department of Labor. In addition to ethics training on issues related to FFP, the ongoing professionalization of scientists needs to include more overt discussion about

  • the implicit norms of success in the field
  • the prevalence and causes of burnout
  • how to productively address some of the more rampant forms of irresponsible behavior (such as the ones I listed earlier in this post), and
  • systemic issues, such as competitive pressures and structural incentives that enable the rationalization of irresponsible behavior

If such measures are implemented, we can significantly improve the ethical climates of our institutions and disciplines; reduce some of the attrition caused by institutional climates that tolerate (and even reward) such “normal misbehavior”; and help prevent the more egregious scandals that shake the public’s trust in science.

References

Martinson, B. C., Anderson, M. S., & De Vries, R. (2005). Scientists behaving badly.  Nature, 435(7043), 737-738.
Chicago

Shinbrot, T. (1999). Exploitation of junior scientists must end. Nature, 399(6736), 521.

De Vries, R., Anderson, M. S., & Martinson, B. C. (2006). Normal misbehavior: Scientists talk about the ethics of research. Journal of Empirical Research on Human Research Ethics, 1(1), 43-50.

Vaidyanathan, B., Khalsa, S., & Ecklund, E. H. (2016). Gossip as Social Control: Informal Sanctions on Ethical Violations in Scientific Workplaces. Social Problems, 63(4), 554-572.

Whistleblower Protection Advisory Committee (WPAC). (2015). Best Practices for Protecting Whistleblowers and Preventing and Addressing Retaliation. https://www.whistleblowers.gov/wpac/WPAC_BPR_42115.pdf

Contributor
Dr. Brandon Vaidyanathan is Associate Professor of Sociology | The Catholic University of America | CUA Staff pagebrandonv@cua.edu

This post may be cited as:
Vaidyanathan B. (2017, 2o September 2017) Dealing with “normal” misbehavior in science: Is gossip enough? Research Ethics Monthly. Retrieved from: https://ahrecs.com/research-integrity/dealing-normal-misbehavior-science-gossip-enough

Strategies for resolving ethically ambiguous scenarios2

 

During the fall of 2013 and spring of 2014, I traveled to numerous universities across the United States and England to conduct in-depth interviews with physicists as part of the Ethics among Scientists in International Context Study, a project led by my colleague Elaine Howard Ecklund at Rice University(1). The study sought to find out how physicists approach ethical issues related to research integrity in their day-to-day work.

My colleagues and I began our interviews with a relatively straightforward question: “What does it mean to you to be a responsible scientist in your role as a researcher?” For many scientists, responsibility in research is a relatively black and white question: don’t falsify, don’t fabricate, and don’t plagiarize. And if one looks to the literature, scholarship and policy also tend to focus on these black and white instances of misbehavior because they are unambiguous and deserving of stern sanctions.

As our research unfolded, Ecklund and I began to question whether a black and white view of misconduct is overly simplistic. From a sociological perspective, whether scientists reach consensus about the meaning of unethical conduct in science is debatable because the same behavior in a given circumstance may be open to different ethical interpretations based on the statuses of the stakeholders involved and the intended and actual outcomes of the behavior. Our research ultimately demonstrated that the line separating legitimate and illegitimate behavior in science tends to be gray, rather than black and white—a concept we refer to as ethical ambiguity.

For the purpose of illustration, consider a scenario in which a scientist receives funding for one project and then uses a portion of that money to support a graduate student on a study unrelated to the grant. Many scientists would view this practice as a black and white instance of unethical conduct. But some scientists we interviewed view this an ethically gray scenario, indicating that the use of funds for reasons other than specified in a grant is justifiable if it means supporting the careers of their students or keeping their lab afloat. In these and other circumstances, scientists cope with ambiguity through decisions that emphasize being good over the “right” way of doing things.

What strategies help resolve these and other ethically ambiguous scenarios?

Frameworks for ethical decision-making offer some, but in my view limited, help. Kantian deontological theories assert that one should follow a priori moral imperatives related to duty or obligation. A deontologist would argue, for example, that a scientist has an obligation to acknowledge the origins of her work. And policies regarding plagiarism have a law-like quality. But how far back in the literature should one cite prior work? Deontology does not help us much in this example. Another framework, consequentialism, would suggest that in an ethically ambiguous scenario, a scientist should select the action that has the best outcomes for the most people. But like other individuals, scientists are limited in their ability to weigh the outcomes of their actions (particularly as it relates to the long-term implications of scientific research).

One ethical decision-making framework, virtue ethics, does offer some help in resolving ambiguity. Virtue ethics recognizes that ethical decision-making requires consideration of circumstances, situational factors, and one’s motivations and reasons for choosing an action, not just the action itself. It poses the question, “what is the ethically good action a practically wise person would take in this circumstance?” For individual scientists, this may require consulting with senior and trusted colleagues to think through such circumstances is always a valuable practice.

A pre-emptive strategy for helping scientists resolve ethically ambiguous scenarios is to create cultures in which ambiguity can be recognized and discussed. For their part, the physicists we spoke with do not view ethics training as an effective way to create such a culture. As one physicist we spoke with explained, “It’s the easy thing to say, oh make a course on it. Taking a physics course doesn’t make me a good physicist. Taking a safety course doesn’t make me safe. Taking an ethics course doesn’t make me ethical.”

There may be merit to this physicist’s point. Nevertheless, junior scientists must learn—likely through the watching, talking, and teaching that accompanies research within a lab—that the ethical questions that scientists encounter are more likely to involve ambiguous scenarios where the appropriate action is unclear than scenarios related to fabrication, falsification, and plagiarism. __

Contributor
David R. Johnson, a sociologist, is an assistant professor of higher education the University of Nevada, Reno, in the United States. His first book, A Fractured Profession: Commercialism and Conflict in Academic Science, is published by Johns Hopkins University Press.
davidrjohnson@unr.edu

This post may be cited as:
Johnson D. (2017, 21 June) Strategies for resolving ethically ambiguous scenarios Research Ethics Monthly. Retrieved from: https://ahrecs.com/research-integrity/strategies-resolving-ethically-ambiguous-scenarios

(1) (National Science Foundation grant # 1237737, Elaine Howard Ecklund PI, Kirstin RW Matthews and Steven Lewis, Co-PIs)

Page 1 of 41234