ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Resource Library

Research Ethics MonthlyAbout Us

Good practice

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

‘Except as required by law’: Australian researchers’ legal rights and obligations regarding participant confidentiality0

 

Anna Olsen, Research School of Population Health, ANU Julie Mooney-Somers, Centre for Values, Ethics and the Law in Medicine, University of Sydney
*Neither of us are lawyers and, as such, our interpretations are as social scientists and HREC members. Interested lawyers and legal scholars are encouraged to contribute!

Researchers’ promises of confidentiality are often easily and genuinely made. However, our experience in research ethics review (Julie through an NGO-run ethics review committee; Anna through formally constituted university and hospital human research ethics committees), in qualitative research and in teaching qualitative research ethics has led us to think about the limits of these promises.

Australian researchers generally rely on the National Statement (National Health and Medical Research Council, 2015) and Human Research Ethics Committees (HRECs) for guidance around ethical and legal conduct in research. For example, Chapter 4.6 in the National Statement notes that researchers may discover illegal activity and guides researchers and HRECs to consider what researchers might be obliged to disclose in a legal situation and how to best protect (and inform) participants of this threat to confidentiality.

The National Statement is currently under revision (National Health and Medical Research Council, 2016) and the review submitted for public consultation in late-2016 contains a proposal to include additional information on “Disclosure to third parties of findings or results” in Section 3 of the National Statement. Here the NHMRC explicitly state that: “There can be situations where researchers have a legal, contractual or professional obligation to divulge findings or results to third parties”. That is, researchers should concern themselves not only with the legal implications of revealing potential illegal activity, but any instance in which they may be asked to break participant confidentiality.

The recent review of the National Statement extends the NHMRC recommendations around potential data disclosure in a number of ways: it makes much more explicit that researchers (as opposed to HRECs or institutions) are responsible for understanding the risks to patient confidentiality: “researchers should be aware of situations where a court, law enforcement agency or regulator may seek to compel the release of findings or results”. Researchers are expected to anticipate legal risks to participant confidentiality by: identifying “(a) whether, to whom and under what circumstances the findings or results will be disclosed; (b) whether potential participants will be forewarned that there may be such a disclosure; (c) the risks associated with such a disclosure and how they will be managed; and (d) the rationale for communicating and/or withholding the findings or results and the relative benefits and/or risks to participants of disclosure/non-disclosure”. And, researchers should advise participants on legal risks to confidentiality and how they will be handled: “(a) have a strategy in place to address this possibility; (b) advise participants of the potential for this to occur; and (c) advise participants as to how the situation will be managed”.

For many researchers in health, legal risks are a very vague reality and legal intervention a remote threat. They may feel confident that their research does not and will not uncover illegal activity, or that their data would simply be irrelevant to a legal case. Or they may feel confident that they have taken sufficient steps to protect their participants’ confidentiality by following guidelines; researchers working in illicit drug use, for example.

Many Australian HRECs articulate the NHMRC guidelines on legal risks of disclosure to third parties by requiring that researchers inform participants that any data collected during research will kept confidential, “except as required by law”. In keeping with the ethical concept of informed consent, participants are thereby warned that researchers are not able to unconditionally offer confidentially. It has become clear to us that the intention of this phrase, to flag the legal limits of confidentiality, is not well understood by researchers (Olsen & Mooney-Somers, 2014).

The National Statement details some aspects of human research that is subject to specific statutory regulation however stresses that compliance with legal obligations is not within the scope of the National Statement: “It is the responsibility of institutions and researchers to be aware of both general and specific legal requirements, wherever relevant”. Moreover, in the document we are directed that it is not the role of a HREC to provide legal advice. It is relatively rare for Australian HRECs to provide explicit guidance on the relevant legal obligations for researchers, including: how they differ across jurisdictions; what protective strategies researchers could employ to better protect patient confidentiality; or how to best inform participants about the risks of legal action (Some useful HREC-produced resources are Alfred Hospital Ethics Committee, 2010; QUT Office of Research Ethics and Integrity, 2016) Criminology scholars have (unsurprisingly) considered these issues in their own field (Chalmers & Israel. 2005; Israel, 2004; Israel & Gelsthorpe, 2017; Palys & Lowman, 2014).

We believe there are real risks to participants, researchers and research institutions.

Recent international cases of research dealing with illegal activity becoming subject to legal action include The Belfast Project/The Boston Tapes (BBC News, 2014; Emmerich, 2016; Israel, 2014) and Bradley Garrett’s ethnographic work with urban explorers (Fish, 2014; Times Higher Education, 2014) (See also Israel & Gelsthorpe, 2017). On the whole, legal action was anticipatable in these cases as they involved illicit activities and the legal action was driven by law enforcement interest. In some instances, researchers took extensive steps to protect participant confidentiality. In other cases the promise of absolute confidentiality seems a little naïve (and in our opinion, perhaps negligent).

Perhaps of more concern are cases in which legal action was instigated by interested others, not law enforcement. Of particular interest to us are recent cases of tobacco companies using Freedom of Information laws in Australia to obtain research data from Cancer Council Victoria on young people’s attitudes to and use of tobacco, and an earlier attempt to seek data on adults from Cancer Council NSW (McKenzie & Baker, 2015; Schetzer & Medew, 2015). As these cases do not involve illegal activity, it is much less likely that researchers could have anticipated the specific legal actions that undermined participant confidentiality. (The tobacco industry has taken these actions in other countries (Hastings, 2015; McMurtrie, 2002)).

Our point here is that the promise of confidentiality should never be casually made. Researchers have an ethical obligation to think through what “except as required by law” may mean for each particular research project. Although it has been argued elsewhere that as professionals, researchers should be provided the same participant confidentiality rights as doctors and lawyers (Emmerich, 2016), the current state of affairs is that research data is not (necessarily) safe from legal, contractual or professional obligation to divulge findings or results to third parties.

References:

Alfred Hospital Ethics Committee. (2010, Updated September 2016). Alfred Hospital ethics committee guidelines: Research that potentially involves legal risks for participants and researchers. Retrieved from https://www.alfredhealth.org.au/contents/resources/research/Research-involving-legal-risks.pdf

BBC News. (1 May 2014). What are the Boston tapes? Retrieved from http://www.bbc.com/news/uk-northern-ireland-27238797

Chalmers, R., & Israel, M. (2005). Caring for Data: Law, Professional Codes and the Negotiation of Confidentiality in Australian Criminological Research. Retrieved from http://crg.aic.gov.au/reports/200304-09.pdf

Emmerich, N. (9 December 2016). Why researchers should get the same client confidentiality as doctors. Retrieved from https://theconversation.com/why-researchers-should-get-the-same-client-confidentiality-as-doctors-69839

Fish, A. (23 May 2014). Urban geographer’s brush with the law risks sending cold chill through social science. Retrieved from https://theconversation.com/urban-geographers-brush-with-the-law-risks-sending-cold-chill-through-social-science-25961

Hastings, G. (31 August 2015). We got an FOI request from Big Tobacco – here’s how it went. Retrieved from https://theconversation.com/we-got-an-foi-request-from-big-tobacco-heres-how-it-went-46457

Israel, M. (2004). Strictly confidential? Integrity and the disclosure of criminological and socio-legal research. British Journal of Criminology, 44(5), 715-740.

Israel, M. (6 May 2014). Gerry Adams arrest: when is it right for academics to hand over information to the courts? Retrieved from https://theconversation.com/gerry-adams-arrest-when-is-it-right-for-academics-to-hand-over-information-to-the-courts-26209

Israel, M., & Gelsthorpe, L. (2017). Ethics in Criminological Research: A Powerful Force, or a Force for the Powerful? . In M. Cowburn, L. Gelsthorpe, & A. Wahidin (Eds.), Research Ethics in Criminology and Criminal Justice: Politics, Dilemmas, Issues and Solutions. London: Routledge.

McKenzie, N., & Baker, R. (15 August 2015). Tobacco company wants schools survey for insights into children and teens. The Age. Retrieved from http://www.theage.com.au/national/tobacco-company-wants-schools-survey-for-insights-into-children-and-teens-20150819-gj2vto.html

McMurtrie, B. (8 February 2002). Tobacco companies seek university documents. Chronicle of Higher Education. Retrieved from http://www.chronicle.com/article/Tobacco-Companies-Seek/6959

National Health and Medical Research Council. (2015). National Statement on Ethical Conduct in Human Research (2007) Retrieved from https://www.nhmrc.gov.au/printpdf/book/export/html/51613

National Health and Medical Research Council. (2016). Public consultation on Section 3 (chapters 3.1 & 3.5), Glossary and Revisions to Section 5: National Statement on Ethical Conduct in Human Research (2007). Retrieved from https://consultations.nhmrc.gov.au/files/consultations/drafts/ns-section3-public-consultation.pdf

Olsen, A., & Mooney-Somers, J. (2014). Is there a problem with the status quo? Debating the need for standalone ethical guidelines for research with people who use alcohol and other drugs. Drug Alcohol Rev, 33(6), 637-642. doi:10.1111/dar.12140

Palys, T., & Lowman, J. (2014). Protecting research confidentiality: What happens when law and ethics collide. Toronto: Lorimer.

QUT Office of Research Ethics and Integrity. (10 Novembeer 2016). Participants and illegal activities. Retrieved from http://www.orei.qut.edu.au/human/guidance/illegal.jsp

Schetzer, A., & Medew, J. (20 August 2015). Cancer Council spends thousands fighting big tobacco over children’s survey data. The Sydney Morning Herald. Retrieved from http://www.smh.com.au/national/cancer-council-spends-thousands-fighting-big-tobacco-over-childrens-survey-data-20150820-gj3nh7.html

Times Higher Education. (5 June 2014). Place-hacker Bradley Garrett: research at the edge of the law. Retrieved from https://www.timeshighereducation.com/features/place-hacker-bradley-garrett-research-at-the-edge-of-the-law/2013717.article

Contributors

Anna Olsen is a Senior Lecturer at the Research School of Population Health, Australian National University. She leads a number of qualitative and mixed methods public health research projects, teaches qualitative research methods and supervises post-graduate students. Dr Olsen is an experienced member of formally constituted university and hospital human research ethics committees. https://researchers.anu.edu.au/researchers/olsen-phd-am

Julie Mooney-Somers is a Senior Lecturer in Qualitative Research in the Centre for Values, Ethics and the Law in Medicine, University of Sydney. She is the director of the Masters of Qualitative Health Research at the University of Sydney. An experienced qualitative researcher, teacher and supervisor, she has taught qualitative research ethics and sat on a NGO-run ethics review committee for six years. http://sydney.edu.au/medicine/people/academics/profiles/julie.mooneysomers.php and http://www.juliemooneysomers.com

This post may be cited as:
Olsen A, and Mooney-Somers J. (2017, 24 February) ‘Except as required by law’: Australian researchers’ legal rights and obligations regarding participant confidentiality. Research Ethics Monthly. Retrieved from: https://ahrecs.com/human-research-ethics/except-required-law-australian-researchers-legal-rights-obligations-regarding-participant-confidentiality

Review of the Australian Code for the Responsible Conduct of Research1

 

The Australian Code for the Responsible Conduct of Research 2007 (the Code) is Australia’s premier research standard. It was developed by the government agencies that fund the majority of research in Australia, namely the National Health and Medical Research Council (NHMRC) and the Australian Research Council, in collaboration with the peak body representing Australian universities (Universities Australia). The Code guides institutions and researchers in responsible research practices and promotes research integrity. The Code has broad relevance across all research disciplines.

The Code is currently under review.

A new approach for the Code has been proposed, informed by extensive consultation with the research sector and advice from expert committees. The Code has been streamlined into a principles-based document and will be supported by guides that provide advice about implementation, such as the first Guide to investigating and managing potential breaches of the Code.

NHMRC, ARC and UA recognise the importance of engaging with the Australian community, including research institutions, researchers, other funding bodies, academies and the public, to ensure the principles-based Code and supporting guides are relevant and practical. A public consultation strategy is an important part of any NHMRC recommendation or guideline development process.

As such, NHMRC on behalf of ARC and UA invites all interested persons to provide comments on the review. A webinar was held on 29 November 2016 to explain the new approach to the Code. You are invited to view this webinar (see link below) and can participate in the public consultation process by visiting the NHMRC Public Consultation website. Submissions close on 28 February 2017.

Further information on the review can be found here.

.
The contributor:

National Health and Medical Research Council (Australia) – Web | Email

This post may be cited as:
NHMRC (2017, 20 January) Review of the Australian Code for the Responsible Conduct of Research. Research Ethics Monthly. Retrieved from:
https://ahrecs.com/research-integrity/review-australian-code-responsible-conduct-research

Ethical use of visual social media content in research publications2

 

At a research ethics workshop at the 2015 CSCW conference (Fiesler et al., 2015), researchers in our community respectfully disagreed about using public social media data for research without the consent of those who had posted the material. Some argued that researchers had no obligation to gain consent from each person whose data appeared in a public social media dataset. Others contended that, instead, people should have to explicitly opt in to having their data collected for research purposes. The issue of consent for social media data remains an ongoing debate among researchers. In this blog post, we tackle a much smaller piece of this puzzle, focusing on the research ethics but not the legal aspects of this issue: how should researchers approach consent when including screenshots of user-generated social media posts in research papers? Because analysis of visual social media content is a growing research area, it is important to identify research ethics guidelines.

We first discuss a few approaches to using user-generated social media images ethically in research papers. In a 2016 paper that we co-authored, we used screenshots from Instagram, Tumblr, and Twitter to exemplify our characterizations of eating disorder presentation online (Pater, Haimson, Andalibi, & Mynatt, 2016). Though these images were posted publicly, we felt uncomfortable using them in our research paper without consent from the posters. We used an opt-out strategy, in which we included content in the paper as long as people did not explicitly opt out. We contacted 17 people using the messaging systems on the social media site where the content appeared, gave them a brief description of the research project, and explained that they could opt out of their post being presented in the paper by responding to the message. We sent these messages in May 2015, and intended to remove people’s images from the paper if they responded before the paper’s final submission for publication five months later in October 2015. Out of the 17 people that we contacted, three people gave explicit permission to use their images in the paper, and the remaining 14 did not respond. Though this was sensitive content due to the eating disorder context, it did not include any identifiable pictures (e.g. a poster’s face) or usernames. While we were not entirely comfortable using content from the 14 people who did not give explicit permission, this seemed to be in line with ethical research practices within our research community (e.g. (Chancellor, Lin, Goodman, Zerwas, & De Choudhury, 2016), who did not receive users’ consent to use images, but did blur any identifiable features). We ultimately decided that including the images did more good than harm, considering that our paper contributed an understanding of online self-presentation for a marginalized population, which could have important clinical and technological implications. Another paper (Andalibi, Ozturk, & Forte, 2017) took a different approach to publishing user-generated visual content. Because the authors had no way of contacting posters, they instead created a few example posts themselves, which included features similar but not identical to the images in the dataset, to communicate the type of images they referenced in the paper. This is similar to what Markham (2012) calls “fabrication as ethical practice.”

This opt-out approach is only ethical in certain cases. For instance, it is not in line with the Australian National Statement on Ethical Conduct in Human Research (National Health and Medical Research Council, 2012), which we assume was not written with social media researchers as its primary audience. NHMRC’s Chapter 2.3 states that an opt-out approach is only ethical “if participants receive and read the information provided.” In a social media context, people may not necessarily receive and read information messaged to them. Additionally, researchers and ethics committees may not agree on whether or not these people are “participants” or whether such a study constitutes human subjects research. When using non-identifiable images, as we did in our study described above, and when the study’s benefit outweighs potential harm done to those who posted the social media content, we argue that an opt-out approach is appropriate. However, an opt-out approach becomes unethical when sensitive, personally-identifiable images are included in a research paper, as we discuss next.

While issues of consent when using social media content in research papers remains a thorny ongoing discussion, in certain instances we believe researchers’ decisions are more clear-cut. If social media content is identifiable – that is, if the poster’s face and/or name appears in the post – researchers should either get explicit consent from that person, de-identify the image (such as by blurring the photo and removing the name), or use ethical fabrication (Markham, 2012). Particularly, we strongly argue that when dealing with sensitive contexts, such as stigmatized identities or health issues, a person’s face and name should not be used without permission. As an example, let’s say that a woman posts a picture of herself using the hashtag #IHadAnAbortion in a public Twitter post. A researcher may argue that this photo is publicly available and thus is also available to copy and paste into a research paper. However, this ignores the post’s contextual integrity (Nissenbaum, 2009): when taking the post out of its intended context (a particular hashtag on Twitter), the researcher fundamentally changes the presentation and the meaning of the post. Additionally, on Twitter, the poster has the agency to delete[1] the post at her discretion, a freedom that she loses when it becomes forever embedded into a research paper and all of the digital and physically distributed copies of that paper. Thus, we argue that when including identifiable social media data in papers, researchers should be obligated to receive explicit permission from the person who posted that content, should they wish to include that image in the paper.

[1] Though all tweets are archived by the Library of Congress and thus not fully deletable, they are not readily accessible by the public, and even by most researchers. Furthermore, Twitter’s Terms of Service require those who collect data to periodically check for and remove deleted tweets from their datasets, though it is not clear whether this applies to the Library of Congress (Twitter, n.d.).

References:

Andalibi, N., Ozturk, P., & Forte, A. (2017). Sensitive Self-disclosures, Responses, and Social Support on Instagram: The Case of #Depression. In Proceedings of the 20th ACM Conference on Computer-Supported Cooperative Work & Social Computing. New York, NY, USA: ACM. http://dx.doi.org/10.1145/2998181.2998243

Chancellor, S., Lin, Z., Goodman, E. L., Zerwas, S., & De Choudhury, M. (2016). Quantifying and Predicting Mental Illness Severity in Online Pro-Eating Disorder Communities. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (pp. 1171–1184). New York, NY, USA: ACM. https://doi.org/10.1145/2818048.2819973

Fiesler, C., Young, A., Peyton, T., Bruckman, A. S., Gray, M., Hancock, J., & Lutters, W. (2015). Ethics for Studying Online Sociotechnical Systems in a Big Data World. In Proceedings of the 18th ACM Conference Companion on Computer Supported Cooperative Work & Social Computing (pp. 289–292). New York, NY, USA: ACM. https://doi.org/10.1145/2685553.2685558

Markham, A. (2012). Fabrication as Ethical Practice. Information, Communication & Society, 15(3), 334–353. https://doi.org/10.1080/1369118X.2011.641993

National Health and Medical Research Council. (2012, February 10). Chapter 2.3: Qualifying or waiving conditions for consent. Retrieved December 13, 2016, from https://www.nhmrc.gov.au/book/national-statement-ethical-conduct-human-research-2007-updated-december-2013/chapter-2-3-qualif

Nissenbaum, H. (2009). Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford University Press.

Pater, J. A., Haimson, O. L., Andalibi, N., & Mynatt, E. D. (2016). “Hunger Hurts but Starving Works”: Characterizing the Presentation of Eating Disorders Online. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (pp. 1185–1200). New York, NY, USA: ACM. https://doi.org/10.1145/2818048.2820030 Twitter. (n.d.). Developer Agreement & Policy —

Twitter Developers. Retrieved December 13, 2016, from https://dev.twitter.com/overview/terms/agreement-and-policy

The contributors:
Oliver L. Haimson (University of California, Irvine) – EmailBio
Nazanin Andalibi (Drexel University) – Bio
Jessica Pater (Georgia Institute of Technology) – Bio

This post may be cited as:
Haimson O, Andalibi N and Pater J. (2016, 20 December) Ethical use of visual social media content in research publications. Research Ethics Monthly. Retrieved from:
https://ahrecs.com/uncategorized/ethical-use-visual-social-media-content-research-publications

We don’t need a definition of research misconduct2

 

Responsibilities for ensuring the integrity of the research record rests with a number of players – funding agencies, governments, publishers, journal editors, institutions who conduct research and the researchers themselves. Our responsibilities for providing research that is honest and trustworthy are extant at the very beginning of a research project and ever present thereafter. If one of the players in the research ecosystem finds that research isn’t honest or can’t or shouldn’t be trusted then we have to take steps to remove it from the research record or stop it from getting there. We don’t need a definition of research misconduct in order to do that.

In fact, there isn’t a definition of research misconduct, and this is part of the problem. Resnik et.al. describe this in their 2015 paper that reviewed and categorized misconduct definitions from 22 out of the top 40 research and development funding countries. They claim that the variation in research misconduct definitions might make it harder for potential complainants to raise a concern because they can’t work out whether something might be misconduct in any particular jurisdiction. Similar research by Resnik et.al. also looked at research misconduct definitions in US universities, and found that the majority go beyond the definition provided in US law, perhaps indicating that these universities recognise that there is more than falsification, fabrication and plagiarism that can impact on the honesty and trustworthiness of the research record. A ‘back of the envelope’ review of Australian research misconduct policies paints a similar picture with two broad clades – one that centres on research misconduct as a serious deviation from accepted practice and the other that requires misrepresentation. All of this means that saying Professor Y committed research misconduct doesn’t really mean much, and doesn’t tell us how the research is dishonest or untrustworthy. It stops us from making our own assessment of the trustworthiness of the research.

Many definitions also require that it can be shown that the researcher responding to the allegation committed the act of research misconduct, however defined, deliberately or intentionally or with recklessness or negligence. This ‘mental fault’ element is used to distinguish those lapses in responsible research that are honest mistakes or accidental from deliberate, mischievous attempts to deceive the users of the research output, whether that is a journal article, lab meeting presentation or grant application. The inclusion of this mental fault element also focusses the attention of those considering complaints or serving on investigation panels on the minds of the ‘accused’ – the investigations very much become concentrated on whether Professor Y was really trying to be evil and not whether the research should be trusted and allowed to have impact.

We believe that this is the fundamental question a research integrity investigation should be considering – can we trust the research and would we be happy for it to have impact?

Consideration of mental fault (mens rea if you’re a lawyer) is important when considering what disciplinary action to take, but is best not part of the rubric when considering trustworthiness, accuracy or honesty of research.

Research conduct occurs on a spectrum – from excellent research conduct at one end to research misconduct at the other. It is not only those deliberate or grossly negligent acts that cause us to question the honesty or trustworthiness of research. There are a range of behaviours that impact on the integrity of research and many of these are neither deliberate nor FFP. Some of these are described in the seminal paper by Martinson et.al. that reports on results of a survey of biomedical researchers. The most frequent ‘questionable research practices’ described in this paper include inadequate record keeping related to research projects (27.5% of researchers), ‘dropping observations or data based on gut feeling’ (15.3%) and ‘using inadequate or inappropriate research designs’ (13.5%). It is clear that these three QRPs will impact on the trustworthiness and accuracy of research findings, and the incidence of these QRPs is much greater than the 0.3% reported for ‘falsifying or cooking research data’. These and other QRPs fall outside of many definitions of research misconduct, and so can be overlooked by institutions forced or who choose to focus on research misconduct as defined. This leaves a broad range of activities potentially unchecked, and research on the record that perhaps really shouldn’t be.

Removing the definition of research misconduct simplifies the landscape. Investigations won’t need to consider the motivation for a departure from accepted practice or breach, but only if the research can be trusted or should be allowed to have impact. Disciplinary action can still happen through other misconduct related processes and this is where deliberation and intent can and should be considered. A system like this already exists. The Canadian Tri-agency Framework for Responsible Conduct of Research does not define research misconduct but instead sets out very clearly articulated principles for research integrity. A breach of these principles can trigger an investigation and consideration of deliberation or intent is not part of the framework. The absence of a definition has not stopped Canadian funding agencies taking appropriate action. Recently, the first disclosure of an investigation was made. It names the researcher responsible and provides detail about the nature of the breach and the action taken by the funding agency involved.

Research misconduct is not a well-defined term, but a better definition is not needed and is not the solution. We need to take action to protect the integrity of the research record and stop untrustworthy or dishonest research from reaching it. We can do that just as well or even better without narrowing the scope of these considerations.

References

David B. Resnik J.D.,Ph.D., Lisa M. Rasmussen Ph.D. & Grace E. Kissling Ph.D. (2015) An International Study of Research Misconduct Policies, Accountability in Research, 22:5, 249-266, DOI: 10.1080/08989621.2014.958218

David B. Resnik J.D., Ph.D., Talicia Neal M.A., Austin Raymond B.A. & Grace E. Kissling Ph.D. (2015) Research Misconduct Definitions Adopted by U.S. Research Institutions, Accountability in Research, 22:1, 14-21, DOI: 10.1080/08989621.2014.891943

Nature 435, 737-738 (9 June 2005) | doi:10.1038/435737a

Contributors
Paul M Taylor, RMIT University (bio) – paul.taylor@rmit.edu.au
Daniel P Barr, University of Melbourne (bio)- dpbarr@unimelb.edu.au

This post may be cited as:
Taylor P and Barr DP. (2016, 25 October) We don’t need a definition of research misconduct. Research Ethics Monthly. Retrieved from: https://ahrecs.com/research-integrity/dont-need-definition-research-misconduct

0