ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)
Search
Generic filters
Exact text matches only
Search into
Filter by Categories
Research integrity
Filter by Categories
Human Research Ethics

Resource Library

Research Ethics MonthlyAbout Us

Research Misconduct

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

What are questionable research practices as reported by ECRs in STEMM in Australia?0

 

Katherine Christian, Carolyn Johnstone, Jo-ann Larkins, Wendy Wright and Michael Doran

Katherine Christian, Federation University Australia
Carolyn Johnstone, Federation University Australia
Jo-ann Larkins, Federation University Australia
Wendy Wright, Federation University Australia
Michael R Doran, Queensland University of Technology

Early-career researchers (ECRs) across the world have long reported significant difficulties caused by lack of funding and consequent job insecurity, gender inequity, work/life imbalance, and poor or insufficient professional development. The overall picture from our research project about ECRs in STEMM fields in Australia is of people who love science employed in unsatisfactory workplaces and overwhelmed by job insecurity and its consequences. We investigated the workplace experiences of ECRs working in the sciences in universities and independent research institutes across Australia, collecting data in a national survey (n=658), and through eight interviews of women who had recently left the academic workplace for alternate careers.

As we previously described (Christian et al., 2020), a concerning 38% ECRs reported questionable research practices from colleagues inside their institution and 32% from colleagues outside their institution. While “questionable research practices” were not defined within the survey, and there was no opportunity provided for respondents to expand in the context of this question, this term has been used to describe behaviours ranging from fraud to data exclusion and rounding of p-values (John et al., 2012). Qualitative data collected from other questions provided insights into practices which give cause for concern. These quotes, which speak for themselves, provide some indication of what our respondents identified as questionable research practices:

I have also encountered some antisocial behaviour among academics, such as senior staff who have attempted to “steal” work I am doing to present as their own. It’s cutthroat. (ECR A)

My supervisor is unethical and a scoundrel who makes this job terrible. She exists to feather her own nest and ECRs are a commodity to use to this end. (ECR B)

I’ve found that highly respected research groups often have less integrity than you’d initially thing (sic). QRPs [questionable research practices] are worryingly common, and engaged in to chase funding to conduct more QRP studies (ECR C)

Lack of funding and the need to ‘sell’ your research often leads to many researchers fabricating and embellishing data. This leads to the inability of genuine researchers to replicate findings, wasting precious time and resources, giving up and then their contracts not being renewed because the boss doesn’t get the 10 publications per year they demand. (ECR D)

I believe that the whole Academia environment is corrupted and has lost its true vision. The lack of funding is making researchers to sometimes make-up data to get grants or to publish meaningless papers just for the sake of raising the numbers. (ECR E)

In our national survey, 60% percent of STEMM ECRs reported they had been impacted by lack of support from supervisors, 33% by bullying and harassment based on power position and 13% said they felt unsafe in the workplace (unexpectedly 16% men felt unsafe compared with 11% women) (Christian et al.,2020). These comments encapsulate many of the issues which point to the poor workplace practices identified by our respondents:

The institutional work culture is a major concern (bullying, academic misconduct, workplace safety etc., which goes un-noticed) (ECR F)

I am currently looking outside academia to get away from the culture of harassment… it takes too much of a toll on my health… but I would stay in academia if I were to find a position that didn’t subject me to harassment by a supervisor. (ECR G)

Being yelled at by my supervisor on a regular basis, being yelled at by his students due to my supervisor lying to the students, being unable to lodge complaints as it’s made clear that I will not have my contract continued and will have difficulty finding another job without references if I lodge a complaint. (ECR H)

The themes which emerged from these data include ECRs feeling the need or wish to leave their jobs because of workplace stress related to job insecurity, poor institutional culture or harassment from supervisors. In parallel, we learnt why ECRs stay and tolerate these conditions: they love their research, their actual work. This puts them in a quandary about whether to stay or go and there is clear uncertainty about what to do next, either because there is nowhere to go or because the options are unpalatable.

If our government is to achieve its stated aim of making Australia one of the best places in the world in which to undertake innovation, science and research, and to maximise the spread of benefits to all Australians (Department of Industry Innovation and Science, 2018), then we must take better care of ECRs in STEMM fields who will form this future workforce. We must address a research culture where questionable research practices, whatever form they take, are so prevalent and, instead, work harder to change the culture and foster the high standards of research integrity called for in our Australian Code of Responsible Research Practice. These practices do not have to be tolerated; instead our research institutions must provide all staff, particularly ECRs, with safe avenues to report inappropriate behaviours – and follow up, every time, with appropriate action.

Limitations

As participants in the survey self-selected, it is possible we may have attracted more dissatisfied people to the study than is representative, or only people who had the time available to respond. Also, as this survey is long and conducted only in English, people from culturally and linguistically diverse backgrounds may be under-represented.

It is not possible to know the response rate to invitations received by potential participants. As a consequence of the approval process required by the HREC, distribution of those invitations was usually not within our direct control and instead was either managed by a third party or was recruitment via directed social media. This process was reported briefly in Research Ethics Monthly (Christian et al., 2019).

Acknowledgements

Katherine Christian is supported by an Australian Government Research Training Program (RTP) Fee-Offset Scholarship through Federation University Australia. Michael Doran is supported by an NHMRC Fellowship (APP1130013)

References

Christian, K., Johnstone, C., Larkins, J. and Wright, W. (17 September 2019) The need to seek institutional approval to survey staff –was this a misunderstanding of the purpose of Guideline 2.2.13 in the National Statement on Ethical Conduct in Human Research? Research Ethics Monthly. https://ahrecs.com/human-research-ethics/the-need-to-seek-institutional-approval-to-survey-staff-was-this-a-misunderstanding-of-the-purpose-of-guideline-2-2-13-in-the-national-statement-on-ethical-conduct-in-human-research

Christian, K., Johnstone, C., Larkins, J., Wright, W. and Doran, M. R. (2020). Survey of Australian STEMM Early Career Researchers: Job insecurity and questionable research practices are major structural concerns. BioRxiv, 2020.02.19.955328. https://doi.org/10.1101/2020.02.19.955328

Department of Industry Innovation and Science. (2018). Australia 2030: Prosperity through Innovation. Australian Government. https://www.industry.gov.au/data-and-publications/australia-2030-prosperity-through-innovation

John, L. K., Loewenstein, G. and Prelec, D. (2012). Measuring the Prevalence of Questionable Research Practices With Incentives for Truth Telling: Psychological Science. https://doi.org/10.1177/0956797611430953

Contributors

Katherine Christian, Federation University Australia School of Arts, Mt Helen Campus, Ballarat, Victoria

Carolyn Johnstone, Federation University Australia School of Arts, Mt Helen Campus, Ballarat, Victoria

Jo-ann Larkins, Federation University Australia School of Science, Engineering and Information Technology, Gippsland Campus, Churchill, Victoria

Wendy Wright, Federation University Australia School of Health and Life Sciences, Gippsland Campus, Churchill, Victoria

Michael R Doran, Queensland University of Technology.

This post may be cited as:

Christian, K., Johnstone, C., Larkins, J., Wright W. and Doran, M. (29 July 2020) What are questionable research practices as reported by ECRs in STEMM in Australia? Research Ethics Monthly. Retrieved from: https://ahrecs.com/uncategorized/what-are-questionable-research-practices-as-reported-by-ecrs-in-stemm-in-australia/

The ethical petri-dish: recommendations for the design of university science curricula0

 

Dr Jo-Anne Kelder, Senior Lecturer, Curriculum Innovation and Development, University of Tasmania, https://www.linkedin.com/in/jokelder/
Professor Sue Jones, Honorary Researcher, School of Natural Sciences, University of Tasmania,
Professor Liz Johnson, DVC of Education, Deakin University, https://www.linkedin.com/in/elizabeth-johnson-24292773/
Associate Professor Tina Acuna, ADL&T College of Sciences and Engineering, University of Tasmania, https://www.linkedin.com/in/tina-acuna-25a35965/

Ethics (thinking and practice) is intrinsic to the nature of science. Ethical practices within science-related professions are mandated by policies, frameworks, standards and cultural norms. A scientist should also consider the broader implications for society when applying scientific knowledge..

...
.Does our laboratory start working to develop a vaccine for Covid-19 or continue working on that potential cure for childhood leukemia? What will happen to the endangered Giant Freshwater Lobster if we remodel the hydrology of that major river so farmers in North-West Tasmania can grow more potatoes? Should we approve the use of GM technology to develop Vitamin A-rich rice?.
...

Science graduates must be equipped to contribute to such complex debates, and empowered to make scientific decisions within a sound ethical framework (Johnson, 2010).

The Science Standards Statement (Jones, Yates and Kelder, 2011), the national benchmark for bachelor-level science degrees in Australia, specifies that graduates will demonstrate a coherent understanding of science, and be able to explain the role and relevance of science in society. society (TLO 1: Jones et al., 2011: p.12). Furthermore, they will be equipped to understand and work within ethical frameworks, and “have some understanding of their social and cultural responsibilities as they investigate the natural world.” (TLO 5.3: Jones et al., 2011: p.15).

The argument that there is ‘no space’ for ethics in the science curriculum is no longer valid (Booth and Garrett, 2004; McGowan 2013). However there remain significant barriers to the teaching and assessment of ethical knowledge, skills and capabilities in undergraduate science curricula. We summarise these as: debate and dissent around what should be taught, who should teach ethical thinking, and how should it be taught and assessed.

It’s not just about plagiarism

Ethics in science falls into two broad categories:

  1. Ethics in the practice of science
  2. Ethics in the application of science.

Ethics in the practice of science relates to integrity in research management (including data collection, analysis and presentation); plagiarism, and authorship. Ethics curricula must ensure students’ familiarity with relevant legislative frameworks such as the National Statement on Ethical Conduct in Human Research. In professionally oriented/applied disciplines such as Agriculture and Environmental Science students must also be prepared for working ethically in a business environment and to understand their ethical and legal obligations as workplace leaders (Botwright-Acuna and Able, 2016).

Ethics in the application of science requires a broader and deeper perspective: appreciating and accepting responsibility for the impacts of scientific work upon society (Evers, 2001; Schultz, 2014). Graduates need to be aware that the ethical frameworks within which science is practised are not static, but adapt as social norms change. They must understand how their personal ethical perspectives interact with and may clash with, formal mandated frameworks, and be prepared to engage in debate around the ethical implications of applying discovery science in the real world. They must be prepared to defend ethical decisions and to appreciate that others may hold conflicting views. As Evers puts it: “the study of ethics should therefore be an integral part of the education and training of all scientists with the purpose of increasing future scientists’ ethical competence” (2001: p. 97).

Recommendation – that students are encouraged to debate, discuss, and appreciate that people will hold different points of view on, ethical questions.

Teachers may need some training

Practising scientists who themselves operate within relevant ethical frameworks are best placed to guide students about ethics in the practice of science (Kabasenche, 2014). However, while some scientists have taken up the teaching challenge of including ethics explicitly in their curriculum, this is not yet mainstream (Booth and Garrett, 2004). Most science academics are not themselves formally trained in ethical thinking (Johansen and Harris, 2000) and may express legitimate concern that they are not best placed to design and teach curricula on ethics (van Leeuwen, Lamberts, Newitt and Errington, 2007).

Recommendation – that science faculties provide professional development and community of practice opportunities to teaching staff to ensure that they have the confidence, skills and knowledge to teach ethical practice within a science curriculum.

There is a strong argument for a collaborative, interdisciplinary approach, with both science academics and philosophically trained ethicists involved in teaching ‘science ethics’ (Kabasenche, 2014). The scientist contributes expertise in the relevant science and their understanding of the ethical practice of science, while the philosopher brings critical thinking skills and decision-making tools that support ethical understandings and analysis of relative consequences. For example, in The Responsible Scientist, Forge (2008) argues that responsibility in scientific work has implications beyond intended outcomes, and includes taking into account foreseen and foreseeable outcomes.

Recommendation – that science faculties pursue opportunities for collaborative, interdisciplinary design and delivery of ‘science ethics’ across the undergraduate science curriculum.

It’s not just for the first year students

Teaching ethics to science students must do more than ensuring that first years are familiar with university policies on plagiarism and academic integrity (Botwright-Acuna et al., 2016). Ethics must be an explicitly assessed component of the curriculum at each level of study, and overtly aligned to the core science curriculum. Assessment tasks must distinguish between students’ knowledge of relevant ethical frameworks, and their ability to apply those frameworks in practice.

For example, an assessment task for third level Zoology students models an Animal Ethics application: students construct a scientific research question within an ethical framework, and justify that research in language accessible to lay people (Jones and Edwards, 2013). In the undergraduate course ‘Communities of Practice in Biochemistry and Molecular Biology’, students develop research skills alongside their capacity for ethical analysis of the impacts of science on society (Keiler et al., 2017) while in a subject on ‘Energy and Sustainability’, students develop a national energy plan that addresses equity issues as well as technical and political feasibility (McGowan, 2013). Schultz (2014) suggests several strategies for assessing Chemistry students’ knowledge of ethical thinking, such as writing a Code of Conduct for practising chemists.

Recommendation – that ethics is a compulsory and explicitly assessed component of a bachelor-level science curriculum, and that students are exposed to ethical thinking in the context of science from their first year onwards.

It’s everybody’s business

Good practice is a teaching team approach to curriculum design, delivery and scholarly evaluation (Kelder et al., 2017; TEQSA, 2018). A whole-of-curriculum approach will involve team members meeting regularly to discuss and coordinate connecting the ethical implications of scientific knowledge and practice being taught; to ensure that ethical thinking is embedded at each curriculum level; to scaffold and develop learning from introductory to assured level. At the broader level, the science curriculum must provide a framework within which students are supported to develop personal and professional responsibility for their learning and later professional life (Loughlin, 2013).

Recommendation – that the degree curriculum is discussed and agreed upon by the whole teaching team prior to curriculum design (and ongoing, as it matures) to ensure that students’ learning is built upon, and assessed coherently and developmentally.

Recommendation – that scholarship promoting and recommending content and delivery methods, and, especially, effective assessment strategies for the teaching of ethics to science undergraduates, is encouraged and rewarded.

References

Booth, J. M. and Garrett, J. M. (2004). Instructors’ practices in and attitudes toward teaching ethics in the genetics classroom. Genetics, 168(3), 1111-1117.

Botwright Acuña, T.L. and Able, A.J. (Eds.). (2016). Good Practice Guide: Threshold Learning Outcomes for Agriculture. Sydney, Australia: Office for Learning and Teaching. https://ltr.edu.au/resources/ID13_2982_Acuna_Guide_2016.pdf

Evers, K. (2001). Standards for ethics and responsibility in science: An analysis and evaluation of their content, background and function. International Council for Science, Paris.

Forge, J. (2008). The Responsible Scientist: A Philosophical Inquiry. University of Pittsburgh Press.

Johnson, J (2010). Teaching Ethics to Science Students: Challenges and a Strategy. In: Education and Ethics in the Life Sciences, Rappert, B. (ed.) ANU E Press, 197–213.

Jones, S. M. and A. Edwards (2013). Placing ethics within the formal science curriculum: a case study. In: Frielick, S. et al. (Eds.) Research and Development in Higher Education: the place of learning and teaching, 36 (pp 243-252). Auckland, New Zealand, 1-4 July 2013. http://herdsa.org.au/publications/conference-proceedings/research-and-development-higher-education-place-learning-and-21

Jones, S. M., Yates, B. F. and Kelder, J.-A. (2011). Learning and Teaching Academic Standards Project: Science Learning and Teaching Academic Standards Statement. Sydney: Australian Learning and Teaching Council. http://www.acds-tlcc.edu.au/science-threshold-learning-outcomes-tlos/science-tlos/

Kabasenche W. P. (2014). The Ethics of Teaching Science and Ethics: A Collaborative Proposal. Journal of Microbiology & Biology Education, 15(2), 135–138. https://doi.org/10.1128/jmbe.v15i2.841

Kelder, J.-A., Carr, A. R. and Walls, J. (2017). Evidence-based Transformation of Curriculum: a Research and Evaluation Framework. Paper presented at the 40th Annual Conference of the Higher Education Research and Development Society of Australasia (HERDSA), Sydney.

Keiler, K. C., Jackson, K. L., Jaworski, L., Lopatto, D. and Ades, S. E. (2017). Teaching broader impacts of science with undergraduate research. PLoS biology, 15(3), e2001318.

Loughlin, W. (2013). Good Practice Guide (Science) Threshold Learning Outcome 5: Personal and professional responsibility. http://www.acds-tlcc.edu.au/science-threshold-learning-outcomes-tlos/science-threshold-learning-outcomes-tlosscience-tlo-good-practice-guides/

McGowan, A. H. (2013). Teaching Science and Ethics to Undergraduates: A Multidisciplinary Approach. Science and Engineering Ethics, 19, 535–543.

National Statement on Ethical Conduct in Human Research. https://www.nhmrc.gov.au/about-us/publications/national-statement-ethical-conduct-human-research-2007-updated-2018

TEQSA (12 December 2018). “Guidance Note – Scholarship” Version 2.5. https://www.teqsa.gov.au/latest-news/publications/guidance-note-scholarship

van Leeuwen, B., Lamberts, R., Newitt, P. and Errington, S. (2012, October). Ethics, issues and consequences: conceptual challenges in science education. In Proceedings of The Australian Conference on Science and Mathematics Education.

This post may be cited as:

Kelder, J., Jones, S., Johnson, E & Botwright-Acuna, T. (18 June 2020) The ethical petri-dish: recommendations for the design of university science curricula Research Ethics Monthly. Retrieved from: https://ahrecs.com/research-integrity/the-ethical-petri-dish-recommendations-for-the-design-of-university-science-curricula

The F-word, or how to fight fires in the research literature0

 

Professor Jennifer Byrne | University of Sydney Medical School and Children’s Hospital at Westmead

 

At home, I am constantly fighting the F-word. Channelling my mother, I find myself saying things like ‘don’t use that word’, ‘not here’, ‘not in this house’. As you can probably gather, it’s a losing battle.

Research has its own F-words – ‘falsification’, ‘fabrication’, different colours of the overarching F-word, ‘fraud’. Unlike the regular F-word, most researchers assume that there’s not much need to use the research versions. Research fraud is considered comfortably rare, the actions of a few outliers. This is the ‘bad apple’ view of research fraud – that fraudsters are different, and born, not made. These rare individuals produce papers that eventually act as spot fires, damaging their fields, or even burning them to the ground. However, as most researchers are not affected, the research enterprise tends to just shrug its collective shoulders, and carry on.

But, of course, there’s a second explanation for research fraud – the so-called ‘bad barrel’ hypothesis – that research fraud can be provoked by poorly regulated, extreme pressure environments. This is a less comfortable idea, because this implies that regular people might be tempted to cheat if subjected to the right (or wrong) conditions. Such environments could result in more affected papers, about more topics, published in more journals. This would give rise to more fires within the literature, and more scientific casualties. But again, these types of environments are not considered to be common, or widespread.

But what if the pressure to publish becomes more widely and acutely applied? The use of publication quotas has been described in different settings as being associated with an uptick in numbers of questionable publications (Hvistendahl 2013; Djuric 2015; Tian et al. 2016). When publication expectations harden into quotas, more researchers may feel forced to choose between their principles and their (next) positions.

This issue has been recently discussed in the context of China (Hvistendahl 2013; Tian et al. 2016), a population juggernaut with scientific ambitions to match. China’s research output has risen dramatically over recent years, and at the same time, reports of research integrity problems have also filtered into the literature. In biomedicine, these issues again have been linked with publication quotas in both academia and clinical medicine (Tian et al. 2016). A form of contract cheating has been alleged to exist in the form of paper mills, or for-profit organisations that provide research content for publications (Hvistendahl 2013; Liu and Chen 2018). Paper mill services allegedly extend to providing completed manuscripts to which authors or teams can add their names (Hvistendahl 2013; Liu and Chen 2018).

I fell into thinking about paper mills by accident, as a result of comparing five very similar papers that were found to contain serious errors, questioning whether some of the reported experiments could have been performed (Byrne and Labbé 2017). With my colleague Dr Cyril Labbé, we are now knee deep in analysing papers with similar errors (Byrne and Labbé 2017; Labbé et al. 2019), suggesting that a worrying number of papers may have been produced with some kind of undeclared help.

It is said that to catch a thief, you need to learn to think like one. So if I were running a paper mill, and wanted to hide many questionable papers in the biomedical literature, what would I do? The answer would be to publish papers on many low-profile topics, using many authors, across many low-impact journals, over many years.

In terms of available topics, we believe that the paper mills may have struck gold by mining the contents of the human genome (Byrne et al. 2019). Humans carry 40,000 different genes of two main types, the so-called coding and non-coding genes. Most human genes have not been studied in any detail, so they provide many publication opportunities in fields where there are few experts to pay attention.

Human genes can also be linked to cancer, allowing individual genes to be examined in different cancer types, multiplying the number of papers that can be produced for each gene (Byrne and Labbé 2017). Non-coding genes are known to regulate coding genes, so non-coding and coding genes can also be combined, again in different cancer types.

The resulting repetitive manuscripts can be distributed between many research groups, and then diluted across the many journals that publish papers examining gene function in cancer (Byrne et al. 2019). The lack of content experts for these genes, or poor reviewing standards, may help these manuscripts to pass into the literature (Byrne et al. 2019). And as long as these papers are not detected, and demand continues, such manuscripts can be produced over many years. So rather than having a few isolated fires, we could be witnessing a situation where many parts of the biomedical literature are silently, solidly burning.

When dealing with fires, I have learned a few things from years of mandatory fire training. In the event of a laboratory fire, we are taught to ‘remove’, ‘alert’, ‘contain’, and ‘extinguish’. I believe that these approaches are also needed to fight fires in the research literature.

We can start by ‘alerting’ the research and publishing communities to manuscript and publication features of concern. If manuscripts are produced to a pattern, they should show similarities in terms of formatting, experimental techniques, language and/or figure appearance (Byrne and Labbé 2017). Furthermore, if manuscripts are produced in a large numbers, they could appear simplistic, with thin justifications to study individual genes, and almost non-existent links between genes and diseases (Byrne et al. 2019). But most importantly, manuscripts produced en masse will likely contain mistakes, and these may constitute an Achilles heel to enable their detection (Labbé et al. 2019).

Acting on reports of unusual shared features and errors will help to ‘contain’ the numbers and influence of these publications. Detailed, effective screening by publishers and journals may detect more problematic manuscripts before they are published. Dedicated funding would encourage active surveillance of the literature by researchers, leading to more reports of publications of concern. Where these concerns are upheld, individual publications can be contained through published expressions of concern, and/or ‘extinguished’ through retraction.

At the same time, we must identify and ‘remove’ the fuels that drive systematic research fraud. Institutions should remove both unrealistic publication requirements, and monetary incentives to publish. Similarly, research communities and funding bodies need to ask whether neglected fields are being targeted for low value, questionable research. Supporting functional studies of under-studied genes could help to remove this particular type of fuel (Byrne et al. 2019).

And while removing, alerting, containing and extinguishing, we should not shy away from thinking about and using any necessary F-words. Thinking that research fraud shouldn’t be discussed will only help this to continue (Byrne 2019).

The alternative could be using the other F-word in ways that I don’t want to think about.

References

Byrne JA (2019). We need to talk about systematic fraud. Nature. 566: 9.

Byrne JA, Grima N, Capes-Davis A, Labbé C (2019). The possibility of systematic research fraud targeting under-studied human genes: causes, consequences and potential solutions. Biomarker Insights. 14: 1-12.

Byrne JA, Labbé C (2017). Striking similarities between publications from China describing single gene knockdown experiments in human cancer cell lines. Scientometrics. 110: 1471-93.

Djuric D (2015). Penetrating the omerta of predatory publishing: The Romanian connection. Sci Engineer Ethics. 21: 183–202.

Hvistendahl M (2013). China’s publication bazaar. Science. 342: 1035–1039.

Labbé C, Grima N, Gautier T, Favier B, Byrne JA (2019). Semi-automated fact-checking of nucleotide sequence reagents in biomedical research publications: the Seek & Blastn tool. PLOS ONE. 14: e0213266.

Liu X, Chen X (2018). Journal retractions: some unique features of research misconduct in China. J Scholar Pub. 49: 305–319.

Tian M, Su Y, Ru X (2016). Perish or publish in China: Pressures on young Chinese scholars to publish in internationally indexed journals. Publications. 4: 9.

This post may be cited as:
Byrne, J. (18 July 2019) The F-word, or how to fight fires in the research literature. Research Ethics Monthly. Retrieved from: https://ahrecs.com/research-integrity/the-f-word-or-how-to-fight-fires-in-the-research-literature

“Reminder about service options and an easy way to pay AHRECS,” we say… aware of how corporate sleazy that sounds0

 

Dr Gary Allen, Senior Consultants AHRECS
Prof. Mark Israel
Prof. Colin Thomson AM
  
   .

Just in time for the end of the financial year (though we know many research institutions budget around calendar year), AHRECS has the capacity to receive payments by credit card. We thought this a good time to remind you of those of our services that lend themselves nicely to credit card payment.

In-meeting 30-minute professional development for HREC members ($900) – Workshops/briefings/guided discussion about your selected topic.  An easy way to tick the HREC member training box with minimum interruption to the work of a busy committee.  An experienced AHRECS team member will provide a PowerPoint with pre-recorded audio that could be played in a meeting (and retained for five years for viewing by absent and new members); the team member will ‘phone or Zoom into the meeting for Q&A/discussion. If so AHRECS can also record that component for your later use.

Access the new subscription area ($360) – Thank you to everyone who expressed interest and support for the new in-house subscribers’ area.  This is scheduled to go live in July/August.  By subscribing, you will get access to an impressive (and growing) set of HRE and RI resources that are Creative Commons so you can use them within your organisations as much as you want.

Bespoke webinar for your research community ($1500) – A one-hour webinar on a human research ethics or research integrity topic of your choice, tailored to your institution. The price allows for up to 200 attendees and provision of a recording for your later use.

3-hour orientation workshop for new RIAs ($2300) – Provide your new Research Integrity Advisers with a practical, topical and engaging orientation through this four-hour workshop.

Ten hours of on-call advice ($3400) – On-call advice can be used for both human research ethics and research integrity advice.  We can offer advice on everything from review feedback on a difficult application to commenting on a draft policy and providing advice on a tricky question with which the committee has been struggling.  In the research integrity space, we can suggest an appropriate investigation approach for an alleged breach, comment on a RI resource, or suggest references on a key topic.  The purchased time can be used in 15min, 30min, 1h, 4h and 8h blocks

Send an email to gary.allen@ahrecs.comif you have any questions.

The prices above exclude GST and a 2% credit card processing fee

0