ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)
Search
Generic filters
Exact text matches only
Search into
Filter by Categories
Research integrity
Filter by Categories
Human Research Ethics

Resource Library

Research Ethics MonthlyAbout Us

Privacy

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Can I use your answers anyway?0

 

Dr Gary Allen
AHRECS Senior Consultant

Many national frameworks for human research ethics, such as the National Statement (2007 updated 2018) in Australia have respect as a core principle.  An essential component of respect is obtaining the prior consent of potential participants.

The role of consent in ethical research can be traced back through the Belmont Report and the Declaration of Helsinki, back to the Nuremberg Code and earlier to philosophical, bioethics and clinical texts.

Recent egregious ethical breaches such as the Cambridge Analytica, OKCupid and Emotional Contagion cases highlight that consent problems in research are not just an issue for biomedical research and not just a 20th Century concern.

Where national and/or institutional policies discuss consent and questionnaire-based research, they will generally indicate that completion and return of a completed survey is a valid expression of consent.

Which is indeed reasonable, especially when it is important to conceal from the researcher who has participated in their research (e.g. where an academic is surveying their own students).

Most frameworks and guidance documents for the ethical conduct of human research will indicate participants should be able to freely withdraw from research without comment of penalty.

Indeed, this is again quite a reasonable position, given that genuine respect for our participants should include acknowledging that they must ordinarily be able to withdraw their consent without comment or penalty.

Solid ground thus far?  Good, because now we’re approaching the conundrum that prompted us writing this post dear readers.  For on-line questionnaires, does this mean stopping the completion of a survey whenever they want and not clicking the “submit” button?   Will this mean that the data already entered is not collected?  What if a participant changes her mind after submitting the data and wants to then withdraw her answers? If that survey is anonymous, consent and the submitted data cannot be withdrawn after submission, because the researchers won’t be able to tell which data was from which individual.

The advent of online questionnaires enabled the resolution of some problems that were largely only an online issue anyway and presented another practical ethics challenge.

A1        Using cookies to reduce the likelihood that an individual completes a survey more than once.

A2        Enabling an individual to save their progress through the survey and complete it over more than one session.

For A1 researchers should ensure the cookie does not enable them to identify respondents and ensure it will not compile any previous or future web activity.  This must be explained in the consent material and assurances provided with regard to these two matters.

For A2 the consent material should explain how password information is saved and the degree to which it can be used to identify respondents.

However, as our scenario indicates, the interesting question comes up if a respondent doesn’t finish the survey.

What happens to the information already entered into the incomplete survey?

What are the wishes of the respondents as to what should happen with these data?

The answers to these questions also depend on why the survey wasn’t completed.

Was it a combination of the participant forgetting, not having time, losing interest or struggling to log in?

Alternatively, was there a reason the individual no longer wished to participate?

Regardless of the reason, what does the individual want to happen with the use of their data?

Simple answers here are not necessarily helpful.  Assuming they forgot, etc may not be accurate and the use of their answers may be absolutely contrary to their wishes.  By the same token, losing potentially useful data merely because participants forgot might be a significant loss – especially if the number of participants is already low.

One approach in the National Statement could be used:

The relevant HREC could be asked to approve a waiver of the consent requirement, so as to enable the use of the partially completed surveys (as per NS Chapter 2.3) if they were accessible and this would need to be made clear in the survey instructions.

While the opt-out approach (also discussed in NS 2.3) might seem a promising strategy, the fact the researchers cannot correspond a set of answers with an individual means that an individual’s decision to opt-out could not be honoured.

Depending on why the individual didn’t complete the survey, the waiver of the consent requirement approach is not especially ideal.  It involves time and other resources that might be in short supply.

Proposals about the use of partially completed surveys should be discussed in the research ethics review application, the recruitment materials and consent materials.

This raises a related point: the reasons someone withdraws from a project might be of interest/importance to the researcher, their research centre, the research ethics review body or other researchers.

We propose the following strategy for a survey in which participants can be linked to their answers:

  • The consent material should discuss what will be done with the answers if the survey is only partially completed;
  • The revocation of consent process and form should be explained; and
  • The resource material for researchers and research ethics reviewers should provide a matrix that explains the treatment of survey responses.

.
You will find suggested wording for the consent material and revocation form in the subscribers’ area for institutions and individuals.
.

This mechanism uses an optional revocation of consent form.  It is essential that participants are told this form is optional, they can stop participating at any time, without explanation, but then using the form would be very informative to researchers.  In the subscribers’ area is both a suggestion for the questions in the revocation form and the associated text for the consent material.
.

We suggest the revocation form would provide some further clarity about the matters above.  In the case of situations where a revocation form is not provided it is perhaps prudent to conclude those individuals don’t want their partially completed survey to be used.
.

The same approach could be used for other kinds of research designs where data is collected at more than one sitting/point/session.
.
.

For surveys in which participants cannot be linked to their answers, we propose the following strategy:
.

  • The consent material should discuss what will be done with the answers if the survey is only partially completed;
  • The consent material could include an optional incomplete submission advice that provides researchers with information about why the survey was not completed, e.g. chosen from a dot-point list; and
  • The consent material should also clearly state that once submitted, data in completed surveys cannot be withdrawn.

.

This post may be cited as:
Allen, G. (30 March 2020) Can I use your answers anyway? Research Ethics Monthly. Retrieved from: https://ahrecs.com/human-research-ethics/can-i-use-your-answers-anyway

Plain English communications and the PICF – and beyond0

 

Bob Milstein
See below

For many of us, preparing the Participant Information and Consent Form (PICF) for a research project is an irksome, time-consuming and unexciting “hoop-jumping” task. Albeit, essential.

Indeed, the National Statement shows how essential the PICF task is. In particular, the Statement’s guiding principle for researchers is that:

“… a person’s decision to participate in research is to be voluntary, and based on sufficient information and adequate understanding of both the proposed research and the implications of participation in it.” [1]

For the purposes of this blogpost, the emphasis is on the ”understanding”.

The PICF provides the key avenue through which research participants are educated and informed ― though oral communications often supplement the document in important ways.

But to educate and inform the research participant, we need to do more than simply give them a lengthy document they find confusing, complex and perhaps impenetrable.

Rather, authors (or teams) who create a PICF need to do more ― they need to:

  • reflect on, and identify, factors that impede clear and concise communication; and
  • create a document that services the information needs (and sometimes the limitations) of the target readers — those readers include the potential research participant as well as the members of the ethics committee who scrutinise (and sometimes criticise) the document to determine its appropriateness for those participants.

Roadblocks to comprehension and ease of use

The roadblocks to generating a clear, concise and easy to easy to read PICF are often:

  • the many topics that need to be covered ― as required by the National Statement
  • the complexities of the project or of the underlying medical, technical, scientific etc issues;
  • the constraints of a – sometimes helpful — template. But even within a template, the writer has an opportunity – and an obligation — to ensure that the text inserted into the template is well-expressed and well-structured — and (most importantly) reader-focused; and
  • the language constraints imposed ― sometimes not so helpfully — by pharmaceutical companies or their legal advisors. Sometimes, that imposed language seems less concerned to inform the reader and more concerned to protect the sponsoring organisation.

For all of these reasons, PICFs can be long, complex, hard to read, and therefore unread.

These challenges are compounded by pressures ― actual and perceived ― that operate on PICF authors. For instance, many scientific writers:

  • under time and performance pressure, seek to cut and paste existing materials in the hope that a cobbled together PICF will do the job;
  • adopt an inflated and excessively formal writing style ― they do this because they wrongly equate formality with professionalism;
  • are concerned that an easier-to-read document might oversimplify (“dumb down”) important information, and generate inaccuracies; and
  • write in a way that works for them and their technical peers, but that ignores or forgets the key reading audience’s needs, priorities and (sometimes exceptionally importantly) limitations.

Reflecting on the key reading audience/audiences, and using the principles of plain language communication to speak to those audiences

The key questions every writer must ask and answer are:

  1. Who am I writing to?
  2. Why am I writing to them? What do I want them to know, do, understand et cetera?

A PICF usually has two key reading audiences:

  1. members of an ethics committee; and
  2. more importantly, potential research participants.

Research in Australia consistently shows adult literacy rates to be low — and even lower when it comes to the issues of health and scientific literacy. These challenges to participant comprehension are even greater for a participant whose thought processes are influenced by fear, false beliefs, denial, anxiety and distress. [2]

Yet unlike the research participant, the writer of the PICF is hyper-literate. And massively informed about the topic ― indeed, they are likely to be as informed about the topic as anyone could be, given the state of the research.

Hyper-literate and highly informed authors struggle to “unburden” themselves of their assumptions around the audience-appropriateness and reader-friendliness of their writing. Most scientists think they are good, or very good, writers. So do most lawyers. Hah!

But unburden themselves PICF authors must. At all times, they need to focus on the information needs — and limitations — of the target reader, so that the participant can, with relative ease, understand:

  • How and why this research is relevant to them or their condition;
  • What problem the research is addressing;
  • What solution the researcher is seeking;
  • What it is they are testing; and
  • How the findings might help the potential participant, or others, with the relevant condition. That is, how the research might improve future care – its cost, complexity, frequency, efficiency et cetera.

Working towards a plain English PICF

For these reasons, we need to reflect on the principles of plain English communication to help readers work their way through the PICF. By doing so, we help satisfy the demands of the National Statement.

When talking about “plain English”, we rely on the internationally accepted definition developed and adopted by the International Plain Language Federation. [3]

“A communication is in plain language if its wording, structure, and design are so clear that the intended readers can easily find what they need, understand what they find, and use that information.”

A starting point: George Orwell

A good starting point on how to achieve a clear and reader-focused document is a famous essay by the novelist George Orwell entitled “Politics and the English Language”. Although he was writing to a general audience, many of Orwell’s observations are directly relevant to the writing of a PICF.

Among his key points:

Never use a metaphor, simile, or other figure of speech which you are used to seeing in print. These days, we say avoid clichés.

Never use a long word where a short one will do. Bernard Dixon, formerly the editor of the New Scientist, tells the story of a manuscript he received containing the following opening sentence:

”The mode of action of anti-lymphocytic serum has not yet been determined by research workers in this country or abroad.”

The author was outraged when he received the following revision from Mr Dickson:

“We don’t know how anti-lymphocytic serum works.”

Dixon says it took him 20 minutes of close textual analysis until he finally persuaded the author that  the meaning of the sentence had not been altered despite the fact that the shorter version was now more direct, more readable and one third its original length

https://www.newscientist.com/article/mg13718654-300-science-and-fiction-plain-words-please/

If it is possible to cut a word out, always cut it out. A first draft is almost never the most concise draft.

Never use the passive where you can use the active. Occasionally, the passive voice has a legitimate — and sometimes an important — role in scientific writing. But it also can be hard work for the target reader: wordy, pompous, unclear, confusing and sometimes deceptive. It is often overused (or to use the active voice, “we often overuse it”; see for instance, Passive Voice in Scientific Writing  https://cgi.duke.edu/web/sciwriting/index.php?action=passive_voice). For these reasons, many scientific journals actively encourage authors to use the active voice when submitting articles

Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent. In a PICF, which often has a necessary and unavoidable degree of scientific/medical technicality, this can be hard to achieve. But sometimes, it might be helpful to supplement the necessarily technical text with additional text that walks the reader through the concept in ways that will work for them. And remember: many research participants might struggle with language that the researcher will take for granted — for example words like “positive”, “negative”, “lateral”, “terminal”, “ante”, “hyper”, “hypo”, “significant”, “natural”, “theory”, “monitor” etc.

Break any of these rules sooner than say anything outright barbarous. As Orwell acknowledges in this, his final, point, the language (and for that matter, structure and design) is there to be used, and the options for generating clear and reader focused text are limitless.

But whatever the approach, and whatever the setting, we must all reflect on the importance of generating text for our target readers that is not only accurate and comprehensive, but is also clear, concise and effective from the reader’s perspective. While these writing principles are clearly important in the writing of a PICF, they are also important in the wide range of settings where  researchers seek to inform, educate, engage and persuade their readers — including the general public, potential funding sources, policymakers and politicians.

Some Further Reading

Australia has for many decades played a leading role in the so-called plain language “movement”, particularly in connection with a number of important law reform initiatives. Currently, Australian plain language practitioner and advocate  Christopher Balmford chairs the Standards Committee of the International Plain Language Federation. In 2019, the Federation proposed to Standards Australia that it in turn propose a plain language standard to ISO. Both proposals were approved. ISO has established a committee, chaired by Balmford, to develop an optional, multi-language, plain language standard.  The first draft is due to be reviewed at a meeting in Bangor, Wales in June 2020.

Although Australia has done a lot of excellent work, some of the key resources around scientific writing come from other countries.

Here is a list of some of the key resources that might help with future PICF writing:

  • Writing about biomedical and health research in plain English; A guide for authors

http://www.access2understanding.org/wp-content/uploads/2014/11/Access-to-Understanding-writing-guidance_v1.pdf     

  • Simply put: a Guide for Creating Easy-to-Understand Materials150 (Centers for Disease Control and Prevention, United States)

www.cdc.gov/ health communication/ToolsTemplates/Simply_ Put_082010.pdf

  • Everyday Words for Public Health Communication, May 2016 (USA)

https://www.cdc.gov/other/pdf/everydaywords-060216-final.pdf

Bob Milstein, Words and Beyond

Bob Milstein is a practising health lawyer and a member of an ethics committee.

He is also lead trainer in Words and Beyond, one of Australia’s leading providers of plain-language training, document rewriting, and cultural change (www.wordsandbeyond.com). He can be contacted on milstein@bigpond.net.au

Footnotes

[1] https://ahrecs.com/human-research-ethics/release-of-the-national-statement-on-ethical-conduct-in-human-research-2007-updated-2018-with-interview . See in particular Ch 2.2.1.

[2]  Australian Bureau of Statistics, Adult Literacy and Life Skills Survey 2006  https://www.abs.gov.au/AUSSTATS/abs@.nsf/Previousproducts/4228.0Main%20Features22006%20(Reissue)?opendocument&tabname=Summary&prodno=4228.0&issue=2006%20(Reissue)&num=&view=

[3] http://www.iplfederation.org/plain-language/

This post may be cited as:
Milstein, B. (6 March 2020) Plain English communications and the PICF – and beyond. Research Ethics Monthly. Retrieved from: https://ahrecs.com/human-research-ethics/plain-english-communications-and-the-picf-and-beyond

The Ethics and Politics of Qualitative Data Sharing0

 

Mark Israel (AHRECS and Murdoch University) and Farida Fozdar (The University of Western Australia).

There is considerable momentum behind the argument that public data is a national asset and should be made more easily available for research purposes. In introducing the Data Sharing and Release Legislative Reforms Discussion Paper in September 2019, the Australian Commonwealth Minister for Government Services argued that proposed changes to data use in the public sector would mean that

Australia’s research sector will be able to use public data to improve the development of solutions to public problems and to test which programs are delivering as intended—and which ones are not.

Data reuse is seen as a cost-efficient use of public funds, reducing the burden on participants and communities. And, the argument is not restricted to government.  Journals, universities and funding agencies are increasingly requiring social scientists to make their data available to other researchers, and even to the public, in the interests of scientific inquiry, accountability, innovation and progress. For example, the Research Councils United Kingdom (RCUK) takes the benefits associated with data sharing for granted

Publicly-funded research data are a public good, produced in the public interest; Publicly-funded research data should be openly available to the maximum extent possible.

In Australia, both the National Health and Medical Research Council (NHMRC) and the Australian Research Council (ARC) have adopted open access policies that apply to research funded by those councils. While the ARC policy only refers to research outputs and excludes research data and research data outputs, the NHMRC strongly encourages open access to research data.

And yet, several social researchers have argued that data sharing requirements, developed in the context of medical research using quantitative data, may be inappropriate for qualitative research. Their arguments rest on a mix of ethical, practical and legal grounds.

In an article entitled ‘Whose Data Are They Anyway?’, Parry and Mauthner (2004) recognised unique issues associated with archiving qualitative data. The main considerations are around confidentiality (is it possible to anonymise the data by changing the details without losing validity) and informed consent (can participants know and consent to all potential future uses of their data at a single point in time?, and alternatively what extra burden do repeated requests for consent place on participants?).

There is also the more philosophical issue of the reconfiguration of the relationship between researchers and participants including moral responsibilities and commitments, potential violations of trust, and the risk of data misrepresentation. There are deeper epistemological issues, including the joint construction of qualitative data, and the reflexivity involved in preparing data for secondary analysis. As a result, Mauthner (2016) critiqued ‘regulation creep’ whereby regulators in the United Kingdom have made data sharing a moral responsibility associated with ethical research, when in fact it may be more ethical not to share data.

In addition, there is a growing movement to recognise the rights of some communities to control their own data. Based on the fundamental principle of self-determination, some Indigenous peoples have claimed sovereignty over their own data: ‘The concept of data sovereignty, … is linked with indigenous peoples’ right to maintain, control, protect and develop their cultural heritage, traditional knowledge and traditional cultural expressions, as well as their right to maintain, control, protect and develop their intellectual property over these.’ (Tauli-Corpuz, in Kukutai and Taylor, 2016:xxii). The goal is that its use should enhance self-determination and development.

To be fair to both the Commonwealth Minister and the RCUK, each recognises that data sharing should only occur prudently and safely and acknowledges that the benefits of sharing need to be balanced against rights to privacy (the balance proposed for earlier Australian legislative proposals have already been subjected to academic critique). The challenge is to ensure that our understanding of how these competing claims should be assessed is informed by an understanding of the nature of qualitative as well as quantitative data, of how data might be co-constructed or owned, of the cultural sensitivity that might be required to interpret and present it, and the damage that might be done as a result of misuse or  misrepresentation.

Acknowledgements
This article draws on material drafted for Fozdar and Israel (under review).
.

References:

Fozdar, F. and Israel, M. (under review) Sociological ethics. In Mackay, D. and Iltis, A. (eds) The Oxford Handbook of Research Ethics. Oxford: Oxford University Press.

Kukutai, T. and Taylor, J. (Eds.) (2016) Indigenous data sovereignty: Toward an agenda (Vol. 38). Canberra: ANU Press.

Mauthner, N.S. (2016) Should data sharing be regulated? In van den Hoonard, W. and Hamilton, A. (eds) The Ethics Rupture: Exploring alternatives to formal research-ethics review. University of Toronto Press. pp.206-229.

Parry, O. and Mauthner, N.S. (2004) Whose data are they anyway? Practical, legal and ethical issues in archiving qualitative research data. Sociology, 38(1), 139-152.

This post may be cited as:
Israel, M. & Fozdar, F. (5 February 2020) The Ethics and Politics of Qualitative Data Sharing. Research Ethics Monthly. Retrieved from: https://ahrecs.com/human-research-ethics/the-ethics-and-politics-of-qualitative-data-sharing

The research use of online data/web 2.0 comments0

 

Does it require research ethics review and specified consent?

Dr Gary Allen
AHRECS Senior Consultant

The internet is a rich source of information for researchers. On the Web 2.0 we see extensive commentary on numerous life matters, which may be of interest to researchers in a wide range of (sub)disciplines. Research interest in these matters frequently prompts the following questions –Can I use that in my project? Hasn’t that already been published? Is research ethics review required? Is it necessary to obtain express consent for the research use?

It’s important to recognise that these questions aren’t posed in isolation. Cases like the OkCupid data scraping scandal, the Ashley Madison hack, Emotional Contagion, Cambridge Analytica and others provide a disturbing context.  At a time when the use of the internet and social media is startingly high (Nielsen 2019, Australian Bureau of Statistics 2018, commentaries such as the WebAlive blog 2019), there is also significant distrust of the platforms people are using. Consequently, there are good reasons for researchers and research ethics reviewers to be cautious about use of existing material for research, even if the terms and conditions of a site/platform specifically discuss research.

Like many ethics questions, there isn’t a single simple answer that is correct all the time.  The use of some kinds of data for research may not meet the National Statement’s definition of human research. Use of other kinds of data may meet that definition but will be exempt from review and so not require explicit consent. Use of other kinds of data or other uses of data that involves no more than low risk can be reviewed outside an HREC meeting and others will actually have to be considered at an HREC meeting.

AHRECS proposes a three-part test, which can be applied to individual projects to test whether a proposed use of internet data is human research and needs ethics review and this will also guide whether explicit and project-specific consent is required. If this test is formally adopted by an institution and by its research ethics committees, it would provide a transparent, consistent, and predictable way to judge these matters.

You can find a word copy of the questions, as well as a png and pdf copy of the flow diagram in our subscribers’ area.
.

For institutions
https://ahrecs.vip/flow…
$350/year
.

For individuals
https://www.patreon.com/posts/flow…
USD10/month
.

 

For any questions email enquiry@ahrecs.com

Part One of this test is whether the content of a site or platform is publicly available. One component of this test is whether the researcher will be using scraping, spoofing or hacking of the site/platform to obtain information.
.

Part Two of the test relates to whether individuals have consented and will be reasonably identifiable from the data and its proposed research use and whether there are risks to those individuals.  A component of this test is exploring whether an exemption from the consent requirement is necessary (i.e. as provided for by paragraphs 2.3.9 -12 of the National Statement and are lawful under any privacy regulation that applies).

Part Three of the test relates to how the proposed project relates to the national human research ethics guidelines – the National Statement – and whether there are any matters that must be considered by a human research ethics committee.  For example, Section 3 of the National Statement (2007 updated 2018) discusses some methodological matters and Section 4 some potential participant issues that must be considered by an HREC.

Individually, any one of these parts could determine that review and consent is required. But meeting all three parts of the test is necessary to indicate that the work is exempt before a project can be exempted from review.

Even if the tests indicate review/consent is required, that doesn’t mean the research is ethically problematic, just a project requires for more due consideration.

The implication of this is that not all research based upon online comments or social media posts can be exempted from review but, conversely, not all such work must be ethically reviewed.  The approach that should be taken depends upon project-specific design matters.  A strong and justifiable institutional process will have nuanced criteria on these matters.  Failing to establish transparent and predictable policies would be a serious lapse in an important area of research.

Booklet 37 of the Griffith University Research Ethics Manual now incorporates this three-part test.

In the subscribers’ area you will find a suggested question set for the three-part test, as well as a graphic overview of the work flow for the questions.

It is recommended institutions adopt their own version of the test, including policy positions with regard to the use of hacked or scraped data, or the research use of material in a manner at odds with a site/platform’s rules.

References

Australian agency to probe Facebook after shocking revelation – The New Daily. Accessed 16/11/19 from https://thenewdaily.com.au/news/world/2018/04/05/facebook-data-leak-australia/

Australian Bureau of Statistics (2018) 8153.0 – Internet Activity, Australia, June 2018. Retrieved from https://www.abs.gov.au/ausstats/abs@.nsf/mf/8153.0/ (accessed 27 September 2019)

Chamber, C. (2014 01 July) Facebook fiasco: was Cornell’s study of ‘emotional contagion’ an ethics breach? The Guardian. Accessed 16/11/19 from http://www.theguardian.com/science/head-quarters/2014/jul/01/facebook-cornell-study-emotional-contagion-ethics-breach

Griffith University (Updated 2019) Griffith University Research Ethics Manual (GUREM). Accessed 16/11/19 from https://www.griffith.edu.au/research/research-services/research-ethics-integrity/human/gurem

McCook, A. (2016 16 May) Publicly available data on thousands of OKCupid users pulled over copyright claim.  Retraction Watch. Accessed 16/11/19 from http://retractionwatch.com/2016/05/16/publicly-available-data-on-thousands-of-okcupid-users-pulled-over-copyright-claim/

Nielsen (2019, 26 July) TOTAL CONSUMER REPORT 2019: Navigating the trust economy in CPG. Retrieved from https://www.nielsen.com/us/en/insights/report/2019/total-consumer-report-2019/ (accessed 27 September 2019)

NHMRC (2007 updated 2018) National Statement on Ethical Conduct in Human Research. Accessed 17/11/19 from https://www.nhmrc.gov.au/about-us/publications/national-statement-ethical-conduct-human-research-2007-updated-2018

Satran, J. (2015 02 September) Ashley Madison Hack Creates Ethical Conundrum For Researchers. Huffington Post. Accessed 16/11/19 from http://www.huffingtonpost.com.au/entry/ashley-madison-hack-creates-ethical-conundrum-for-researchers_55e4ac43e4b0b7a96339dfe9?section=australia&adsSiteOverride=au

WebAlive (2019 24 June) The State of Australia’s Ecommerce in 2019 Retrieved from https://www.webalive.com.au/ecommerce-statistics-australia/ (accessed 27 September 2019).

Recommendations for further reading

Editorial (2018 12 March) Cambridge Analytica controversy must spur researchers to update data ethics. Nature. Accessed 16/11/19 from https://www.nature.com/articles/d41586-018-03856-4?utm_source=briefing-dy&utm_medium=email&utm_campaign=briefing&utm_content=20180329

Neuroskeptic (2018 14 July) The Ethics of Research on Leaked Data: Ashley Madison. Discover. Accessed 16/11/19 from http://blogs.discovermagazine.com/neuroskeptic/2018/07/14/ethics-research-leaked-ashley-madison/#.Xc97NC1L0RU

Newman, L. (2017 3 July) WikiLeaks Just Dumped a Mega-Trove of CIA Hacking Secrets. Wired Magazine. Accessed 16/11/19 from https://www.wired.com/2017/03/wikileaks-cia-hacks-dump/

Weaver, M (2018 25 April) Cambridge University rejected Facebook study over ‘deceptive’ privacy standards. TheGuardian. Accessed 16/11/19 from https://www.theguardian.com/technology/2018/apr/24/cambridge-university-rejected-facebook-study-over-deceptive-privacy-standards

Woodfield, K (ed.) (2017) The Ethics of Online Research. Emerald Publishing. https://doi.org/10.1108/S2398-601820180000002004

Zhang, S. (2016 20 May ) Scientists are just as confused about the ethics of big-data research as you. Wired Magazine. Accessed 16/011/19 from http://www.wired.com/2016/05/scientists-just-confused-ethics-big-data-research/

Competing interests

Gary is the principal author of the Griffith University Research Ethics Manual (GUREM) and receives a proportion of license sales.

This post may be cited as:
Allen, G. (23 November 2019) The research use of online data/web 2.0 comments. Research Ethics Monthly. Retrieved from: https://ahrecs.com/human-research-ethics/the-research-use-of-online-data-web-2-0-comments

0