ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)
Search
Generic filters
Exact text matches only
Search into
Filter by Categories
Research integrity
Filter by Categories
Human Research Ethics

Resource Library

Research Ethics MonthlyAbout Us

Privacy

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Plain English communications and the PICF – and beyond0

 

Bob Milstein
See below

For many of us, preparing the Participant Information and Consent Form (PICF) for a research project is an irksome, time-consuming and unexciting “hoop-jumping” task. Albeit, essential.

Indeed, the National Statement shows how essential the PICF task is. In particular, the Statement’s guiding principle for researchers is that:

“… a person’s decision to participate in research is to be voluntary, and based on sufficient information and adequate understanding of both the proposed research and the implications of participation in it.” [1]

For the purposes of this blogpost, the emphasis is on the ”understanding”.

The PICF provides the key avenue through which research participants are educated and informed ― though oral communications often supplement the document in important ways.

But to educate and inform the research participant, we need to do more than simply give them a lengthy document they find confusing, complex and perhaps impenetrable.

Rather, authors (or teams) who create a PICF need to do more ― they need to:

  • reflect on, and identify, factors that impede clear and concise communication; and
  • create a document that services the information needs (and sometimes the limitations) of the target readers — those readers include the potential research participant as well as the members of the ethics committee who scrutinise (and sometimes criticise) the document to determine its appropriateness for those participants.

Roadblocks to comprehension and ease of use

The roadblocks to generating a clear, concise and easy to easy to read PICF are often:

  • the many topics that need to be covered ― as required by the National Statement
  • the complexities of the project or of the underlying medical, technical, scientific etc issues;
  • the constraints of a – sometimes helpful — template. But even within a template, the writer has an opportunity – and an obligation — to ensure that the text inserted into the template is well-expressed and well-structured — and (most importantly) reader-focused; and
  • the language constraints imposed ― sometimes not so helpfully — by pharmaceutical companies or their legal advisors. Sometimes, that imposed language seems less concerned to inform the reader and more concerned to protect the sponsoring organisation.

For all of these reasons, PICFs can be long, complex, hard to read, and therefore unread.

These challenges are compounded by pressures ― actual and perceived ― that operate on PICF authors. For instance, many scientific writers:

  • under time and performance pressure, seek to cut and paste existing materials in the hope that a cobbled together PICF will do the job;
  • adopt an inflated and excessively formal writing style ― they do this because they wrongly equate formality with professionalism;
  • are concerned that an easier-to-read document might oversimplify (“dumb down”) important information, and generate inaccuracies; and
  • write in a way that works for them and their technical peers, but that ignores or forgets the key reading audience’s needs, priorities and (sometimes exceptionally importantly) limitations.

Reflecting on the key reading audience/audiences, and using the principles of plain language communication to speak to those audiences

The key questions every writer must ask and answer are:

  1. Who am I writing to?
  2. Why am I writing to them? What do I want them to know, do, understand et cetera?

A PICF usually has two key reading audiences:

  1. members of an ethics committee; and
  2. more importantly, potential research participants.

Research in Australia consistently shows adult literacy rates to be low — and even lower when it comes to the issues of health and scientific literacy. These challenges to participant comprehension are even greater for a participant whose thought processes are influenced by fear, false beliefs, denial, anxiety and distress. [2]

Yet unlike the research participant, the writer of the PICF is hyper-literate. And massively informed about the topic ― indeed, they are likely to be as informed about the topic as anyone could be, given the state of the research.

Hyper-literate and highly informed authors struggle to “unburden” themselves of their assumptions around the audience-appropriateness and reader-friendliness of their writing. Most scientists think they are good, or very good, writers. So do most lawyers. Hah!

But unburden themselves PICF authors must. At all times, they need to focus on the information needs — and limitations — of the target reader, so that the participant can, with relative ease, understand:

  • How and why this research is relevant to them or their condition;
  • What problem the research is addressing;
  • What solution the researcher is seeking;
  • What it is they are testing; and
  • How the findings might help the potential participant, or others, with the relevant condition. That is, how the research might improve future care – its cost, complexity, frequency, efficiency et cetera.

Working towards a plain English PICF

For these reasons, we need to reflect on the principles of plain English communication to help readers work their way through the PICF. By doing so, we help satisfy the demands of the National Statement.

When talking about “plain English”, we rely on the internationally accepted definition developed and adopted by the International Plain Language Federation. [3]

“A communication is in plain language if its wording, structure, and design are so clear that the intended readers can easily find what they need, understand what they find, and use that information.”

A starting point: George Orwell

A good starting point on how to achieve a clear and reader-focused document is a famous essay by the novelist George Orwell entitled “Politics and the English Language”. Although he was writing to a general audience, many of Orwell’s observations are directly relevant to the writing of a PICF.

Among his key points:

Never use a metaphor, simile, or other figure of speech which you are used to seeing in print. These days, we say avoid clichés.

Never use a long word where a short one will do. Bernard Dixon, formerly the editor of the New Scientist, tells the story of a manuscript he received containing the following opening sentence:

”The mode of action of anti-lymphocytic serum has not yet been determined by research workers in this country or abroad.”

The author was outraged when he received the following revision from Mr Dickson:

“We don’t know how anti-lymphocytic serum works.”

Dixon says it took him 20 minutes of close textual analysis until he finally persuaded the author that  the meaning of the sentence had not been altered despite the fact that the shorter version was now more direct, more readable and one third its original length

https://www.newscientist.com/article/mg13718654-300-science-and-fiction-plain-words-please/

If it is possible to cut a word out, always cut it out. A first draft is almost never the most concise draft.

Never use the passive where you can use the active. Occasionally, the passive voice has a legitimate — and sometimes an important — role in scientific writing. But it also can be hard work for the target reader: wordy, pompous, unclear, confusing and sometimes deceptive. It is often overused (or to use the active voice, “we often overuse it”; see for instance, Passive Voice in Scientific Writing  https://cgi.duke.edu/web/sciwriting/index.php?action=passive_voice). For these reasons, many scientific journals actively encourage authors to use the active voice when submitting articles

Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent. In a PICF, which often has a necessary and unavoidable degree of scientific/medical technicality, this can be hard to achieve. But sometimes, it might be helpful to supplement the necessarily technical text with additional text that walks the reader through the concept in ways that will work for them. And remember: many research participants might struggle with language that the researcher will take for granted — for example words like “positive”, “negative”, “lateral”, “terminal”, “ante”, “hyper”, “hypo”, “significant”, “natural”, “theory”, “monitor” etc.

Break any of these rules sooner than say anything outright barbarous. As Orwell acknowledges in this, his final, point, the language (and for that matter, structure and design) is there to be used, and the options for generating clear and reader focused text are limitless.

But whatever the approach, and whatever the setting, we must all reflect on the importance of generating text for our target readers that is not only accurate and comprehensive, but is also clear, concise and effective from the reader’s perspective. While these writing principles are clearly important in the writing of a PICF, they are also important in the wide range of settings where  researchers seek to inform, educate, engage and persuade their readers — including the general public, potential funding sources, policymakers and politicians.

Some Further Reading

Australia has for many decades played a leading role in the so-called plain language “movement”, particularly in connection with a number of important law reform initiatives. Currently, Australian plain language practitioner and advocate  Christopher Balmford chairs the Standards Committee of the International Plain Language Federation. In 2019, the Federation proposed to Standards Australia that it in turn propose a plain language standard to ISO. Both proposals were approved. ISO has established a committee, chaired by Balmford, to develop an optional, multi-language, plain language standard.  The first draft is due to be reviewed at a meeting in Bangor, Wales in June 2020.

Although Australia has done a lot of excellent work, some of the key resources around scientific writing come from other countries.

Here is a list of some of the key resources that might help with future PICF writing:

  • Writing about biomedical and health research in plain English; A guide for authors

http://www.access2understanding.org/wp-content/uploads/2014/11/Access-to-Understanding-writing-guidance_v1.pdf     

  • Simply put: a Guide for Creating Easy-to-Understand Materials150 (Centers for Disease Control and Prevention, United States)

www.cdc.gov/ health communication/ToolsTemplates/Simply_ Put_082010.pdf

  • Everyday Words for Public Health Communication, May 2016 (USA)

https://www.cdc.gov/other/pdf/everydaywords-060216-final.pdf

Bob Milstein, Words and Beyond

Bob Milstein is a practising health lawyer and a member of an ethics committee.

He is also lead trainer in Words and Beyond, one of Australia’s leading providers of plain-language training, document rewriting, and cultural change (www.wordsandbeyond.com). He can be contacted on milstein@bigpond.net.au

Footnotes

[1] https://ahrecs.com/human-research-ethics/release-of-the-national-statement-on-ethical-conduct-in-human-research-2007-updated-2018-with-interview . See in particular Ch 2.2.1.

[2]  Australian Bureau of Statistics, Adult Literacy and Life Skills Survey 2006  https://www.abs.gov.au/AUSSTATS/abs@.nsf/Previousproducts/4228.0Main%20Features22006%20(Reissue)?opendocument&tabname=Summary&prodno=4228.0&issue=2006%20(Reissue)&num=&view=

[3] http://www.iplfederation.org/plain-language/

This post may be cited as:
Milstein, B. (6 March 2020) Plain English communications and the PICF – and beyond. Research Ethics Monthly. Retrieved from: https://ahrecs.com/human-research-ethics/plain-english-communications-and-the-picf-and-beyond

The Ethics and Politics of Qualitative Data Sharing0

 

Mark Israel (AHRECS and Murdoch University) and Farida Fozdar (The University of Western Australia).

There is considerable momentum behind the argument that public data is a national asset and should be made more easily available for research purposes. In introducing the Data Sharing and Release Legislative Reforms Discussion Paper in September 2019, the Australian Commonwealth Minister for Government Services argued that proposed changes to data use in the public sector would mean that

Australia’s research sector will be able to use public data to improve the development of solutions to public problems and to test which programs are delivering as intended—and which ones are not.

Data reuse is seen as a cost-efficient use of public funds, reducing the burden on participants and communities. And, the argument is not restricted to government.  Journals, universities and funding agencies are increasingly requiring social scientists to make their data available to other researchers, and even to the public, in the interests of scientific inquiry, accountability, innovation and progress. For example, the Research Councils United Kingdom (RCUK) takes the benefits associated with data sharing for granted

Publicly-funded research data are a public good, produced in the public interest; Publicly-funded research data should be openly available to the maximum extent possible.

In Australia, both the National Health and Medical Research Council (NHMRC) and the Australian Research Council (ARC) have adopted open access policies that apply to research funded by those councils. While the ARC policy only refers to research outputs and excludes research data and research data outputs, the NHMRC strongly encourages open access to research data.

And yet, several social researchers have argued that data sharing requirements, developed in the context of medical research using quantitative data, may be inappropriate for qualitative research. Their arguments rest on a mix of ethical, practical and legal grounds.

In an article entitled ‘Whose Data Are They Anyway?’, Parry and Mauthner (2004) recognised unique issues associated with archiving qualitative data. The main considerations are around confidentiality (is it possible to anonymise the data by changing the details without losing validity) and informed consent (can participants know and consent to all potential future uses of their data at a single point in time?, and alternatively what extra burden do repeated requests for consent place on participants?).

There is also the more philosophical issue of the reconfiguration of the relationship between researchers and participants including moral responsibilities and commitments, potential violations of trust, and the risk of data misrepresentation. There are deeper epistemological issues, including the joint construction of qualitative data, and the reflexivity involved in preparing data for secondary analysis. As a result, Mauthner (2016) critiqued ‘regulation creep’ whereby regulators in the United Kingdom have made data sharing a moral responsibility associated with ethical research, when in fact it may be more ethical not to share data.

In addition, there is a growing movement to recognise the rights of some communities to control their own data. Based on the fundamental principle of self-determination, some Indigenous peoples have claimed sovereignty over their own data: ‘The concept of data sovereignty, … is linked with indigenous peoples’ right to maintain, control, protect and develop their cultural heritage, traditional knowledge and traditional cultural expressions, as well as their right to maintain, control, protect and develop their intellectual property over these.’ (Tauli-Corpuz, in Kukutai and Taylor, 2016:xxii). The goal is that its use should enhance self-determination and development.

To be fair to both the Commonwealth Minister and the RCUK, each recognises that data sharing should only occur prudently and safely and acknowledges that the benefits of sharing need to be balanced against rights to privacy (the balance proposed for earlier Australian legislative proposals have already been subjected to academic critique). The challenge is to ensure that our understanding of how these competing claims should be assessed is informed by an understanding of the nature of qualitative as well as quantitative data, of how data might be co-constructed or owned, of the cultural sensitivity that might be required to interpret and present it, and the damage that might be done as a result of misuse or  misrepresentation.

Acknowledgements
This article draws on material drafted for Fozdar and Israel (under review).
.

References:

Fozdar, F. and Israel, M. (under review) Sociological ethics. In Mackay, D. and Iltis, A. (eds) The Oxford Handbook of Research Ethics. Oxford: Oxford University Press.

Kukutai, T. and Taylor, J. (Eds.) (2016) Indigenous data sovereignty: Toward an agenda (Vol. 38). Canberra: ANU Press.

Mauthner, N.S. (2016) Should data sharing be regulated? In van den Hoonard, W. and Hamilton, A. (eds) The Ethics Rupture: Exploring alternatives to formal research-ethics review. University of Toronto Press. pp.206-229.

Parry, O. and Mauthner, N.S. (2004) Whose data are they anyway? Practical, legal and ethical issues in archiving qualitative research data. Sociology, 38(1), 139-152.

This post may be cited as:
Israel, M. & Fozdar, F. (5 February 2020) The Ethics and Politics of Qualitative Data Sharing. Research Ethics Monthly. Retrieved from: https://ahrecs.com/human-research-ethics/the-ethics-and-politics-of-qualitative-data-sharing

The research use of online data/web 2.0 comments0

 

Does it require research ethics review and specified consent?

Dr Gary Allen
AHRECS Senior Consultant

The internet is a rich source of information for researchers. On the Web 2.0 we see extensive commentary on numerous life matters, which may be of interest to researchers in a wide range of (sub)disciplines. Research interest in these matters frequently prompts the following questions –Can I use that in my project? Hasn’t that already been published? Is research ethics review required? Is it necessary to obtain express consent for the research use?

It’s important to recognise that these questions aren’t posed in isolation. Cases like the OkCupid data scraping scandal, the Ashley Madison hack, Emotional Contagion, Cambridge Analytica and others provide a disturbing context.  At a time when the use of the internet and social media is startingly high (Nielsen 2019, Australian Bureau of Statistics 2018, commentaries such as the WebAlive blog 2019), there is also significant distrust of the platforms people are using. Consequently, there are good reasons for researchers and research ethics reviewers to be cautious about use of existing material for research, even if the terms and conditions of a site/platform specifically discuss research.

Like many ethics questions, there isn’t a single simple answer that is correct all the time.  The use of some kinds of data for research may not meet the National Statement’s definition of human research. Use of other kinds of data may meet that definition but will be exempt from review and so not require explicit consent. Use of other kinds of data or other uses of data that involves no more than low risk can be reviewed outside an HREC meeting and others will actually have to be considered at an HREC meeting.

AHRECS proposes a three-part test, which can be applied to individual projects to test whether a proposed use of internet data is human research and needs ethics review and this will also guide whether explicit and project-specific consent is required. If this test is formally adopted by an institution and by its research ethics committees, it would provide a transparent, consistent, and predictable way to judge these matters.

You can find a word copy of the questions, as well as a png and pdf copy of the flow diagram in our subscribers’ area.
.

For institutions
https://ahrecs.vip/flow…
$350/year
.

For individuals
https://www.patreon.com/posts/flow…
USD10/month
.

 

For any questions email enquiry@ahrecs.com

Part One of this test is whether the content of a site or platform is publicly available. One component of this test is whether the researcher will be using scraping, spoofing or hacking of the site/platform to obtain information.
.

Part Two of the test relates to whether individuals have consented and will be reasonably identifiable from the data and its proposed research use and whether there are risks to those individuals.  A component of this test is exploring whether an exemption from the consent requirement is necessary (i.e. as provided for by paragraphs 2.3.9 -12 of the National Statement and are lawful under any privacy regulation that applies).

Part Three of the test relates to how the proposed project relates to the national human research ethics guidelines – the National Statement – and whether there are any matters that must be considered by a human research ethics committee.  For example, Section 3 of the National Statement (2007 updated 2018) discusses some methodological matters and Section 4 some potential participant issues that must be considered by an HREC.

Individually, any one of these parts could determine that review and consent is required. But meeting all three parts of the test is necessary to indicate that the work is exempt before a project can be exempted from review.

Even if the tests indicate review/consent is required, that doesn’t mean the research is ethically problematic, just a project requires for more due consideration.

The implication of this is that not all research based upon online comments or social media posts can be exempted from review but, conversely, not all such work must be ethically reviewed.  The approach that should be taken depends upon project-specific design matters.  A strong and justifiable institutional process will have nuanced criteria on these matters.  Failing to establish transparent and predictable policies would be a serious lapse in an important area of research.

Booklet 37 of the Griffith University Research Ethics Manual now incorporates this three-part test.

In the subscribers’ area you will find a suggested question set for the three-part test, as well as a graphic overview of the work flow for the questions.

It is recommended institutions adopt their own version of the test, including policy positions with regard to the use of hacked or scraped data, or the research use of material in a manner at odds with a site/platform’s rules.

References

Australian agency to probe Facebook after shocking revelation – The New Daily. Accessed 16/11/19 from https://thenewdaily.com.au/news/world/2018/04/05/facebook-data-leak-australia/

Australian Bureau of Statistics (2018) 8153.0 – Internet Activity, Australia, June 2018. Retrieved from https://www.abs.gov.au/ausstats/abs@.nsf/mf/8153.0/ (accessed 27 September 2019)

Chamber, C. (2014 01 July) Facebook fiasco: was Cornell’s study of ‘emotional contagion’ an ethics breach? The Guardian. Accessed 16/11/19 from http://www.theguardian.com/science/head-quarters/2014/jul/01/facebook-cornell-study-emotional-contagion-ethics-breach

Griffith University (Updated 2019) Griffith University Research Ethics Manual (GUREM). Accessed 16/11/19 from https://www.griffith.edu.au/research/research-services/research-ethics-integrity/human/gurem

McCook, A. (2016 16 May) Publicly available data on thousands of OKCupid users pulled over copyright claim.  Retraction Watch. Accessed 16/11/19 from http://retractionwatch.com/2016/05/16/publicly-available-data-on-thousands-of-okcupid-users-pulled-over-copyright-claim/

Nielsen (2019, 26 July) TOTAL CONSUMER REPORT 2019: Navigating the trust economy in CPG. Retrieved from https://www.nielsen.com/us/en/insights/report/2019/total-consumer-report-2019/ (accessed 27 September 2019)

NHMRC (2007 updated 2018) National Statement on Ethical Conduct in Human Research. Accessed 17/11/19 from https://www.nhmrc.gov.au/about-us/publications/national-statement-ethical-conduct-human-research-2007-updated-2018

Satran, J. (2015 02 September) Ashley Madison Hack Creates Ethical Conundrum For Researchers. Huffington Post. Accessed 16/11/19 from http://www.huffingtonpost.com.au/entry/ashley-madison-hack-creates-ethical-conundrum-for-researchers_55e4ac43e4b0b7a96339dfe9?section=australia&adsSiteOverride=au

WebAlive (2019 24 June) The State of Australia’s Ecommerce in 2019 Retrieved from https://www.webalive.com.au/ecommerce-statistics-australia/ (accessed 27 September 2019).

Recommendations for further reading

Editorial (2018 12 March) Cambridge Analytica controversy must spur researchers to update data ethics. Nature. Accessed 16/11/19 from https://www.nature.com/articles/d41586-018-03856-4?utm_source=briefing-dy&utm_medium=email&utm_campaign=briefing&utm_content=20180329

Neuroskeptic (2018 14 July) The Ethics of Research on Leaked Data: Ashley Madison. Discover. Accessed 16/11/19 from http://blogs.discovermagazine.com/neuroskeptic/2018/07/14/ethics-research-leaked-ashley-madison/#.Xc97NC1L0RU

Newman, L. (2017 3 July) WikiLeaks Just Dumped a Mega-Trove of CIA Hacking Secrets. Wired Magazine. Accessed 16/11/19 from https://www.wired.com/2017/03/wikileaks-cia-hacks-dump/

Weaver, M (2018 25 April) Cambridge University rejected Facebook study over ‘deceptive’ privacy standards. TheGuardian. Accessed 16/11/19 from https://www.theguardian.com/technology/2018/apr/24/cambridge-university-rejected-facebook-study-over-deceptive-privacy-standards

Woodfield, K (ed.) (2017) The Ethics of Online Research. Emerald Publishing. https://doi.org/10.1108/S2398-601820180000002004

Zhang, S. (2016 20 May ) Scientists are just as confused about the ethics of big-data research as you. Wired Magazine. Accessed 16/011/19 from http://www.wired.com/2016/05/scientists-just-confused-ethics-big-data-research/

Competing interests

Gary is the principal author of the Griffith University Research Ethics Manual (GUREM) and receives a proportion of license sales.

This post may be cited as:
Allen, G. (23 November 2019) The research use of online data/web 2.0 comments. Research Ethics Monthly. Retrieved from: https://ahrecs.com/human-research-ethics/the-research-use-of-online-data-web-2-0-comments

Ethics, Security and Privacy – the Bermuda Triangle of data management?0

 

Malcolm Wolski and Andrew Bowness
Griffith University

 

To manage sensitive research data appropriately, ethics, security and privacy requirements need to be considered. Researchers are traditionally familiar with ethics, but often have not considered the privacy and security pieces of the puzzle. Our reasons for making this statement are:

  • IT products used in research change rapidly
  • Legislation changes rapidly and there are jurisdictional issues
  • Most researchers are not legal or IT experts
  • No one teaches them enough basics to know what is risky behaviour

The recent revision to the Australian Code for the Responsible Conduct of Research (2018) on Management of Data and Information in Research highlights that it is not just the responsibility of a university to use best practice, but it is also the responsibility of the researcher. The responsible conduct of research includes within its scope the appropriate generation, collection, access, use, analysis, disclosure, storage, retention, disposal, sharing and re-use of data and information. Researchers have a responsibility to make themselves aware of the requirements of any relevant codes, legislation, regulatory, contractual or consent agreements, and to ensure they comply with them.

It’s a complex world

However, this is becoming an increasingly more complex environment for researchers. First, privacy legislation is dependent on jurisdiction of participants. For one example, a research project involving participants in Queensland is impacted by not only the Australian Privacy Act but also the Queensland version (Information Privacy Act 2009 Qld), and, if a participant or collaborator is an EU citizen, the General Data Protection Regulation (EU GDPR).

Secondly, cybersecurity and information security activities in universities have increased dramatically in recent times because of publicised data breaches and the impact of data breach legislation. If your research involves foreign citizens, you may also find foreign legislation impacting the type of response required.

Thirdly, funding agencies, such as government departments are increasingly specifying security and privacy requirements in tender responses and contracts.

These are having an impact on research project governance and practices, particularly for projects where the researcher has identified they are working with sensitive data. While the conversation typically focuses on data identified under the privacy acts as sensitive (e.g. Personally Identifiable Information (Labelled) under the Australian Privacy Act), researchers handle a range of data they may wish to treat as sensitive, whether for contractual reasons (e.g. participant consent, data sharing agreements) or for other reasons (e.g. ethical or cultural).

We have noticed an increasing trend within institutions where researchers are being required to provide more information on how they manage data as specified in a proposal or in a data sharing agreement. This typically revolves around data privacy and security, which is different from the ethics requirements.

What does “security” and “privacy” mean to the practitioner

IT security is more about minimising attack points though process or by using IT solutions to prevent or minimise the impacts of hostile acts or alternatively minimise impacts though misadventure (e.g. leaving a laptop on a bus). Data security is more in the sphere of IT and not researchers. This is reflected in which software products, systems and storage are “certified” to be safely used for handling and managing data classified as sensitive. IT usually also provides the identity management systems used to share data.

We have also noticed that researchers are relying on software vendors’ website claims about security and privacy which is problematic because most cloud software is running from offshore facilities which do not comply with Australian privacy legislation. Unless you are an expert in both Australian legislation and cybersecurity you need to rely on the expertise of your institutional IT and cybersecurity teams to verify vendors’ claims.

In the current environment, data privacy is more about mandated steps and activities designed to force a minimal set of user behaviours to prevent harm caused through successful attacks or accidental data breaches. It usually involves punishment to force good behaviour (e.g. see Data Breach Legislation for late reporting). Typically, data privacy is more the responsibility of the researcher. It usually involves governance processes (e.g. who has been given access to what data) or practices (e.g. what software products the team actually uses to share and store data).

What we should be worrying about

The Notifiable Data Breaches Statistics Report: 1 April to 30 June 2019 highlighted that only 4% of breaches, out of 254 notifications, were due to system faults, but 34% were due to human error and 62% due to malicious or criminal acts. Based on these statistics, the biggest risk associated with data breaches is where the data is in the hands of the end-user (i.e. the researcher) not with the IT systems themselves.

We argue the risks are also greater in research than the general population because of a number of factors such as the diversity of data held (e.g. data files, images, audio etc), the fluidity of the team membership, teams often being made up of staff across department and institutional boundaries, mobility of staff, data collection activities offsite, and the range of IT products needed in the research process.

For this discussion, the focus is on the governance and practice factor within the research project team and how this relates back to the ethics requirements when it has been highlighted that the project will involve working with sensitive data.

Help!!

We have worked closely with researcher groups for many years and have noticed a common problem. Researchers are confronted with numerous legislative, regulatory, policy and contractual requirements all written in terminology and language that bears little resemblance with what happens in practice. For example, to comply with legislation:

  • what does sending a data file “securely” over the internet actually look like in practice and which IT products are “safe”?
  • Is your university-provided laptop with the standard institutional image certified as “safe” for data classified as private? How do you know?
  • Is your mobile phone a “safe” technology to record interviews or images classified as private data? What is a “safe” technology for field work?

Within the university sector a range of institutional business units provide support services. For example, IT may provide advice assessing the security and privacy compliance of software, networked equipment or hardware infrastructure and the library may provide data management advice covering sensitive data. At our institution, Griffith University, the eResearch Services and the Library Research Services teams have been working closely with research groups to navigate their way through this minefield to develop standard practices fit for their purpose.

What we think is the best way forward

Our approach is to follow the Five Safes framework which has also been adopted by the Office of the National Data Commissioner. For example:

  • Safe People Is the research team member appropriately authorised to access and use specified data i.e. do you have a documented data access plan against team roles and a governance/induction process to gain access to restricted data?
  • Safe Projects Is the data to be used for an appropriate purpose i.e. do you have copies of the underlying data sharing/consent agreements, contracts, documents outlining ownership and licensing rights?
  • Safe Settings Does the access environment prevent unauthorised use i.e. do IT systems and processes support this and are access levels checked regularly?
  • Safe Data Has appropriate and sufficient protection been applied to the data i.e. what is it and does it commensurate with the level of risk involved?
  • Safe Outputs Are the statistical results non-disclosive or have you checked rights/licensing issues?

Expect to see a lot more of the Five Safes approach in the coming years.

References

Hardy, M. C., Carter, A., & Bowden, N. (2016). What do postdocs need to succeed? A survey of current standing and future directions for Australian researchers.2, 16093. https://doi.org/10.1057/palcomms.2016.93

Meacham, S. (2016). The 2016 ASMR Health and Medical Research Workforce Survey. Australian Society of Medical Research.

Contributors

Malcolm Wolski, Director eResearch Services, Griffith University

Andrew Bowness, Manager, Support Services, eResearch Services, Griffith University

This post may be cited as:
Wolski, M. and Bowness, A. (29 September 2019) Ethics, Security and Privacy – the Bermuda Triangle of data management?. Research Ethics Monthly. Retrieved from: https://ahrecs.com/research-integrity/ethics-security-and-privacy-the-bermuda-triangle-of-data-management

0