To manage sensitive research data appropriately, ethics, security and privacy requirements need to be considered. Researchers are traditionally familiar with ethics, but often have not considered the privacy and security pieces of the puzzle. Our reasons for making this statement are:
- IT products used in research change rapidly
- Legislation changes rapidly and there are jurisdictional issues
- Most researchers are not legal or IT experts
- No one teaches them enough basics to know what is risky behaviour
The recent revision to the Australian Code for the Responsible Conduct of Research (2018) on Management of Data and Information in Research highlights that it is not just the responsibility of a university to use best practice, but it is also the responsibility of the researcher. The responsible conduct of research includes within its scope the appropriate generation, collection, access, use, analysis, disclosure, storage, retention, disposal, sharing and re-use of data and information. Researchers have a responsibility to make themselves aware of the requirements of any relevant codes, legislation, regulatory, contractual or consent agreements, and to ensure they comply with them.
It’s a complex world
However, this is becoming an increasingly more complex environment for researchers. First, privacy legislation is dependent on jurisdiction of participants. For one example, a research project involving participants in Queensland is impacted by not only the Australian Privacy Act but also the Queensland version (Information Privacy Act 2009 Qld), and, if a participant or collaborator is an EU citizen, the General Data Protection Regulation (EU GDPR).
Secondly, cybersecurity and information security activities in universities have increased dramatically in recent times because of publicised data breaches and the impact of data breach legislation. If your research involves foreign citizens, you may also find foreign legislation impacting the type of response required.
Thirdly, funding agencies, such as government departments are increasingly specifying security and privacy requirements in tender responses and contracts.
These are having an impact on research project governance and practices, particularly for projects where the researcher has identified they are working with sensitive data. While the conversation typically focuses on data identified under the privacy acts as sensitive (e.g. Personally Identifiable Information (Labelled) under the Australian Privacy Act), researchers handle a range of data they may wish to treat as sensitive, whether for contractual reasons (e.g. participant consent, data sharing agreements) or for other reasons (e.g. ethical or cultural).
We have noticed an increasing trend within institutions where researchers are being required to provide more information on how they manage data as specified in a proposal or in a data sharing agreement. This typically revolves around data privacy and security, which is different from the ethics requirements.
What does “security” and “privacy” mean to the practitioner
IT security is more about minimising attack points though process or by using IT solutions to prevent or minimise the impacts of hostile acts or alternatively minimise impacts though misadventure (e.g. leaving a laptop on a bus). Data security is more in the sphere of IT and not researchers. This is reflected in which software products, systems and storage are “certified” to be safely used for handling and managing data classified as sensitive. IT usually also provides the identity management systems used to share data.
We have also noticed that researchers are relying on software vendors’ website claims about security and privacy which is problematic because most cloud software is running from offshore facilities which do not comply with Australian privacy legislation. Unless you are an expert in both Australian legislation and cybersecurity you need to rely on the expertise of your institutional IT and cybersecurity teams to verify vendors’ claims.
In the current environment, data privacy is more about mandated steps and activities designed to force a minimal set of user behaviours to prevent harm caused through successful attacks or accidental data breaches. It usually involves punishment to force good behaviour (e.g. see Data Breach Legislation for late reporting). Typically, data privacy is more the responsibility of the researcher. It usually involves governance processes (e.g. who has been given access to what data) or practices (e.g. what software products the team actually uses to share and store data).
What we should be worrying about
The Notifiable Data Breaches Statistics Report: 1 April to 30 June 2019 highlighted that only 4% of breaches, out of 254 notifications, were due to system faults, but 34% were due to human error and 62% due to malicious or criminal acts. Based on these statistics, the biggest risk associated with data breaches is where the data is in the hands of the end-user (i.e. the researcher) not with the IT systems themselves.
We argue the risks are also greater in research than the general population because of a number of factors such as the diversity of data held (e.g. data files, images, audio etc), the fluidity of the team membership, teams often being made up of staff across department and institutional boundaries, mobility of staff, data collection activities offsite, and the range of IT products needed in the research process.
For this discussion, the focus is on the governance and practice factor within the research project team and how this relates back to the ethics requirements when it has been highlighted that the project will involve working with sensitive data.
We have worked closely with researcher groups for many years and have noticed a common problem. Researchers are confronted with numerous legislative, regulatory, policy and contractual requirements all written in terminology and language that bears little resemblance with what happens in practice. For example, to comply with legislation:
- what does sending a data file “securely” over the internet actually look like in practice and which IT products are “safe”?
- Is your university-provided laptop with the standard institutional image certified as “safe” for data classified as private? How do you know?
- Is your mobile phone a “safe” technology to record interviews or images classified as private data? What is a “safe” technology for field work?
Within the university sector a range of institutional business units provide support services. For example, IT may provide advice assessing the security and privacy compliance of software, networked equipment or hardware infrastructure and the library may provide data management advice covering sensitive data. At our institution, Griffith University, the eResearch Services and the Library Research Services teams have been working closely with research groups to navigate their way through this minefield to develop standard practices fit for their purpose.
What we think is the best way forward
Our approach is to follow the Five Safes framework which has also been adopted by the Office of the National Data Commissioner. For example:
- Safe People Is the research team member appropriately authorised to access and use specified data i.e. do you have a documented data access plan against team roles and a governance/induction process to gain access to restricted data?
- Safe Projects Is the data to be used for an appropriate purpose i.e. do you have copies of the underlying data sharing/consent agreements, contracts, documents outlining ownership and licensing rights?
- Safe Settings Does the access environment prevent unauthorised use i.e. do IT systems and processes support this and are access levels checked regularly?
- Safe Data Has appropriate and sufficient protection been applied to the data i.e. what is it and does it commensurate with the level of risk involved?
- Safe Outputs Are the statistical results non-disclosive or have you checked rights/licensing issues?
Expect to see a lot more of the Five Safes approach in the coming years.
Hardy, M. C., Carter, A., & Bowden, N. (2016). What do postdocs need to succeed? A survey of current standing and future directions for Australian researchers.2, 16093. https://doi.org/10.1057/palcomms.2016.93
Meacham, S. (2016). The 2016 ASMR Health and Medical Research Workforce Survey. Australian Society of Medical Research.
Malcolm Wolski, Director eResearch Services, Griffith University
Andrew Bowness, Manager, Support Services, eResearch Services, Griffith University
This post may be cited as:
Wolski, M. and Bowness, A. (29 September 2019) Ethics, Security and Privacy – the Bermuda Triangle of data management?. Research Ethics Monthly. Retrieved from: https://ahrecs.com/research-integrity/ethics-security-and-privacy-the-bermuda-triangle-of-data-management