ACN - 101321555 Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Resource Library

Research Ethics MonthlyISSN 2206-2483

Human Research Ethics

Australasian Human Research Ethics Consultancy Services Pty Ltd (AHRECS)

Hints for Using Worked Examples in Training Sessions0

 

First of all a frank acknowledgement by the AHRECS team – In the past we’ve merrily used invented applications/vignettes, sometimes with deliberately inserted defects, and de-identified real proposals (with permission) in the professional development activities we’ve facilitated. We did so as a way to help research ethics reviewers and researchers (but reviewers made up the overwhelming majority of these workshops) to spot mistakes and in doing so demonstrating they understood an ethical principle or a specific provision of a statement/code/policy. At the time we might even have congratulated ourselves on providing a real world practical activity rather than merely telling attendees what they should do.
.
A couple of years ago each of us drew the same conclusion and were horrified – The use of ‘can you find the hidden flaw’ exercises was part of the reason for the adversarial climate between researcher and research ethics reviewers. They reinforce the message that the job of a research ethics review body is to find what’s wrong with a project, that members are being effective if they find something other members may have missed and that they should expect to find ethical defects: that is, an (unwarranted) assumption that participants need to be protected from researchers.
.
As Jim notes, examples can be used positively in professional development activities for research ethics reviewers and researchers. Such activities can be used as a way to focus on congratulating researchers for novel or elegant solutions to ethics challenges, on facilitating rather than policing research, and how to achieve best practice in review feedback.
.
Such examples should be used in all our professional development strategies.
The use of defective examples is dead. Long like the use of positive examples. 

Training sessions for new ethics committee members and new researchers frequently use a completed application as a fully-worked example of how to practically implement legislation, codes, and administrative processes.  There is now a solid body of scientific findings that can guide the effective use of worked examples in promoting learning and its generalisation to new situations.1  Based on these findings, here are three evidence-based hints:

.

(1) Walk trainees through at least two completed ethics applications for related projects.  According to the available research, a single example will most likely cause new committee members to see it as an ideal exemplar that all applications must conform to.  Similarly, new researchers will tend to see a single example as an ideal template.  They may try to squeeze all their information into that template even if it metaphorically means pounding square pegs into round holes.  Enabling trainees to study, compare, and contrast two or worked examples dramatically increases understanding of the underlying principles and, more importantly, the ability to see analogies between the examples and new applications.2
.
(2) The initial worked examples should be correct, particularly for new members and researchers who are not yet familiar with the legislation, codes, and administrative processes.  As familiarity increases, test cases with deficiencies can then be introduced for study and facilitated discussion.
.
(3) The projects described in initial examples should be relatively simple while still being authentic.  Then, as understanding and skill increases, more complex worked examples and test cases can be introduced.4
.

Given that the time allocated to a training session may be limited to a few hours, readers may wonder how they are going to find the time for extensively using examples while still covering the principles in the legislation, codes, and administrative procedures.  One way to free up time and promote a better linkage of the principles to ethics applications is to convert a lecture-based “just-in-case” approach to learning to an experiential, trainee-centred, “just-in-time” mode.  This conversion can be accomplished by providing a short (5-10 min) introduction that orients the audience to the main points to be covered.  Then, the principles can be brought out in facilitated discussions at relevant points during walk-throughs of the examples and test cases.
.

  1. Renkl A: Toward an instructionally oriented theory of example-based learning. Cognitive Science 2014;38(1):1-37.
  2. Gentner D, Holyoak KJ: Reasoning and learning by analogy: Introduction. American Psychologist 1997;52:32-4.
  3. Stark R, Kopp V, Fischer MR: Case-based learning with worked examples in complex domains: Two experimental studies in undergraduate medical education. Learning and Instruction 2011;21(1):22-33.
  4. Paas F, Van Merrienboer J, Van Gog T. Designing instruction for the contemporary learning landscape. In: Harris IKR, Graham S, Urdan T, editors. APA Educational Psychology Handbook: Vol 3 Application to Learning and Teaching Washington: American Psychological Association; 2011. p. pp. 335-57.
    http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1688&context=edupapers

.
Disclosure of interests

I have no conflict of interest

Contributor
James Kehoe, PhD FRSN
Jim is a Professor of Psychology, UNSW, where his 49-year research career has spanned many areas of learning, memory, and training.  He has served as chair of the Animal Care and Ethics Committee and convener of the Human Research Ethics Advisory Panel (Behavioural Sciences)
Jim’s UNSW staff profileejameskehoe@gmail.com

This post may be cited as:
Kehoe J. (26 March 2018) Hints for Using Worked Examples in Training Sessions. Research Ethics Monthly. Retrieved from: https://ahrecs.com/human-research-ethics/hints-for-using-worked-examples-in-training-sessions

“More what you’d call guidelines”0

 

In a notorious scene from Pirates of the Caribbean: The Curse of the Black Pearl, Captain Barbarossa refers to the Pirate’s Code cynically as ‘what you’d call guidelines’ suggesting that conformity is merely a matter of choice:

Elizabeth: Wait! You have to take me to shore. According to the Code of the Order of the Brethren…

Captain Barbarossa: First, your return to shore was not part of our negotiations nor our agreement so I must do nothing. And secondly, you must be a pirate for the Pirate’s Code to apply and you’re not. And thirdly, the Code is more what you’d call ‘guidelines’ than actual rules. Welcome aboard the Black Pearl, Miss Turner.

Recently, some evidence has emerged that the same observation could be made about another set of guidelines, namely, those relating to the ethics review and conduct of human research in Australia: the National Statement on Ethical Conduct in Human Research issued by the National Health and Medical Research Council, the Australian Research Council and Universities Australia in 2007 and modified to the current version of May 2015. These guidelines set out the principles and processes for ethics review by human research ethics committees (HRECs) and conduct of research in which people are participants. The guidelines also set out requirements for the establishment, membership and operation of HRECs and assign obligations to institutions to see that these are followed. Since 2001, the NHMRC has established and maintained a register on which institutions list their HRECs and agree to operate them according to the National Statement.

Annually, these institutions provide to the NHMRC, on request, reports on the conduct of the HRECs they have established. It is these reports, covering 2014, 2015 and 2016 that provide revealing evidence about the extent to which HRECs and institutions in fact conform to the National Statement.

.

ACTIVITY

‘When an institution has established an HREC, the institution is responsible for ensuring that…

  • review of research proposal is thorough;
  • review processes and procedures are expeditious;
  • the workload of an HREC does not compromise the quality or timeliness of the ethical review;’ (National Statement 5.1.28 (c), (d) & (i))

Some reasons for these guidelines are that ensuring an adequate workload maintains review skills and that avoiding an excessive workload weakens and prolongs reviews.

Evidence:

In 2014, of the 216 HRECs that reported, 11 did not review any proposals and 59 met between 1 to 5 times and 41 HRECs considered not more than 10 proposals.

In 2015, of the 212 HRECs that reported, 10 did not review any proposals and 54 met between 1 to 5 times, and 42 HRECs considered not more than 10 new proposals.

In 2016, of the 210 HRECs that reported, 15 did not review any proposals and 50 met not more than 1 to 5 times and 42 HRECs considered not more than 10 new proposals.

The published data support the conclusion that in each of the last three years about 5% of the HRECs did not review any proposals and about 20% did not review more than 10 new proposals. What would be the effect of such light workloads on the review expertise of HREC members?

.

TRAINING

One other means by which committees can maintain their review skills is to undertake regular continuing education or training. The National Statement contains two relevant guidelines that recognise this:

‘where an institution has established an HREC, the institution is responsible for ensuring that… (b) members undertake: (ii) continuing education;’ (5.1.28 (b)(ii)), and

‘…each member of a review body should: …(c) attend continuing education or training programs in research ethics at least every three years.’ (5.2.3 (c)).

Evidence:

In 2016, 161 of 210 HRECs reported that ‘1 or more’ members had attended training during the year.

In 2015, 185 of the 212 HRECs reported that their institutions made opportunities for training available to members, but only 160 reported that 1 or more members had attended relevant training during the year.

In 2014, although 179 of the 217 HRECs indicted that their responsible institution provided opportunities for members to attend training, only 149 of the committees reported that 1 or more members had attended relevant training during the year.

The data published by the NHMRC does not allow us to identify whether the same HRECs failed to take advantage of training possibilities in different years nor to work out what kinds of institutions were more or less likely to ensure professional development of committee members. Nevertheless, in each of these years, between 30 and 40 HRECs undertook no training at all and the data for those that did could mean that not more than one member attended one training opportunity in that year.

While some HRECs may be investing in the professional development of their members, it is difficult to conclude that the requirements are being taken seriously across the sector.

Indeed, it is doubtful that the sector as a whole is engaging with capacity building of HREC members. Since 2008, the NHMRC has not devoted any resources to training of HRECs, in contrast with its counterparts in the United Kingdom. Canada and the United States. The provision of such professional development as is available has fallen to voluntary national gatherings, such as the Australasian Ethics Network or to commercial providers.

.

MEMBERSHIP

There is a minimum membership for HRECs of eight constituted by specified categories, namely,

  • a chair,
  • at least two community members,
  • a member with experience in counselling or treatment of people,
  • a person who performs a community pastoral role
  • a lawyer
  • at least two researchers. (National Statement 5.1.30)

The specification relates to HREC decision-making, as the guidelines provide that HREC decisions ‘must be informed by an exchange of opinions from each of those who constitute the minimum membership…’

Evidence:

In the 2016 year, 16 of the 210 HRECs reported that they did not have the minimum membership during the year.

In the 2015 year, 42 of the 212 HRECs reported during the year that they made decisions on proposals when there was a vacancy in their membership, conduct that the report noted as being ‘contrary to the National Statement’.

In the 2014 year, 34 of the 216 HRECs reported that they had continued meeting when there was a vacancy in their membership, again noted in the report as being ‘contrary to the National Statement’.

Accordingly, there were numerous decisions – it is impossible to calculate how many – made by HRECs during each of the last three years that lacked the range of input that the guidelines require. Further, as the reports in 2014 and 2015 note, such decision-making is contrary to the National Statement and, in turn, a failure by the responsible institution to fulfil its responsibility to ‘see that any human research for which they are responsible is… ethically reviewed and monitor in accordance with this National Statement.’ (National Statement 5.1.1(b)).

.

GENDER BALANCE

One other guideline about membership provides that, ‘as far as possible, there should be equal numbers of men and women’ (National Statement 5.1.29(a)).

Evidence:

In the 2016 report, it was stated that because ‘It is recognised that this may be difficult to attain’, the ‘NHMRC considered instances in which there was at least an 80:20 gender imbalance as significant and requiring attention.’ Only 5 of the 210 HRECs reported such an imbalance. No data were reported of the number of HRECs in which there was a lesser gender imbalance.

In the 2015 report, the same statements appear and only 3 of the 212 HRECs reported an imbalance of 80:20 or more.

In the 2014 report, that statement does not appear and 24 of the 216 HRECs are reported to have ‘less than a 70:30 gender balance in either direction.’

Accordingly, gender imbalances of anything less than 1 in 5 are not regarded as in need of attention, an interpretation of the guideline that provides very little incentive to correct imbalances that, in many other contexts in 2018, would be regarded as unacceptable. Were the sector taking gender seriously in HREC membership, it would be far better to create more meaningful targets and, if necessary, to phase these in over time. It would also be sensible to track where gender imbalance lies within committee membership and compare that to broader patterns within the host institutions. For example, if the small number of female academics on a committee reflected a broader problem in an institution, the latter may need to be addressed first rather than placing an increased burden on a small number of more senior female academics.

Year Reviewed applications Complaints about research Complaints about review
2007 10777 138 49
2008 21087 96 19
2009 22306 100 11
2010 23696 121 21
2011 25022 n/a n/a
2012 26257 161 19
2013 24882 145 20
2014 20892 226 58
2015 18768 229 34
2016 18039 237 37

.

IMPLICATIONS

Does any of this matter? If so, why: do deficiencies in the processes by which HRECs reach their conclusions and decisions contribute to more of those decisions being inappropriate or unacceptable? Trends in complaints data may be some indication, and these data, available from 2007 to the current year (except 2011), tabulated below, do show some increase in both kinds of complaints. However, it is notoriously difficult to generate reliable conclusions from complaints data.

Implications for NHMRC Administering Institutions

Institutions take responsibility for annual HREC reports and many of those institutions will be administering institutions with NHMRC grants, the conditions for which are contained in a standard Funding Agreement. Clause 24 of that agreement provides that ‘in carrying out this Agreement, the Administering Institution must comply… with… the NHMRC Approved Standards and Guidelines.’ (which are defined to include the National Statement). Clause 30.4 of the same agreement requires Administering Institutions to ‘immediately notify the NHMRC in writing if it ceases fully to comply with… the NHMRC Approved Standard and Guidelines. Such failures are among the grounds on which the NHMRC can suspend or terminate research funding’ (Funding Agreement clause 15).

Should reporting of a non-compliant HREC be treated as such a breach? Given that an institution is required to notify the NHMRC when it ceases ‘fully to comply’, any of the deficiencies recorded above from the last three annual reports of HRECs would appear to be sufficient.

.

Some wider implications and questions

These data raise the question: when is an HREC decision sufficiently defective as not to merit respect or recognition? When there is no input from any one of the minimum members? When the gender balance is lower than 1 in 5 in either direction? When the decision is the only one or only one of 5 that the committee has made in a year? When none of the committee members have attended training in the last three years? When the decision is one of 30 made at the same meeting? When there is no reference in the decision to any one of the four key review criteria: research merit, justice, beneficence or respect?

Will tolerance of nonconforming practice lead to declining support for and recognition of HRECs and, in turn, of ethics review itself? If that recognition declines, will the need for ethics review be questioned? Would recognition that Australian ethics review lacks accountability and conformity to national guidelines threaten the reputation of Australian research, researchers and research institutions?

Doing nothing in the face of this evidence condones the nonconforming practices and risks breeding an indifference to ethics review and, in turn, to seeing it as irrelevant and unnecessary.

However, if a response to these data of deficiency is needed, at what level should that response be made: that of the HRECs, institutions or all the sector stakeholders? In short, who should take responsibility for the reliability of ethics review and how should that responsibility be implemented?

Contributors
Colin Thomson – Senior Consultant, AHRECS | AHRECS biocolin.thomson@ahrecs.com

This post may be cited as:
Thomson C. (2017, 22 March 2018) “More what you’d call guidelines”. Research Ethics Monthly. Retrieved from: https://ahrecs.com/human-research-ethics/more-what-youd-call-guidelines

Disaster Research and its Ethical Review0

 

Disaster research ethics is a growing area of interest within the research ethics field. Given the lack of a universal definition of disasters, it should not be a surprise that disaster research ethics is defined in various ways. Early approaches focused on ethical issues in conducting research in the acute phase of disasters (O’Mathúna 2010). Given the similarities of some of the ethical issues, it came to include humanitarian crises and emergencies. A recent review combined mental health research in natural disasters, armed conflicts and the associated refugee and internally displaced persons (IDP) settings (Chiumento et al. 2017). Each of these settings raises distinct ethical issues, as well as practical challenges for those ethically reviewing disaster research. The 2016 revision of the Council for International Organizations of Medical Sciences (CIOMS) research ethics guidelines included a section on disaster research (https://cioms.ch/wp-content/uploads/2017/01/WEB-CIOMS-EthicalGuidelines.pdf). This blog will highlight a few of the practical challenges and note some efforts to respond to these.

One issue is how some disasters happen suddenly, while research ethics review takes time. The 2016 CIOMS guidelines call for innovative approaches to research ethics review, including ways to pre-assess protocols so that they can be reviewed rapidly once a relevant disaster occurs. As committees develop ways to adapt to disaster research, other review practices can be examined to identify innovative approaches to the challenges.

A key ethical issue to address with disaster research is whether a particular project should be conducted at this time with these particular participants. In the most immediate phase of an acute disaster, resources and energy should be focused on search and rescue. Researchers could hinder this, or divert scarce resources. At the same time, data should be collected as soon as possible to contribute to the evidence based for first responders. Ethics review committees should ensure justifications are provided for why a project needs to be done during the acute phase. Questions also need to be asked about whether disaster survivors have more important needs than to participate in research. For example, some have questioned whether children who survive war should be asked to participate in research when there are few resources available to help them with the mental health challenges of surviving war (Euwema et al. 2008).

With the move towards a more evidence-based approach to humanitarian work, international and non-governmental organisations (NGOs) are increasingly engaging in research and other evaluation programmes. Some of these organisations may have little experience with research or research ethics, and hence need additional support in developing and conducting projects. Much debate has occurred over what ‘counts’ as research and is therefore required to undergo formal research ethics approval. Rather than asking if a project is research or not, it is more important to identify the ethical issues in the project and ensure they are being addressed as carefully and thoroughly as possible (Chiumento et al. 2017). Needs assessments, projects that monitor or evaluate programmes, public health surveillance, and many other activities raise ethical issues whether or not they are formal academic research studies. At the same time, every project does not need to submit the same sort of detailed research ethics application as a randomised control trial of an experimental drug. Some sort of ethical evaluation should be conducted, and here again there is an opportunity to be innovative. Different formal and informal review mechanisms could be developed to support groups conducting different types of projects. The key concern should be that the ethical issues are being examined and addressed.

Also key here is that people in the communities from which participants will be sought are involved from the design of the project (O’Mathúna 2018). Too many ‘parachute projects’ have been conducted (some with ethical approval) whereby the project is designed completely by outsiders. Once everything has been decided, the team approaches the community only to identify a lack of interest in participating or that certain ethical challenges have been overlooked. Research in other cultures, especially in the midst of armed conflicts, is especially prone to such challenges. Review committees may need to encourage exploratory discussions between researchers and participant communities, or seek evidence of how such discussions have gone.

Unexpected ethical issues often arise in disaster research given the instability and complexity of its settings (O’Mathúna & Siriwardhana 2017). An approach where ethics review bodies give approval to projects and then have little or no engagement other than an annual report is especially inadequate in disasters. Researchers may be forced to make changes in fluid settings, or may encounter unexpected issues. Submitting amendments may not be practical or fast enough, when what is needed is advice and direction from those with research ethics expertise. Thus, initiatives are being developed to provide “on call” ethics advice.

This points to how disaster research often requires additional support and protection for researchers than other types of research. Researchers may enter danger zones (natural or violent) and may see or learn of horrors and atrocities. Researchers can be subjected to physical dangers or traumatised psychologically.. In addition to the normal stresses of conducting research, these additional factors can lead to mistakes and even ethical corner-cutting. Therefore, review committees need to carefully investigate how the physical and mental well-being of researchers will be protected and supported.

These are some examples of how research ethics needs to go beyond approval processes to mechanisms that promote ethical decision-making and personal integrity during research. One such project in which I am involved is seeking insight from humanitarian researchers into the ethical issues experienced in the field (http://PREAportal.org). We are also conducting a systematic review of such issues and collecting case studies from researchers. The goal is to produce a practical tool to facilitate learning lessons from disaster researchers and promote ethical decision-making within teams.

The world is increasingly experiencing disasters and conflicts and huge amounts of resources are put into responses. Some of these resources are put towards evaluating disaster responses, and developing evidence to support disaster responders. We can expect disaster research to increase and to be increasingly seen by research ethics committees. It is therefore important that ethics committees prepare themselves to respond to the ethical challenges that disaster research raises.

References

Chiumento, A., Rahman, A., Frith, L., Snider, L., & Tol, W. A. (2017). Ethical standards for mental health and psychosocial support research in emergencies: Review of literature and current debates. Globalization and Health 13(8). doi 10.1186/s12992-017-0231-y

Euwema, M., de Graaff, D., de Jager, A., & Kalksma-Van Lith, B. (2008). Research with children in war-affected areas. In: Research with Children, Perspectives and Practices, 2nd edition. Eds. Christensen, P. & James, A. Abingdon, UK: Routledge; 189-204.

O’Mathúna, D.  (2010). Conducting research in the aftermath of disasters: Ethical considerations. Journal of Evidence-Based Medicine 3(2):65-75.

O’Mathúna, D. (2018). The dual imperative in disaster research ethics. In: SAGE Handbook of Qualitative Research Ethics. Eds. Iphofen, R. & Tolich M. London: SAGE; 441-454.

O’Mathúna, D., & Siriwardhana, C. (2017). Research ethics and evidence for humanitarian health. Lancet 390(10109):2228-9.

Declaration of interests

Dónal O’Mathúna has been involved in research ethics for over twenty years. He was chair of the Research Ethics Committee at Dublin City University (DCU) for six years. In addition to his joint position at DCU and The Ohio State University, he is Visiting Professor of Ethics in the European Master in Disaster Medicine, Università del Piemonte Orientale, Italy. His research interests focus on ethical issues in disasters, in particular disaster research ethics. He was Chair of the EU-funded COST Action (2012-2016) on Disaster Bioethics (http://DisasterBioethics.eu) and is the Principal Investigator on the R2HC-funded research project, Post-Research Ethics Analysis (http://PREAportal.org).

Contributor
Dónal O’Mathúna, PhD
Associate Professor, School of Nursing & Human Sciences, Dublin City University, Ireland
Associate Professor, College of Nursing, The Ohio State University, Columbus, Ohio, USA
Dónal’s DCU profiledonal.omathuna@dcu.ie
Twitter: @domathuna
http://BioethicsIreland.ie

This post may be cited as:
O’Mathúna D. (2018, 26 February 2018) ‘Disaster Research and its Ethical Review’. Research Ethics Monthly. Retrieved from https://ahrecs.com/human-research-ethics/disaster-research-ethical-review

Ethical Use of Student Data in Higher Education – Advancing the conversation0

 

In a 2016 conference paper discussing ethical use of student data I noted that there was a ‘disconnect between national and international perspectives of the importance of institutional policy and guidelines regarding ethical use of student data, and the perceptions of academics about these guidelines’ (Jones, 2016, p300). I suggested that one strategy for bridging this divide was for conversations to be held both within and between institutions with an aim of informing and enhancing learning and teaching practice and culture. This post provides an overview of some of the conversations that have occurred in this area in the last 12 months in Australasia, particularly through the Australasian Society for Computers in Learning in Tertiary Education (ASCILITE).

First though, my interpretation of the phrase ‘ethical use of student data’. To me, and I am sure many others, this is much more than applying for, and being granted, clearance from your institution’s Human Research Ethics Committee. Certainly, this is an important step if you are intending to disseminate your findings as research and publish, and is sometimes a step that academic staff can overlook if research in their discipline does not normally involve ethics approval, or they do not consider this as they are not directly researching students, just their data. Ethical use also considers:

  • Protection of student privacy
  • Conversations with students regarding reasons for collection and use of data
  • Ensuring that data is used for informing and enhancing practice and the student experience
  • Obtaining consent from students; or, at least, informing students how and why their data will be used

The ability for students to ‘opt out’ of any data collection is a sensitive issue as there are some circumstances, for example, research into online discussion forums where this could adversely affect the research if students were given this option. This is just one aspect that needs further conversations and development of policy and guidelines.

ASCILITE is considered a leading organisation in the southern hemisphere for staff working in tertiary education in ‘fields associated with enhancing learning and teaching through the pedagogical use of technologies’ (ASCILITE, 2014) and as such is well placed to be leading the cross-institutional conversation on ethical use of student data. In 2017 some of the ways these conversations were facilitated included

  • Learning Analytics Special Interest Group ran a series of webinars with one facilitated by Paul Prinsloo having the topic of Responsible Learning Analytics: A Tentative Proposal
  • The 2017 ASCILITE Conference included an Exploratory Panel Session discussing ‘emerging ethical, legal, educational, and technological issues surrounding the collection and use of student data by universities, and the impact these strategies have on student trust and privacy.’
  • The Learning Analytics SIG also held a panel session discussing scenarios for Utopian/Dystopian future in regards to Learning Analytics

However, there was only one submitted paper with reference to ethical use of data (Brooker, Corrin, Mirriahi & Fisher, 2017). Similarly for the upcoming Learning Analytics Knowledge conference (LAK18), only one paper has any reference to ethics in the title, and at the 2017 conference there was one session with 3 papers. This suggests that whilst national and international bodies are promoting the conversations, there is still a way to go before these happen widely within institutions. Are there other organisations that are facilitating similar discussions?

Whilst promoting these conversations is a useful first step, there is also a need to continue to develop guidelines and processes. These will help ensure that staff are submitting ethics applications and their work with student data is conducted in an ethical manner. Additionally, Human Ethics staff need to work alongside academics and Learning & Teaching support staff; journals and conferences need to ensure that appropriate ethics approvals have been obtained and institutions need to involve students in all facets of Learning Analytics. These strategies will promote more widespread adoption of ethical practices in use of student data to inform and enhance learning and teaching practice and culture, and, ultimately, the student experience. Hopefully initiatives such as those outlined in this post will continue to grow and spark the necessary conversations – who will join us?

References

ASCILITE (2014) About ASCILITE. Retrieved from http://ascilite.org/about-ascilite/

Brooker, A., Corrin, L., Mirriahi, N. & Fisher, J. (2017). Defining ‘data’ in conversations with students about the ethical use of learning analytics. In H. Partridge, K. Davis, & J. Thomas. (Eds.), Me, Us, IT! Proceedings ASCILITE2017: 34th International Conference on Innovation, Practice and Research in the Use of Educational Technologies in Tertiary Education (pp. 27-31). Retrieved from http://2017conference.ascilite.org/wp-content/uploads/2017/11/Concise-BROOKER.pdf

Jones, H. (2016). Ethical considerations in the use of student data: International perspectives and educators’ perceptions. In S. Barker, S. Dawson, A. Pardo, & C. Colvin (Eds.), Show Me The Learning. Proceedings ASCILITE 2016 Adelaide (pp. 300-304). Retrieved from http://2016conference.ascilite.org/wp-content/uploads/ascilite2016_jonesh_concise.pdf

Declaration of Interests

Hazel Jones is a member of the ASCILITE Executive Committee and one of the facilitators for the Learning Analytics SIG.

Contributor
Hazel Jones
PhD candidiate/Educational Designer | University of Southern Queensland | USQ Staff ProfileHazel.Jones@usq.edu.au

This post may be cited as:
Jones H. (2018, 22 February 2018) ‘Ethical Use of Student Data in Higher Education – Advancing the conversation’. Research Ethics Monthly. Retrieved from https://ahrecs.com/human-research-ethics/ethical-use-student-data-higher-education-advancing-conversation

Page 1 of 1112345...10...Last »