Text (Instruction Step Text)
From The Embassy of Good Science
Describe the actions the user should take to experience the material (including preparation and follow up if any). Write in an active way.
- ⧼SA Foundation Data Type⧽: Text
1
<div>
==='''<span lang="EN-GB">What is a Reproducibility Network?</span>''' ===
</div><div><div>
<span lang="EN-GB">A national Reproducibility Network <u>(RN)</u> is a countrywide peer-led consortium that aims to improve research practices by promoting, supporting, and investigating factors contributing to robust research including, but not limited to, reproducibility, replicability, and Open Science. Activities may include promoting training activities, disseminating best practices, supporting research on reproducible research practices, and advocacy for reproducible and open research. </span>
</div><div>
<span lang="EN-GB">An RN typically serves as a hub to connect researchers to exchange ideas and good practices, promoting collaboration among researchers from a range of scientific disciplines. These networks provide infrastructure, facilitate opportunities for researchers and initiatives to support and amplify each other’s efforts, and foster community building as well as shared problem solving. </span>
</div><div>
<span lang="EN-GB">RNs can serve as connectors to other stakeholder groups such as universities, funders, or academic publishers.</span>
</div></div>
===Benefits for setting up an Reproducibility Network?===
<span lang="EN-GB">By providing seed funding for the establishment of a new RN, you actively contribute to the strengthening of reproducibility and Open Science in your local ecosystem. The widespread presence of RNs is crucial, as they function as points of contact for scientific communities who, across e.g., disciplinary, demographic, and geographic contexts, face different challenges and barriers. RNs can provide local and tailored support and keep in mind the different stages of readiness of their local communities for implementing reproducible research practices.</span>
==="Lessons learned" from the TIER2 award===
1. '''<span lang="EN-GB">Build strong community links.</span>''' <span lang="EN-GB">Involve already existing and successful RNs in the establishment of new RNs. This ensures that new RNs receive valuable guidance, input and support early in the establishment process.</span>
2. '''<span lang="EN-GB">Expand connections and broaden the reach.</span>''' <span lang="EN-GB">Reach out to researchers and other relevant stakeholders, such as universities, as this is important for local support and the sustainability of the RN. However, identifying and connecting with researchers in Horizon Europe Widening Participation countries (WIDERA countries) who are active in reproducible research and Open Science practices can be challenging.</span>
3. '''<span lang="EN-GB">Facilitate international support.</span>''' <span lang="EN-GB">Several RNs across the globe exist and more are being established. Build strong international connections amongst them to facilitate the sharing of resources and best practices, this will help to coordinate and amplify efforts.</span>
4. '''<span lang="EN-GB">Focus on the local ecosystem.</span>''' <span lang="EN-GB">RNs are national networks that promote transparent and trustworthy practices in their local research ecosystems. Recognize local needs, geopolitical conditions as well as barriers and available resources.</span>
===How has TIER2 supported the awarded networks?===
1. TIER2 members and award organizers have facilitated connections between awardees and existing international Reproducibility and Open Science networks. via email contacts as well as through virtual and in-person meetings.
2. <span lang="EN-GB">TIER2 award organizers, have added awardees, with their consent, to various mailing lists and newsletters from different international RNs.</span>
3.<span lang="EN-GB">Further, TIER2 award organizers have invited awardees to attend and speak at several Open Science and reproducibility events to meet (steering group) members from other RNs and (inter-)national initiatives.</span>
<span lang="EN-GB">4. TIER2 project members as well as award organizers have provided the awardees with resources and information on relevant topics, including different RN structures, website layout and structure, as well as language.</span>
===Awardess of the TIER2 Reproducibility Network Award===
TIER2 is proud to announce the two awarded consortia based in Ukraine and Georgia who will receive the monetary awards from the Reproducibility Network open call this summer. Multiple scientific consortia from Horizon Europe “[https://www.era-learn.eu/support-for-partnerships/additional-activities/openness-inclusivness-transparency/widening-and-inclusiveness Widening Participation]” countries submitted applications describing their plans and motivations for establishing a Reproducibility Network in their home country which TIER2 would support with a €5000 prize.
[[File:Ukraine and Georgia RN.png|center|thumb]]
====Ukrainian Consortium====
The Ukrainian consortium, from the Institute for Open Science and Innovation ([https://www.facebook.com/inosi.org/ INOSI]), [https://twitter.com/optima_open OPTIMA] Project Consortium & [https://lpnu.ua/en Lviv Polytechnic National University], comprises researchers with a broad scientific background, ranging from informatics to chemistry and ecology. The core of the consortium has already experience working together in promoting Open Science in Ukraine, particularly within the OPTIMA project and within the Working Group on the [https://www.kmu.gov.ua/en/news/ukraina-pryiednalas-do-krain-ies-shcho-maiut-zatverdzhenyi-plan-realizatsii-pryntsypiv-vidkrytoi-nauky National Plan for Open Science development] in Ukraine. In response to what motivated them in participating in the open call, they state that: ''“Ukraine needs good science to make good decisions in all spheres. This is particularly relevant during the war and will be needed for the post-war recovery. Reproducibility (as a part of the Open Science concept) can boost the value of academic research in Ukraine making science a real game-changer for progress''”. Regarding their future plans for the Ukrainian Reproducibility Network, they share: “''In the short term, the ambition is to kickstart the network of experts, able to lead the discussion on reproducibility and become a role model on the national level. In the long term, the ambition is, of course, to make reproducibility in research a standard by default. This has to be supported by co-creation and sharing best practices, research on research, and making an impact on national policy. We hope that the network will be viable and ambitious enough to compete for international grant funding to achieve this''”. With regard to the global state of reproducibility & scientific integrity, they say: “''The progress on the global level is visible, but it's only the beginning of a long way forward. The key to achieving the goal is a strong research culture that is often missing in many academic communities. Openness and transparency in performing and communicating research are the basic things to be established''.”
====Georgian Consortium====
The consortium from Georgia, comprises three researchers from different institutions: the Department of Human Anatomy at Tbilisi State Medical University ([https://tsmu.edu/ts/home TSMU]), the Faculty of Medicine at Tbilisi State University ([https://www.tsu.ge/en TSU]), the Institute of Morphology, and the Scientific Department at Caucasus International University ([https://ciu.edu.ge/?lang=en CIU]). Brought together as team members of a research group, they were drafting a proposal for a Horizon Europe (HE) project when their HE grant coordinator alerted them about the TIER2 open call announcement. The team was immediately drawn to it, sharing that:
“''During our individual and collective research endeavors, we frequently encountered challenges in reproducing experiment results, a phenomenon that was not isolated to our work but across the global research landscape. [...] a consolidated effort was needed to elevate the state of research in our nation. [...] Moreover, the opportunity to foster a Reproducibility Network (RN) in Georgia provided a platform to unite our nation's fragmented research endeavors, drive standards in research methodologies, and integrate with the global scientific community''”.
Regarding their short-term plans after receiving the award, they list the following: “''Organize the foundational meeting, bringing together stakeholders from various Georgian research institutions, to lay down the operational blueprint for the RN; Launch training sessions that cover core skills in reproducibility, data management, and research design; Conduct sessions in universities and community centers to educate and foster trust in scientific research: Set up an official RN website and leverage social media for real-time updates and engagements”.''
''In the long term, the team envisions to “Establish partnerships with International Reproducibility Networks, facilitating knowledge exchange and joint research projects and collaborate with Georgian institutions to advocate for policies emphasizing reproducibility and transparency''”. Their global vision for the state of reproducibility and scientific integrity “''is one where every piece of research, irrespective of its domain or geography, stands the test of time and validation. We envision a scientific landscape where collaboration, transparency, and inclusivity aren't just ideals but are deeply integrated into research methodologies''.”
They share that the way forward is to prioritize
“''1)Education & Training: Equip researchers, especially the younger generation, with the necessary tools and knowledge to ensure reproducibility.''
''2)Open Science: Promote Open Access publications, making research universally accessible and subject to broader scrutiny.''
''3)Interdisciplinary Collaborations: Foster collaborations across disciplines, pooling expertise and resources to tackle complex research challenges.''
''4)Technological Integration: Leverage technology, especially AI and data analytics, to aid in ensuring research consistency and integrity.''
''If we could change one thing, it would be the isolated nature of scientific endeavors prevalent in many regions, like Georgia. We would foster a globally interconnected research network where findings, methodologies, and tools are shared seamlessly, accelerating scientific progress and ensuring its robustness''”.
Lastly, the Georgian consortium highlights what they would change in the global reproducibility landscape if they could:
“''1)Revise Academic Incentives: The current "publish or perish" culture sometimes prioritizes quantity over quality. We'd advocate for a system where researchers are rewarded for the reproducibility and integrity of their work, not just the volume. 2)Enhanced Training: Incorporate reproducibility and Open Science training at early academic stages, ensuring that upcoming researchers are well-equipped with the necessary skills and ethos. 3)Global Collaboration Platforms: Creation of digital platforms that facilitate global collaboration, data sharing, and mutual validation of research findings, breaking down silos and fostering a truly global scientific community''.”
====Serbian Consortium====
[[File:Serbian RN.png|center|thumb]]
<span lang="EN-US">[https://tier2-project.eu/ TIER2] is excited to announce the winner of the 2024 Open Call aimed at fostering the establishment of a third Reproducibility Network (RN) in “Widening Participation” countries - Serbia. The goal of Reproducibility Networks is to promote rigorous research practices, facilitate interdisciplinary collaborations and discussions, and enhance the trustworthiness of scientific work. The Serbian consortium will thus receive a €5,000 grant to organise an initial meeting, laying the groundwork for establishing an RN in their country.</span>
<span lang="EN-US">The Serbian consortium, consisting of nine organisations – six institutes and three faculties –, brings together diverse academic backgrounds with a shared commitment to improving research culture in Serbia. They focus on integrating Open Science, reproducibility, and inclusive policies into institutions and education. The consortium has previously collaborated on initiatives like the [https://nitra.gov.rs/en/ Team for Open Science in Serbia] and [https://nitra.gov.rs/en/inovacije/projekat-saige the Saige project], organising workshops and training to promote open science practices.</span>
<span lang="EN-US">Motivated by challenges such as low research investment and a scientific system that prioritises quantity over quality, they believe establishing a Reproducibility Network will enhance collaboration and help to address these issues. In the short term, they plan to promote the network through conferences, a kick-off event, and online platforms. Long-term, they aim to integrate Open Science into curricula, incentivise reproducibility, support initiatives beyond major centers, advocate for policy changes, and build international collaborations.</span>
<span lang="EN-US">Their vision for global reproducibility is one where research is transparent, ethical, and rigorous:</span>
<span lang="EN-US">''“In this ideal state, researchers across all disciplines adhere to principles of Open Science, ensuring that their methods, data, and results are accessible and reproducible.”''</span>
<span lang="EN-US">– Matija Zlatar on behalf of the Serbian consortium</span>
<span lang="EN-US">They advocate for integrating these principles into education, establishing incentive systems, and fostering collaboration:</span>
<span lang="EN-US">''“We should integrate reproducibility and Open Science practices into university curricula and professional development programs to equip researchers with the necessary skills and knowledge to conduct reproducible research.”''</span>
<span lang="EN-US">– Matija Zlatar on behalf of the Serbian consortium</span>
===Resources to set up a Reproducibility Network===
*[https://osf.io/ndwsj Application template]
*[https://osf.io/tsmxh Reviewer guidelines]
==='''<span lang="EN-GB">Call to action – what could you do?</span>'''===
<div>
*'''<span lang="EN-GB">Are you a researcher?</span>''' <span lang="EN-GB">Join an existing RN in your country or, if none exist, identify supporters and form your own network.</span>
</div><div>
*'''<span lang="EN-GB">Are you a funder?</span>''' <span lang="EN-GB">Offer your support by providing (additional) funding for personnel costs, events on reproducibility practices and Open Science, or training opportunities. Further, establish your own award calls to support the establishment of more RNs.</span>
</div><div>
*'''<span lang="EN-GB">Are you a publisher?</span>''' <span lang="EN-GB">Support the wide range of outputs generated by RNs, for example via special issues or journals, to help them increase their reach.</span>
</div>
===Not sure, if your country has an established Reproducibility Network?===
<span lang="EN-GB">Visit the Global Networks page hosted by the UKRN to find out if a Reproducibility Network already exists in your country: https://www.ukrn.org/global-networks/.</span> <div></div>
Please watch the video carefully! The purpose of this exercise is to assess your understanding of the concept of circularity and its role in addressing today’s environmental challenges. Circularity is essential because it helps reduce resource extraction, waste, and pollution while keeping materials in use for as long as possible. By fostering more sustainable production and consumption patterns, circularity contributes to protecting ecosystems, supporting equitable economies, and achieving the Sustainable Development Goals. +
Read the slides carefully and learn about the concept of climate mainstreaming within organisations and the key steps required for its successful implementation. +
Doing research with communities affected by climate change: Climate-conscious methodologies matrix (for researchers and ethics reviewers) +
Please go through the PowerPoint presentation. +
Introduction to the evaluation of the effectiveness of Research Ethics and Integrity (REI) training +
What do we know about measuring training effectiveness?
Self-assessment is one of the most prevalent means to measure effectiveness of REI training. The second most frequently used method for assessing training effect was a moral reasoning test. While the developed tools were mostly used pre and post intervention (with/without control groups) and the results were compared, there were other measures added to evaluate the learning process or student progress.
It also seemed that the tests designed for ethics training (like DIT, DEST, TESS) cannot be universally implemented to all REI training due to very different formats of training and/or availability of tests. There are also qualitative possibilities (like learning diaries, tasks submitted during other courses, etc.) to monitor the learning progress, and through that assess the effectiveness of training.
See figure 1 outlining the identified measures and their application scale and feasibility (more details in D4.1):
Figure 1. Measurement tools identified in the literature review (numbers indicate Kirkpatrick’s levels, see below) (tool descriptions in D4.1).
[[File:Screenshot 2025-11-17 205639.png|center|frameless|500x500px]]
As can be seen in figure 1, self-reporting (blue bubbles) is the most feasible measure and that can be implemented large-scale (mostly). It is no wonder that this is, based on the literature review, also the most used approach. Also, SPEEES and SOLKA tests utilise self-reporting.
Most tools measure the content or the learning process (green bubbles) – they give information about what was learned during the training. As indicated, feasibility of the measures is not high – either a lot of work needs to be put into implementing the tool, they are not openly available, or they may be field specific. Possibilities for measuring behaviour (yellow bubbles) are scarce.
Comparing results collected with various tools is almost impossible because they measure different aspects of training with different analysis instruments. It is not possible to determine whether qualitative (indicated as ’qual’) or quantitative (indicated as ‘stat’ in Fig 1) methods of analysis are more feasible. Feasibility depends on the combination of various aspects, such as accessibility to the tool, need of special equipment, and competence required.
The SOLO taxonomy (Biggs & Collins, 1982; Biggs & Tang, 2007) can be applied in REI teaching and learning to structure, design, and assess learning outcomes.
*It categorizes levels of understanding from simple to complex.
*The taxonomy helps educators identify whether learners are merely recognizing ethical issues or integrating and applying them in sophisticated ways.
*Learning tasks and assignments should be designed to elicit responses that demonstrate understanding at the intended SOLO level.
*Facilitators are responsible for aligning assignments with the learning outcomes they aim to assess.
The taxonomy, therefore, serves as both a framework for assessment and a guide for instructional design in REI contexts. +
The aim of educating secondary school students is to raise their awareness of the ethical and integrity challenges they may face.
Resources for secondary school students include (please see the last section of the BEYOND trainer guide for an overview of materials divided by topics and target groups):
*The [[Integrity Teacher Guide]] for secondary school students' education.
*The [[Path2integrity Learning cards]] (Path2Integrity learning cards S) focusing on students in high schools, and a dedicated handbook (S-Series handbook).
For training effectiveness measurement facilitators can use the following tools for learning output collection and for analysing collected material:
{| class="wikitable"
|+
!'''Tool for collecting learning outputs'''
!'''Details'''
!'''Analysis instrument **'''
|-
|'''ProLearning app'''
|''ProLearning'': https://www.epfl.ch/labs/chili/dualt/current-projects/realto/
<span lang="EN-GB"></span>
|learning analytics
|-
|'''Engagement app'''
|''ForgetNot'' (by EduLog): https://web.htk.tlu.ee/forgetnot
|learning analytics
|-
|'''Self-Reflection Form/Compass'''
|App under development, [https://docs.google.com/forms/d/17ORaVeaLjBYucufYNGF6TNjgtqqNdlk5BhSp5bfM5eA/copy form] * (for copying and editing)
|SOLO taxonomy, reflection levels, content criteria
|-
|'''Pre-post texts'''
|Collect a short text (e.g. a response to a case or short essay) before the training and after the training
|SOLO taxonomy, reflection levels, content criteria
|-
|'''Learning diaries'''
|Ask learners keep a diary over a certain period, for each submission provide some guiding questions or topics
|SOLO taxonomy, reflection levels, content criteria
|-
|'''Group reports'''
|Ask groups working together to provide a (short) group report (or provide a template with points to work on)
|SOLO taxonomy, content criteria
|-
|'''Group discussions'''
|Monitor the group discussions to evaluate the level of understanding and content discussed (scaffold as appropriate)
|SOLO taxonomy, content criteria
|-
|'''Group dynamics'''
|''CoTrack'' application: https://www.cotrack.website/en/
|learning analytics
|-
|'''Retention check'''
|After a certain time (few weeks/months) ask learners to provide a short text (analysis of a case, short essay on an ethics topic/question). Compare the levels of understanding to another piece collected during or right after the training.
|SOLO taxonomy, content criteria
|}
For instance, to measure participants’ reactions during or right after the training, ProLearning app or Self-Reflection Form can be used. In addition, if learners worked in groups and provided a group-report, the learning process can be evaluated based on the SOLO taxonomy to measure the levels of understanding. Moreover, if possible, a couple of months after the training an additional case study could be given to the same learners, and the content of their analysis could again be evaluated with the SOLO taxonomy. This kind of effectiveness measure would give a possibility to triangulate the measurement in different time points.
The aim of educating secondary school students is to raise their awareness of the ethical and integrity challenges they may face.
Resources for secondary school students include (please see the last section of the BEYOND trainer guide for an overview of materials divided by topics and target groups):
*The [[Integrity Teacher Guide]] for secondary school students' education.
*The [[Path2integrity Learning cards]] (Path2Integrity learning cards S) focusing on students in high schools, and a dedicated handbook (S-Series handbook).
For training effectiveness measurement facilitators can use the following tools for learning output collection and for analysing collected material:
{| class="wikitable"
|+
!'''Tool for collecting learning outputs'''
!'''Details'''
!'''Analysis instrument **'''
|-
|'''ProLearning app'''
|''ProLearning'': https://www.epfl.ch/labs/chili/dualt/current-projects/realto/
<span lang="EN-GB"></span>
|learning analytics
|-
|'''Engagement app'''
|''ForgetNot'' (by EduLog): https://web.htk.tlu.ee/forgetnot
|learning analytics
|-
|'''Self-Reflection Form/Compass'''
|App under development, [https://docs.google.com/forms/d/17ORaVeaLjBYucufYNGF6TNjgtqqNdlk5BhSp5bfM5eA/copy form] * (for copying and editing)
|SOLO taxonomy, reflection levels, content criteria
|-
|'''Pre-post texts'''
|Collect a short text (e.g. a response to a case or short essay) before the training and after the training
|SOLO taxonomy, reflection levels, content criteria
|-
|'''Learning diaries'''
|Ask learners keep a diary over a certain period, for each submission provide some guiding questions or topics
|SOLO taxonomy, reflection levels, content criteria
|-
|'''Group reports'''
|Ask groups working together to provide a (short) group report (or provide a template with points to work on)
|SOLO taxonomy, content criteria
|-
|'''Group discussions'''
|Monitor the group discussions to evaluate the level of understanding and content discussed (scaffold as appropriate)
|SOLO taxonomy, content criteria
|-
|'''Group dynamics'''
|''CoTrack'' application: https://www.cotrack.website/en/
|learning analytics
|-
|'''Retention check'''
|After a certain time (few weeks/months) ask learners to provide a short text (analysis of a case, short essay on an ethics topic/question). Compare the levels of understanding to another piece collected during or right after the training.
|SOLO taxonomy, content criteria
|}
For instance, to measure participants’ reactions during or right after the training, ProLearning app or Self-Reflection Form can be used. In addition, if learners worked in groups and provided a group-report, the learning process can be evaluated based on the SOLO taxonomy to measure the levels of understanding. Moreover, if possible, a couple of months after the training an additional case study could be given to the same learners, and the content of their analysis could again be evaluated with the SOLO taxonomy. This kind of effectiveness measure would give a possibility to triangulate the measurement in different time points.
There is evidence of how such a framework can be used to analyse reflective journals/learning logs (see Bell et al., 2011). We have tested the feasibility of this framework also in the context of REI. Figure 1 illustrates how reflection levels are displayed during a 6-week diary-keeping period related to REI learning. As indicated, some participants (P1–P5) show various levels, but some indicate constant levels. The exploration suggests that it is possible to analyse reflective journals/writing in REI context applying the framework of levels of reflective thinking.
Figure 1. Eexample of analysis results (reflection levels) of learning diaries by 5 training participants (P1-5).
[[File:Img7.png|center|frameless|500x500px]] +
Ethical principles (Kitchener, 1985) can be used to evaluate the characteristics of the ethical dilemma:
[[File:Img9.png|center|frameless|500x500px]]
Figure 1. Ethical principles (Kitchener, 1985).
In the case introduced in the section what is this about the following response may be given by the learners:
Ethical issues that could emerge in this case: underage children, parental consent, procedure of getting the consent, considering the wishes of the child, organising data collection, cooperation with the (pre)school, which is more harmful for the child - being recorded (for the research purposes) or being upset about the procedure of fulfilling the parent’s demands.
Ethical principles that can be at stake in this case:
* respect for autonomy - in this case there is a conflict between the autonomy of the child and the autonomy of the parent - for the researchers there is no simple answer. The procedure of informing should be altered to prevent these situations from happening in the future.
* doing no harm (non-maleficence) - in this case the researchers’ only chance of not harming the children whose parents had not given their consent would have been not proceeding with data collection, but in that case they would be harming the research/society/greater good.
* benefiting others (beneficence) - the research is probably important to the society, but it is not right to harm anyone to benefit others.
* being just (justice) - the researchers must be fair towards the children, their parents, the school, the society - if their rights get into a conflict of interest, new means of data collection must be found.
* being faithful (fidelity) - researchers must respect the wishes of the parents, but also children. +
to set up and collect student responses (anonymously). The tools analyse the collected data instantly and provide teachers with an overview of the impact of the training. In addition to collecting learner reactions, the topics indicated by the app bring those aspects of learning into focus and learners start paying greater attention to them – this may have a more long-term impact on their behaviour.
ProLearning collects learner responses to teacher-generated questions (yes/no or scale 0-100), asks teachers to predict the learner responses and write a short description of the learning situation. Only then can the teacher see the graph outlining learner responses (based on groups) in relation to their own predictions. If the teacher’s prediction of the performance of a group of learners is seen as the expected learning outcome, then the use of this tool can provide a measure of the alignment or discrepancy between the expected effectiveness and the learners’ performance.
In addition, engagement in activities may have an impact on how well people acquire competencies. For measuring engagement, an application ForgetNot (by EduLog) is available. In the application, learners can provide their feedback on three aspects of the training: how they were engaged in the activities (behavioural aspect of learning), how they felt (emotional aspect of learning) and how relevant the knowledge was for them (cognitive aspect of learning). All these aspects are relevant in case of successful learning. The responses accumulate based on the group and provide information to the facilitator how the training format was perceived by learners.
Both applications are suitable for any educational context, including HE. Data collected by tools provide an overview of the group advancement and is more suitable to evaluate short-term effects of training on the Kirkpatrick level 1 (learner reactions). It is more suitable to use this tool to evaluate shot-term trainings or specific activities during the training sessions. Both tools are freely available online and the teacher needs to create an account to create inquiry sessions and see the results. Students do not need to create an account they can join with the session code provided by the teacher.
MMLA tools use statistics to analyse collected information and create graphs to illustrate the results. Teachers need to be able to read the graphs and draw conclusions based on the data. Modifications can be made to the training format and implementation based on the results.
MMLA tools may be most suitable for beginners and short trainings where collecting reactions fast is relevant. But they can also be used with more advanced level learners.
The tools are available at:
ProLearning: [http://www.prolearning.realto.ch www.prolearning.realto.ch]
ForgetNot (by EduLog): https://web.htk.tlu.ee/forgetnot
While pre- and post-test are very common as a training effectiveness measure, we are proposing a pre- and post-text measure.
Of course, tests are easy to implement and analyse (if statistics is used), but the improvement of the average scores may not provide the entire picture of the learning process. In addition, post-post texts can be used as a measure implemented several months after the training to assess the retention of the competencies – this may also provide insights into the potential change in the learner’s behaviour or practices (Kirkpatrick’s levels 2 and 3).
The learner would provide a text (either an essay or short reflection of a case) prior to the training, after that participates in training activities and then submits another text (can again be a short essay or discussion of a case). Optionally, another text can be produced several months after the end of the training. If the same analysis tool is used, the long-term impact can be measured.
This measure is suitable for HE context and in all disciplines. The measure is simple to implement. Common analysis tools make the work simpler and the progress levels comparable. The text can be evaluated based on the SOLO taxonomy and the reflection levels. Content criteria, like ethical principles, ethical analysis, ethical approaches can also be sought for in the texts. It may be challenging to use it in case of large groups as reading and analysis may take time. It may be difficult to find the learners months after the end of training.
Ethics sections in doctoral dissertations can also be analysed as ‘pre- and post-texts’ if the final product can be compared to earlier drafts.
The tool is suitable for use in training for all target groups in HE context. +
While national REI surveys (barometers) may not address training directly, they can be used as macro-level long-term reflections of the state of REI in a given context, and as such they may also be reflections of whether training efforts, in a broad sense, have been efficient (Kirkpatrick’s level 4). As the evaluation of training effect cannot be tied to specific training at this macro-level, it may provide indications of the extent of challenges, which could have or can be alleviated through training, and point a direction of training needs. For instance, surveys usually collect information about participation in trainings and ask how confident researchers feel in dealing with ethical issues during their research (statistical data analysis).
In addition, national as well as institutional surveys may provide an opportunity to collect cases of questionable practices and future trainings could address those topics. For collecting cases researchers consider confusing or problematic, open answer questions could be added in the surveys.
In addition, the health of the entire research community can be evaluated by monitoring the leadership aspect in the surveys. For analysing this aspect, a REI Leadership framework (Tammeleht et al., 2022, submitted) can be used. The meta-analysis provides information about the wider impact of research practices in researcher institutions, but also helps institutional leaders support everyone in their organisation to obtain ethical research practices.
This tool is suitable for use in training with ECRs and active researchers. +
How can researchers reflect on their values, imagine alternative futures, and build solidarity in the face of shared struggles? Listen to Josie Chambers, Rianne Janssen, and Lucy Sabin explore transformative research. +
'''Greening labs''' involves reducing environmental impact by implementing sustainable practices within laboratory settings. In this regard, several small actions that are '''ecofriendly''' can be considered in lab activities to contribute to environmental sustainability.
'''Watch the video on “Green Labs from the Faculty of Science and Engineering of the University of Groningen” and pay attention to the everyday small actions that can be adopted to improve lab efficiency and make lab research more environmentally friendly.''' +
We will begin by watching a short video on '''environmental justice'''. After watching the video, you will be asked to complete a brief questionnaire based on the content you’ve just seen. +
The purpose of this exercise is to facilitate an understanding of sustainability as a wicked problem. At the end of the video, some questions will help you reflect on what you have seen. +
Please <span lang="EN-US">read through the introductory PowerPoint presentation “From disconnection to planetary stewardship"</span> +
Laboratories consume a huge amount of plastic, the majority of which is single use, and not recycled. Green Labs Austria presents the problem of plastic waste from labs and gives guidelines on where to start in addressing the problem in a lab (Green Labs Austria, 2024. ''Pioneering sustainability in scientific research.'' ''MIT Science Policy Review''). Through a background study, they evaluate what plastic materials can be recycled, which ones can be replaced and how can plastic materials be recycled for greener labs ([https://www.youtube.com/watch?v=aojnkoh4fPA Tackling the plastic problem in the lab]).
'''Watch this video and familiarize yourself with the types of plastic materials used in labs which can be recycled or replaced as well as the steps involved in the setting up of a plastic recycling pipeline.''' +
<span lang="EN-US">The methodology of BEYOND cases is rooted in the values clarification method. It simultaneously develops discussions on ethics and values-related issues while enhancing competencies necessary for dialogic communication, including: '''1) skills for listening and responding, 2) openness, 3) empathy, and 4) mutuality orientation'''<sup>5</sup>.</span>
<span lang="EN-US">This particular methodology has been developed through various educational games created by the Centre for Ethics at the University of Tartu, with the first game released in 2010 for teachers. Subsequent games have been designed for medical workers, students, researchers and the general public. The training material is intended for use as active learning methods with high interactivity, such as group work and group discussions. The method combines individual activities (taking first personal responsibility via choosing one’s own solution) with group activities (discussing the case, solutions and their underlying motivations and values, and potentially reaching a consensus).</span>
<span lang="EN-US">The material consists of ethical dilemmas which are developed in accordance with the methodology described by Parder et al. (2024)<sup>6</sup>.</span>
*The narrative is described from the perspective of the protagonist – the protagonist must be someone that the trainees find it easy to identify with.
*<span lang="EN-US">The characters and the basic relationships between them are described without too much detail, leaving thus room for trainees to fill the missing information with their own life experiences.</span>
*<span lang="EN-US">The information about the motives of the actors has been kept to a minimum to give the trainees an opportunity to draw from their experiences.</span>
*<span lang="EN-US">The temporal dimension of the narrative is also kept limited – in some cases background information is given, but the pre-given choices were kept within one temporal moment.</span>
*<span lang="EN-US">The dilemma and the pre-given solutions were balanced – the narrative was written from the neutral perspective and the pre-given solutions were morally acceptable from the perspective of at least one ethical theory.</span>
<span lang="EN-US">The drafting of solutions was inspired by four ethical theories: deontology, utilitarianism, care ethics and virtue ethics. It has to be noted that the solutions are not in perfect accordance with the theories as the aim of this training methodology is not to teach ethical theories to trainees, but rather to provide realistic alternative solutions to choose from.</span>
<span lang="EN-US">Finally, the aim of the methodology is not to teach a “right” answer to the dilemma as dilemmas often involve conflicts between two or more valuable ethical principles, but to focus on the reflection of the cases and solutions and to guide participants to carry out moral reasoning with emphasis on the skills of listening and discussing.</span><div>
----<div>
'''<span lang="EN-US">'''<span lang="EN-US">[5]</span> <span lang="ET">Kent , M. L., and Taylor, M. (2002).</span> <span lang="EN-US">Toward a Dialogic Theory of Public Relations. ''Public Relations Review,'' ''28''(1), 21–37. https://doi.org/10.1016/S0363-8111(02)00108-X; Taylor, M., and Kent, M. L. (2014). Dialogic Engagement: Clarifying Foundational Concepts. ''Journal of Public Relations Research,'' ''26''(5), 384–398. https://doi.org/10.1080/1062726X.2014.956106; Yang, S.-U., Kang, M., and Cha, H. (2015). A Study on Dialogic Communication, Trust, and Distrust: Testing a Scale for Measuring Organization–Public Dialogic Communication (OPDC). ''Journal of Public Relations Research,'' ''27''(2), 175–192. <span lang="ET">https://doi.org/10.1080/1062726X.2015.1007998</span></span>
</div><div>
<span lang="ET"><span lang="ET">[6]</span></span> <span lang="EN-US">Parder, M. L., Tammeleht, A., Juurik, M., Paaver, T., Velbaum, K., and Harro-Loit, H. (2024). Digital Discussion Game on Values: Development, Use and Possibilities for Measuring Its Functionality. In Y. P. Cheng, M. Pedaste, E. Bardone, Y. M. Huang (eds). (2024). Innovative Technologies and Learning. ICITL 2024. Lecture Notes in Computer Science, 14785. Springer, Cham.</span>
</div></div>
