Text (Instruction Step Text)

From The Embassy of Good Science
Describe the actions the user should take to experience the material (including preparation and follow up if any). Write in an active way.


  • ⧼SA Foundation Data Type⧽: Text
Showing 20 pages using this property.
1
Users of the Seesaw experience making a decision, related to the pandemic, which involves trade-offs. For instance, where human challenge trials are approved with potentially life-saving accelerations in vaccine development, healthy participants might accept possibly fatal risks. Users make a choice between two alternative options (represented as the two extremes of a seesaw) and hear about the arguments for each option through prerecorded video clips.  +
In this lecture, Olivier Le Gall articulates the foundational principles of Open Science. The initial segment of the lecture addresses the rationale for opening science and provides a comprehensive overview of its concept. The subsequent segment delves into the core values and guiding principles underpinning Open Science. Finally, the concluding segment elucidates the anticipated social benefits derived from the implementation of Open Science. '''Watch the lecture and then answer the questions.''' '''Further reading:''' The Embassy of Good Science: “[[Theme:Bbf561cd-7369-4314-ac74-2c870373af9d|Open Science]]” UNESCO Recommendation on Open Science. (2021) [https://doi.org/10.54677/MNMH8546 https://doi.org/10.54677/MNMH8546  ]  +
<div> ==='''<span lang="EN-GB">What is a Reproducibility Network?</span>'''  === </div><div><div> <span lang="EN-GB">A national Reproducibility Network <u>(RN)</u> is a countrywide peer-led consortium that aims to improve research practices by promoting, supporting, and investigating factors contributing to robust research including, but not limited to, reproducibility, replicability, and Open Science. Activities may include promoting training activities, disseminating best practices, supporting research on reproducible research practices, and advocacy for reproducible and open research.  </span> </div><div> <span lang="EN-GB">An RN typically serves as a hub to connect researchers to exchange ideas and good practices, promoting collaboration among researchers from a range of scientific disciplines. These networks provide infrastructure, facilitate opportunities for researchers and initiatives to support and amplify each other’s efforts, and foster community building as well as shared problem solving.  </span> </div><div> <span lang="EN-GB">RNs can serve as connectors to other stakeholder groups such as universities, funders, or academic publishers.</span> </div></div> ===Benefits for setting up an Reproducibility Network?=== <span lang="EN-GB">By providing seed funding for the establishment of a new RN, you actively contribute to the strengthening of reproducibility and Open Science in your local ecosystem. The widespread presence of RNs is crucial, as they function as points of contact for scientific communities who, across e.g., disciplinary, demographic, and geographic contexts, face different challenges and barriers. RNs can provide local and tailored support and keep in mind the different stages of readiness of their local communities for implementing reproducible research practices.</span> ==="Lessons  learned" from the TIER2 award=== 1.   '''<span lang="EN-GB">Build strong community links.</span>''' <span lang="EN-GB">Involve already existing and successful RNs in the establishment of new RNs. This ensures that new RNs receive valuable guidance, input and support early in the establishment process.</span> 2.  '''<span lang="EN-GB">Expand connections and broaden the reach.</span>''' <span lang="EN-GB">Reach out to researchers and other relevant stakeholders, such as universities, as this is important for local support and the sustainability of the RN. However, identifying and connecting with researchers in Horizon Europe Widening Participation countries (WIDERA countries) who are active in reproducible research and Open Science practices can be challenging.</span> 3.  '''<span lang="EN-GB">Facilitate international support.</span>''' <span lang="EN-GB">Several RNs across the globe exist and more are being established. Build strong international connections amongst them to facilitate the sharing of resources and best practices, this will help to coordinate and amplify efforts.</span>   4. '''<span lang="EN-GB">Focus on the local ecosystem.</span>''' <span lang="EN-GB">RNs are national networks that promote transparent and trustworthy practices in their local research ecosystems. Recognize local needs, geopolitical conditions as well as barriers and available resources.</span> ===How  has TIER2 supported the awarded networks?=== 1. TIER2 members and award organizers have facilitated connections between awardees and existing international Reproducibility and Open Science networks. via email contacts as well as through virtual and in-person meetings.   2. <span lang="EN-GB">TIER2 award organizers, have added awardees, with their consent, to various mailing lists and newsletters from different international RNs.</span>   3.<span lang="EN-GB">Further, TIER2 award organizers have invited awardees to attend and speak at several Open Science and reproducibility events to meet (steering group) members from other RNs and (inter-)national initiatives.</span> <span lang="EN-GB">4. TIER2 project members as well as award organizers have provided the awardees with resources and information on relevant topics, including different RN structures, website layout and structure, as well as language.</span> ===Awardess of the TIER2 Reproducibility Network Award=== TIER2 is proud to announce the two awarded consortia based in Ukraine and Georgia who will receive the monetary awards from the Reproducibility Network open call this summer. Multiple scientific consortia from Horizon Europe “[https://www.era-learn.eu/support-for-partnerships/additional-activities/openness-inclusivness-transparency/widening-and-inclusiveness Widening Participation]” countries submitted applications describing their plans and motivations for establishing a  Reproducibility Network in their home country which TIER2 would support with a €5000 prize. [[File:Ukraine and Georgia RN.png|center|thumb]] ====Ukrainian Consortium==== The Ukrainian consortium, from the Institute for Open Science and Innovation ([https://www.facebook.com/inosi.org/ INOSI]), [https://twitter.com/optima_open OPTIMA] Project Consortium & [https://lpnu.ua/en Lviv Polytechnic National University], comprises researchers with a broad scientific background, ranging from informatics to chemistry and ecology. The core of the consortium has already experience working together in promoting Open Science in Ukraine, particularly within the OPTIMA project and within the Working Group on the [https://www.kmu.gov.ua/en/news/ukraina-pryiednalas-do-krain-ies-shcho-maiut-zatverdzhenyi-plan-realizatsii-pryntsypiv-vidkrytoi-nauky National Plan for Open Science development] in Ukraine. In response to what motivated them in participating in the open call, they state that: ''“Ukraine needs good science to make good decisions in all spheres. This is particularly relevant during the war and will be needed for the post-war recovery. Reproducibility (as a part of the Open Science concept) can boost the value of academic research in Ukraine making science a real game-changer for progress''”. Regarding their future plans for the Ukrainian Reproducibility Network, they share: “''In the short term, the ambition is to kickstart the network of experts, able to lead the discussion on reproducibility and become a role model on the national level. In the long term, the ambition is, of course, to make reproducibility in research a standard by default. This has to be supported by co-creation and sharing best practices, research on research, and making an impact on national policy. We hope that the network will be viable and ambitious enough to compete for international grant funding to achieve this''”. With regard to the global state of reproducibility & scientific integrity, they say: “''The progress on the global level is visible, but it's only the beginning of a long way forward. The key to achieving the goal is a strong research culture that is often missing in many academic communities. Openness and transparency in performing and communicating research are the basic things to be established''.” ====Georgian Consortium==== The consortium from Georgia, comprises three researchers from different institutions: the Department of Human Anatomy at Tbilisi State Medical University ([https://tsmu.edu/ts/home TSMU]), the Faculty of Medicine at Tbilisi State University ([https://www.tsu.ge/en TSU]), the Institute of Morphology, and the Scientific Department at Caucasus International University ([https://ciu.edu.ge/?lang=en CIU]). Brought together as team members of a research group, they were drafting a proposal for a Horizon Europe (HE) project when their HE grant coordinator alerted them about the TIER2 open call announcement. The team was immediately drawn to it, sharing that: “''During our individual and collective research endeavors, we frequently encountered challenges in reproducing experiment results, a phenomenon that was not isolated to our work but across the global research landscape. [...]  a consolidated effort was needed to elevate the state of research in our nation. [...] Moreover, the opportunity to foster a Reproducibility Network (RN) in Georgia provided a platform to unite our nation's fragmented research endeavors, drive standards in research methodologies, and integrate with the global scientific community''”. Regarding their short-term plans after receiving the award, they list the following: “''Organize the foundational meeting, bringing together stakeholders from various Georgian research institutions, to lay down the operational blueprint for the RN; Launch training sessions that cover core skills in reproducibility, data management, and research design; Conduct sessions in universities and community centers to educate and foster trust in scientific research: Set up an official RN website and leverage social media for real-time updates and engagements”.'' ''In the long term, the team envisions to “Establish partnerships with International Reproducibility Networks, facilitating knowledge exchange and joint research projects and collaborate with Georgian institutions to advocate for policies emphasizing reproducibility and transparency''”. Their global vision for the state of reproducibility and scientific integrity “''is one where every piece of research, irrespective of its domain or geography, stands the test of time and validation. We envision a scientific landscape where collaboration, transparency, and inclusivity aren't just ideals but are deeply integrated into research methodologies''.” They share that the way forward is to prioritize “''1)Education & Training: Equip researchers, especially the younger generation, with the necessary tools and knowledge to ensure reproducibility.'' ''2)Open Science: Promote Open Access publications, making research universally accessible and subject to broader scrutiny.'' ''3)Interdisciplinary Collaborations: Foster collaborations across disciplines, pooling expertise and resources to tackle complex research challenges.'' ''4)Technological Integration: Leverage technology, especially AI and data analytics, to aid in ensuring research consistency and integrity.'' ''If we could change one thing, it would be the isolated nature of scientific endeavors prevalent in many regions, like Georgia. We would foster a globally interconnected research network where findings, methodologies, and tools are shared seamlessly, accelerating scientific progress and ensuring its robustness''”. Lastly, the Georgian consortium highlights what they would change in the global reproducibility landscape if they could: “''1)Revise Academic Incentives: The current "publish or perish" culture sometimes prioritizes quantity over quality. We'd advocate for a system where researchers are rewarded for the reproducibility and integrity of their work, not just the volume. 2)Enhanced Training: Incorporate reproducibility and Open Science training at early academic stages, ensuring that upcoming researchers are well-equipped with the necessary skills and ethos. 3)Global Collaboration Platforms: Creation of digital platforms that facilitate global collaboration, data sharing, and mutual validation of research findings, breaking down silos and fostering a truly global scientific community''.” ====Serbian Consortium==== [[File:Serbian RN.png|center|thumb]] <span lang="EN-US">[https://tier2-project.eu/ TIER2] is excited to announce the winner of the 2024 Open Call aimed at fostering the establishment of a third Reproducibility Network (RN) in “Widening Participation” countries - Serbia. The goal of Reproducibility Networks is to promote rigorous research practices, facilitate interdisciplinary collaborations and discussions, and enhance the trustworthiness of scientific work. The Serbian consortium will thus receive a €5,000 grant to organise an initial meeting, laying the groundwork for establishing an RN in their country.</span>   <span lang="EN-US">The Serbian consortium, consisting of nine organisations – six institutes and three faculties –, brings together diverse academic backgrounds with a shared commitment to improving research culture in Serbia. They focus on integrating Open Science, reproducibility, and inclusive policies into institutions and education. The consortium has previously collaborated on initiatives like the [https://nitra.gov.rs/en/ Team for Open Science in Serbia] and [https://nitra.gov.rs/en/inovacije/projekat-saige the Saige project], organising workshops and training to promote open science practices.</span> <span lang="EN-US">Motivated by challenges such as low research investment and a scientific system that prioritises quantity over quality, they believe establishing a Reproducibility Network will enhance collaboration and help to address these issues. In the short term, they plan to promote the network through conferences, a kick-off event, and online platforms. Long-term, they aim to integrate Open Science into curricula, incentivise reproducibility, support initiatives beyond major centers, advocate for policy changes, and build international collaborations.</span> <span lang="EN-US">Their vision for global reproducibility is one where research is transparent, ethical, and rigorous:</span> <span lang="EN-US">''“In this ideal state, researchers across all disciplines adhere to principles of Open Science, ensuring that their methods, data, and results are accessible and reproducible.”''</span> <span lang="EN-US">– Matija Zlatar on behalf of the Serbian consortium</span> <span lang="EN-US">They advocate for integrating these principles into education, establishing incentive systems, and fostering collaboration:</span> <span lang="EN-US">''“We should integrate reproducibility and Open Science practices into university curricula and professional development programs to equip researchers with the necessary skills and knowledge to conduct reproducible research.”''</span> <span lang="EN-US">– Matija Zlatar on behalf of the Serbian consortium</span> ===Resources to set up a Reproducibility Network=== *[https://osf.io/ndwsj Application template] *[https://osf.io/tsmxh Reviewer guidelines] ==='''<span lang="EN-GB">Call to action – what could you do?</span>'''=== <div> *'''<span lang="EN-GB">Are you a researcher?</span>''' <span lang="EN-GB">Join an existing RN in your country or, if none exist, identify supporters and form your own network.</span>   </div><div> *'''<span lang="EN-GB">Are you a funder?</span>''' <span lang="EN-GB">Offer your support by providing (additional) funding for personnel costs, events on reproducibility practices and Open Science, or training opportunities. Further, establish your own award calls to support the establishment of more RNs.</span> </div><div> *'''<span lang="EN-GB">Are you a publisher?</span>''' <span lang="EN-GB">Support the wide range of outputs generated by RNs, for example via special issues or journals, to help them increase their reach.</span>   </div> ===Not sure, if your country has an established Reproducibility Network?=== <span lang="EN-GB">Visit the Global Networks page hosted by the UKRN to find out if a Reproducibility Network already exists in your country: https://www.ukrn.org/global-networks/.</span>  <div></div>  
Please watch the video carefully! The purpose of this exercise is to assess your understanding of the concept of circularity and its role in addressing today’s environmental challenges. Circularity is essential because it helps reduce resource extraction, waste, and pollution while keeping materials in use for as long as possible. By fostering more sustainable production and consumption patterns, circularity contributes to protecting ecosystems, supporting equitable economies, and achieving the Sustainable Development Goals.  +
Read the slides carefully and learn about the concept of climate mainstreaming within organisations and the key steps required for its successful implementation.  +
What do we know about measuring training effectiveness? Self-assessment is one of the most prevalent means to measure effectiveness of REI training. The second most frequently used method for assessing training effect was a moral reasoning test. While the developed tools were mostly used pre and post intervention (with/without control groups) and the results were compared, there were other measures added to evaluate the learning process or student progress. It also seemed that the tests designed for ethics training (like DIT, DEST, TESS) cannot be universally implemented to all REI training due to very different formats of training and/or availability of tests. There are also qualitative possibilities (like learning diaries, tasks submitted during other courses, etc.) to monitor the learning progress, and through that assess the effectiveness of training. See figure 1 outlining the identified measures and their application scale and feasibility (more details in D4.1): Figure 1. Measurement tools identified in the literature review (numbers indicate Kirkpatrick’s levels, see below) (tool descriptions in D4.1). [[File:Screenshot 2025-11-17 205639.png|center|frameless|500x500px]] As can be seen in figure 1, self-reporting (blue bubbles) is the most feasible measure and that can be implemented large-scale (mostly). It is no wonder that this is, based on the literature review, also the most used approach. Also, SPEEES and SOLKA tests utilise self-reporting. Most tools measure the content or the learning process (green bubbles) – they give information about what was learned during the training. As indicated, feasibility of the measures is not high – either a lot of work needs to be put into implementing the tool, they are not openly available, or they may be field specific. Possibilities for measuring behaviour (yellow bubbles) are scarce. Comparing results collected with various tools is almost impossible because they measure different aspects of training with different analysis instruments. It is not possible to determine whether qualitative (indicated as ’qual’) or quantitative (indicated as ‘stat’ in Fig 1) methods of analysis are more feasible. Feasibility depends on the combination of various aspects, such as accessibility to the tool, need of special equipment, and competence required.  
The SOLO taxonomy (Biggs & Collins, 1982; Biggs & Tang, 2007) can be applied in REI teaching and learning to structure, design, and assess learning outcomes. *It categorizes levels of understanding from simple to complex. *The taxonomy helps educators identify whether learners are merely recognizing ethical issues or integrating and applying them in sophisticated ways. *Learning tasks and assignments should be designed to elicit responses that demonstrate understanding at the intended SOLO level. *Facilitators are responsible for aligning assignments with the learning outcomes they aim to assess. The taxonomy, therefore, serves as both a framework for assessment and a guide for instructional design in REI contexts.  +
Bloom's Taxonomy is a well-known educational framework that offers a methodical way to classify learning objectives according to cognitive difficulty. (e.g., Adams, 2015).'"`UNIQ--ref-00000073-QINU`"' It is a hierarchical framework that uses cognitive complexity to classify learning objectives. Benjamin Bloom created it in the 1950s, and it is now a vital instrument in educational theory and practice. The taxonomy is divided into six stages: remembering, understanding, applying, analysing, evaluating, and creating. The levels are arranged from lower to higher order cognitive skills. Fundamentally, remembering entails recollecting words, information, and fundamental ideas. Understanding is more than just remembering concepts; it also involves understanding meanings. Applying necessitates applying knowledge to novel contexts or problem-solving. Analysing means dissecting data into its constituent elements and identifying connections between them. Making decisions based on standards and criteria is the process of evaluating. Creating, in the end, involves coming up with original concepts and/or interpretations. The goal of applying Bloom's Taxonomy to training aims and results is to enhance comprehension by considering the knowledge, skills, and competencies that the specific training programmes were created to impart. The Remembering, Understanding, Applying, Analysing, Evaluating, and Creating domains of Bloom's Taxonomy each reflect a different cognitive process and the depth and complexity of learning. [[File:BloomsTaxonomy.jpg|alt=|center|frame|Fig 21. Bloom’s Taxonomy (taken from the Centre for teaching, Vanderbilt University. '"`UNIQ--nowiki-00000074-QINU`"') ]] All taxonomic levels are relevant irrespective of the study or career level. However, the taxonomic levels may mean different things for different individuals. For example, application of knowledge may mean engaging with research designs, but senior researchers often use more complex designs than students still learning how to do research. Nevertheless, it is essential that the learning extends beyond remembering and understanding, and that the complexity of activities at all levels gradually grow as the individual gains experience, knowledge and confidence. <span lang="EN-GB">·      '''remembering and understanding:''' focus on memorizing key ethics concepts and theories. For example, students should master basic principles and terminology related to ethics and integrity.</span> <span lang="EN-GB">·      '''applying and analysing''': engage in practical applications and critical thinking. Apply ethics concepts to real-life scenarios, such as conducting experiments and analyzing data.</span> ·      '''evaluating and creating''': evaluate research findings and create new knowledge. Encourage learners to think critically and innovate in ethical dilemmas. '"`UNIQ--references-00000075-QINU`"'  
Bloom's Taxonomy is a well-known educational framework that offers a methodical way to classify learning objectives according to cognitive difficulty. (e.g., Adams, 2015).'"`UNIQ--ref-00000096-QINU`"' It is a hierarchical framework that uses cognitive complexity to classify learning objectives. Benjamin Bloom created it in the 1950s, and it is now a vital instrument in educational theory and practice. The taxonomy is divided into six stages: remembering, understanding, applying, analysing, evaluating, and creating. The levels are arranged from lower to higher order cognitive skills. Fundamentally, remembering entails recollecting words, information, and fundamental ideas. Understanding is more than just remembering concepts; it also involves understanding meanings. Applying necessitates applying knowledge to novel contexts or problem-solving. Analysing means dissecting data into its constituent elements and identifying connections between them. Making decisions based on standards and criteria is the process of evaluating. Creating, in the end, involves coming up with original concepts and/or interpretations. The goal of applying Bloom's Taxonomy to training aims and results is to enhance comprehension by considering the knowledge, skills, and competencies that the specific training programmes were created to impart. The Remembering, Understanding, Applying, Analysing, Evaluating, and Creating domains of Bloom's Taxonomy each reflect a different cognitive process and the depth and complexity of learning. [[File:BloomsTaxonomy.jpg|alt=|center|frame|Fig 21. Bloom’s Taxonomy (taken from the Centre for teaching, Vanderbilt University. '"`UNIQ--nowiki-00000097-QINU`"') ]] All taxonomic levels are relevant irrespective of the study or career level. However, the taxonomic levels may mean different things for different individuals. For example, application of knowledge may mean engaging with research designs, but senior researchers often use more complex designs than students still learning how to do research. Nevertheless, it is essential that the learning extends beyond remembering and understanding, and that the complexity of activities at all levels gradually grow as the individual gains experience, knowledge and confidence. ''Remembering and understanding:'' Here, the focus is on memorising key facts, concepts and theories relevant to the field of research and innovation. Understanding these foundational elements is critical to moving forward. For example, undergraduate students need to master the basic principles and terminology related to ethics and integrity to effectively navigate through more complex topics later. Similarly, individuals pursuing a PhD or who are new to academia need a solid understanding of basic concepts before they can conduct more in-depth analyses and applications, such as mastering the ethics of their own PhD research. Moreover, senior researchers may need to understand the basic concept of supervision and mentoring practices when it comes to supervising a team and PhD candidates. ''Apply and analyse:'' Learning should always be an active endeavour irrespective of career or studies applying and analysing knowledge. This is where the emphasis shifts to practical application and critical thinking. Early career researchers, junior professors and academics need competencies for applying the ethics and integrity concepts they have learnt to real-life scenarios in connection to conducting experiments, collecting data and critically analysing the results to gain meaningful insights. Through these activities, participants develop the skills necessary to contribute to the advancement of their field and address research questions with greater depth and sophistication. In terms of research ethics and integrity, this involves applying such knowledge and values to every step of the research. ''Evaluate and create:'' The highest level in Bloom’s Taxonomy involves evaluating existing knowledge and creating new knowledge. All researchers play a critical role in shaping the direction of research and innovation. They are responsible for assessing the validity and significance of research findings and identifying areas for further investigation and innovation. By synthesising existing knowledge and developing new ideas, theories or methods, researchers develop their field forward and inspire the next generation of researchers and innovators. All RE/RI training should include components, which encourage learners to extend their thinking to evaluation and creation. In practice, this involves having such a robust knowledge base and values so that even when encountering new ethical dilemmas or being posed with a novel potentially integrity-threatening situation, they can rely on having the ‘tools’ to handle the situation. '"`UNIQ--references-00000098-QINU`"'  
The aim of educating secondary school students is to raise their awareness of the ethical and integrity challenges they may face. Resources for secondary school students include (please see the last section of the BEYOND trainer guide for an overview of materials divided by topics and target groups): *The [[Integrity Teacher Guide]] for secondary school students' education. *The [[Path2integrity Learning cards]] (Path2Integrity learning cards S) focusing on students in high schools, and a dedicated handbook (S-Series handbook). For training effectiveness measurement facilitators can use the following tools for learning output collection and for analysing collected material: {| class="wikitable" |+ !'''Tool for collecting learning outputs''' !'''Details''' !'''Analysis instrument **''' |- |'''ProLearning app''' |''ProLearning'': https://www.epfl.ch/labs/chili/dualt/current-projects/realto/ <span lang="EN-GB"></span> |learning analytics |- |'''Engagement app''' |''ForgetNot'' (by EduLog): https://web.htk.tlu.ee/forgetnot |learning analytics |- |'''Self-Reflection Form/Compass''' |App under development, [https://docs.google.com/forms/d/17ORaVeaLjBYucufYNGF6TNjgtqqNdlk5BhSp5bfM5eA/copy form] * (for copying and editing) |SOLO taxonomy, reflection levels, content criteria |- |'''Pre-post texts''' |Collect a short text (e.g. a response to a case or short essay) before the training and after the training |SOLO taxonomy, reflection levels, content criteria |- |'''Learning diaries''' |Ask learners keep a diary over a certain period, for each submission provide some guiding questions or topics |SOLO taxonomy, reflection levels, content criteria |- |'''Group reports''' |Ask groups working together to provide a (short) group report (or provide a template with points to work on) |SOLO taxonomy, content criteria |- |'''Group discussions''' |Monitor the group discussions to evaluate the level of understanding and content discussed (scaffold as appropriate) |SOLO taxonomy, content criteria |- |'''Group dynamics''' |''CoTrack'' application: https://www.cotrack.website/en/ |learning analytics |- |'''Retention check''' |After a certain time (few weeks/months) ask learners to provide a short text (analysis of a case, short essay on an ethics topic/question). Compare the levels of understanding to another piece collected during or right after the training. |SOLO taxonomy, content criteria |} For instance, to measure participants’ reactions during or right after the training, ProLearning app or Self-Reflection Form can be used. In addition, if learners worked in groups and provided a group-report, the learning process can be evaluated based on the SOLO taxonomy to measure the levels of understanding. Moreover, if possible, a couple of months after the training an additional case study could be given to the same learners, and the content of their analysis could again be evaluated with the SOLO taxonomy. This kind of effectiveness measure would give a possibility to triangulate the measurement in different time points.  
The aim of educating secondary school students is to raise their awareness of the ethical and integrity challenges they may face. Resources for secondary school students include (please see the last section of the BEYOND trainer guide for an overview of materials divided by topics and target groups): *The [[Integrity Teacher Guide]] for secondary school students' education. *The [[Path2integrity Learning cards]] (Path2Integrity learning cards S) focusing on students in high schools, and a dedicated handbook (S-Series handbook). For training effectiveness measurement facilitators can use the following tools for learning output collection and for analysing collected material: {| class="wikitable" |+ !'''Tool for collecting learning outputs''' !'''Details''' !'''Analysis instrument **''' |- |'''ProLearning app''' |''ProLearning'': https://www.epfl.ch/labs/chili/dualt/current-projects/realto/ <span lang="EN-GB"></span> |learning analytics |- |'''Engagement app''' |''ForgetNot'' (by EduLog): https://web.htk.tlu.ee/forgetnot |learning analytics |- |'''Self-Reflection Form/Compass''' |App under development, [https://docs.google.com/forms/d/17ORaVeaLjBYucufYNGF6TNjgtqqNdlk5BhSp5bfM5eA/copy form] * (for copying and editing) |SOLO taxonomy, reflection levels, content criteria |- |'''Pre-post texts''' |Collect a short text (e.g. a response to a case or short essay) before the training and after the training |SOLO taxonomy, reflection levels, content criteria |- |'''Learning diaries''' |Ask learners keep a diary over a certain period, for each submission provide some guiding questions or topics |SOLO taxonomy, reflection levels, content criteria |- |'''Group reports''' |Ask groups working together to provide a (short) group report (or provide a template with points to work on) |SOLO taxonomy, content criteria |- |'''Group discussions''' |Monitor the group discussions to evaluate the level of understanding and content discussed (scaffold as appropriate) |SOLO taxonomy, content criteria |- |'''Group dynamics''' |''CoTrack'' application: https://www.cotrack.website/en/ |learning analytics |- |'''Retention check''' |After a certain time (few weeks/months) ask learners to provide a short text (analysis of a case, short essay on an ethics topic/question). Compare the levels of understanding to another piece collected during or right after the training. |SOLO taxonomy, content criteria |} For instance, to measure participants’ reactions during or right after the training, ProLearning app or Self-Reflection Form can be used. In addition, if learners worked in groups and provided a group-report, the learning process can be evaluated based on the SOLO taxonomy to measure the levels of understanding. Moreover, if possible, a couple of months after the training an additional case study could be given to the same learners, and the content of their analysis could again be evaluated with the SOLO taxonomy. This kind of effectiveness measure would give a possibility to triangulate the measurement in different time points.  
There is evidence of how such a framework can be used to analyse reflective journals/learning logs (see Bell et al., 2011). We have tested the feasibility of this framework also in the context of REI. Figure 1 illustrates how reflection levels are displayed during a 6-week diary-keeping period related to REI learning. As indicated, some participants (P1–P5) show various levels, but some indicate constant levels. The exploration suggests that it is possible to analyse reflective journals/writing in REI context applying the framework of levels of reflective thinking. Figure 1. Eexample of analysis results (reflection levels) of learning diaries by 5 training participants (P1-5). [[File:Img7.png|center|frameless|500x500px]]  +
Ethical principles (Kitchener, 1985) can be used to evaluate the characteristics of the ethical dilemma: [[File:Img9.png|center|frameless|500x500px]] Figure 1. Ethical principles (Kitchener, 1985). In the case introduced in the section what is this about  the following response may be given by the learners: Ethical issues that could emerge in this case: underage children, parental consent, procedure of getting the consent, considering the wishes of the child, organising data collection, cooperation with the (pre)school, which is more harmful for the child - being recorded (for the research purposes) or being upset about the procedure of fulfilling the parent’s demands. Ethical principles that can be at stake in this case: * respect for autonomy - in this case there is a conflict between the autonomy of the child and the autonomy of the parent - for the researchers there is no simple answer. The procedure of informing should be altered to prevent these situations from happening in the future. * doing no harm (non-maleficence) - in this case the researchers’ only chance of not harming the children whose parents had not given their consent would have been not proceeding with data collection, but in that case they would be harming the research/society/greater good. * benefiting others (beneficence) - the research is probably important to the society, but it is not right to harm anyone to benefit others. * being just (justice) - the researchers must be fair towards the children, their parents, the school, the society - if their rights get into a conflict of interest, new means of data collection must be found. * being faithful (fidelity) - researchers must respect the wishes of the parents, but also children.  +
to set up and collect student responses (anonymously). The tools analyse the collected data instantly and provide teachers with an overview of the impact of the training. In addition to collecting learner reactions, the topics indicated by the app bring those aspects of learning into focus and learners start paying greater attention to them – this may have a more long-term impact on their behaviour. ProLearning collects learner responses to teacher-generated questions (yes/no or scale 0-100), asks teachers to predict the learner responses and write a short description of the learning situation. Only then can the teacher see the graph outlining learner responses (based on groups) in relation to their own predictions. If the teacher’s prediction of the performance of a group of learners is seen as the expected learning outcome, then the use of this tool can provide a measure of the alignment or discrepancy between the expected effectiveness and the learners’ performance. In addition, engagement in activities may have an impact on how well people acquire competencies. For measuring engagement, an application ForgetNot (by EduLog) is available. In the application, learners can provide their feedback on three aspects of the training: how they were engaged in the activities (behavioural aspect of learning), how they felt (emotional aspect of learning) and how relevant the knowledge was for them (cognitive aspect of learning). All these aspects are relevant in case of successful learning. The responses accumulate based on the group and provide information to the facilitator how the training format was perceived by learners. Both applications are suitable for any educational context, including HE. Data collected by tools provide an overview of the group advancement and is more suitable to evaluate short-term effects of training on the Kirkpatrick level 1 (learner reactions). It is more suitable to use this tool to evaluate shot-term trainings or specific activities during the training sessions. Both tools are freely available online and the teacher needs to create an account to create inquiry sessions and see the results. Students do not need to create an account they can join with the session code provided by the teacher. MMLA tools use statistics to analyse collected information and create graphs to illustrate the results. Teachers need to be able to read the graphs and draw conclusions based on the data. Modifications can be made to the training format and implementation based on the results. MMLA tools may be most suitable for beginners and short trainings where collecting reactions fast is relevant. But they can also be used with more advanced level learners. The tools are available at: ProLearning: [http://www.prolearning.realto.ch www.prolearning.realto.ch] ForgetNot (by EduLog): https://web.htk.tlu.ee/forgetnot  
While pre- and post-test are very common as a training effectiveness measure, we are proposing a pre- and post-text measure. Of course, tests are easy to implement and analyse (if statistics is used), but the improvement of the average scores may not provide the entire picture of the learning process. In addition, post-post texts can be used as a measure implemented several months after the training to assess the retention of the competencies – this may also provide insights into the potential change in the learner’s behaviour or practices (Kirkpatrick’s levels 2 and 3). The learner would provide a text (either an essay or short reflection of a case) prior to the training, after that participates in training activities and then submits another text (can again be a short essay or discussion of a case). Optionally, another text can be produced several months after the end of the training. If the same analysis tool is used, the long-term impact can be measured. This measure is suitable for HE context and in all disciplines. The measure is simple to implement. Common analysis tools make the work simpler and the progress levels comparable. The text can be evaluated based on the SOLO taxonomy and the reflection levels. Content criteria, like ethical principles, ethical analysis, ethical approaches can also be sought for in the texts. It may be challenging to use it in case of large groups as reading and analysis may take time. It may be difficult to find the learners months after the end of training. Ethics sections in doctoral dissertations can also be analysed as ‘pre- and post-texts’ if the final product can be compared to earlier drafts. The tool is suitable for use in training for all target groups in HE context.  +
While national REI surveys (barometers) may not address training directly, they can be used as macro-level long-term reflections of the state of REI in a given context, and as such they may also be reflections of whether training efforts, in a broad sense, have been efficient (Kirkpatrick’s level 4). As the evaluation of training effect cannot be tied to specific training at this macro-level, it may provide indications of the extent of challenges, which could have or can be alleviated through training, and point a direction of training needs. For instance, surveys usually collect information about participation in trainings and ask how confident researchers feel in dealing with ethical issues during their research (statistical data analysis). In addition, national as well as institutional surveys may provide an opportunity to collect cases of questionable practices and future trainings could address those topics. For collecting cases researchers consider confusing or problematic, open answer questions could be added in the surveys. In addition, the health of the entire research community can be evaluated by monitoring the leadership aspect in the surveys. For analysing this aspect, a REI Leadership framework (Tammeleht et al., 2022, submitted) can be used. The meta-analysis provides information about the wider impact of research practices in researcher institutions, but also helps institutional leaders support everyone in their organisation to obtain ethical research practices. This tool is suitable for use in training with ECRs and active researchers.  +
<span lang="EN-US">In this activity, first you will watch the video “5 Ethical Principles”, which introduces core principles of climate env environmental ethics, relevant for research and innovation. Afterwards, you can note down which principles are most relevant in your research.</span>    +
How can researchers reflect on their values, imagine alternative futures, and build solidarity in the face of shared struggles? Listen to Josie Chambers, Rianne Janssen, and Lucy Sabin explore transformative research.  +
'''Greening labs''' involves reducing environmental impact by implementing sustainable practices within laboratory settings. In this regard, several small actions that are '''ecofriendly''' can be considered in lab activities to contribute to environmental sustainability. '''Watch the video on “Green Labs from the Faculty of Science and Engineering of the University of Groningen” and pay attention to the everyday small actions that can be adopted to improve lab efficiency and make lab research more environmentally friendly.'''  +
Cookies help us deliver our services. By using our services, you agree to our use of cookies.
5.2.9