Search by property
From The Embassy of Good Science
This page provides a simple browsing interface for finding entities described by a property and a named value. Other available search interfaces include the page property search, and the ask query builder.
List of results
- Exploring Training Materials on Open Science + (Utilizing online training resources is ano … Utilizing online training resources is another way to put blended learning into practice. This method mixes traditional in-person instruction with the usage of the internet to let students create their own learning experiences. Trainees can gain from direction and engagement with a trainer while having access to flexible and interactive training options outside of the classroom by combining in-person and online training methods. The 'conventional' learning resources produced by the ROSIE project (refer to the [https://rosie-project.eu/rosie-knowledge-hub/ ROSiE Knowledge Hub]) can be utilized in conjunction with the online training resources for self-directed learning developed by ROSiE to facilitate blended learning in the classroom. </br></br>Training materials on responsible open science can be found for the fields of [https://zenodo.org/records/10799656 Social Sciences], [https://zenodo.org/records/10799691 Natural Sciences], [https://zenodo.org/records/10800651 Humanities], [https://zenodo.org/records/10801617 Health and Life Sciences] and [https://zenodo.org/records/10801722 Citizen Science]. In addition, the ROSiE project created a [https://zenodo.org/records/10795319 collection of case studies] that can be use in the traditional and/or online ROSiE training. Moreover, six different modules on responsible open science can be found [[Guide:E525ee0d-0d7e-4ba5-b19b-89e4a5029b2f|here]], on the Embassy of Good Science platformE525ee0d-0d7e-4ba5-b19b-89e4a5029b2f|here]], on the Embassy of Good Science platform)
- Sertifika + (VIRT2UE sertifikasını alabilmek için 10 ar … VIRT2UE sertifikasını alabilmek için 10 araştırmacıya VIRT2UE yaklaşımı üzerine eğitim vermeniz gerekmektedir. VIRT2UE programı üç online modül ve beş katılımlı alıştırmadan oluşmaktadır. 10 araştırmacıyı 5 alıştırmanın 3’ü üzerinden eğitmeniz bizim için yeterlidir – bu, araştırma doğruluğu konusuna erdem temelli yaklaşımı öğrenmek için gerekli temeli sağlayacaktır. </br></br>10 araştırmacının eğitimi, program esnasında katılımlı alıştırmaların ‘uygulanması’ aşamasının bir parçası olarak ya da program bittikten sonra gerçekleştirilebilir. </br></br>Eğitiminize katılan araştırmacılara kendilerinin de birer eğitmen olabilmesi için gerekli olan detaylı bilgiyi verebilirsiniz. Bu opsiyonel bir adımdır ve VIRT2UE sertifikasını almanız için bir koşul değildir. Eğer gerçekten araştırmacıları eğitmen olmak üzere eğitmek isterseniz, bu kişilerin İyi Bilim Elçiliği web sitesindeki VIRT2UE kılavuzunu okuduklarından ve alıştırmaları kolaylaştırıcı olarak yönetmekle ilgili deneyimlerini paylaşmak için ekstra bir oturuma katıldıklarından emin olunuz. Yüz yüze toplantıların mümkün olmaması durumunda (örneğin pandemi esnasında) bu deneyim paylaşımı yazılı bir alıştırma, akran koçluğu ve/veya video konferans ile ikame edilebilir.veya video konferans ile ikame edilebilir.)
- Sertifika + (VIRT2UE sertifikasını edinmek için bir öğr … VIRT2UE sertifikasını edinmek için bir öğrenme portföyü oluşturmanız gerekmektedir. Portföyünüz içerisinde bulunması gereken dosyalar aşağıda belirtilmiştir: </br></br>*Tamamlanan görevlere ilişkin tablo: https://community.embassy.science/uploads/short-url/kDU9xWHeyxdC4ER1toih5KJGE8H.docx</br>*Alıştırmaları kolaylaştırıcı olarak yönetirken edindiğiniz deneyimleri detaylandıran yorum formları: [https://www.dropbox.com/s/1fmppqv189jxlqj/Self%20reflection%20form.pdf?dl=0 Reflection form]</br>*10 araştırmacıyı eğittiğinizi gösteren belgeler (zoom görüşmelerinden alınmış ekran görüntüleri/ yoklama kâğıdı / katılım bildirimi)</br></br>Bunların yanı sıra sertifikanızın gönderilmesini istediğiniz adresi de yazmanız gerekmektedir (ev ya da iş adresiniz olabilir)!mektedir (ev ya da iş adresiniz olabilir)!)
- Erdemler ve Normlar + (Vaka net bir biçimde anlaşıldıktan sonra k … Vaka net bir biçimde anlaşıldıktan sonra katılımcılardan kendilerini vakayı sunan kişinin yerine koymalarını ve eğer sunucu yerinde kendileri olsa bu ikilemde hangi erdem(ler)in (iki tane erdem belirlenmesi yeterlidir ancak daha fazlası da mümkündür) rol oynayacağını düşünmelerini isteyin. Bunun için katılımcılara şu soruyu sorabilirsiniz:</br></br>“Eğer vakayı sunan kişinin durumunda olsaydınız ve ne yapacağınıza karar vermeniz gerekseydi, sizin için bu kararı verirken hangi erdem önemli olurdu?”</br></br>Lütfen belirtilecek erdemlerin ikilemin seçeneklerinden biriyle bağlantılı olması gerekmediğini unutmayın. Bu aşamada katılımcılar söz konusu durumda doğrulukla hareket edebilmek için hangi ahlaki niteliği (erdemi) hayata geçirmeleri gerektiği üzerine fikir yürütmek durumundadır. Bağlamsal olarak, katılımcılardan seçtikleri erdemi takip edecek eylem kuralının (normunun) ya da davranışın ne olduğu üzerine fikir yürütmelerini isteyin. Bunun için kendilerine şu soruları sorabilirler:</br></br>“Bu durumda bu erdeme uygun davranabilmek için ne yapmam gerekir?”</br></br>“Bu durumda bu erdemi hayata geçirebilmek için nasıl bir eylem kuralını takip etmem gerekir?”</br></br>Lütfen aynı erdemle farklı normların ya da aynı normla farklı erdemlerin ilişkilendirilebileceğini unutmayın.lerin ilişkilendirilebileceğini unutmayın.)
- Erdemler ve Normlar + (Vaka sahibi katılımcıdan, (önceden seçilmi … Vaka sahibi katılımcıdan, (önceden seçilmiş olan) vakasını anlatmasını ve vakanın neden ahlaki açıdan sorunlu olduğunu açıklamasını isteyin. Grupla birlikte, vakayı sunan katılımcının ikilemi (ikilem içerisindeki seçenekleri) formüle etmesine yardımcı olun (yani A’yı mı yapmalıyım yoksa B’yi mi?) ancak ikilemin doğru formülasyonunun ne olduğunu belirleyecek kişinin vaka sahibi olması gerektiğini unutmayın. Bu aşamada iki alternatif eylem tarzı üzerine odaklanmaya çalışın ve üçüncü bir seçenek ya da yaratıcı çözümler aramaktan kaçının. Bu, diyaloğa odaklanmaya yardımcı olacak ve insanların ikilemden bir çıkış yolu aramaya ya da hızlı bir çözüm bulmaya çalışmasını engelleyecektir. İkilemi ve vakayı tanımlayan anahtar kelimeleri yazı panosu üzerine not alın.r kelimeleri yazı panosu üzerine not alın.)
- Measurement tools for collecting learning outcomes: long-term effect + (Vignettes can be integrated in various con … Vignettes can be integrated in various contexts, like team training sessions, institutional or national surveys and so on. While the use of vignettes is quite common in the training contexts, comments collected on vignettes is not very common in non-training contexts (e.g. as part of national REI surveys, team meetings, conferences). Still, the comments collected on the vignettes may prove to be a great source of information about the respondents’ attitudes, beliefs, knowledge and ethical sensitivity (Kirkpatrick’s level 3).</br></br>Vignettes contain a situation with one or several ethical aspects and there can be a straightforward solution or not. There are several measures to gauge ethical sensitivity with vignettes – for example, Likert scale can be used to indicate how ethical the situation seems to the respondent. An open-answer option could be added, and research indicates (Parder et al., 2024; Tammeleht et al., forthcoming) that open responses reveal more about ethical sensitivity than quantitative data. </br></br>Implementing vignettes into various surveys or team meetings/conferences requires some preparation from the facilitators, but collecting responses and comments is quite simple. Analysing results may take some time, especially in case open answers are scrutinised. We recommend using an EASM (Ethical Awareness and Sensitivity Meter) for measuring the level on sensitivity in the open answers. Content criteria (ethical principles, ethical analysis, ethical approaches) or recognising the topics present in the vignette (similar to the domain-specific measure).</br></br>For example, Estonian national REI survey included four vignettes. The survey asked respondents to indicate the ethicality of the situation on a Likert scale (1-6). The results (of statistical analysis) show (figure 1) that the ethicality of vignettes was evaluated on different levels, some topics were considered more unethical than others. </br>[[File:Img25.png|left|frameless|263x263px]]</br></br></br></br></br></br></br></br>Figure 1. Unethical behaviour identified on a Likert scale (all unethical indications) (from Parder et al., 2024).</br></br>Then the respondents had a chance to add a comment – this was optional but about a half of the respondents used the opportunity (which may indicate some ethical sensitivity). Open comments were analysed based on the EASM and the picture looked a bit different (see figure 2). Based on the Likert scale results, vignette 4 was not considered very unethical (or not connected to ethics). Open answers revealed that 70% of respondents actually considered the situation to be ethical in nature and showed understanding of the topic. It also became clear that 30% of respondents had completely missed the topic, meaning they had not understood the situation from the ethical perspective.</br>[[File:Img26.png|center|frameless|500x500px]]</br></br></br>Figure 2. Analysis of open comments to the vignettes (from Parder et al., 2024).</br></br>Overall, it can be concluded that the identified misconceptions and not noticing ethical issues (both on the prestructural level in the SOLO taxonomy) may be the indication that training might be needed to clarify the topics.</br></br>This tool is suitable for use in training for ECRs, supervisors/mentors and expert researchers.upervisors/mentors and expert researchers.)
- THE PREPARED CODE: A Global Code of Conduct for Research during Pandemics + (Vulnerabilities increase during pandemics. Where possible, research approaches should be adapted to ensure the ethical inclusion of persons in vulnerable situations – with adequate protections – rather than adopting patronizing or convenience exclusions.)
- Add and edit Theme pages + (Want to contribute to the Embassy of Good … Want to contribute to the Embassy of Good Science? It's simple! All you need is an ORCiD login and you can get started right away. </br></br>See the video below for detailed instructions on how to use your ORCiD to log into the Embassy.</br></br><div class="video-button" data-href="https://www.youtube.com/embed/8s2hroYxT3I"></br><span class="video-button-label">ORCiD Login</span></br><span class="video-button-duration">0:47 min</span></br></div>; <span class="video-button-duration">0:47 min</span> </div>)
- Ethical and Societal Foundations of Open Science + (Watch the interactive video below and complete the exercises!)
- Protection of Research Participants + (Watch the interactive video below and complete the exercises!)
- Rights of Citizen Scientists + (Watch the interactive video below and complete the exercises!)
- Conflicts of interest in citizen science + (Watch the interactive video below and complete the exercises!)
- Debate and Dialogue + (Watch the video to get an impression of th … Watch the video to get an impression of the VIRT2UE 'Debate & Dialogue Exercise'. </br>[[File:D&D2.png |link=https://www.youtube.com/watch?v=249umsbOIG0&list=PLabbUwyulAry4tzZ12eHl5JOJhJGiaE6k&index=4]]</br><br></br></br></br>Debate and dialogue are two different communicative modes. The following video explores the differences between the two and helps the viewer develop a better understanding of their dynamics. These differences are also described in more detail in the theme page '[https://embassy.science/wiki/Theme:6217d06b-c907-4b09-af4e-b4c8a17b9847 dialogue versus debate'].</br></br>[[File:ByVirtueof.png |link=https://www.youtube.com/watch?v=M-nI32JBOyo]][[File:ByVirtueof.png |link=https://www.youtube.com/watch?v=M-nI32JBOyo]])
- Modified Dilemma Game + (Watch the video to have an impression of t … Watch the video to have an impression of the 'Modified Dilemma Game' of the VIRT2UE Train-the-Trainer program.</br>[[File: DGE.png|link=https://www.youtube.com/watch?v=Qpq-oWPdvJQ&list=PLabbUwyulAry4tzZ12eHl5JOJhJGiaE6k&index=6]]</br></br>If you are playing the app version of the game, your trainer will ask you to download the Dilemma Game [https://www.eur.nl/en/about-eur/policy-and-regulations/integrity/research-integrity/dilemma-game app] before the session and watch this [https://www.youtube.com/watch?v=SKhT7qHh9T8&t=8s video] for an introduction to using the app.&t=8s video] for an introduction to using the app.)
- Micromodule care ethics and environmental ethics + (Watch this short video introducing ethics of care (or care ethics). '''Video''')
- Translate the VIRT2UE guide into your own language + (We have made a text version of the trainin … We have made a text version of the training guide with all instructions and both the trainer and trainee perspectives included. This can be used to easily translate the guide.</br></br>[https://public.3.basecamp.com/p/Kfoj9Eo6iDvgjaDj6qBjzdGh Here] you can find the word count for each part of the training guide. This can be used to obtain a quote for the translation from a translator.ote for the translation from a translator.)
- Technology and sustainability + (We rarely think about the environmental cost of streaming a movie, joining a video call, or downloading a podcast— but the digital world runs on data centers that use huge amounts of energy, water, and land.)
- Introduction to the evaluation of the effectiveness of Research Ethics and Integrity (REI) training + (We realise that this is not a conclusive l … We realise that this is not a conclusive list, just a toolbox collected for trainers. Still, we have tried to collect measurement tools to evaluate effectiveness on different levels – from self-reactions, to learning content and process, to common practices/behaviour, and to see the wider impact on the research community. All the included tools and analysis instruments have been tested in the REI training context.</br></br></br>At best, tools for measuring and assessing REI learning also serve a pedagogical function, that is, its use is part of the teaching activity and the learning process. The choice of the measure of effectiveness depends on the training format, learning objectives, pedagogical approach, learning activities, time available for its use, facilitators’ competencies and others. Depending on these criteria a measurement tool can be chosen, different tools scan be combined. It should also be remembered that no one size fits all – facilitators should familiarise themselves with different tools and analysis instruments and combine the ones they consider feasible. In addition, large-scale tools can be used also in case of small groups, but vice versa may not be possible.</br></br>Figure 2 outlines a map of tools based on feasibility and scale of use. Table 1 (see step 5) provides general information about the data collection and analysis tools. </br>[[File:Fg2.png|center|frameless|500x500px]]</br>Figure 2. Map of tools to measure REI training effectiveness.</br></br>To measure short-term training effects (meaning what is just happening and what the learners’ reactions are, during and right after the learning process) you can use: MMLA tools collecting learner reactions (e.g. ProLearning, ForgetNot), and Self-Reflection Form/Compass.</br></br>For mid-term training effects (information about the content and learning process) one can use: Eye-tracking, Pre and post texts, Domain-specific, domain-transcending measure, Learning diaries/journals, Group reports/portfolios, Group discussions, Monitoring the online learning environment, group-dynamics with CoTrack.</br></br>Long-term effects can be measured with (effects after the intervention, behaviour and practices): National REI barometers /surveys (consequently the national guidelines can be improved), Retention check - learner’s activities after the training (e.g. monitoring the ethics sections of articles or asking learners to do another task several months after the training to measure retention); implement vignettes in surveys/non-training contexts to measure ethical sensitivity of learners.</br></br>We also outline recommendations for the implementation of the tools and analysis instruments:</br></br></br>*Effectiveness of the training starts from careful planning – the alignment of learning outcomes, training content and evaluation is crucial. Analysis instruments could be considered when outlining the learning outcomes and content.</br></br>*The measurement tools could be used as pedagogical instruments – e.g. learning diaries can be used as a tool to support the development of learner’s reflection skills as well as measuring if those skills advance.</br>*Using similar analysis instruments provide an opportunity to compare the results of different training formats (e.g. the SOLO taxonomy).</br>*Combining various measurement tools (triangulation) provides a holistic picture of the entire learning process as well as outcomes (i.e. effect). Using different measurement tools at various measurement points provides an alternative angle to triangulation.</br>*Measurement on level 4 should be implemented on a national (perhaps also institutional) level.</br>*An implementation example: to measure participants’ reactions during or right after the training, Self-Reflection Form can be used. In addition, if learners worked in groups their group discussions can be monitored, and if they provided a group-report or pre- and post-texts, the learning process can be evaluated based on the SOLO taxonomy to measure the levels of understanding. Moreover, if possible, a couple of months after the training an additional case study could be given to the same learners, and the content of their analysis could again be evaluated with the SOLO taxonomy. This kind of effectiveness measure would give a possibility to triangulate the measurement in different time points. Analysing vignettes and participating in national REI surveys would provide insights on the wider research community. insights on the wider research community.)
- Introduction to environmental justice + (We will begin by watching a short video on '''environmental justice'''. After watching the video, you will be asked to complete a brief questionnaire based on the content you’ve just seen.)
- From waste to wisdom: rethinking plastic waste management in the lab + (Well done! You have now learnt about vario … Well done! You have now learnt about various types (recyclable vs. replaceable) of plastics used in a lab setting and steps towards designing a plastic recycling pipeline for a more management of plastic wastes in a lab. It is obvious that the recycling pipeline suggested by Green Labs Austria has to be tailored to the specific conditions of each lab. </br></br></br>In sum, here are some guidelines that can be adopted for the successful development of a recycling pipeline: '''''(i) communication is key for enabling an easy and sensible sorting of plastic waste; (ii) recycling pipeline should be initially tested with a smaller group before being rolled out to a much larger group; (iii) strive for adaptability by substituting non-recyclable materials with recyclable alternatives.''''' </br></br></br>Moving forward, please use the questions below as a guide to reflect on your next steps. as a guide to reflect on your next steps.)
- 01 - Value Analysis: A Method for Analysing Cases in Research Ethics and Research Integrity + (What alternatives exist?)
- 01 - Value Analysis: A Method for Analysing Cases in Research Ethics and Research Integrity + (What could be considered as forseeable consequences?)
- Introduction to the evaluation of the effectiveness of Research Ethics and Integrity (REI) training + (What do we know about measuring training e … What do we know about measuring training effectiveness?</br></br>Self-assessment is one of the most prevalent means to measure effectiveness of REI training. The second most frequently used method for assessing training effect was a moral reasoning test. While the developed tools were mostly used pre and post intervention (with/without control groups) and the results were compared, there were other measures added to evaluate the learning process or student progress.</br></br></br>It also seemed that the tests designed for ethics training (like DIT, DEST, TESS) cannot be universally implemented to all REI training due to very different formats of training and/or availability of tests. There are also qualitative possibilities (like learning diaries, tasks submitted during other courses, etc.) to monitor the learning progress, and through that assess the effectiveness of training.</br></br>See figure 1 outlining the identified measures and their application scale and feasibility (more details in D4.1):</br></br>Figure 1. Measurement tools identified in the literature review (numbers indicate Kirkpatrick’s levels, see below) (tool descriptions in D4.1).</br>[[File:Screenshot 2025-11-17 205639.png|center|frameless|500x500px]]</br></br></br>As can be seen in figure 1, self-reporting (blue bubbles) is the most feasible measure and that can be implemented large-scale (mostly). It is no wonder that this is, based on the literature review, also the most used approach. Also, SPEEES and SOLKA tests utilise self-reporting.</br></br>Most tools measure the content or the learning process (green bubbles) – they give information about what was learned during the training. As indicated, feasibility of the measures is not high – either a lot of work needs to be put into implementing the tool, they are not openly available, or they may be field specific. Possibilities for measuring behaviour (yellow bubbles) are scarce.</br></br>Comparing results collected with various tools is almost impossible because they measure different aspects of training with different analysis instruments. It is not possible to determine whether qualitative (indicated as ’qual’) or quantitative (indicated as ‘stat’ in Fig 1) methods of analysis are more feasible. Feasibility depends on the combination of various aspects, such as accessibility to the tool, need of special equipment, and competence required.pecial equipment, and competence required.)
- 01 - Value Analysis: A Method for Analysing Cases in Research Ethics and Research Integrity + (What do you think is the best thing for X to do?)
- Nature Relations in Research and Innovation + (What if innovation projects were designed … What if innovation projects were designed with nature in mind? Multispecies Thinking broadens our perspective, recognizing the interconnectedness of all life forms and the need to include non-human beings in our ethical and research considerations. Designer Liina Lember explores how light pollution might be tackled from the perspective of another species. View the slideshow on “In-Visible Moth Spells” and notice how it makes you feel. Spells” and notice how it makes you feel.)
- 01 - Value Analysis: A Method for Analysing Cases in Research Ethics and Research Integrity + (What is the dilemma?)
- AI In Healthcare: Technology Basics + (What is the primary function of generative AI?)
- 01 - Value Analysis: A Method for Analysing Cases in Research Ethics and Research Integrity + (What might be the consequences of the various alternatives?)
- 01 - Value Analysis: A Method for Analysing Cases in Research Ethics and Research Integrity + (What might be the short- as well as the long-range consequences?)
- THE PREPARED CODE: A Global Code of Conduct for Research during Pandemics + (When research is prioritized during a pandemic, research participants in ongoing studies must not be left worse off than before they joined their original study)
- Specific Research Ethics and Integrity Considerations for Crisis Research + (When researchers from privileged circumsta … When researchers from privileged circumstances conduct ethically questionable studies in lower-income settings, it's known as 'ethics dumping.' Ethics dumping can create significant challenges and is a growing concern. It is therefore essential that researchers work as closely as possible with local collaborators and reflect together on ways to prevent it. </br></br>This video explores six different ways ethics dumping can occur, from intentional disregard to unintentional cultural misunderstandings. unintentional cultural misunderstandings.)
- Selecting Appropriate Material and Effectiveness Measurement Tools for your Target Audience + (When training professors, trainers, mentor … When training professors, trainers, mentors, and supervisors in research integrity, the aim is to equip them with the knowledge, skills, and resources to effectively train, mentor and guide their students and junior colleagues in conducting research with integrity. </br></br>Resources for professors and senior academics include (please see the last section of the BEYOND trainer guide for an overview of materials divided by topics and target groups): </br></br></br></br>*The [[Guide:Bbe860a3-56a9-45f7-b787-031689729e52|VIRT2UE]] train the trainer course on research integrity and the training module introducing [[Instruction:D3ee617b-5d9b-4c47-a015-030b0354c9d2|supervision and mentorship practices]].</br>*The training materials to foster mentorship and supervision developed by [https://h2020integrity.eu/toolkit/tools-researchers-supervisors/ INTEGRITY].</br>*The training for supervisors and leaders ([https://www.researchethicstraining.net/leadershiplevel leadership level]) developed by the RID-SSISS project.</br>*The [https://www.academicintegrity.eu/wp/bridge-module-for-supervisors/ module for supervisors] developed by [https://www.academicintegrity.eu/wp/bridge/ BRIDGE project].</br></br>Trainers can select one or more of the following tools for evaluating training effectiveness for Professors and Senior Academics:</br></br></br>{| class="wikitable"</br>|+Table 7: BEYOND Tools for evaluating training effectiveness for academics and experts</br>!'''Tool for collecting learning outputs'''</br>!'''Details'''</br>!'''Analysis instrument **'''</br>|-</br>|'''Self-Reflection Form/Compass'''</br>|App under development, [https://forms.office.com/Pages/ShareFormPage.aspx?id=WXWumNwQiEKOLkWT5i_j7twYn7PlpvpDlgGDpz2LgIdUMk5XRTVYQTVKRFRDWDlHOUdGU1FHTUlFVi4u&sharetoken=03epmvYBRpmfXvpRg9os form] * (for copying and editing)</br>|SOLO taxonomy, reflection levels, content criteria</br>|-</br>|'''Pre-post texts'''</br>|Collect a short text (e.g. a response to a case or short essay) before the training and after the training</br>|SOLO taxonomy, reflection levels, content criteria</br>|-</br>|'''Learning diaries'''</br>|Ask learners keep a diary over a certain period, for each submission provide some guiding questions or topics</br>|SOLO taxonomy, reflection levels, content criteria</br>|-</br>|'''Group reports'''</br>|Ask groups working together to provide a (short) group report (or provide a template with points to work on)</br>|SOLO taxonomy, content criteria</br>|-</br>|'''Group discussions'''</br>|Monitor the group discussions to evaluate the level of understanding and content discussed (scaffold as appropriate)</br>|SOLO taxonomy, content criteria</br>|-</br>|'''Retention check'''</br>|After a certain time (few weeks/months) ask learners to provide a short text (analysis of a case, short essay on an ethics topic/question). Compare the levels of understanding to another piece collected during or right after the training.</br>|SOLO taxonomy, content criteria</br>|-</br>|'''Vignettes'''</br>|This can be used for measuring ethical sensitivity in (non-)training context.</br>|statistics, EASM (based on the SOLO taxonomy), content criteria</br>|-</br>|'''National surveys'''</br>|Can be used for analysing training-related content in reports and monitoring the display of REI leadership.</br>|statistics, REI leadership framework</br>|}</br>For instance, to measure participants’ reactions during or right after the training, Self-Reflection Form can be used. In addition, if learners worked in groups so their group discussions can be monitored, and if they provided a group-report or pre- and post-texts, the learning process can be evaluated based on the SOLO taxonomy to measure the levels of understanding. Moreover, if possible, a couple of months after the training an additional case study could be given to the same learners, and the content of their analysis could again be evaluated with the SOLO taxonomy. Analysing vignettes and participating in national REI surveys would be relevant for this target group. This kind of effectiveness measure would give a possibility to triangulate the measurement in different time points.</br></br>An example for implementation can be found here: [https://helsinkifi.sharepoint.com/:p:/r/sites/BEYONDHelsinkiteam/Shared%20Documents/ENERI%20CR%20material%20example.pptx?d=w41ec1fe12bbb495781dbe81d55a09d71&csf=1&web=1&e=yxKgSC ENERI CR material example.pptx] </br></br>'"`UNIQ--nowiki-0000008F-QINU`"' The Self-Reflection Form link enables the facilitator to make a copy of the form, which they can then edit, and the data will accumulate on the facilitator’s cloud service (Google or Microsoft).</br></br>'"`UNIQ--nowiki-00000090-QINU`"' Analysis instruments are described in WP4.2, later available at the Embassy’s website.cribed in WP4.2, later available at the Embassy’s website.)
- Selecting Appropriate Material and Effectiveness Measurement Tools for your Target Audience + (When training professors, trainers, mentor … When training professors, trainers, mentors, and supervisors in research integrity, the aim is to equip them with the knowledge, skills, and resources to effectively train, mentor and guide their students and junior colleagues in conducting research with integrity. </br></br>Resources for professors and senior academics include (please see the last section of the BEYOND trainer guide for an overview of materials divided by topics and target groups): </br></br></br></br>*The [[Guide:Bbe860a3-56a9-45f7-b787-031689729e52|VIRT2UE]] train the trainer course on research integrity and the training module introducing [[Instruction:D3ee617b-5d9b-4c47-a015-030b0354c9d2|supervision and mentorship practices]].</br>*The training materials to foster mentorship and supervision developed by [https://h2020integrity.eu/toolkit/tools-researchers-supervisors/ INTEGRITY].</br>*The training for supervisors and leaders ([https://www.researchethicstraining.net/leadershiplevel leadership level]) developed by the RID-SSISS project.</br>*The [https://www.academicintegrity.eu/wp/bridge-module-for-supervisors/ module for supervisors] developed by [https://www.academicintegrity.eu/wp/bridge/ BRIDGE project].</br></br>Trainers can select one or more of the following tools for evaluating training effectiveness for Professors and Senior Academics:</br></br></br>{| class="wikitable"</br>|+Table 7: BEYOND Tools for evaluating training effectiveness for academics and experts</br>!'''Tool for collecting learning outputs'''</br>!'''Details'''</br>!'''Analysis instrument **'''</br>|-</br>|'''Self-Reflection Form/Compass'''</br>|App under development, [https://forms.office.com/Pages/ShareFormPage.aspx?id=WXWumNwQiEKOLkWT5i_j7twYn7PlpvpDlgGDpz2LgIdUMk5XRTVYQTVKRFRDWDlHOUdGU1FHTUlFVi4u&sharetoken=03epmvYBRpmfXvpRg9os form] * (for copying and editing)</br>|SOLO taxonomy, reflection levels, content criteria</br>|-</br>|'''Pre-post texts'''</br>|Collect a short text (e.g. a response to a case or short essay) before the training and after the training</br>|SOLO taxonomy, reflection levels, content criteria</br>|-</br>|'''Learning diaries'''</br>|Ask learners keep a diary over a certain period, for each submission provide some guiding questions or topics</br>|SOLO taxonomy, reflection levels, content criteria</br>|-</br>|'''Group reports'''</br>|Ask groups working together to provide a (short) group report (or provide a template with points to work on)</br>|SOLO taxonomy, content criteria</br>|-</br>|'''Group discussions'''</br>|Monitor the group discussions to evaluate the level of understanding and content discussed (scaffold as appropriate)</br>|SOLO taxonomy, content criteria</br>|-</br>|'''Retention check'''</br>|After a certain time (few weeks/months) ask learners to provide a short text (analysis of a case, short essay on an ethics topic/question). Compare the levels of understanding to another piece collected during or right after the training.</br>|SOLO taxonomy, content criteria</br>|-</br>|'''Vignettes'''</br>|This can be used for measuring ethical sensitivity in (non-)training context.</br>|statistics, EASM (based on the SOLO taxonomy), content criteria</br>|-</br>|'''National surveys'''</br>|Can be used for analysing training-related content in reports and monitoring the display of REI leadership.</br>|statistics, REI leadership framework</br>|}</br>For instance, to measure participants’ reactions during or right after the training, Self-Reflection Form can be used. In addition, if learners worked in groups so their group discussions can be monitored, and if they provided a group-report or pre- and post-texts, the learning process can be evaluated based on the SOLO taxonomy to measure the levels of understanding. Moreover, if possible, a couple of months after the training an additional case study could be given to the same learners, and the content of their analysis could again be evaluated with the SOLO taxonomy. Analysing vignettes and participating in national REI surveys would be relevant for this target group. This kind of effectiveness measure would give a possibility to triangulate the measurement in different time points.</br></br>An example for implementation can be found here: [https://helsinkifi.sharepoint.com/:p:/r/sites/BEYONDHelsinkiteam/Shared%20Documents/ENERI%20CR%20material%20example.pptx?d=w41ec1fe12bbb495781dbe81d55a09d71&csf=1&web=1&e=yxKgSC ENERI CR material example.pptx] </br></br>'"`UNIQ--nowiki-00000097-QINU`"' The Self-Reflection Form link enables the facilitator to make a copy of the form, which they can then edit, and the data will accumulate on the facilitator’s cloud service (Google or Microsoft).</br></br>'"`UNIQ--nowiki-00000098-QINU`"' Analysis instruments are described in WP4.2, later available at the Embassy’s website.cribed in WP4.2, later available at the Embassy’s website.)
- Recognition and Networking + (When you have become a certified VIRT2UE trainer, you will receive an invitation to join the ENERI e-Community. Fill in this application and return it to the coordinators of the ENERI e-community [https://eneri.eu/e-community/ here])
- THE PREPARED CODE: A Global Code of Conduct for Research during Pandemics + (Where possible, community engagement should be continued or even increased during a pandemic, to address the most pressing needs of communities and to help maintain trust in science.)
- THE PREPARED CODE: A Global Code of Conduct for Research during Pandemics + (Where research participants depend on research studies for access to medication and services, study modifications during pandemics need to be managed responsibly to ensure that their lives and health are not endangered.)
- Measurement tools for collecting learning outcomes: long-term effect + (While national REI surveys (barometers) ma … While national REI surveys (barometers) may not address training directly, they can be used as macro-level long-term reflections of the state of REI in a given context, and as such they may also be reflections of whether training efforts, in a broad sense, have been efficient (Kirkpatrick’s level 4). As the evaluation of training effect cannot be tied to specific training at this macro-level, it may provide indications of the extent of challenges, which could have or can be alleviated through training, and point a direction of training needs. For instance, surveys usually collect information about participation in trainings and ask how confident researchers feel in dealing with ethical issues during their research (statistical data analysis).</br></br>In addition, national as well as institutional surveys may provide an opportunity to collect cases of questionable practices and future trainings could address those topics. For collecting cases researchers consider confusing or problematic, open answer questions could be added in the surveys.</br></br>In addition, the health of the entire research community can be evaluated by monitoring the leadership aspect in the surveys. For analysing this aspect, a REI Leadership framework (Tammeleht et al., 2022, submitted) can be used. The meta-analysis provides information about the wider impact of research practices in researcher institutions, but also helps institutional leaders support everyone in their organisation to obtain ethical research practices.</br></br>This tool is suitable for use in training with ECRs and active researchers.training with ECRs and active researchers.)
- Measurement tools for collecting learning outputs: medium term effects + (While pre- and post-test are very common a … While pre- and post-test are very common as a training effectiveness measure, we are proposing a pre- and post-text measure. </br></br>Of course, tests are easy to implement and analyse (if statistics is used), but the improvement of the average scores may not provide the entire picture of the learning process. In addition, post-post texts can be used as a measure implemented several months after the training to assess the retention of the competencies – this may also provide insights into the potential change in the learner’s behaviour or practices (Kirkpatrick’s levels 2 and 3).</br></br>The learner would provide a text (either an essay or short reflection of a case) prior to the training, after that participates in training activities and then submits another text (can again be a short essay or discussion of a case). Optionally, another text can be produced several months after the end of the training. If the same analysis tool is used, the long-term impact can be measured.</br></br>This measure is suitable for HE context and in all disciplines. The measure is simple to implement. Common analysis tools make the work simpler and the progress levels comparable. The text can be evaluated based on the SOLO taxonomy and the reflection levels. Content criteria, like ethical principles, ethical analysis, ethical approaches can also be sought for in the texts. It may be challenging to use it in case of large groups as reading and analysis may take time. It may be difficult to find the learners months after the end of training.</br></br>Ethics sections in doctoral dissertations can also be analysed as ‘pre- and post-texts’ if the final product can be compared to earlier drafts. </br></br>The tool is suitable for use in training for all target groups in HE context.ining for all target groups in HE context.)
- Der Selbstauskunfts-Ansatz: Eine Reflexion über das Konzept des Guten in der Wissenschaft + (Während der Übung wird euch eure Trainer:i … Während der Übung wird euch eure Trainer:in anleiten, eine Gruppenreflexion über das Konzept des Guten zu führen. Um diese Reflexion zu durchzuführen, werdet ihr:</br></br>1. lernen, auf welche verschiedenen Arten Forschung ''gut'' sein kann;</br></br>2. in kleinen Gruppen über eure Überlegungen aus dem Selbstauskunfts-Arbeitsblatt sprechen (eine Liste von Fragen, die diese Gruppenbesprechung erleichtert, findest du unter “Praktische Tipps”);</br></br>3. darüber reflektieren, wie die unterschiedlichen Arten, auf die Forschung ''gut'' sein kann, kategorisiert werden können;</br></br>4. über die Inhalte des Europäischen Verhaltenskodex für Integrität in der Forschung reflektieren, indem ihr diskutiert, wie die unterschiedlichen Arten des Guten in der Forschung im Kodex erläutert werden.</br></br>Die Übung wird auf den Gedanken und Ideen aufbauen, die du und die anderen Teilnehmenden auf dem Selbstauskunfts-Arbeitsblatt festgehalten habt. </br></br>Eine detaillierte Beschreibung der einzelnen Schritte dieser Übung ist bei den Anweisungen für die Trainer:innen zu finden.weisungen für die Trainer:innen zu finden.)
- Modified Dilemma Game + (You and your group will be asked to fill o … You and your group will be asked to fill out tables to identify the principles and practices of the European Code of Conduct for Research Integrity, and scientific virtues, that are relevant to the dilemma at stake. </br></br>[[File:Modified Dilemma Game Table 1.png|thumb|'''Table 1: Which principles from European Code for Research Integrity can you identify in each dilemma?''']]</br><br /></br>[[File:Modified Dilemma Game Table 2.png|thumb|'''Table 2. Which research misbehaviors can you identify in this dilemma?''']]</br><br /></br>[[File:Modified Dilemma Game Table 3.jpg|thumb|'''Table 3. Which scientific virtues are important when deciding on a course of action?''']]</br><br />[[File:Modified Dilemma Game Table 3.jpg|thumb|'''Table 3. Which scientific virtues are important when deciding on a course of action?''']] <br />)
- Modified Dilemma Game + (You will be asked to reflect on the proces … You will be asked to reflect on the process, and to evaluate if the learning objectives were met. You will be invited to have a brief dialogue on what you might have just learned as a group. You may be asked to seek answers to questions such as the following:</br></br><br /></br></br>*Was it easy or difficult to identify the relevant principles and virtues in the chosen dilemma?</br>*Did this exercise help you with identifying and connecting to formally defined principles, such as those in the European Code of Conduct for Research Integrity?</br>*Did most of the players agree or disagree with the final choice?</br>*What were the main points of contention?</br>*How come people disagreed (e.g. differences in experience, training, background, values, norms…)?</br>*Was any alternative option proposed?</br>*Did anybody change her/his mind as a result of the discussion?</br>*What is needed in order to do the moral good in your work setting? What were the most convincing arguments used in the discussion?</br>*On which areas do you feel there is insufficient consensus?</br>*How to best address such future dilemmas in your daily work?address such future dilemmas in your daily work?)
- Modified Dilemma Game + (You will be invited to the plenary. The trainer will ask you to reflect on your individual choices and engage in a dialogue with other participants.)
- Modified Dilemma Game + (You, as a group, will be asked to present a brief summary of what has just been discussed in your group. You may assign a member as a spokesman to shortly present the results of your discussion.)
- Deneyimler üzerine fikir yürütme ve alıştırmaları uygulama + (Yüz yüze alıştırmaları kolaylaştırıcı olar … Yüz yüze alıştırmaları kolaylaştırıcı olarak yönetme ve kullanma konusundaki deneyim ve uzmanlığınıza dayanarak; katılımcılarla birlikte, kendi hedef gruplarının yeterlilik düzeyi ve öğrenme ihtiyaçları, bu grupların ayrıntılı olarak tartışmak istediği şeyler ve bu konuların alıştırmalara nasıl katılabileceği üzerine tartışın. Katılımcılara İyi Bilim Elçiliği web sitesindeki eğitim materyallerine nasıl erişebileceklerini gösterin. Ayrıca, bu platformdaki toplulukla platformun tartışma sayfası üzerinden nasıl etkileşime geçebileceklerini ve eğitim materyalleri üzerinde nasıl değişiklik talep edebileceklerini de açıklayabilirsiniz.ep edebileceklerini de açıklayabilirsiniz.)
- Critical Thinking, Standpoint & Ethics + ([[File:A drop of water.png|center|frameles … [[File:A drop of water.png|center|frameless|600x600px]]</br></br></br>Researchers in many fields have long known that the act of looking at something can change it. This holds true for people, for animals, and for particles. Below you will see four well known examples of how an observer can have an impact on what they are observing. For this drag and drop exercise, match the impact type to the meaning.</br></br></br></br>'''Exercise Feedback'''</br></br>These phenomena are well known in research. For instance, being observed makes psychiatric patients a third less likely to require sedation (Damsa et al, 2006), or the famous double slit experiment in modern physics. But many people believe that what we see is never what ‘really is’, even in the most highly controlled experimental settings. What do you think? experimental settings. What do you think?)
- Gene Editing Case Study with Human Application + ([[File:A group of different people.jpg|cen … [[File:A group of different people.jpg|center|frameless|600x600px]]</br><div><div></br>This checklist is intended for use as a supplement to the usual ethics review process regarding matters that are mainly specific to gene editing in humans. All usual aspects of research ethics review will also need to be considered, for instance, the appropriate processing of sensitive data or the involvement of vulnerable persons, like young children. Additionally, the checklist is not exhaustive; there may be other issues pertaining to individual studies that are not included here. Nevertheless, alongside general guidelines and processes, it provides a useful starting point for ethics reviewers. </br></br>'''<br />'''</br>'''1. Somatic or germline gene editing'''</br></br>a. Does the project aim to involve somatic or germline gene editing or both?</br></br>b. If germline gene editing, does the project comply with national legislation?</br></br>c. If germline gene editing, what steps have been undertaken to ensure societal acceptability?</br></br>d. If somatic gene editing, could the intervention affect the germline accidentally?</br></br>'''<br />'''</br>'''2. Novelty of gene editing in the project'''</br></br>a. Does the project use a novel technique, one that has already been tried in humans, or both?</br></br>b. If this is the first time it has been tested in humans, have comprehensive studies been undertaken in vitro and in animals to demonstrate proof of concept and safety?</br></br>c. If the technology has already been tested in humans, what do the findings tell us about potential risks and benefits?</br></br>'''<br />'''</br>'''3. Technological and other risks'''</br></br>a. Are risks of on-target effects clearly described and addressed?</br></br>b. Are risks of off-target effects clearly described and addressed?</br></br>c. Are risks of genetic mosaicism clearly described and addressed?</br></br>d. Are risks of immunogenicity clearly described and addressed?</br></br>e. Are risks associated with the treatment process clearly described and addressed?</br></br>f. Are risks of incidental findings clearly described and addressed?</br></br></br>'''4. Enhancement and slippery slope'''</br></br>a. Is the gene editing to be used purely for therapeutic purposes?</br></br>b. If for therapeutic purposes, are there risks that the technology could also be applied for enhancement purposes?</br></br>c. If so, how is this risk addressed?</br></br></br>'''5. Consent'''</br></br>a. How is the consent process being managed?</br></br>b. How is the option to opt out of the procedure being managed?</br></br>c. Is participant information sufficiently comprehensive and comprehensible so that the potential participants (or their legal representatives) will understand enough about the technology to assess the potential for harms and benefits meaningfully?</br></br>d. Are the potential participants being offered adequate support and time to reach a decision?</br></br></br>'''6. Data'''</br></br>a. What measures and protections are in place to prevent the exploitation of genetic and/or other biological data, for example, for profit?</br></br>b. What measures and protections are in place to prevent the misuse exploitation of genetic and/or other biological data and leading to, for example, discrimination, harassment, or marginalisation?</br></br></br>'''7. Equity'''</br></br>a. Who are the potential beneficiaries of this study?</br></br>b. Will the resultant therapy or other benefits be broadly accessible?</br></br>c. How are any matters of potential inequity in access addressed and justified?</br></br></br>'''8. Study justification'''</br></br>a. Is there a medical need for this study?</br></br>b. Might the same objectives be achieved via less risky and/or less costly methods?</br></br></br>[https://classroom.eneri.eu/sites/default/files/2024-10/Checklist%20for%20gene%20editing.pdf You can download the checklist here]</br></div></div><div></div>editing.pdf You can download the checklist here] </div></div><div></div>)
- Critical Thinking, Standpoint & Ethics + ([[File:A pair of eyes.png|center|frameless … [[File:A pair of eyes.png|center|frameless|600x600px]]</br></br></br>For most people, the ultimate proof that something is true is to see it for themselves. But how reliable are your observations? In the following pages, we will consider three potential influencing factors:</br></br></br>*The sense perception of the observer</br>*The impacts of the observer</br>*The viewpoint of the observerhe observer *The viewpoint of the observer)
- Critical Thinking, Standpoint & Ethics + ([[File:A spyglass.png|center|frameless|600 … [[File:A spyglass.png|center|frameless|600x600px]]</br></br></br>Sometimes its not just the presence but the viewpoint which changes the interaction with the observed.</br></br>The third influencing factor upon what is observed stems from the viewpoint of the observer. Researchers are not neutral processors of information. As human beings, they bring with them a host of assumptions and preconceptions.</br></br></br>Observation is dependent upon and coloured by our individual senses and our background beliefs and assumptions. In research, many of our background beliefs and assumptions are associated with the paradigm in which we operate, as we consider next. in which we operate, as we consider next.)
- Critical Thinking, Standpoint & Ethics + ([[File:A view of mountains high up on a hi … [[File:A view of mountains high up on a hill.png|center|frameless|600x600px]]</br></br></br>Which paradigm are you working in? Look at the descriptions on the end points of each question, and try to work out where you and your research project might fall along each continuum. Are you more of a realist or more of a relativist? Is your approach to knowledge generation more positivist or interpretivist? Do these aspects fit with the methodological stance that you take in your research?</br></br></br>Most people operate somewhere between the extremes. Additionally, it is possible to alter one’s positionality in response to different contexts. For instance, when addressing a research question which requires broad statistics, one might take a more positivist stance; when in-depth inquiry of a qualitative nature is required, one might take a more interpretivist stance. The important point is that we are cognisant of our perspective and its influence upon the knowledge that we create.fluence upon the knowledge that we create.)
- AI In Healthcare: Technology Basics + ([[File:AI Image1.png|center|frameless|600x … [[File:AI Image1.png|center|frameless|600x600px]]</br></br></br>AI, or artificial intelligence, refers to the development of digital systems that can perform tasks that typically require human intelligence.</br></br></br>These tasks include learning, reasoning, problem-solving, perception, language understanding, and speech recognition.</br></br></br>At its core, AI leverages principles from computer science, mathematics, and cognitive psychology to replicate intelligent behaviour in machines.</br></br></br>AI utilises algorithms, data, and computational power to simulate intelligent behaviour, enabling machines to adapt, improve, and perform complex functions autonomously.</br></br></br>Several core scientific concepts underpin the development and functionality of AI. Work your way through the presentation below to hear about some of them:</br></br></br>This list of core scientific concepts in AI is subject to ongoing research and development. The field of AI is rapidly evolving, and new techniques, algorithms, and applications are continuously emerging.</br></br></br>As researchers and scientists make advancements in AI technology and explore novel use cases, the understanding and implementation of these concepts may evolve.</br></br></br>Generative AI is a type of foundation model that is becoming more and more evident in everyday life as well as in healthcare. Test your understanding of generative AI by answering the following questions.e AI by answering the following questions.)
- AI In Healthcare: Technology Basics + ([[File:AI Image10.png|center|frameless|600x600px]] <div><div> Think about places where data relating to your own health might be stored – tick all that apply </div></div><div> * </div>)
