Text (Instruction Step Text)
From The Embassy of Good Science
Describe the actions the user should take to experience the material (including preparation and follow up if any). Write in an active way.
- ⧼SA Foundation Data Type⧽: Text
5
Group discussions (e.g. groups solving cases, analysing vignettes, making a decision, discussing possible courses of action or value conflicts) reflect the groups’ understanding of the topics and are suitable in HE context and in different disciplines. Sometimes the entire training may consist of discussions but only at the end the group presents their outcome, and the facilitator evaluates that part (Kirkpatrick’s level 2). But the facilitator may choose to record the discussion and evaluate the entire discussion process. Group discussions can be monitored by facilitators on site with e.g. ECAG template or recoded and then evaluated afterwards based on the SOLO taxonomy. In addition, content criteria can be identified in the discussions and discourse analysis can be used to monitor attitudes and values.
For example, the figure 2 (from Tammeleht et al., 2022) displays a (recorded) group discussion timeline indicating the levels of understanding, the time devoted to certain topics (A-D) and the order of topics. Indeed, this takes a long time but can be done in case of small groups. ECAG is more convenient and can be used in-class for on-site evaluation.
[[File:Img20.png|center|frameless|500x500px]]
Figure 2. Group discussion timeline (from Tammeleht et al., 2022).
This tool is suitable for all target groups in HE context.
== Group discussionsGroup discussions (e.g. groups solving cases, analysing vignettes, making a decision, discussing possible courses of action or value conflicts) reflect the groups’ understanding of the topics and are suitable in HE context and in different disciplines. Sometimes the entire training may consist of discussions but only at the end the group presents their outcome, and the facilitator evaluates that part (Kirkpatrick’s level 2). But the facilitator may choose to record the discussion and evaluate the entire discussion process. Group discussions can be monitored by facilitators on site with e.g. ECAG template or recoded and then evaluated afterwards based on the SOLO taxonomy. In addition, content criteria can be identified in the discussions and discourse analysis can be used to monitor attitudes and values. ==
<span lang="EN-US">Guide participants by going through the following steps:</span>
#''' '''<span lang="EN-US">Players read the case and the solutions.</span>
#Then everyone individually picks the solution that they are most likely to support (this is an individual decision and discussion at that stage should be discouraged).
#<span lang="EN-US">When all players have made their choice, all individuals within a group should simultaneously reveal their choices to each other (easiest to do with simply raising the number of fingers corresponding to the solution). Often different solutions are chosen and realization of this peer disagreement is an important aspect of the methodology.</span>
#Each player then takes turns explaining their reasons behind their choice. Others listen without intervening, commenting or criticising.
#<span lang="EN-US">After everyone has provided their reasons, discussion begins with the aim of finding consensus in the group. Consider whether reaching a common decision is possible. Provide reasons for your choice and explain why other solutions may not seem as good. After listening to other players’ explanations, you can choose to either stick with your initial choice, change it or provide a new solution.</span>
#When you have reached a consensus (or decided not to achieve it), move on to the next case. +
<span lang="EN-GB">These FAIRsharing Collections provide a focal point for the discoverability and visualisation of standards (reporting requirements, terminologies, models/formats, identifier schemas) in the Social Science (147 standards: https://fairsharing.org/6224), Life Sciences (1043 standard: https://fairsharing.org/6225), as well as data policies from publishers and funders (272 policies). These Collections are dynamic, living representations of discipline-specific community’s recommendation and requirements by policy makers. At-a-glance views of other disciplines is available by selecting the relevant subject areas at: https://fairsharing.org/browse/subject</span>
<span lang="EN-GB">Detailed information of the content of these three specific Collection, showing coverages and relationships, is available in their respective narrative reports available at https://doi.org/10.17605/OSF.IO/H4R6M</span> +
<span lang="EN-GB">Based on the podcast, identify '''one realistic change''' you could make, for example:</span><div>
*<span lang="EN-GB">Asking a different research question</span>
</div><div>
*<span lang="EN-GB">Paying attention to gendered or social contexts you currently overlook</span>
</div><div>
*<span lang="EN-GB">Being more critical about funding sources or collaborations</span>
</div><div>
*<span lang="EN-GB">Using a more creative or reflective method in a workshop, class, or research meeting</span>
</div><div>
<span lang="EN-GB">Keep it small and concrete and '''use these questions''' to check whether your reflection is aligned with the podcast’s messages:</span>
</div><div><div>
*<span lang="EN-GB">Can I explain how this change responds to a limitation the podcast's guest describes in current research practices?</span>
</div><div></div><div>
*<span lang="EN-GB">Does this change move beyond simply “adding” a group, and instead question assumptions or structures?</span>
</div><div></div><div>
*<span lang="EN-GB">Does it reflect the idea that research is connected to real lives, values, and consequences?</span>
</div><div>
<span lang="EN-GB">If you can answer “yes” to at least two, you are applying the podcast’s insights.</span>
</div></div> +
Wildlife research can provide important scientific insights into biodiversity conservation, but it also raises ethical concerns regarding animal welfare, ecosystem disruption, and community involvement. As scientists and researchers work to understand animal behaviors and ecosystems, ethical considerations are becoming more critical than ever. In this session, you will learn about key ethical issues that arise in wildlife ecology research. The goal is to help various stakeholders involved in wildlife research integrate ethical thinking into every stage of their work, ensuring that wildlife research advances science while protecting the lives and landscapes it depends on.
Watch the video below on "Ethical Issues in Wildlife Ecology Research?” which gives an overview of practical ethical concerns of wildlife research. +
Now that we have learned about care ethics and the way in which care has been conceptualized and enacted by indigenous evironmental movements let's look at a particular story brought to us by a microplastic researcher. In this story we hear few examples of how care-based and environmentally aware practices can be embedded in the context of research.
Listen to the podcast below. +
When training professors, trainers, mentors, and supervisors in research integrity, the aim is to equip them with the knowledge, skills, and resources to effectively train, mentor and guide their students and junior colleagues in conducting research with integrity.
Resources for professors and senior academics include (please see the last section of the BEYOND trainer guide for an overview of materials divided by topics and target groups):
*The [[Guide:Bbe860a3-56a9-45f7-b787-031689729e52|VIRT2UE]] train the trainer course on research integrity and the training module introducing [[Instruction:D3ee617b-5d9b-4c47-a015-030b0354c9d2|supervision and mentorship practices]].
*The training materials to foster mentorship and supervision developed by [https://h2020integrity.eu/toolkit/tools-researchers-supervisors/ INTEGRITY].
*The training for supervisors and leaders ([https://www.researchethicstraining.net/leadershiplevel leadership level]) developed by the RID-SSISS project.
*The [https://www.academicintegrity.eu/wp/bridge-module-for-supervisors/ module for supervisors] developed by [https://www.academicintegrity.eu/wp/bridge/ BRIDGE project].
Trainers can select one or more of the following tools for evaluating training effectiveness for Professors and Senior Academics:
{| class="wikitable"
|+Table 7: BEYOND Tools for evaluating training effectiveness for academics and experts
!'''Tool for collecting learning outputs'''
!'''Details'''
!'''Analysis instrument **'''
|-
|'''Self-Reflection Form/Compass'''
|App under development, [https://forms.office.com/Pages/ShareFormPage.aspx?id=WXWumNwQiEKOLkWT5i_j7twYn7PlpvpDlgGDpz2LgIdUMk5XRTVYQTVKRFRDWDlHOUdGU1FHTUlFVi4u&sharetoken=03epmvYBRpmfXvpRg9os form] * (for copying and editing)
|SOLO taxonomy, reflection levels, content criteria
|-
|'''Pre-post texts'''
|Collect a short text (e.g. a response to a case or short essay) before the training and after the training
|SOLO taxonomy, reflection levels, content criteria
|-
|'''Learning diaries'''
|Ask learners keep a diary over a certain period, for each submission provide some guiding questions or topics
|SOLO taxonomy, reflection levels, content criteria
|-
|'''Group reports'''
|Ask groups working together to provide a (short) group report (or provide a template with points to work on)
|SOLO taxonomy, content criteria
|-
|'''Group discussions'''
|Monitor the group discussions to evaluate the level of understanding and content discussed (scaffold as appropriate)
|SOLO taxonomy, content criteria
|-
|'''Retention check'''
|After a certain time (few weeks/months) ask learners to provide a short text (analysis of a case, short essay on an ethics topic/question). Compare the levels of understanding to another piece collected during or right after the training.
|SOLO taxonomy, content criteria
|-
|'''Vignettes'''
|This can be used for measuring ethical sensitivity in (non-)training context.
|statistics, EASM (based on the SOLO taxonomy), content criteria
|-
|'''National surveys'''
|Can be used for analysing training-related content in reports and monitoring the display of REI leadership.
|statistics, REI leadership framework
|}
For instance, to measure participants’ reactions during or right after the training, Self-Reflection Form can be used. In addition, if learners worked in groups so their group discussions can be monitored, and if they provided a group-report or pre- and post-texts, the learning process can be evaluated based on the SOLO taxonomy to measure the levels of understanding. Moreover, if possible, a couple of months after the training an additional case study could be given to the same learners, and the content of their analysis could again be evaluated with the SOLO taxonomy. Analysing vignettes and participating in national REI surveys would be relevant for this target group. This kind of effectiveness measure would give a possibility to triangulate the measurement in different time points.
An example for implementation can be found here: [https://helsinkifi.sharepoint.com/:p:/r/sites/BEYONDHelsinkiteam/Shared%20Documents/ENERI%20CR%20material%20example.pptx?d=w41ec1fe12bbb495781dbe81d55a09d71&csf=1&web=1&e=yxKgSC ENERI CR material example.pptx]
'"`UNIQ--nowiki-00000016-QINU`"' The Self-Reflection Form link enables the facilitator to make a copy of the form, which they can then edit, and the data will accumulate on the facilitator’s cloud service (Google or Microsoft).
'"`UNIQ--nowiki-00000017-QINU`"' Analysis instruments are described in WP4.2, later available at the Embassy’s website.
When training professors, trainers, mentors, and supervisors in research integrity, the aim is to equip them with the knowledge, skills, and resources to effectively train, mentor and guide their students and junior colleagues in conducting research with integrity.
Resources for professors and senior academics include (please see the last section of the BEYOND trainer guide for an overview of materials divided by topics and target groups):
*The [[Guide:Bbe860a3-56a9-45f7-b787-031689729e52|VIRT2UE]] train the trainer course on research integrity and the training module introducing [[Instruction:D3ee617b-5d9b-4c47-a015-030b0354c9d2|supervision and mentorship practices]].
*The training materials to foster mentorship and supervision developed by [https://h2020integrity.eu/toolkit/tools-researchers-supervisors/ INTEGRITY].
*The training for supervisors and leaders ([https://www.researchethicstraining.net/leadershiplevel leadership level]) developed by the RID-SSISS project.
*The [https://www.academicintegrity.eu/wp/bridge-module-for-supervisors/ module for supervisors] developed by [https://www.academicintegrity.eu/wp/bridge/ BRIDGE project].
Trainers can select one or more of the following tools for evaluating training effectiveness for Professors and Senior Academics:
{| class="wikitable"
|+Table 7: BEYOND Tools for evaluating training effectiveness for academics and experts
!'''Tool for collecting learning outputs'''
!'''Details'''
!'''Analysis instrument **'''
|-
|'''Self-Reflection Form/Compass'''
|App under development, [https://forms.office.com/Pages/ShareFormPage.aspx?id=WXWumNwQiEKOLkWT5i_j7twYn7PlpvpDlgGDpz2LgIdUMk5XRTVYQTVKRFRDWDlHOUdGU1FHTUlFVi4u&sharetoken=03epmvYBRpmfXvpRg9os form] * (for copying and editing)
|SOLO taxonomy, reflection levels, content criteria
|-
|'''Pre-post texts'''
|Collect a short text (e.g. a response to a case or short essay) before the training and after the training
|SOLO taxonomy, reflection levels, content criteria
|-
|'''Learning diaries'''
|Ask learners keep a diary over a certain period, for each submission provide some guiding questions or topics
|SOLO taxonomy, reflection levels, content criteria
|-
|'''Group reports'''
|Ask groups working together to provide a (short) group report (or provide a template with points to work on)
|SOLO taxonomy, content criteria
|-
|'''Group discussions'''
|Monitor the group discussions to evaluate the level of understanding and content discussed (scaffold as appropriate)
|SOLO taxonomy, content criteria
|-
|'''Retention check'''
|After a certain time (few weeks/months) ask learners to provide a short text (analysis of a case, short essay on an ethics topic/question). Compare the levels of understanding to another piece collected during or right after the training.
|SOLO taxonomy, content criteria
|-
|'''Vignettes'''
|This can be used for measuring ethical sensitivity in (non-)training context.
|statistics, EASM (based on the SOLO taxonomy), content criteria
|-
|'''National surveys'''
|Can be used for analysing training-related content in reports and monitoring the display of REI leadership.
|statistics, REI leadership framework
|}
For instance, to measure participants’ reactions during or right after the training, Self-Reflection Form can be used. In addition, if learners worked in groups so their group discussions can be monitored, and if they provided a group-report or pre- and post-texts, the learning process can be evaluated based on the SOLO taxonomy to measure the levels of understanding. Moreover, if possible, a couple of months after the training an additional case study could be given to the same learners, and the content of their analysis could again be evaluated with the SOLO taxonomy. Analysing vignettes and participating in national REI surveys would be relevant for this target group. This kind of effectiveness measure would give a possibility to triangulate the measurement in different time points.
An example for implementation can be found here: [https://helsinkifi.sharepoint.com/:p:/r/sites/BEYONDHelsinkiteam/Shared%20Documents/ENERI%20CR%20material%20example.pptx?d=w41ec1fe12bbb495781dbe81d55a09d71&csf=1&web=1&e=yxKgSC ENERI CR material example.pptx]
'"`UNIQ--nowiki-0000001E-QINU`"' The Self-Reflection Form link enables the facilitator to make a copy of the form, which they can then edit, and the data will accumulate on the facilitator’s cloud service (Google or Microsoft).
'"`UNIQ--nowiki-0000001F-QINU`"' Analysis instruments are described in WP4.2, later available at the Embassy’s website.
[[File:A group of different people.jpg|center|frameless|600x600px]]
<div><div>
This checklist is intended for use as a supplement to the usual ethics review process regarding matters that are mainly specific to gene editing in humans. All usual aspects of research ethics review will also need to be considered, for instance, the appropriate processing of sensitive data or the involvement of vulnerable persons, like young children. Additionally, the checklist is not exhaustive;there may be other issues pertaining to individual studies that are not included here. Nevertheless, alongside general guidelines and processes, it provides a useful starting point for ethics reviewers.
'''<br />'''
'''1. Somatic or germline gene editing'''
a. Does the project aim to involve somatic or germline gene editing or both?
b. If germline gene editing, does the project comply with national legislation?
c. If germline gene editing, what steps have been undertaken to ensure societal acceptability?
d. If somatic gene editing, could the intervention affect the germline accidentally?
'''<br />'''
'''2. Novelty of gene editing in the project'''
a. Does the project use a novel technique, one that has already been tried in humans, or both?
b. If this is the first time it has been tested in humans, have comprehensive studies been undertaken in vitro and in animals to demonstrate proof of concept and safety?
c. If the technology has already been tested in humans, what do the findings tell us about potential risks and benefits?
'''<br />'''
'''3. Technological and other risks'''
a. Are risks of on-target effects clearly described and addressed?
b. Are risks of off-target effects clearly described and addressed?
c. Are risks of genetic mosaicism clearly described and addressed?
d. Are risks of immunogenicity clearly described and addressed?
e. Are risks associated with the treatment process clearly described and addressed?
f. Are risks of incidental findings clearly described and addressed?
'''4. Enhancement and slippery slope'''
a. Is the gene editing to be used purely for therapeutic purposes?
b. If for therapeutic purposes, are there risks that the technology could also be applied for enhancement purposes?
c. If so, how is this risk addressed?
'''5. Consent'''
a. How is the consent process being managed?
b. How is the option to opt out of the procedure being managed?
c. Is participant information sufficiently comprehensive and comprehensible so that the potential participants (or their legal representatives) will understand enough about the technology to assess the potential for harms and benefits meaningfully?
d. Are the potential participants being offered adequate support and time to reach a decision?
'''6. Data'''
a. What measures and protections are in place to prevent the exploitation of genetic and/or other biological data, for example, for profit?
b. What measures and protections are in place to prevent the misuse exploitation of genetic and/or other biological data and leading to, for example, discrimination, harassment, or marginalisation?
'''7. Equity'''
a. Who are the potential beneficiaries of this study?
b. Will the resultant therapy or other benefits be broadly accessible?
c. How are any matters of potential inequity in access addressed and justified?
'''8. Study justification'''
a. Is there a medical need for this study?
b. Might the same objectives be achieved via less risky and/or less costly methods?
[https://classroom.eneri.eu/sites/default/files/2024-10/Checklist%20for%20gene%20editing.pdf You can download the checklist here]
</div></div><div></div>
Yüz yüze alıştırmaları kolaylaştırıcı olarak yönetme ve kullanma konusundaki deneyim ve uzmanlığınıza dayanarak;katılımcılarla birlikte, kendi hedef gruplarının yeterlilik düzeyi ve öğrenme ihtiyaçları, bu grupların ayrıntılı olarak tartışmak istediği şeyler ve bu konuların alıştırmalara nasıl katılabileceği üzerine tartışın. Katılımcılara İyi Bilim Elçiliği web sitesindeki eğitim materyallerine nasıl erişebileceklerini gösterin. Ayrıca, bu platformdaki toplulukla platformun tartışma sayfası üzerinden nasıl etkileşime geçebileceklerini ve eğitim materyalleri üzerinde nasıl değişiklik talep edebileceklerini de açıklayabilirsiniz. +
[[File:Gene Image6.png|center|frameless|600x600px]]
Gene editing and stem cell research intersect in many different ways. For instance, stem cells, particularly induced pluripotent stem cells, can be derived from patients with genetic diseases. These cells can then be edited using CRISPR-Cas9 to introduce or correct disease-causing mutations, allowing researchers to study the underlying mechanisms of the disease in a controlled laboratory environment. This approach can also be used to screen potential drugs for treating genetic disorders.
Additionally, by combining stem cell technology with gene editing techniques, researchers aim to develop more effective and targeted gene therapies for a wide range of disorders. There are three main types of stem cells, embryonic stem cells, induced pluripotent stem cells and adult stem cells. +
The Data Transmission module is designed to provide a practical experience of how information and data gets misunderstood, distorted, and reinterpreted as it is transmitted between people. This module is based on the ‘childrens game of ‘Telephone’’ which is used to illustrate the importance of tracking down the original source of any story or piece of data, especially if this data is to be used for research or schoolwork. By systematically playing the game and reflecting on its results, the importance of responsibility in research;how and when to verify a message;how to recognize a piece of information as trustworthy;and the role played by trust in data transmission protocols will be highlighted. In addition, the student will develop a personal awareness of how information is transmitted through listening, hearing, and understanding. Students will also develop an appreciation of how this is closely linked to the data protocols of sending, accepting, and processing. +
[[File:GovProc img3.png|center|frameless|600x600px]]
Research governance mechanisms typically include a system for ethics approval of research. For many types of research, including research with humans, human data or animals, ethics review is compulsory, and approval must be granted before data collection can begin.
Reviews are normally undertaken by committees who seek to protect the interests of research participants, the institution, and other stakeholders.
They also ensure that research complies with local and internationally accepted ethics guidelines and legal requirements. In many countries, these committees are known as research ethics committees or RECs. In other countries, they may be known institutional review boards (IRBs);ethics review boards (ERBs);or ethics review panels (ERPs).
RECs are normally comprised of members from a range of disciplines or professional backgrounds to ensure relevant expertise and input from different perspectives. RECs must be free from influence by the researchers, funders or other stakeholders so they can provide an unbiased opinion.
RECs have the power to authorise a project, request modifications or prevent studies that do not conform to accepted ethical norms and standards. There are different types of RECs. For instance, many countries have centralised systems for clinical research that involves patients or healthcare staff. Many universities have their own RECs and some have different RECs for different disciplines.
RECs can have different templates and processes for applications, but all have the same basic requirements. They want to know:
**About the research proposal - why the research is being conducted and exactly what it will involve
**How the research and the researchers are complying with all the relevant legal and ethical requirements
**What risks are associated with the project and how these are being mitigated
RECs have a critical role in upholding ethical standards in research. With their significant combined expertise, they can spot potential problems in research proposals and help to ensure that both researchers and participants are protected from harm.
[[File:G5.png|center|frameless|600x600px]]
Justification for this type of research cannot rest purely upon the assessment of harms and benefits for the participants. There are many other factors to take into account when assessing the ethical permissibility of leading-edge gene editing research with humans. Work through the presentation below to reveal some other important factors that might need to be considered.
The assessment of proposals like this is a complex matter and it may demand input from a wide range of perspectives. There are specific technical questions (for instance, regarding what the therapy will involve and the potential for off-target or on-target effects), as well as broader and more general questions, (for instance, ‘does this research need to be done?’ and ‘who stands to benefit from the research?’). The involvement of young children also demands careful consideration, ‘Is children's participation in the research necessary or could the information be obtained in other ways?’;‘What would be the likely consequences of not involving children?’. These considerations require individual, case by case scrutiny, from a committee with wide-ranging expertise. +
Bell, A., Kelton, J., McDonagh, N., Mladenovic, R., & Morrison, K. (2011). A critical evaluation of the usefulness of a coding scheme to categorise levels of reflective thinking. Assessment & Evaluation in Higher Education, 36(7), 797-815.
Biggs, J. and Collis. K. (1982). Evaluating the quality of learning: The SOLO taxonomy. New York: Academic Press.
Biggs, J. and Tang, C. (2007). Teaching for Quality Learning at University (3rd ed.) Buckingham: SRHE and Open University Press.
Chejara, P., Prieto, L. P., Ruiz-Calleja, A., Rodríguez-Triana, M. J., Shankar, S. K. & Kasepalu, R. (2021). EFAR-MMLA: An Evaluation Framework to Assess and Report Generalizability of Machine Learning Models in MMLA. Sensors, 21(8), 2863.
Conqvist, M. (2024). Research ethics in Swedish dissertations in educational sciences – a matter of confusion. Nordic Conference of PhD Supervision, 2024. Karlstad, Sweden.
DeLozier, S. J., & Rhodes, M. G. (2017). Flipped classrooms: A review of key ideas and recommendations for practice. Educational psychology review, 29(1), 141-151.
Gibbs, P., Costley. C., Armsby, P. and Trakakis, A. (2007). Developing the ethics of worker-researchers through phronesis. Teaching in Higher Education, 12(3), 365–375. https://doi.org/10.1080/13562510701278716
Hannula. M. S., Haataja, E., Löfström, E., Garcia Moreno-Esteva, E., Salminen-Saari, J., & Laine, A. (2022). Advancing video research methodology to capture the processes of social interaction and multimodality. ZDM Mathematics Education 54, 433–443. https://doi.org/10.1007/s11858-021-01323-5
Jordan, J. (2007). Taking the first step towards a moral action: A review of moral sensitivity across domains. Journal of Genetic Psychology, 168, 323–359.
Kember, David. 1999. Determining the level of reflective thinking from students' written journals using a coding scheme based on the work of Mezirow. International Journal of Lifelong Education, 18(1), 18–30. '"`UNIQ--nowiki-00000002-QINU`"'.
Kember, D., Jones, A., Loke, A., McKay, J., Sinclair, K., Tse, H., Webb, C., Wong, F., Wong, M., and Yeung, E. (1999). Determining the level of reflective thinking from students’ written journals using a coding scheme based on the work of Mezirow. International Journal of Lifelong Education, 18(1), 18–30.
Kember, D., Leung, D. Y. P., Jones, A., Loke, A. Y., McKay, J., Sinclair, K., Tse, H., Webb, C., Wong, F. K. Y., Wong, M., and Yeung, E. (2000). Development of a questionnaire to measure the level of reflective thinking. Assessment & Evaluation in Higher Education, 25(4), 381–395. '"`UNIQ--nowiki-00000003-QINU`"'.
Kember, D., McKay, J., Sinclair, K., & Wong, F. K. Y. (2008). A four-category scheme for coding and assessing the level of reflection. Assessment & Evaluation in Higher Education, 33(4), 369-379.
Kirkpatrick, D. L. (1959). Techniques for evaluation training programs. Journal of American Society of Training Directions, 13, 21–26.
Kitchener, K. S. (1985). Ethical principles and ethical decisions in student affairs. In Applied ethics in student services: new directions for student services, number. 30 (pp. 17–29). San Francisco: Jossey-Bass.
Löfström, E., (2012) Students’ Ethical Awareness and Conceptions of Research Ethics. Ethics & Behavior,22(5), 349-361. DOI: 10.1080/10508422.2012.679136
Löfström, E. and Tammeleht, A. (2023). A pedagogy for teaching research ethics and integrity in the social sciences: case-based and collaborative learning. In Academic Integrity in the Social Sciences (pp.127−145).
Mezirow, J. (1991). Transformative dimensions in adult learning. San Francisco: Jossey-Bass.
Mustajoki, H. and Mustajoki, A. (2017). A new approach to research ethics: Using guided dialogue to strengthen research communities. Taylor & Francis. https://doi.org/10.4324/9781315545318 .
Parder, M.-L., Tammeleht, A., Rajando, K. and Simm, K. (2024). Using vignettes for assessing ethical sensitivity in the national research ethics and integrity study. Poster presentation at the World Conference of Research Integrity, 2024. Athens, Greece.
Praslova, L. (2010). Adaptation of Kirkpatrick’s four level model of training criteria to assessment of learning outcomes and program evaluation in Higher Education. Educational Assessment, Evaluation and Accountability, 22, 215–225. https://doi.org/10.1007/s11092-010-9098-7
Salminen, A., & Pitkänen, L. (2020). Finnish research integrity barometer 2018. Finnish National Board on Research Integrity TENK publications, 2-2020.
Steele, L. M., Mulhearn, T. J., Medeiros, K. E., Watts, L. L., Connelly, S., and Mumford, M. D. (2016). How do we know what works? A review and critique of current practices in ethics training evaluation. Accountability in Research, 23(6), 319–350. https://doi.org/10.1080/08989621.2016.1186547 .
Stoesz, B. M. and Yudintseva, A. (2018). Effectiveness of tutorials for promoting educational integrity: a synthesis paper. International Journal for Educational Integrity, 14(6). https://doi.org/10.1007/s40979-018-0030-0 .
Stolper, M. & Inguaggiato, G. (n.d.) Debate and Dialogue. [[Instruction:Ac206152-effd-475b-b8cd-7e5861cb65aa|https://embassy.science/wiki/Instruction:Ac206152-effd-475b-b8cd-7e5861cb65aa]]
Tammeleht, A., Rodríguez-Triana, M. J., Koort, K., & Löfström, E. (2019). Collaborative case-based learning process in research ethics. International Journal of Educational Integrity, 15(1), 6.
Tammeleht, A., Rodríguez-Triana, M.J., Koort, K., Löfström, E. (2020). Scaffolding collaborative case-based learning in research ethics. Journal of Academic Ethics 19, pages 229–252.
Tammeleht, A., Löfström, E. & Rodríguez-Triana, M. J. (2022). Facilitating development of research ethics leadership competencies. International Journal of Educational Integrity. https://doi.org/10.1007/s40979-022-00102-3
Tammeleht, A. (2022). Facilitating the development of research ethics and integrity competencies through scaffolding and collaborative case-based problem-solving. Helsinki: Unigrafia OY, Helsinki Studies in Education.
Tammeleht, A., J. Antoniou, R. de La C. Bernabe, C. Chapin, S. van den Hooff, V. N. Mbanya, M.-L. Parder, A. Sairio, K. Videnoja, and E. Löfström. (submitted). Manifestations of research ethics and integrity leadership in national surveys - Cases of Estonia, Finland, Norway, France and the Netherlands.
Tammeleht, A., Parder, M.-L., Rajando, K., and Simm, K. (forthcoming). Using vignettes for assessing ethical sensitivity in the national research ethics and integrity survey – an example from Estonia.
Thorpe, K. (2004). Reflective learning journals: From concept to practice. Reflective Practice, 5(3), 327–343. https://doi.org/10.1080/1462394042000270655 .
Tucker, B. (2012). The flipped classroom. Education next, 12(1), 82-83.
Well done! You have now learnt about various types (recyclable vs. replaceable) of plastics used in a lab setting and steps towards designing a plastic recycling pipeline for a more management of plastic wastes in a lab. It is obvious that the recycling pipeline suggested by Green Labs Austria has to be tailored to the specific conditions of each lab.
In sum, here are some guidelines that can be adopted for the successful development of a recycling pipeline: '''''(i) communication is key for enabling an easy and sensible sorting of plastic waste;(ii) recycling pipeline should be initially tested with a smaller group before being rolled out to a much larger group;(iii) strive for adaptability by substituting non-recyclable materials with recyclable alternatives.'''''
Moving forward, please use the questions below as a guide to reflect on your next steps. +
10 dakikanın ardından (süre grubun büyüklüğüne göre değişebilir) münazarayı sonlandırın ve gruptan münazara sürecinde gerçekleşenler üzerine fikir yürütmelerini/yorumlamalarda bulunmalarını isteyin. Münazaranın içeriğinden ziyade esas olarak süreç üzerine düşünmelerine yardımcı olun. Katılımcılara münazara yönteminin ayırt edici özelliklerinin neler olduğunu sorun ve bunları yazı panosu üzerinde listeleyin. Fikir yürütme sürecini teşvik etmek amacıyla münazaranın özellikleri olarak gördüğünüz şeyler üzerine sorular sorun (sorgulama davranışını/sorgulama sanatını gösteren bir rol model gibi davranın). Yöneltebileceğiniz sorulardan bazılarını aşağıda bulabilirsiniz:
a. Birbirinizle olan konuşmanızda neler dikkatinizi çekti? Gözlemlediğiniz/deneyimlediğiniz şeyler nelerdi?
b. Katılımcıların duruş ya da ses tonuyla ilgili aklınızda kalan detaylar var mı?
c. İki grup arasındaki etkileşimi nasıl betimlerdiniz? Gözlemlediğiniz/deneyimlediğiniz şeyler nelerdi?
d. Birbirinizi net bir biçimde anladığınızı düşünüyor musunuz?
e. Münazarayı hangi grup önde götürdü ve sizce neden böyle oldu? (Bir grup sürekli karşı grubun üzerine giderken diğeri sürekli kendini savunmak zorunda kalmış olabilir. Katılımcılarla birlikte bu sürece sebep olan şeyin ne olduğunu bulmaya çalışın.)
Münazaranın özelliklerini listeler ve bunlar üzerine yorumlamalarda bulunurlarken katılımcılardan örnekler sunmalarını isteyin (Ne gördünüz ya da deneyimlediniz? Eksik olan neydi?). Ayrıca münazaranın/tartışmanın bazı durumlarda verimli bir yöntem olabileceğini de belirtin: mesela ahlaki ikileme ilişkin ilk hükümleri/ifadeleri/fikirleri hızlı bir biçimde netleştirmek için bu yöntem etkili olabilir. +
a. Divide participants into small group and ask each group to select a moderator/rapporteur (for the plenary session at the end of the exercise).
b. Ask each participant to present (in 1 minute) the situation to their subgroup by describing the moral uncertainty or concern they experienced, including the specific virtue important in that situation. At this point participants should not disclose how the situation ended (what did they do in that situation).
a. Ask each subgroup to choose (e.g. by voting) which situation and virtue they want to reflect upon as a group among the ones presented. Invite participants in each group to place themselves in the selected situation and ask the case owner any factual clarification questions (i.e. no judgments, advice, or conclusions),
b. Invite the participants to choose which virtue they think is at stake in the presented situation and to write the three notes themselves for their virtue (by using the handout 2 available in the practical tips),
c. After each participant completes the form, ask them to briefly share their notes with their subgroups by engaging in a group reflection/dialogue about differences/similarities related to the virtues and behaviors which were chosen. Invite participants to reflect on:
<br />
*What is interesting or surprising?,
*Did you have different virtues in the first place?,
*What did you learn from each other with respect to how the middle position was described and reflected upon?,
*In which way were the virtues of the European Code of Conduct for Research Integrity represented in the virtues mentioned by the participants? +
'''''Target audience''': secondary school students, doctoral students and early career researchers, senior academic and RE/RI experts.''
Empowering researchers to behave responsibly in research is at the heart of the [https://h2020integrity.eu/ INTEGRITY] course and each individual module. For this purpose several modules each addressing a different research integrity research ethics topic have been developed.
These are specifically designed for 4 different target audiences:
#[https://h2020integrity.eu/toolkit/tools-high-school-students/teachers-guide-for-secondary-school/ high school students]
#[https://h2020integrity.eu/toolkit/tools-undergraduate-students/integrity-games/ undergraduate students]
#[https://h2020integrity.eu/toolkit/tools-phd-students/ PhD students and]
#[https://h2020integrity.eu/toolkit/tools-researchers-supervisors/ researchers and supervisors].
The training materials are presented alongside a [https://h2020integrity.eu/toolkit/tools-high-school-students/teachers-guide-for-secondary-school/ Teacher Guide] +
[[File:Ge3Image4.png|center|frameless|600x600px]]
The ecologist’s perspective
As an ecologist, I have serious concerns about the proposal to use gene drive technology to eradicate malaria-carrying mosquitoes. While the goal of eliminating malaria is undeniably important, the potential risks to ecosystems, biodiversity, and the natural world need to be carefully considered before taking such a drastic step.
One of my main concerns is biodiversity disruption. Mosquitoes are not just pests;they play important roles in ecosystems. For example, male mosquitoes are pollinators for some plants, and many species of birds, fish, and bats rely on mosquitoes as a food source. If we wipe out a mosquito species, we could disrupt food chains in ways we can’t fully predict. Ecosystems are incredibly complex and fragile, so the extinction of one species can lead to a chain reaction, potentially causing other species to disappear. In regions that are already struggling with food security, this kind of disruption could lead to further ecological damage and even food shortages. The consequences could be devastating for both nature and the people who rely on it.
Then there’s the issue of gene flow to non-target species. In the wild, mosquitoes sometimes interbreed with closely related species. There’s a real risk that the gene drive could spread to non-target mosquitoes, including those that don’t carry malaria. If that happens, we could see a dramatic drop in mosquito populations beyond what’s intended, affecting species that depend on them for food or pollination. Imagine what would happen if all mosquito species suddenly disappeared—we’re talking about a potential collapse of ecosystems that rely on them, creating ripple effects throughout the environment.
And let’s not forget about ecosystem irreversibility. Once these gene drives are released into the wild, they’re self-propagating, meaning they spread on their own. If something goes wrong, there’s no way to take it back. We can’t hit an “undo” button on nature. This kind of irreversible interference with ecosystems raises ethical questions about how much we should be tampering with the natural world. We could be altering the balance of mosquito populations forever, and that’s a weighty decision to make.
We need to be absolutely sure of the impacts before moving forward, because once this technology is out there, there’s no way to reverse it. We could be making changes to the natural world that we don’t fully understand, with consequences that could last for generations.
