Difference between revisions of "Theme:Eea53f3b-1bf1-4660-8c04-5e2e9aaebdb5"

From The Embassy of Good Science
(Created page with "{{Theme |Theme Type=Principles & Aspirations |Has Parent Theme=Theme:520b3bc7-a6ab-4617-95f2-89c9dee31c53 |Title=Improving reproducibility in science |Is About=The '''iRISE (I...")
 
Line 8: Line 8:
  
 
Here on The Embassy of Good Science, we share the Delphi process outputs as part of a living, community-informed knowledge base. The data presented includes two priority lists: prioritised interventions and reproducibility measures. By making these results openly available, we aim to support ongoing dialogue, encourage community contribution, and facilitate the uptake, adaptation, and continuous improvement of practices that strengthen research reproducibility across disciplines and stakeholder groups.
 
Here on The Embassy of Good Science, we share the Delphi process outputs as part of a living, community-informed knowledge base. The data presented includes two priority lists: prioritised interventions and reproducibility measures. By making these results openly available, we aim to support ongoing dialogue, encourage community contribution, and facilitate the uptake, adaptation, and continuous improvement of practices that strengthen research reproducibility across disciplines and stakeholder groups.
|Important For=Academic administrators and research leaders; Academic institutions; Academics and researchers; All stakeholders engaged in science policy; All stakeholders in research; All stakeholders in science communication; All stakeholders in science funding; All stakeholders in science implementation; All stakeholders in science oversight; Anyone interested in the broader culture of research integrity; Collaborating researchers; Computer scientists; resesarch institutes; students, researchers, and research ethics reviewers; reserach institutions; researchers in collaborative projects; research leaders; Data managers and research infrastructure teams; Authors; ENRIO member organisations; EU policymakers and legislators; Early-career researchers (PhD students, post-docs); EU policymakers, regulators, and standardisation bodies; Early-career researchers and students; Editors; Educational institutions and research organisations; Ethics and integrity bodies; Funders; Funding bodies and policymakers; Funding Agencies and EU Research Programs; Funding institutions; Funding agencies and research integrity officers; Government science and innovation policy makers; Health and Life Scientists; Health researchers and clinical research staff; Institutional Publishing Service Providers (IPSPs); Journal editors and academic publishers; Journal editors and scholarly societies; Journal publishers; Open Science Coordinators and Research Managers; Open access advocates and researchers; Peer-reviewers; PhD students, trainees, and visiting researchers; Policy Makers and Funding Agencies; Policy Makers and Science Governance Bodies; Policy makers and regulatory authorities; Prospective Research Integrity Trainers; Publishers, journals, and editors; Research Administrators; Research Funding Organisations (RFOs); Research Integrity Officers; Research Performing Organisations (RPOs); Research partners in the Global South and North; Researchers, Research Ethics Communities and Research Integrity Offices; Science Advisory Bodies; Scientists, researchers and academic institutions; Senior researchers, Supervisors; Universities and research institutions responsible for integrity frameworks
+
|Important For=Academic administrators and research leaders; Academic institutions; Academics and researchers; All stakeholders engaged in science policy; All stakeholders in research; All stakeholders in science communication; All stakeholders in science funding; All stakeholders in science implementation; All stakeholders in science oversight; Anyone interested in the broader culture of research integrity; Authors; Collaborating researchers; Computer scientists; Data managers and research infrastructure teams; ENRIO member organisations; EU policymakers and legislators; EU policymakers, regulators, and standardisation bodies; Early-career researchers (PhD students, post-docs); Early-career researchers and students; Editors; Educational institutions and research organisations; Ethics and integrity bodies; Funders; Funding Agencies and EU Research Programs; Funding agencies and research integrity officers; Funding bodies and policymakers; Funding institutions; Government science and innovation policy makers; Health and Life Scientists; Health researchers and clinical research staff; Institutional Publishing Service Providers (IPSPs); Journal editors and academic publishers; Journal editors and scholarly societies; Journal publishers; Open Science Coordinators and Research Managers; Open access advocates and researchers; Peer-reviewers; PhD students, trainees, and visiting researchers; Policy Makers and Funding Agencies; Policy Makers and Science Governance Bodies; Policy makers and regulatory authorities; Prospective Research Integrity Trainers; Publishers, journals, and editors; Research Administrators; Research Funding Organisations (RFOs); Research Integrity Officers; Research Performing Organisations (RPOs); Research partners in the Global South and North; Researchers, Research Ethics Communities and Research Integrity Offices; Science Advisory Bodies; Scientists, researchers and academic institutions; Senior researchers, Supervisors; Universities and research institutions responsible for integrity frameworks; research leaders; researchers in collaborative projects; reserach institutions; resesarch institutes; students, researchers, and research ethics reviewers
 
|Has Best Practice=<u>Reproducibility measures are:</u>
 
|Has Best Practice=<u>Reproducibility measures are:</u>
  
Line 53: Line 53:
 
}}
 
}}
 
{{Related To
 
{{Related To
|Related To Resource=Resource:H5P-528; Resource:9b3be28c-343a-41aa-8313-232e6eebc113
+
|Related To Resource=Resource:9b3be28c-343a-41aa-8313-232e6eebc113; Resource:H5P-528; Resource:Eca5fe9c-1ecc-457e-8e8e-0c85560518cc; Resource:04b3bf40-1488-4d3c-9255-3d21e49688d3; Resource:H5P-369; Resource:B5efc1b9-af8c-4dd5-9b6b-0e326a150b9e
 
|Related To Theme=Theme:639528ea-d2c2-4565-8b44-15bb9646f74b
 
|Related To Theme=Theme:639528ea-d2c2-4565-8b44-15bb9646f74b
 
}}
 
}}
 
{{Tags
 
{{Tags
|Involves=Ana Marušić; Ivan Buljan; Dora Pejdo
+
|Involves=Ana Marušić; Dora Pejdo; Ivan Buljan
 
}}
 
}}

Revision as of 11:33, 19 February 2026

Improving reproducibility in science

What is this about?

The iRISE (Improving Reproducibility in Science in Europe) project is a Horizon Europe initiative aimed at strengthening the reproducibility and integrity of research across disciplines. It brings together stakeholders from social sciences, natural sciences, and biomedical sciences to identify, prioritise, and implement practices and practical tools that enhance research quality. Grounded in the principles of Responsible Research and Innovation (RRI), iRISE works to ensure that interventions to improve reproducibility are evidence-based, fit for purpose, and tailored to the needs of different stakeholder communities.

Within iRISE, our work focuses on stakeholder engagement and the prioritisation of interventions to improve reproducibility. Building on a comprehensive scoping review (WP2), we use a structured Delphi consultation process to reach cross-disciplinary consensus on which practices and tools should be adopted directly and which require adaptation before implementation. The Delphi method involves iterative rounds of expert consultation, enabling participants to review anonymised group feedback and refine their responses until consensus is achieved. This approach ensures transparency, inclusivity, and community alignment in setting priorities.

Here on The Embassy of Good Science, we share the Delphi process outputs as part of a living, community-informed knowledge base. The data presented includes two priority lists: prioritised interventions and reproducibility measures. By making these results openly available, we aim to support ongoing dialogue, encourage community contribution, and facilitate the uptake, adaptation, and continuous improvement of practices that strengthen research reproducibility across disciplines and stakeholder groups.

For whom is this important?

Academic administrators and research leadersAcademic institutionsAcademics and researchersAll stakeholders engaged in science policyAll stakeholders in researchAll stakeholders in science communicationAll stakeholders in science fundingAll stakeholders in science implementationAll stakeholders in science oversightAnyone interested in the broader culture of research integrityAuthorsCollaborating researchersComputer scientistsData managers and research infrastructure teamsENRIO member organisationsEU policymakers and legislatorsEU policymakers, regulators, and standardisation bodiesEarly-career researchers (PhD students, post-docs)Early-career researchers and studentsEditorsEducational institutions and research organisationsEthics and integrity bodiesFundersFunding Agencies and EU Research ProgramsFunding agencies and research integrity officersFunding bodies and policymakersFunding institutionsGovernment science and innovation policy makersHealth and Life ScientistsHealth researchers and clinical research staffInstitutional Publishing Service Providers (IPSPs)Journal editors and academic publishersJournal editors and scholarly societiesJournal publishersOpen Science Coordinators and Research ManagersOpen access advocates and researchersPeer-reviewersPhD students, trainees, and visiting researchersPolicy Makers and Funding AgenciesPolicy Makers and Science Governance BodiesPolicy makers and regulatory authoritiesProspective Research Integrity TrainersPublishers, journals, and editorsResearch AdministratorsResearch Funding Organisations (RFOs)Research Integrity OfficersResearch Performing Organisations (RPOs)Research partners in the Global South and NorthResearchers, Research Ethics Communities and Research Integrity OfficesScience Advisory BodiesScientists, researchers and academic institutionsSenior researchers, SupervisorsUniversities and research institutions responsible for integrity frameworksresearch leadersresearchers in collaborative projectsreserach institutionsresesarch institutesstudents, researchers, and research ethics reviewers

What are the best practices?

Reproducibility measures are:

-         Methodological quality (9.03) - 89.04 %

-         Reporting quality (9.00) - 87.67 %

-         Code and dana availability and re-use (8.66) - 84.93 %

-         Computational reproducibility (8.52) - 76.71 %

-         Transparency of research plan (8.47) - 73.97 %

-         Reproducible workflow practices (8.34) - 73.97 %

-         Trial registration (8.21) - 78. 08 %

-         Materials availability and re-use (8.16) - 71.23 %

Interventions:

-         Data management training (8.52) - 75.34 %

-         Data quality checks/feedback (8.33) - 72.60 %

-         Statistical training (8.33) - 71.23 %

-         Data sharing policy/guideline (8.22) - 72.60 %

-         Protocol/trial registration (8.15)- 73.97 %

-         Reproducible code/analysis training (8.11) - 71.23%  

In Detail

Round 1

In the first round, the panelists scored reproducibility measures and interventions on a scale from 1 to 10 on Likert scale. Items scoring 8–10 with at least 70% agreement were added to the priority list. Items scoring 1–3 with at least 70% agreement were discarded. The panelists could also comment on their scores. The reproducibility measures and interventions that scored 4-7, with 70% agreement, were again revised in the second round.

Round 2

The panelists reviewed the reproducibility measures and interventions that scored 4–7 with 70% agreement. The rankings and anonymized comments from the first round were shared to help participants reassess their scores.

Final Round

The final round consisted of an online meeting with eight selected panellists (two researchers, two editors, two publishers, one funder, and one policymaker). During this session, the participants revisited highest-scoring interventions that had not reached consensus in previous rounds. The panel then had a task to review the ranking order of the two prioritised lists. After the final round there were no changes to the prioritised lists
Cookies help us deliver our services. By using our services, you agree to our use of cookies.
5.3.4