Difference between revisions of "Theme:Eea53f3b-1bf1-4660-8c04-5e2e9aaebdb5"
| (8 intermediate revisions by the same user not shown) | |||
| Line 1: | Line 1: | ||
{{Theme | {{Theme | ||
|Theme Type=Principles & Aspirations | |Theme Type=Principles & Aspirations | ||
| − | |Has Parent Theme=Theme: | + | |Has Parent Theme=Theme:Eea53f3b-1bf1-4660-8c04-5e2e9aaebdb5 |
| − | |Title= | + | |Title=Prioritising interventions and reproducibility measures to improve research reproducibility: a Delphi consultation method |
|Is About=The '''iRISE (Improving Reproducibility in Science in Europe)''' project is a Horizon Europe initiative aimed at strengthening the reproducibility and integrity of research across disciplines. It brings together stakeholders from social sciences, natural sciences, and biomedical sciences to identify, prioritise, and implement practices and practical tools that enhance research quality. Grounded in the principles of Responsible Research and Innovation (RRI), iRISE works to ensure that interventions to improve reproducibility are evidence-based, fit for purpose, and tailored to the needs of different stakeholder communities. | |Is About=The '''iRISE (Improving Reproducibility in Science in Europe)''' project is a Horizon Europe initiative aimed at strengthening the reproducibility and integrity of research across disciplines. It brings together stakeholders from social sciences, natural sciences, and biomedical sciences to identify, prioritise, and implement practices and practical tools that enhance research quality. Grounded in the principles of Responsible Research and Innovation (RRI), iRISE works to ensure that interventions to improve reproducibility are evidence-based, fit for purpose, and tailored to the needs of different stakeholder communities. | ||
| Line 8: | Line 8: | ||
Here on The Embassy of Good Science, we share the Delphi process outputs as part of a living, community-informed knowledge base. The data presented includes two priority lists: prioritised interventions and reproducibility measures. By making these results openly available, we aim to support ongoing dialogue, encourage community contribution, and facilitate the uptake, adaptation, and continuous improvement of practices that strengthen research reproducibility across disciplines and stakeholder groups. | Here on The Embassy of Good Science, we share the Delphi process outputs as part of a living, community-informed knowledge base. The data presented includes two priority lists: prioritised interventions and reproducibility measures. By making these results openly available, we aim to support ongoing dialogue, encourage community contribution, and facilitate the uptake, adaptation, and continuous improvement of practices that strengthen research reproducibility across disciplines and stakeholder groups. | ||
| − | |Important For= | + | |Important For=All stakeholders; Early-career researchers (PhD students, post-docs); Editors; Funders; Institutions; Policymakers; Public; Publishers; Reproducibility networks; Research Institutions and Universities; Research Performing Organisations (RPOs); Researchers |
| − | |Has Best Practice= | + | |Has Best Practice=Items were rated on a 10-point Likert scale. Scores of 8–10 were classified as '''high priority''', and consensus was defined a priori as at least 70% of panellists assigning a score within this high-priority range. |
| − | + | <u>Reproducibility measures are:</u> | |
| − | + | #Methodological quality (9.03) | |
| − | + | #Reporting quality (9.00) | |
| − | + | #Code and dana availability and re-use (8.66) | |
| − | + | #Computational reproducibility (8.52) | |
| − | + | #Transparency of research plan (8.47) | |
| − | + | #Reproducible workflow practices (8.34) | |
| − | + | #Trial registration (8.21) | |
| − | + | #Materials availability and re-use (8.16) | |
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
<u>Interventions:</u> | <u>Interventions:</u> | ||
| − | + | # Data management training (8.52) | |
| − | + | # Data quality checks/feedback (8.33) | |
| − | + | # Statistical training (8.33) | |
| − | + | # Data sharing policy/g uideline (8.22) | |
| − | + | # Protocol/trial registration (8.15) | |
| − | + | # Reproducible code/analysis training (8.11) | |
| − | |||
| − | |||
| − | |||
| − | |||
| − | |||
|Has Detail='''Round 1''' | |Has Detail='''Round 1''' | ||
| Line 50: | Line 40: | ||
'''Final Round''' | '''Final Round''' | ||
| − | The final round consisted of an online meeting with eight selected panellists (two researchers, two editors, two publishers, one funder, and one policymaker). During this session, the participants revisited highest-scoring interventions that had not reached consensus in previous rounds. The panel then had a task to review the ranking order of the two prioritised lists. After the final round there were no changes to the prioritised lists | + | The final round consisted of an online meeting with eight selected panellists (two researchers, two editors, two publishers, one funder, and one policymaker). During this session, the participants revisited highest-scoring interventions that had not reached consensus in previous rounds. The panel then had a task to review the ranking order of the two prioritised lists. After the final round there were no changes to the prioritised lists. |
}} | }} | ||
{{Related To | {{Related To | ||
| − | |Related To Resource= | + | |Related To Resource=Resource:Eca5fe9c-1ecc-457e-8e8e-0c85560518cc; Resource:04b3bf40-1488-4d3c-9255-3d21e49688d3; Resource:9b3be28c-343a-41aa-8313-232e6eebc113; Resource:H5P-528; Resource:B5efc1b9-af8c-4dd5-9b6b-0e326a150b9e; Resource:H5P-369 |
|Related To Theme=Theme:639528ea-d2c2-4565-8b44-15bb9646f74b | |Related To Theme=Theme:639528ea-d2c2-4565-8b44-15bb9646f74b | ||
}} | }} | ||
{{Tags | {{Tags | ||
| − | |Involves= | + | |Involves=Dora Pejdo; Ivan Buljan; Ana Marušić |
}} | }} | ||
Latest revision as of 12:09, 19 February 2026
Prioritising interventions and reproducibility measures to improve research reproducibility: a Delphi consultation method
What is this about?
The iRISE (Improving Reproducibility in Science in Europe) project is a Horizon Europe initiative aimed at strengthening the reproducibility and integrity of research across disciplines. It brings together stakeholders from social sciences, natural sciences, and biomedical sciences to identify, prioritise, and implement practices and practical tools that enhance research quality. Grounded in the principles of Responsible Research and Innovation (RRI), iRISE works to ensure that interventions to improve reproducibility are evidence-based, fit for purpose, and tailored to the needs of different stakeholder communities.
Within iRISE, our work focuses on stakeholder engagement and the prioritisation of interventions to improve reproducibility. Building on a comprehensive scoping review (WP2), we use a structured Delphi consultation process to reach cross-disciplinary consensus on which practices and tools should be adopted directly and which require adaptation before implementation. The Delphi method involves iterative rounds of expert consultation, enabling participants to review anonymised group feedback and refine their responses until consensus is achieved. This approach ensures transparency, inclusivity, and community alignment in setting priorities.
Here on The Embassy of Good Science, we share the Delphi process outputs as part of a living, community-informed knowledge base. The data presented includes two priority lists: prioritised interventions and reproducibility measures. By making these results openly available, we aim to support ongoing dialogue, encourage community contribution, and facilitate the uptake, adaptation, and continuous improvement of practices that strengthen research reproducibility across disciplines and stakeholder groups.For whom is this important?
What are the best practices?
Items were rated on a 10-point Likert scale. Scores of 8–10 were classified as high priority, and consensus was defined a priori as at least 70% of panellists assigning a score within this high-priority range.
Reproducibility measures are:
- Methodological quality (9.03)
- Reporting quality (9.00)
- Code and dana availability and re-use (8.66)
- Computational reproducibility (8.52)
- Transparency of research plan (8.47)
- Reproducible workflow practices (8.34)
- Trial registration (8.21)
- Materials availability and re-use (8.16)
Interventions:
- Data management training (8.52)
- Data quality checks/feedback (8.33)
- Statistical training (8.33)
- Data sharing policy/g uideline (8.22)
- Protocol/trial registration (8.15)
- Reproducible code/analysis training (8.11)
In Detail
Round 1
In the first round, the panelists scored reproducibility measures and interventions on a scale from 1 to 10 on Likert scale. Items scoring 8–10 with at least 70% agreement were added to the priority list. Items scoring 1–3 with at least 70% agreement were discarded. The panelists could also comment on their scores. The reproducibility measures and interventions that scored 4-7, with 70% agreement, were again revised in the second round.
Round 2
The panelists reviewed the reproducibility measures and interventions that scored 4–7 with 70% agreement. The rankings and anonymized comments from the first round were shared to help participants reassess their scores.
Final Round
The final round consisted of an online meeting with eight selected panellists (two researchers, two editors, two publishers, one funder, and one policymaker). During this session, the participants revisited highest-scoring interventions that had not reached consensus in previous rounds. The panel then had a task to review the ranking order of the two prioritised lists. After the final round there were no changes to the prioritised lists.contributed to this theme. Latest contribution was Feb 19, 2026
