Difference between revisions of "Instruction:4bc3edc0-d493-4f35-a742-e4cee5569562"

From The Embassy of Good Science
Line 15: Line 15:
 
{{Instruction Step Trainee
 
{{Instruction Step Trainee
 
|Instruction Step Title=Pilot 2 -  Reproducibility Management Plan (RMP)
 
|Instruction Step Title=Pilot 2 -  Reproducibility Management Plan (RMP)
|Instruction Step Text=<span lang="EN-US">The Reproducibility Management Plan (RMP) Pilot aims to create a prototype of key thematic subjects and questions that will serve as the starting point to support reproducibility at the planning stage of research. Work involves defining what an RMP is, integrating it into the ARGOS service, and testing its effectiveness with feedback from the community. The pilot addresses researchers, beneficiaries and funders for its adoption.</span>
+
|Instruction Step Text=<span lang="EN-US">The [https://osf.io/pn27g Reproducibility Management Plan (RMP) Pilot] aims to create a prototype of key thematic subjects and questions that will serve as the starting point to support reproducibility at the planning stage of research. Work involves defining what an RMP is, integrating it into the ARGOS service, and testing its effectiveness with feedback from the community. The pilot addresses researchers, beneficiaries and funders for its adoption.</span>
 
}}
 
}}
 
{{Instruction Step Trainee
 
{{Instruction Step Trainee
Line 23: Line 23:
 
{{Instruction Step Trainee
 
{{Instruction Step Trainee
 
|Instruction Step Title=Pilot 4  - The Reproducibility Checklist
 
|Instruction Step Title=Pilot 4  - The Reproducibility Checklist
|Instruction Step Text=<span lang="EN-US">The Reproducibility Checklist for Computational Social Science Research provides a structure of well-defined checklists and templates that can help review data and code reproducibility for computational social scientists. The checklists and review templates cater for the specific needs of the three research phases, i.e., planning and data collection, process and analysis and finally sharing and archiving the research resources. It results in building trust and authority in the social science research community.</span>
+
|Instruction Step Text=<span lang="EN-US"><span lang="EN-GB">The <span lang="DE">[https://eur04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2FGESIS-Methods-Hub%2Fguidelines-for-methods%2Fblob%2Fmain%2FREADME-template.md&data=05%7C02%7Cb.leitner%40amsterdamumc.nl%7C98306bdc1d14436a474308de062e488a%7C68dfab1a11bb4cc6beb528d756984fb6%7C0%7C0%7C638954993173754302%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&sdata=8wKJxWZb4qfVKLG%2F%2Bgn4HWuQCxrkIvrUyENg%2FEEwiWM%3D&reserved=0 Reproducibility Checklist for Computational Social Science Research]</span> provides a simple, structured framework to enhance the reproducibility of computational methods shared on [https://eur04.safelinks.protection.outlook.com/?url=http%3A%2F%2Fmethodshub.gesis.org%2F&data=05%7C02%7Cb.leitner%40amsterdamumc.nl%7C98306bdc1d14436a474308de062e488a%7C68dfab1a11bb4cc6beb528d756984fb6%7C0%7C0%7C638954993173788095%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&sdata=fZvqBgWjUcBuH%2FNJKGAl8WYPcNRsRQujUc8k%2FbPcG3Q%3D&reserved=0 Methods Hub]. Designed for minimal effort and maximum usability, the checklist helps researchers document and share their data and code in a consistent, verifiable way that fosters trust and collaboration within the social science community. This pilot contributes to more efficient, transparent, and reproducible computational social science by integrating practical usability with clear documentation standards.</span></span>
 
}}
 
}}
 
{{Instruction Step Trainee
 
{{Instruction Step Trainee
Line 43: Line 43:
 
{{Instruction Step Trainee
 
{{Instruction Step Trainee
 
|Instruction Step Title=Pilot 8- An Editorial Reference Handbook for Reproducibility and FAIRness
 
|Instruction Step Title=Pilot 8- An Editorial Reference Handbook for Reproducibility and FAIRness
|Instruction Step Text=<span lang="EN-US">The Editorial Reference Handbook contributes towards a common understanding and what is required to assist reproducibility and FAIRness. The Handbook includes two components. A structured section includes an educational and practical set of checks, defined by reviewing existing material, harmonising and operationalising them. Some journals have internal checks, but the type, richness and stringency vary, and there is little/no consensus among publishers. A narrative component with a general framework helps improve internal processes, defined by describing an ideal process where checks should be applied. There are a variety of internal processes, and how, when and by whom these checks are done vary, and this can also affect the results.</span>
+
|Instruction Step Text=<div>
 +
<span lang="EN-US">The [https://publishers.fairassist.org/ <u>Editorial Reference Handbook</u>] informs and assists journals to operationalise a set of checks necessary to make the data underlying published research findings more findable, accessible, interoperable and reusable (FAIR) to underpin reproducibility. The Handbook targets in-house staff managing the manuscripts primarily, but it will also benefit reviewers, authors and even those providing services to publishers by making the fundamental checks and requirements transparent and understandable.</span>
 +
</div><div>
 +
<span lang="EN-US">The Handbook includes three interlinked components: a checklist of 12 elements, a flowchart that outlines the ideal internal manuscript submission workflow (who should perform the checks and when), and a guidance to help users to implement the checks effectively, with definitions and tips.</span>  
 +
</div><div>
 +
<span lang="EN-US">The Handbook fills a gap, because no common guidance existed on the practical implementation of these checks across a complex publishing workflow and the variety of individuals and teams involved. This pilot saw the participation of more than 30 individuals, from more than 20 journals of major publishers: CUP, Cell Press, EMBO Press, Taylor & Francis, GigaScience Press, OUP, PLOS, Springer Nature.</span>
 +
</div>
 
}}
 
}}
 
{{Instruction Remarks Trainee}}
 
{{Instruction Remarks Trainee}}

Revision as of 08:40, 9 October 2025

New Tools and Services for Reproducibility

Instructions for:TraineeTrainer
Related Initiative
Goal
TIER2 aims to better understand the causes, consequences and possible solutions of perceived poor levels of reproductivity of research across research contexts. With a focus on social, life, and computer sciences, as well as research publishers and funders, the project aims to increase awareness, build capacity, and propose innovative solutions sensitive to varied research cultures. Central to its approach are eight Pilot activities designed to develop, implement, and evaluate new reproducibility-related tools and practices.
Duration (hours)
1
For whom is this important?
1
Pilot 1 - Decision Aid

TIER2's Decision Aid provides clarity on the meaning, relevance, and feasibility of ‘reproducibility’ for researchers to aid them in identifying what type of reproducibility is relevant for their research and indicate what they must consider regarding how feasible such ‘reproducibility’ would be for them.

2
Pilot 2 - Reproducibility Management Plan (RMP)

The Reproducibility Management Plan (RMP) Pilot aims to create a prototype of key thematic subjects and questions that will serve as the starting point to support reproducibility at the planning stage of research. Work involves defining what an RMP is, integrating it into the ARGOS service, and testing its effectiveness with feedback from the community. The pilot addresses researchers, beneficiaries and funders for its adoption.

3
Pilot 3 - Reproducible Workflows

Reproducible Workflows has adapted the SCHEMA open-source platform for reproducible workflows in life and computer sciences by leveraging software containerisation, workflow description languages (CWL, Snakemake), and experiment packaging specifications (RO-crate), particularly emphasising machine learning in computer science.

4
Pilot 4 - The Reproducibility Checklist

The Reproducibility Checklist for Computational Social Science Research provides a simple, structured framework to enhance the reproducibility of computational methods shared on Methods Hub. Designed for minimal effort and maximum usability, the checklist helps researchers document and share their data and code in a consistent, verifiable way that fosters trust and collaboration within the social science community. This pilot contributes to more efficient, transparent, and reproducible computational social science by integrating practical usability with clear documentation standards.

5
Pilot 5 - The Reproducibility Promotion Plan for Funders (RPP)

The Reproducibility Promotion Plan for Funders (RPP) has developed a policy template with recommendations for funders to foster reproducible practices both in the research they fund across three key areas of funding work: evaluation and monitoring, policy and definitions, and incentives. The RPP provides actionable recommendations and best practice examples that funders and funding institutions can adapt to meet their specific needs.

All drafts of the RPP can be found on OSF.

RPP4F One Pager.jpg

6
Pilot 6 - The Reproducibility Monitoring Dashboard

The Reproducibility Monitoring Dashboard hosts tools that enable funding agencies to track and monitor the reusability of research artifacts across various projects, programs, topics, and disciplines. This auto-generated dashboard assesses the impacts of policies related to data and code sharing.  

7
Pilot 7 - Editorial Workflows to Increase Data Sharing

This tool is aimed at increasing data sharing in published work. Data sharing is an important building block for increased reproducibility & transparency, but current rates of sharing are low.

8
Pilot 8- An Editorial Reference Handbook for Reproducibility and FAIRness

The Editorial Reference Handbook informs and assists journals to operationalise a set of checks necessary to make the data underlying published research findings more findable, accessible, interoperable and reusable (FAIR) to underpin reproducibility. The Handbook targets in-house staff managing the manuscripts primarily, but it will also benefit reviewers, authors and even those providing services to publishers by making the fundamental checks and requirements transparent and understandable.

The Handbook includes three interlinked components: a checklist of 12 elements, a flowchart that outlines the ideal internal manuscript submission workflow (who should perform the checks and when), and a guidance to help users to implement the checks effectively, with definitions and tips.  

The Handbook fills a gap, because no common guidance existed on the practical implementation of these checks across a complex publishing workflow and the variety of individuals and teams involved. This pilot saw the participation of more than 30 individuals, from more than 20 journals of major publishers: CUP, Cell Press, EMBO Press, Taylor & Francis, GigaScience Press, OUP, PLOS, Springer Nature.

Steps

Other information

Cookies help us deliver our services. By using our services, you agree to our use of cookies.
5.2.9