Difference between revisions of "Instruction:4bc3edc0-d493-4f35-a742-e4cee5569562"

From The Embassy of Good Science
 
(7 intermediate revisions by the same user not shown)
Line 9: Line 9:
 
{{Instruction Steps Foldout Trainee}}
 
{{Instruction Steps Foldout Trainee}}
 
{{Instruction Perspective Trainee}}
 
{{Instruction Perspective Trainee}}
 +
{{Instruction Step Trainee
 +
|Instruction Step Title=Pilot 1 - Decision Aid
 +
|Instruction Step Text=<span lang="EN-US">TIER2's Decision Aid provides clarity on the meaning, relevance, and feasibility of ‘reproducibility’ for researchers to aid them in identifying what type of reproducibility is relevant for their research and indicate what they must consider regarding how feasible such ‘reproducibility’ would be for them.</span>
 +
}}
 +
{{Instruction Step Trainee
 +
|Instruction Step Title=Pilot 2 -  Reproducibility Management Plan (RMP)
 +
|Instruction Step Text=<span lang="EN-US">The [https://osf.io/pn27g Reproducibility Management Plan (RMP) Pilot] aims to create a prototype of key thematic subjects and questions that will serve as the starting point to support reproducibility at the planning stage of research. Work involves defining what an RMP is, integrating it into the ARGOS service, and testing its effectiveness with feedback from the community. The pilot addresses researchers, beneficiaries and funders for its adoption.</span><div>
 +
Please find the integrated ARGOS tool here: <u>https://argos.openaire.eu/portal/</u><u>.</u>
 +
 +
 +
<u><span lang="EN-US">Tutorials on OpenPlato:</span></u>
 +
 +
# <u>'''<span lang="EN-US">ARGOS Service for Admins</span>'''<span lang="EN-US">: https://openplato.eu/course/view.php?id=150</span></u>
 +
# <u>'''<span lang="EN-US">ARGOS Info Pack</span>'''<span lang="EN-US">: https://openplato.eu/course/view.php?id=547</span></u>
 +
# <u>'''<span lang="EN-US">ARGOS Service for Users</span>'''<span lang="EN-US">: https://openplato.eu/course/view.php?id=122</span></u>
 +
</div><div>
 +
<u><span lang="EN-US">Reports on co-creation activities: <nowiki>https://osf.io/fp7zt/</nowiki></span></u>  
 +
</div>
 +
}}
 +
{{Instruction Step Trainee
 +
|Instruction Step Title=Pilot 3 - Reproducible Workflows
 +
|Instruction Step Text=<span lang="EN-US">In this pilot, [https://schema-lab.hypatia-comp.athenarc.gr/ SCHEMA] was developed as an open-source framework, comprising SCHEMA api for programmatic execution and SCHEMA lab for a user-friendly web interface to support reproducible computational research through containerized execution, metadata capturing and experiment management. It provides a scalable environment that enables researchers to design and run reproducible computational workflows and experiments.</span>
 +
}}
 +
{{Instruction Step Trainee
 +
|Instruction Step Title=Pilot 4  - The Reproducibility Checklist
 +
|Instruction Step Text=<span lang="EN-US"><span lang="EN-GB">The <span lang="DE">[https://eur04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2FGESIS-Methods-Hub%2Fguidelines-for-methods%2Fblob%2Fmain%2FREADME-template.md&data=05%7C02%7Cb.leitner%40amsterdamumc.nl%7C98306bdc1d14436a474308de062e488a%7C68dfab1a11bb4cc6beb528d756984fb6%7C0%7C0%7C638954993173754302%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&sdata=8wKJxWZb4qfVKLG%2F%2Bgn4HWuQCxrkIvrUyENg%2FEEwiWM%3D&reserved=0 Reproducibility Checklist for Computational Social Science Research]</span> provides a simple, structured framework to enhance the reproducibility of computational methods shared on [https://eur04.safelinks.protection.outlook.com/?url=http%3A%2F%2Fmethodshub.gesis.org%2F&data=05%7C02%7Cb.leitner%40amsterdamumc.nl%7C98306bdc1d14436a474308de062e488a%7C68dfab1a11bb4cc6beb528d756984fb6%7C0%7C0%7C638954993173788095%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&sdata=fZvqBgWjUcBuH%2FNJKGAl8WYPcNRsRQujUc8k%2FbPcG3Q%3D&reserved=0 Methods Hub]. Designed for minimal effort and maximum usability, the checklist helps researchers document and share their data and code in a consistent, verifiable way that fosters trust and collaboration within the social science community. This pilot contributes to more efficient, transparent, and reproducible computational social science by integrating practical usability with clear documentation standards.</span></span>
 +
 +
 +
<span lang="EN-US">The [https://github.com/GESIS-Methods-Hub/guidelines-for-methods/blob/main/README-template.md <u>checklist tempelate</u>] is publicly available through GitHub [https://github.com/GESIS-Methods-Hub/guidelines-for-methods <u>project repositoriy</u>] and is accompanied by [https://github.com/GESIS-Methods-Hub/guidelines-for-methods/blob/main/guidelines.md <u>user guidance</u>] for researchers who want to submit a method. These resources explain how to apply the checklist in practice, both independently and in connection with [https://methodshub.gesis.org/ <u>Methods Hub</u>].</span>
 +
}}
 +
{{Instruction Step Trainee
 +
|Instruction Step Title=Pilot 5 - The Reproducibility Promotion Plan for Funders  (RPP)
 +
|Instruction Step Text=<span lang="EN-US">The Reproducibility Promotion Plan for Funders (RPP) has developed a policy template with recommendations for funders to foster reproducible practices both in the research they fund across three key areas of funding work: evaluation and monitoring, policy and definitions, and incentives. The RPP provides actionable recommendations and best practice examples that funders and funding institutions can adapt to meet their specific needs.</span>
 +
 +
All drafts of the RPP can be found on [https://osf.io/3fpbj/ OSF].
 +
 +
[[File:RPP4F One Pager.jpg|center|thumb]]
 +
}}
 +
{{Instruction Step Trainee
 +
|Instruction Step Title=Pilot 6 - The Reproducibility Monitoring Dashboard
 +
|Instruction Step Text=<span lang="EN-US">The Reproducibility Monitoring Dashboard hosts tools that enable funding agencies to track and monitor the reusability of research artifacts across various projects, programs, topics, and disciplines. This auto-generated dashboard assesses the impacts of policies related to data and code sharing.</span>  <div>
 +
# <u><span lang="EN-US">OSF link for development materials: https://osf.io/wnvtx/</span></u>
 +
# <u><span lang="EN-US">Dashboard prototype for EU-funded Machine Learning projects: https://app.powerbi.com/view?r=eyJrIjoiZTc0MmU1ZTktNzAyMy00ZTk1LWFkZmYtNDVmYjU2YzdhMzZhIiwidCI6IjZhZTA3NzAyLWM1ZjctNGYzOC05Yjg3LWFjYWQ2MmE3NWQ5MyIsImMiOjl9</span></u>
 +
</div>
 +
}}
 +
{{Instruction Step Trainee
 +
|Instruction Step Title=Pilot 7 - Editorial Workflows to Increase Data Sharing
 +
|Instruction Step Text=<span lang="EN-GB">A key aspect of (computational) reproducibility is availability of data. However, sharing of research data is still not the norm across disciplines. Developed co-creatively with stakeholders from major publishers, the workflow and email template provide a low-effort approach for publishers to nudge researchers towards sharing their data for journals operating under a “share upon request” policy. Documentation on the workflow and email template can be found in [https://doi.org/10.17605/OSF.IO/S7GJV TIER2 D5.2].</span>
 +
}}
 +
{{Instruction Step Trainee
 +
|Instruction Step Title=Pilot 8- An Editorial Reference Handbook for Reproducibility and FAIRness
 +
|Instruction Step Text=<div>
 +
<span lang="EN-US">The [https://publishers.fairassist.org/ <u>Editorial Reference Handbook</u>] informs and assists journals to operationalise a set of checks necessary to make the data underlying published research findings more findable, accessible, interoperable and reusable (FAIR) to underpin reproducibility. The Handbook targets in-house staff managing the manuscripts primarily, but it will also benefit reviewers, authors and even those providing services to publishers by making the fundamental checks and requirements transparent and understandable.</span>
 +
</div><div>
 +
<span lang="EN-US">The Handbook includes three interlinked components: a checklist of 12 elements, a flowchart that outlines the ideal internal manuscript submission workflow (who should perform the checks and when), and a guidance to help users to implement the checks effectively, with definitions and tips.</span>  
 +
</div><div>
 +
<span lang="EN-US">The Handbook fills a gap, because no common guidance existed on the practical implementation of these checks across a complex publishing workflow and the variety of individuals and teams involved. This pilot saw the participation of more than 30 individuals, from more than 20 journals of major publishers: CUP, Cell Press, EMBO Press, Taylor & Francis, GigaScience Press, OUP, PLOS, Springer Nature.</span>
 +
</div>
 +
}}
 
{{Instruction Remarks Trainee}}
 
{{Instruction Remarks Trainee}}
 
{{Custom TabContent Close Trainee}}
 
{{Custom TabContent Close Trainee}}

Latest revision as of 10:56, 30 October 2025

New Tools and Services for Reproducibility

Instructions for:TraineeTrainer
Related Initiative
Goal
TIER2 aims to better understand the causes, consequences and possible solutions of perceived poor levels of reproductivity of research across research contexts. With a focus on social, life, and computer sciences, as well as research publishers and funders, the project aims to increase awareness, build capacity, and propose innovative solutions sensitive to varied research cultures. Central to its approach are eight Pilot activities designed to develop, implement, and evaluate new reproducibility-related tools and practices.
Duration (hours)
1
For whom is this important?
1
Pilot 1 - Decision Aid

TIER2's Decision Aid provides clarity on the meaning, relevance, and feasibility of ‘reproducibility’ for researchers to aid them in identifying what type of reproducibility is relevant for their research and indicate what they must consider regarding how feasible such ‘reproducibility’ would be for them.

2
Pilot 2 - Reproducibility Management Plan (RMP)

The Reproducibility Management Plan (RMP) Pilot aims to create a prototype of key thematic subjects and questions that will serve as the starting point to support reproducibility at the planning stage of research. Work involves defining what an RMP is, integrating it into the ARGOS service, and testing its effectiveness with feedback from the community. The pilot addresses researchers, beneficiaries and funders for its adoption.

Please find the integrated ARGOS tool here: https://argos.openaire.eu/portal/.


Tutorials on OpenPlato:

  1. ARGOS Service for Admins: https://openplato.eu/course/view.php?id=150
  2. ARGOS Info Pack: https://openplato.eu/course/view.php?id=547
  3. ARGOS Service for Users: https://openplato.eu/course/view.php?id=122

Reports on co-creation activities: https://osf.io/fp7zt/  

3
Pilot 3 - Reproducible Workflows

In this pilot, SCHEMA was developed as an open-source framework, comprising SCHEMA api for programmatic execution and SCHEMA lab for a user-friendly web interface to support reproducible computational research through containerized execution, metadata capturing and experiment management. It provides a scalable environment that enables researchers to design and run reproducible computational workflows and experiments.

4
Pilot 4 - The Reproducibility Checklist

The Reproducibility Checklist for Computational Social Science Research provides a simple, structured framework to enhance the reproducibility of computational methods shared on Methods Hub. Designed for minimal effort and maximum usability, the checklist helps researchers document and share their data and code in a consistent, verifiable way that fosters trust and collaboration within the social science community. This pilot contributes to more efficient, transparent, and reproducible computational social science by integrating practical usability with clear documentation standards.


The checklist tempelate is publicly available through GitHub project repositoriy and is accompanied by user guidance for researchers who want to submit a method. These resources explain how to apply the checklist in practice, both independently and in connection with Methods Hub.

5
Pilot 5 - The Reproducibility Promotion Plan for Funders (RPP)

The Reproducibility Promotion Plan for Funders (RPP) has developed a policy template with recommendations for funders to foster reproducible practices both in the research they fund across three key areas of funding work: evaluation and monitoring, policy and definitions, and incentives. The RPP provides actionable recommendations and best practice examples that funders and funding institutions can adapt to meet their specific needs.

All drafts of the RPP can be found on OSF.

RPP4F One Pager.jpg

6
Pilot 6 - The Reproducibility Monitoring Dashboard

The Reproducibility Monitoring Dashboard hosts tools that enable funding agencies to track and monitor the reusability of research artifacts across various projects, programs, topics, and disciplines. This auto-generated dashboard assesses the impacts of policies related to data and code sharing.  

7
Pilot 7 - Editorial Workflows to Increase Data Sharing

A key aspect of (computational) reproducibility is availability of data. However, sharing of research data is still not the norm across disciplines. Developed co-creatively with stakeholders from major publishers, the workflow and email template provide a low-effort approach for publishers to nudge researchers towards sharing their data for journals operating under a “share upon request” policy. Documentation on the workflow and email template can be found in TIER2 D5.2.

8
Pilot 8- An Editorial Reference Handbook for Reproducibility and FAIRness

The Editorial Reference Handbook informs and assists journals to operationalise a set of checks necessary to make the data underlying published research findings more findable, accessible, interoperable and reusable (FAIR) to underpin reproducibility. The Handbook targets in-house staff managing the manuscripts primarily, but it will also benefit reviewers, authors and even those providing services to publishers by making the fundamental checks and requirements transparent and understandable.

The Handbook includes three interlinked components: a checklist of 12 elements, a flowchart that outlines the ideal internal manuscript submission workflow (who should perform the checks and when), and a guidance to help users to implement the checks effectively, with definitions and tips.  

The Handbook fills a gap, because no common guidance existed on the practical implementation of these checks across a complex publishing workflow and the variety of individuals and teams involved. This pilot saw the participation of more than 30 individuals, from more than 20 journals of major publishers: CUP, Cell Press, EMBO Press, Taylor & Francis, GigaScience Press, OUP, PLOS, Springer Nature.

Steps

Other information

Cookies help us deliver our services. By using our services, you agree to our use of cookies.
5.2.9