Questionable Research Practices in Analysis and Reporting

From The Embassy of Good Science
Revision as of 10:06, 28 October 2020 by 0000-0001-7124-9282 (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Questionable Research Practices in Analysis and Reporting

What is this about?

In a list of major and minor research misbehaviours collaboratively developed by a group of research integrity experts, the research phase ‘Reporting’ (which describes analysis and publication of results jointly) contains the most items.[1] Needless to say, the potential for misbehaviors in this phase is exacerbated by the vast number of decisions researchers must take during the course of analysis and reporting.

  1. Bouter, L.M., Tijdink, J., Axelsen, N., Martinson, B.C. and Ter Riet, G., 2016. Ranking major and minor research misbehaviors: results from a survey among participants of four World Conferences on Research Integrity. Research Integrity and Peer Review, 1(1), p.17.

Why is this important?

Integrity in analysis and reporting of results is important to fully understand your data. Misbehaviors related to analysis and reporting include:

  1. Report on data driven hypotheses without disclosure [‘HARKing’ ‐ Hypothesizing After Results are Known ‐ typically with a view to make results appear more spectacular (‘Chrysalis effect’)]   
  2. Delete data before performing data analysis without disclosure   
  3. Selectively delete data, modify data or add fabricated data after performing initial data‐analyses  [in other words: falsification or fabrication of data]   
  4. Perform data‐analyses not stated in the study protocol without disclosure  [or in predefined data‐analysis plan – also called ‘Significance chasing’, ‘P-hacking’, ‘data dredging’,  ‘fishing expedition’ or explorative subgroup analyses]   
  5. Report an incorrect downwardly rounded p‐value [e.g. by reporting a p value of .054 as being less than .05]   
  6. Not report all study protocol‐stipulated results  [in the aggregate of all published reports on the study at issue]   
  7. Not publish a valid ‘negative’ study  [in a form that is publicly available or accessible behind a paywall (article, report, website etc.)]   
  8. Report an unexpected finding as having been hypothesized from the start   
  9. Conceal results that contradict your earlier findings or convictions   
  10. Not report clearly relevant details of study methods  
  11. Not report replication problems   
  12. Selectively cite to enhance your own findings or convictions   
  13. Selectively cite to please editors, reviewers or colleagues   
  14. Selectively cite or cite your own work to improve citation metrics  [e.g. Impact Factor, H‐index]   
  15. Let your convictions influence the conclusions substantially   
  16. Insufficiently report study flaws and limitations   
  17. Spread study results over more papers than needed [‘salami slicing’]   
  18. Duplicate publication without disclosure   
  19. Re‐use of previously published data without disclosure [which may lead to double counting in meta‐analyses]   
  20. Modify the results or conclusions of a study due to pressure of a sponsor  [commercial or not‐for‐profit funder of the study]   
  21. Failure to disclose a sponsor of the study   
  22. Failure to disclose a relevant financial or intellectual conflict of interest  [in publications, when reviewing grant proposals, or evaluating persons or institutions]  
  23. Handle existing conflicts of interest inadequately   
  24. Communicate results to the general public before a peer reviewed publication is available  
  25. Deliberately communicate findings inaccurately in the media or during presentations   
  26. Make no clear distinction between personal views and professional comments (List from Bouter et al 2016[1])
  1. Bouter, L.M., Tijdink, J., Axelsen, N., Martinson, B.C. and Ter Riet, G., 2016. Ranking major and minor research misbehaviors: results from a survey among participants of four World Conferences on Research Integrity. Research Integrity and Peer Review, 1(1), p.17

For whom is this important?

Other information

Cookies help us deliver our services. By using our services, you agree to our use of cookies.
5.1.6