Turning a blind eye to breaches of integrity

From The Embassy of Good Science
Revision as of 09:38, 28 October 2020 by 0000-0001-7124-9282 (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Turning a blind eye to breaches of integrity

What is this about?

When you witness a colleague making a mistake, it is sometimes difficult to address it. It is even more difficult to address if the mistake is not an honest error, but an intentional breach of the rules. Do you dare to take action? If so, what can you do? Or… do you turn a blind eye? And what would that mean for you?

Why is this important?

Research aims for ‘truth’. We trust researchers to strive for that. Errors in research – whether intentional or not – hamper truth finding. When errors (honest mistakes or deliberate violations) are dealt with, we come closer to finding scientific ‘truth’, such as a theory that explains what we see around us or the best treatment for a particular condition. Addressing errors restores trust in research. When someone knows a mistake is made and does not act upon that, truth is compromised and trust in research decreases.

For whom is this important?

What are the best practices?

  • A zero-tolerance culture towards putative breaches of research integrity. When institutions or individuals turn a blind eye to misbehaviour, they fail to foster a culture of research integrity. A zero-tolerance culture encourages people to report suspicions of malpractice.
  • A clear reporting system, including clear procedures and access to guidance and help (e.g. ombudspersons). A scheme has to be in place to handle expressions of concern and actual allegations of potential errors.
  • Protection of whistleblowers, clarity about the rights of both whistleblowers and persons who are accused.