Day 1: Keynote lectures by Ulrich Dirnagl and Daniel Quintana
Try This at Home: Rethinking Good Scientific Practice
Spectacular cases, such as the the massive falsification of data by the Dutch psychologist Diederik Stapel, or plagarism in the theses of several German ministers, have focused public attention on misconduct in science. The term 'Good Scientific Practice' (GSP) is often evoked in this context, as a normative framework intended to recognize and prevent misconduct. While fraud in research may severely distort the scientific record, it fortunately appears to be rather rare. There is also no evidence that GSP norms are effective in preventing individual researchers from cheating. On the other hand, we have clear evidence from meta-research that in many fields there are substantial weaknesses in planning, conducting, analysing, and reporting of research. Low internal and external validity as well as low statistical power appears to produce high rates of false postives, and inflate effect sizes unrealistically. Not surprisingly then, the majority of scientists believe that we are in the midst of a ‘reproducibility crisis’. The immense proliferation of research outputs, combined with increasing methodological complexity and the size of data sets, greatly complicates the sharing, evaluation, and synthesis of high quality evidence. At the same time, nonpublication of results leads to duplicative research and deprives medical decision-makers of the totality of evidence. Central to these problems is the way how institutions incentivize researchers and advance their careers. This creates immense pressure to publish in top journals and accumulate third party funding, while there is almost no focus on content of the publications or the robustness or value of research outputs. In my view, GSP worthy of this term must not only consider extreme behaviours like fraud, but also the 'normal', everyday conduct of research. I strongly believe that even a minor shift in our scientific practices towards more robustness, reliability and reproducibility will have a much bigger positive overall impact on scientific output than complete erradication of misconduct, which is impossible anyway. In my talk I will highlight some 'Try this at home' GSP for students and researchers alike, along the lines of the extended GSP concept outlined above. However, without matching action of funders, journals and institutions, the impact of such activities of individual researchers will remain limited.
An important element of the research process is getting feedback on your work. This can occur on a micro level, in terms of individual units of work, but also at a macro level, in terms of broad research programs. Under the traditional system in which citations and policy implementations are used as a proxy of impact, it can take years to receive feedback that your work is useful and correct. While transparent research practices will help increase reproducibility and restore trust in the social and life sciences, an underappreciated by-product of this emerging approach is that transparency facilitates the rapid feedback of your work. In this talk, I will demonstrate how myself and others have used open science practices to get fast feedback. First, I will cover the open science platforms that you can use to deposit your preprints, data, and code. Second, I highlight how social media platforms can be used as a tool to direct attention to your work and keep up-to-date with transparent scientific practices. Finally, I will address some common misunderstandings surrounding open science platforms and using social media for your research. By using this rapid feedback approach, scientists can more quickly find errors in their work and evaluate whether they should persevere with a line of research, or pivot to other research areas.
Day 2: Keynote lecture by Candice Morey
Researchers are increasingly pressed by their institutions, funders, and journals for increased transparency. We are asked to make data and materials publicly available and to document our hypotheses before carrying out analyses, activities which may (or may not) have been already part of a lab's workflow. These requests are often seen as burdensome. I argue that integrating transparent practices into your work - in particular, curating materials and data early during a project with a view to publicly releasing them later, and pre-registration - saves time and effort in the long-run, decreases inevitable errors in your analysis, and also improves the experience for students working with you on research projects (yours as well as theirs). I will describe my own experiences adopting transparent practices, including lessons I have learned implementing transparent practices into my lab work flow.