On this page I write about open science, data sharing, and attempts to improve the replicability of research findings, among others. Please let me know if you wish to add examples!
Is science broken? There are recent debates about whether science in general and social sciences in particular are broken. Here I link to media articles and events addressing this question:
- An article in Nature about the UCL event “Is Science broken?”
- A German-speaking article in Die Zeit about the replicability of findings in the Social Sciences / Psychology
- Open Science Framework. A tool for researchers who want to make their research and data transparent and known to others. With a special focus on networking for the sake of replication studies. Free and Open Source. LINK to OSF
- Open-access.net. Mission statement according to the page: “The information platform open-access.net satisfies the growing need for information on open access (OA) by gathering and bundling this information and processing it for various target groups and scenarios.” Available in German and English. LINK to Open-access.net. For instance, OSF was the platform used by the “Many Labs Replication Project” that connected researchers and labs to systematically investigate variation in replicability. LINK to the many labs replication project
- Center for Open Science. “COS is a non-profit technology company providing free and open services to increase inclusivity and transparency of research. COS supports shifting incentives and practices to align more closely with scientific values.” LINK to the Center for Open Science
- JUNQ: The Journal of Unsolved Questions. Mission: “establishing the publication of null-results as an important cornerstone for the advancement of knowledge and scientific understanding in all disciplines thus contributing to overcome biases and fraud in research.” LINK to the JUNQ
- The Journal of Articles in Support of the Null Hypothesis. Similar to the scope of JUNQ, this Journal’s mission is: “We seek to change that by offering an outlet for experiments that do not reach the traditional significance levels (p < .05). Thus, reducing the file drawer problem, and reducing the bias in psychological literature. Without such a resource researchers could be wasting their time examining empirical questions that have already been examined. We collect these articles and provide them to the scientific community free of cost.” LINK to jasnh
- The Rejected Papers Project. Shares papers of Jaan Valsiner that were rejected by journals for the readers to decide whether they find them insightful or not. After endless discussions with colleagues about the often arbitrary formulation and decision of peer reviewers, it seems to me that this idea deserves being spread. Here the LINK.
- Self-Publishing. And this idea has been spread! See here a very interesting discussion about the problems in current publishing practices, and the alternative to make your own research more transparent and known through self-publishing. LINK to Micah Allen’s text “Birth of a new school: How self-publication can improve research”.
- An interesting article about the replicability debate, that comes to the conclusion that: “Science Isn’t Broken. It’s just a hell of a lot harder than we give it credit for“.
Disclaimer: I bear no responsibility for the accuracy, legality or content of the external site or for that of subsequent links. Contact the external site for answers to questions regarding its content.