Nießen, D., Poppa, C., Daikeler, J., Silber, H., Weiß, B., & Richter, D. (2025). Actor-driven risk factors of publication bias: Opening the file drawer of two probabilistic panel surveys. PsyArXiv. https://doi.org/10.31234/osf.io/phk3a_v1
A new study examining 178 successful study submissions to two major German probabilistic panels between 2013 and 2021 sheds light on the persistent problem of publication bias in science. The research compared the original hypotheses and exploratory analyses stated in successful study submissions with their subsequent presentation in published articles.
The findings reveal that publication bias—whereby results are selectively reported or not published based on their direction or strength—remains a significant issue affecting the robustness of scientific research. However, the study identified several factors linked to a lower risk of publication bias. These include experimental study design, third-party funding, preregistration of hypotheses, and a focus on economic studies.
The authors concluded that measures such as preregistration, funding incentives, and rigorous peer review can help reduce publication bias. The research highlights the possibilities for systemic reforms in the way scientific findings are reported, reviewed, and published, urging researchers, funding agencies, and journals to play active roles in ensuring research integrity.
Poppa, C., Nießen, D., Daikeler, J., Silber, H., Weiß, B., & Richter, D. (2025). The tip of the iceberg? Insights into the prevalence of publication bias in two probability-based academic panels. PsyArXiv. https://doi.org/10.31234/osf.io/bj3g9_v1
A new study has unveiled significant publication bias in the social, behavioral, and economic sciences, revealing that a large proportion of research remains unpublished or selectively reported. Publication bias occurs when scientifically significant results are given preference over nonsignificant or null findings, based on the assumption that such results are more valuable and publishable. This often leads to what is known as the “file drawer bias,” where studies with null results are less likely to be published.
Researchers investigated 178 studies submitted to two major German probability-based research panels (SOEP-IS and GESIS Panel), comparing the original hypotheses to those presented in subsequent publications. After more than six years of data availability, the analysis showed that 43.8% of studies remained unpublished, while only 44.4% resulted in at least one peer-reviewed article, and 11.8% were only published as gray literature.
Alarmingly, less than one third of all submitted hypotheses made it into published articles. Moreover, more than 80% of published hypotheses were formulated ad-hoc after data collection, with over half of studies publishing only such ad-hoc hypotheses—a potential sign of HARKing (Hypothesizing After the Results are Known). More than 70% of published hypotheses were supported by the data, illustrating persistent incentives to report only positive findings.
These results highlight ongoing challenges in research transparency and integrity, underscoring the possibilities for reforms in the scientific publication process to address file drawer bias, selective reporting, and the widespread use of post-hoc hypotheses.