This study investigates the small‐sample performance of meta‐regression methods for detecting and estimating genuine empirical effects in research literatures tainted by publication selection. Publication selection exists when editors, reviewers or researchers have a preference for statistically significant results. Meta‐regression methods are found to be robust against publication selection. Even if a literature is dominated by large and unknown misspecification biases, precision‐effect testing and joint precision‐effect and meta‐significance testing can provide viable strategies for detecting genuine empirical effects. Publication biases are greatly reduced by combining two biased estimates, the estimated meta‐regression coefficient on precision (1/Se) and the unadjusted‐average effect.
© 2001-2024 Fundación Dialnet · Todos los derechos reservados