Copas-Jackson-type bounds for publication bias over a general class of selection models

Kavli Affiliate: Yi Zhou

| First 5 Authors: Taojun Hu, Taojun Hu, , ,

| Summary:

Publication bias (PB) is one of the most vital threats to the accuracy of
meta-analysis. Adjustment or sensitivity analysis based on selection models,
which describe the probability of a study being published, provide a more
objective evaluation of PB than widely-used simple graphical methods such as
the trim-and-fill method. Most existing methods rely on parametric selection
models. The Copas-Jackson bound (C-J bound) provides a worst-case bound of an
analytical form over a nonparametric class of selection models, which would
provide more robust conclusions than parametric sensitivity analysis. The
nonparametric class of the selection models in the C-J bound is restrictive and
only covers parametric selection models monotonic to the standard errors of
outcomes. The novelty of this paper is to develop a method that constructs
worst-case bounds over a general class of selection models weakening the
assumption in the C-J bound. We propose an efficient numerical method to obtain
an approximate worst-case bound via tractable nonlinear programming with linear
constraints. We substantiate the effectiveness of the proposed bound with
extensive simulation studies and show its applicability with two real-world
meta-analyses.

| Search Query: ArXiv Query: search_query=au:”Yi Zhou”&id_list=&start=0&max_results=3

Read More