Real Effect or Bias? Best Practices for Evaluating the Robustness of Real-World Evidence through Quantitative Sensitivity Analysis for Unmeasured Confounding

Kavli Affiliate: Xiang Zhang

| First 5 Authors: Douglas Faries, Chenyin Gao, Xiang Zhang, Chad Hazlett, James Stamey

| Summary:

The assumption of no unmeasured confounders is a critical but unverifiable
assumption required for causal inference yet quantitative sensitivity analyses
to assess robustness of real-world evidence remains underutilized. The lack of
use is likely in part due to complexity of implementation and often specific
and restrictive data requirements required for application of each method. With
the advent of sensitivity analyses methods that are broadly applicable in that
they do not require identification of a specific unmeasured confounder, along
with publicly available code for implementation, roadblocks toward broader use
are decreasing. To spur greater application, here we present a best practice
guidance to address the potential for unmeasured confounding at both the design
and analysis stages, including a set of framing questions and an analytic
toolbox for researchers. The questions at the design stage guide the research
through steps evaluating the potential robustness of the design while
encouraging gathering of additional data to reduce uncertainty due to potential
confounding. At the analysis stage, the questions guide researchers to
quantifying the robustness of the observed result and providing researchers
with a clearer indication of the robustness of their conclusions. We
demonstrate the application of the guidance using simulated data based on a
real-world fibromyalgia study, applying multiple methods from our analytic
toolbox for illustration purposes.

| Search Query: ArXiv Query: search_query=au:”Xiang Zhang”&id_list=&start=0&max_results=3

Read More