Kavli Affiliate: Cheng Peng
| First 5 Authors: Andrew Mullhaupt, Cheng Peng, , ,
| Summary:
The Neyman-Pearson region of a simple binary hypothesis testing is the set of
points whose coordinates represent the false positive rate and false negative
rate of some test. The lower boundary of this region is given by the
Neyman-Pearson lemma, and is up to a coordinate change, equivalent to the
optimal ROC curve. We establish a novel lower bound for the boundary in terms
of any $f$-divergence. Since the bound generated by hockey-stick
$f$-divergences characterizes the Neyman-Pearson boundary, this bound is best
possible. In the case of KL divergence, this bound improves Pinsker’s
inequality. Furthermore, we obtain a closed-form refined upper bound for the
Neyman-Pearson boundary in terms of the Chernoff $alpha$-coefficient. Finally,
we present methods for constructing pairs of distributions that can
approximately or exactly realize any given Neyman-Pearson boundary.
| Search Query: ArXiv Query: search_query=au:”Cheng Peng”&id_list=&start=0&max_results=3