Abstract
Consider the family of power divergence statistics based on n trials, each leading to one of r possible outcomes. This includes the log-likelihood ratio and Pearson's statistic as important special cases. It is known that in certain regimes (e.g., when r is of order n2 and the allocation is asymptotically uniform as n→∞) the power divergence statistic converges in distribution to a linear transformation of a Poisson random variable. We establish explicit error bounds in the Kolmogorov (or uniform) metric to complement this convergence result, which may be applied for any values of n, r and the index parameter λ for which such a finite-sample bound is meaningful. We further use this Poisson approximation result to derive error bounds in Gaussian approximation of the power divergence statistics.
Original language | English |
---|---|
Pages (from-to) | 25-37 |
Number of pages | 13 |
Journal | ALEA: Latin American Journal of Probability and Mathematical Statistics |
Volume | 21 |
DOIs | |
Publication status | Published - Jan 2024 |
Keywords
- Cressie-Read statistics
- Pearson's statistic
- Poisson approximation
- log-likelihood ratio
- normal approximation
- uniform metric
ASJC Scopus subject areas
- Statistics and Probability