Surrogate-Assisted Delayed-Acceptance MCMC for Efficient Parameter Estimation of the Extended Burr Type XII Distribution under Progressive Type-II Censoring
مهدی رجایی سلماسی
1
(
, Islamic Azad University-urmia Branch, Iran
)
کلید واژه: Extended Burr XII, Progressive Type-II censoring, EM algorithm, Surrogate model, Delayed-acceptance MCMC, KD-tree, Coverage,
چکیده مقاله :
We present a unified estimation framework for the Extended Burr Type XII (EBXII) distribution under progressive Type-II censoring that combines classical and modern approaches: (i) EM-based initialization tailored for censoring, (ii) maximum likelihood estimation with Fisher and bootstrap intervals, and (iii) surrogate-assisted delayed-acceptance MCMC (DA-MCMC) to accelerate Bayesian inference. The surrogate is a KD-tree $k$-nearest-neighbor model operating in a whitened logarithmic parameter space; it screens proposals cheaply in Stage 1 while Stage 2 performs unbiased corrections using exact or pseudo-marginal likelihood evaluations. On simulated EBXII datasets with $(\alpha,\beta,\gamma)=(2.5,1.5,3.0)$ and progressive censoring, our primary experiment shows DA-MCMC reproduces reference posteriors (comparable means and 95\% credible intervals) while requiring only 7,854 exact likelihood evaluations compared to 12,000 in the reference run — a reduction of approximately 34.5\%. Coverage analysis, bootstrap validations, and ESS diagnostics are reported. An appendix provides concise symbolic derivations for the likelihood, score and observed information, and the EM MC-step formulations. The pipeline is implemented in Python and made reproducible.
چکیده انگلیسی :
We present a unified estimation framework for the Extended Burr Type XII (EBXII) distribution under progressive Type-II censoring that combines classical and modern approaches: (i) EM-based initialization tailored for censoring, (ii) maximum likelihood estimation with Fisher and bootstrap intervals, and (iii) surrogate-assisted delayed-acceptance MCMC (DA-MCMC) to accelerate Bayesian inference. The surrogate is a KD-tree $k$-nearest-neighbor model operating in a whitened logarithmic parameter space; it screens proposals cheaply in Stage 1 while Stage 2 performs unbiased corrections using exact or pseudo-marginal likelihood evaluations. On simulated EBXII datasets with $(\alpha,\beta,\gamma)=(2.5,1.5,3.0)$ and progressive censoring, our primary experiment shows DA-MCMC reproduces reference posteriors (comparable means and 95\% credible intervals) while requiring only 7,854 exact likelihood evaluations compared to 12,000 in the reference run — a reduction of approximately 34.5\%. Coverage analysis, bootstrap validations, and ESS diagnostics are reported. An appendix provides concise symbolic derivations for the likelihood, score and observed information, and the EM MC-step formulations. The pipeline is implemented in Python and made reproducible.
\begin{thebibliography}{99}
\bibitem{ChristenFox2005} Christen, J.A. and Fox, C., (2005). Markov chain Monte Carlo using an approximation. \emph{Journal of Computational and Graphical Statistics}, 14(4), 795–810.
\bibitem{Dempster1977} Dempster, A.P., Laird, N.M., and Rubin, D.B., (1977). Maximum likelihood from incomplete data via the EM algorithm. \emph{JRSS-B}, 39(1), 1–38.
\bibitem{Kundu2008} Kundu, D., and Raqab, M.Z., (2008). Estimation of R = P(Y < X) for Burr distribution. \emph{Computational Statistics \& Data Analysis}, 52(5), 2680–2692.
\bibitem{Cordeiro2012} Cordeiro, G.M., Ortega, E.M., and Nadarajah, S., (2012). The Kumaraswamy Burr XII distribution. \emph{Journal of the Korean Statistical Society}, 41(1), 75–85.
\bibitem{Conrad2016} Conrad, P.R., Marzouk, Y.M., Pillai, N.S., and Smith, A., (2016). Accelerating asymptotically exact MCMC for computationally intensive models via local approximations. \emph{JASA}, 111(516), 1591–1607.
\bibitem{Rasmussen2006} Rasmussen, C.E. and Williams, C.K.I., (2006). \emph{Gaussian Processes for Machine Learning}. MIT Press.
\bibitem{Papamarkou2022} Papamarkou, T., Sherlock, C., and Stoehr, J., (2022). Fast and robust MCMC using surrogate likelihoods. \emph{Statistics and Computing}, 32(3), 46.
