http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
Characterization of the Smoothest Density with Given Moments
Hong, Changkon The Korean Statistical Society 2001 Journal of the Korean Statistical Society Vol.30 No.3
In this paper, we characterize the smoothest density with prescribed moments. Hong and Kim(1995) proved the existence and uniqueness of such as density. we introduce the general optimal control problem and prove some theorems on the characterization of the minimizer using the optimal control problem techniques.
Smoothing Parameter Selection Using Multifold Cross-Validation in Smoothing Spline Regressions
Hong, Changkon,Kim, Choongrak,Yoon, Misuk The Korean Statistical Society 1998 Communications for statistical applications and me Vol.5 No.2
The smoothing parameter <TEx>$\lambda$</TEx> in smoothing spline regression is usually selected by minimizing cross-validation (CV) or generalized cross-validation (GCV). But, simple CV or GCV is poor candidate for estimating prediction error. We defined MGCV (Multifold Generalized Cross-validation) as a criterion for selecting smoothing parameter in smoothing spline regression. This is a version of cross-validation using $leave-\kappa-out$ method. Some numerical results comparing MGCV and GCV are done.
Robust Interval Estimation Using Density Power Divergence
Sangjin Lee,Changkon Hong 한국자료분석학회 2022 한국자료분석학회 학술대회자료집 Vol.2021 No.2
It is well known that the maximum likelihood estimator (MLE) is asymptotically efficient but not robust with respect to both model misspecification and outliers. Basu et al. (1998) suggest a family of density-based divergence measures called `density power divergences . Each measure in this family is indexed by a single tuning parameter α, which controls the trade-off between robustness and asymptotic efficiency of the estimators. The Kullback-Leibler divergence (Kullback, Leibler, 1951) and 𝐿₂ -distance are members of this family. With a suitably chosen tuning parameter, a minimum density power divergence estimator (MDPDE) can be obtained. For 0<α<1, the estimator is in between MLE (efficient-but-nonrobust) and minimum 𝐿₂ -distance estimator 𝐿₂𝐸 (robust-but-inefficient). Hong, Kim(2001) suggest a data-driven selection of α. In this paper we will suggest a confidence interval using MDPDE when the data set is contaminated. Bootstrap resampling will be used to obtain the confidence interval. The resulting confidence intervals (called MDPD bootstrap confidence intervals) are expected to be robust with respect to the outliers. The performance of the MDPDE bootstrap confidence intervals are investigated via simulation study.
Robust Interval Estimation Using Density Power Divergence
Sangjin Lee,Changkon Hong 한국자료분석학회 2022 Journal of the Korean Data Analysis Society Vol.24 No.1
The maximum likelihood estimator (MLE) is known to be asymptotically efficient but it is not robust with respect to outliers and model misspecification. Basu et al. (1998) suggest a family of divergence measures called density power divergences , which is based on density. Each measure in this family is indexed by a single tuning parameter a. This parameter controls the trade-off between asymptotic efficiency and robustness of the estimators. The L₂-distance and the Kullback-Leibler divergence belong to this family. With a appropriately chosen tuning parameter, one can get a minimum density power divergence estimator (MDPDE). For 0 < a < 1, the estimator is in between MLE and minimum L₂-distance estimator L₂E. Note that MLE is efficient-but-nonrobust while L₂E is robust-butinefficient. Hong, Kim (2001) suggest an automatic selection of a. In this paper we will suggest a confidence interval using MDPDE when the data set is contaminated. Bootstrap resampling will be used to obtain the confidence interval. It is expected that the resulting confidence intervals (called MDPD bootstrap confidence intervals) are robust with respect to the outliers. The performances of the MDPDE bootstrap confidence intervals are investigated via simulation study.