http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
M-추정에 기반을 둔 로버스트 스펙트럴 추정량: 주택 가격 지수에 대한 응용
박노진,Pak, Ro Jin 한국통계학회 2016 응용통계연구 Vol.29 No.3
주파수 영역에서 시계열 자료를 분석함에 있어 스펙트럴 추정량은 매우 유용한 도구이다. 기존의 스펙트럴 추정량은 이상치에 영향을 받을 수밖에 없는 구조로 되어있어서 M-추정법을 활용하여 로버스트 스펙트럴 추정량이 제안되었다. M-추정을 위해서는 조율모수를 적절하게 선택해 주어야 하는데 Pak (2001)이 제안한 방법을 사용할 때의 효과를 연구하였다. 모의실험과 주택가격지수에의 적용을 통하여 효과가 있음을 확인하였다. In analysing a time series on the frequency domain, the spectral estimator (or periodogram) is a very useful statistic to identify the periods of a time series. However, the spectral estimator is very sensitive in nature to outliers, so that the spectral estimator in terms of M-estimation has been studied by some researchers. Pak (2001) proposed an empirical method to choose a tuning parameter for the Huber's M-estimating function. In this article, we try to implement Pak's estimation proposal in the spectral estimator. We use the Korean housing price index as an example data set for comparing various M-estimating results.
A Robust Estimation for the Composite Lognormal-Pareto Model
Pak, Ro Jin The Korean Statistical Society 2013 Communications for statistical applications and me Vol.20 No.4
Cooray and Ananda (2005) proposed a composite lognormal-Pareto model to analyze loss payment data in the actuarial and insurance industries. Their model is based on a lognormal density up to an unknown threshold value and a two-parameter Pareto density. In this paper, we implement the minimum density power divergence estimation for the composite lognormal-Pareto density. We compare the performances of the minimum density power divergence estimator (MDPDE) and the maximum likelihood estimator (MLE) by simulations and an example. The minimum density power divergence estimator performs reasonably well against various violations in the distribution. The minimum density power divergence estimator better fits small observations and better resists against extraordinary large observations than the maximum likelihood estimator.
Reducing Bias of the Minimum Hellinger Distance Estimator of a Location Parameter
Pak, Ro-Jin Korean Data and Information Science Society 2006 한국데이터정보과학회지 Vol.17 No.1
Since Beran (1977) developed the minimum Hellinger distance estimation, this method has been a popular topic in the field of robust estimation. In the process of defining a distance, a kernel density estimator has been widely used as a density estimator. In this article, however, we show that a combination of a kernel density estimator and an empirical density could result a smaller bias of the minimum Hellinger distance estimator than using just a kernel density estimator for a location parameter.
The Bandwidth from the Density Power Divergence
Pak, Ro Jin The Korean Statistical Society 2014 Communications for statistical applications and me Vol.21 No.5
The most widely used optimal bandwidth is known to minimize the mean integrated squared error(MISE) of a kernel density estimator from a true density. In this article proposes, we propose a bandwidth which asymptotically minimizes the mean integrated density power divergence(MIDPD) between a true density and a corresponding kernel density estimator. An approximated form of the mean integrated density power divergence is derived and a bandwidth is obtained as a product of minimization based on the approximated form. The resulting bandwidth resembles the optimal bandwidth by Parzen (1962), but it reflects the nature of a model density more than the existing optimal bandwidths. We have one more choice of an optimal bandwidth with a firm theoretical background; in addition, an empirical study we show that the bandwidth from the mean integrated density power divergence can produce a density estimator fitting a sample better than the bandwidth from the mean integrated squared error.
Robustizing Kalman filters with the M-estimating functions
Pak, Ro Jin The Korean Statistical Society 2018 Communications for statistical applications and me Vol.25 No.1
This article considers a robust Kalman filter from the M-estimation point of view. Pak (Journal of the Korean Statistical Society, 27, 507-514, 1998) proposed a particular M-estimating function which has the data-based shaping constants. The Kalman filter with the proposed M-estimating function is considered. The structure and the estimating algorithm of the Kalman filter accompanying the M-estimating function are mentioned. Kalman filter estimates by the proposed M-estimating function are shown to be well behaved even when data are contaminated.
The Minimum Squared Distance Estimator and the Minimum Density Power Divergence Estimator
Pak, Ro-Jin The Korean Statistical Society 2009 Communications for statistical applications and me Vol.16 No.6
Basu et al. (1998) proposed the minimum divergence estimating method which is free from using the painful kernel density estimator. Their proposed class of density power divergences is indexed by a single parameter $\alpha$ which controls the trade-off between robustness and efficiency. In this article, (1) we introduce a new large class the minimum squared distance which includes from the minimum Hellinger distance to the minimum $L_2$ distance. We also show that under certain conditions both the minimum density power divergence estimator(MDPDE) and the minimum squared distance estimator(MSDE) are asymptotically equivalent and (2) in finite samples the MDPDE performs better than the MSDE in general but there are some cases where the MSDE performs better than the MDPDE when estimating a location parameter or a proportion of mixed distributions.
Modified Local Density Estimation for the Log-Linear Density
Pak, Ro-Jin 한국통계학회 2000 Communications for statistical applications and me Vol.7 No.1
We consider local likelihood method with a smoothed version of the model density in stead of an original model density. For simplicity a model is assumed as the log-linear density then we were able to show that the proposed local density estimator is less affected by changes among observations but its bias increases little bit more than that of the currently used local density estimator. Hence if we use the existing method and the proposed method in a proper way we would derive the local density estimator fitting the data in a better way.
Robustness of Minimum Disparity Estimators in Linear Regression Models
Pak, Ro-Jin The Korean Statistical Society 1995 Journal of the Korean Statistical Society Vol.24 No.2
This paper deals with the robustness properties of the minimum disparity estimation in linear regression models. The estimators defined as statistical quantities whcih minimize the blended weight Hellinger distance between a weighted kernel density estimator of the residuals and a smoothed model density of the residuals. It is shown that if the weights of the density estimator are appropriately chosen, the estimates of the regression parameters are robust.