http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
황창하,나은영,석경하,Hwang, Chang-Ha,Na, Eun-Young,Seok, Kyung-Ha 한국데이터정보과학회 2001 한국데이터정보과학회지 Vol.12 No.2
신경망은 점차 분류 및 함수추정을 위한 현대 통계적 방법론으로 부각되고 있다. 신경망은 특히 선형 회귀함수를 일반화시키는 유연한(flexible) 방법을 제공하며 일반적 비선형 함수를 모수화하는 방법으로 간주된다. 본 논문에서는 함수추정을 위한 신경망을 생각한다. 신경망이 훈련자료를 과대적합하는 것을 피할 수 있도록 하는 간단한 방법은 정칙화(regularization)이다. 신경망에서는 정칙화를 위해 주로 가중치 감소법(weight decay method)을 사용한다. 함수추정을 위해 가중치감소 신경망을 사용할 때 은닉노드수, 가중치모수, 학습률 및 학습반복회수가 중요한 모수이다. 본 논문에서는 유전자 알고리즘을 사용하여 가중치감소 신경망의 중요한 모수들을 자동으로 최적화하는 방법을 제안하고 결과적으로 가중치감소 신경망을 자동학습하는 방법을 설명한다. 그리고 다른 함수추정방법들과 자동학습된 가중치감소 신경망을 비교분석한다. Neural networks we increasingly being seen as an addition to the statistics toolkit which should be considered alongside both classical and modern statistical methods. Neural networks are usually useful for classification and function estimation. In this paper we concentrate on function estimation using neural networks with weight decay factor The use of weight decay seems both to help the optimization process and to avoid overfitting. In this type of neural networks, the problem to decide the number of hidden nodes, weight decay parameter and iteration number of learning is very important. It is called the optimization of weight decay neural networks. In this paper we propose a automatic optimization based on genetic algorithms. Moreover, we compare the weight decay neural network automatically learned according to automatic optimization with ordinary neural network, projection pursuit regression and support vector machines.
Variable selection in L1 penalized censored regression
황창하,김말숙,심주용 한국데이터정보과학회 2011 한국데이터정보과학회지 Vol.22 No.5
The proposed method is based on a penalized censored regression model with L1-penalty. We use the iteratively reweighted least squares procedure to solve L1 penalized log likelihood function of censored regression model. It provide the ecient computation of regression parameters including variable selection and leads to the generalized cross validation function for the model selection. Numerical results are then presented to indicate the performance of the proposed method.
황창하,김상민,박희주 한국통계학회 1997 Communications for statistical applications and me Vol.4 No.2
다층 신경망은 비모수 회귀함수 추정의 한 방법이다. 다충 신경망을 학습시키기 위해 역전파 알고리즘이 널리 사용되고 있다. 그러나 이 알고리즘은 이상치에 매우 민감하여 이상치를 포함하고 있는 자료에 대하여 원하지 않는 회귀함수를 추정한다. 본 논문에서는 통계물리에서 자주 사용하는 방법을 이용하여 로버스트 역전파 알고리즘을 제안하고 수학적으로 신경망과 매우 유사한 PRP(projection pursuit regression) 방법, 일반적인 역전파 알고리즘과 모의실험을 통해 비교 분석한다.
Multioutput LS-SVR based residual MCUSUM control chart for autocorrelated process
황창하 한국데이터정보과학회 2016 한국데이터정보과학회지 Vol.27 No.2
Most classical control charts assume that processes are serially independent, and autocorrelation among variables makes them unreliable. To address this issue, a variety of statistical approaches has been employed to estimate the serial structure of the process. In this paper, we propose a multioutput least squares support vector regression and apply it to construct a residual multivariate cumulative sum control chart for detecting changes in the process mean vector. Numerical studies demonstrate that the proposed multioutput least squares support vector regression based control chart provides more satisfying results in detecting small shifts in the process mean vector.
Robust varying coecient model using L1 regularization
황창하,배종식,심주용 한국데이터정보과학회 2016 한국데이터정보과학회지 Vol.27 No.4
In this paper we propose a robust version of varying coefficient models, which is based on the regularized regression with L1 regularization. We use the iteratively reweighted least squares procedure to solve L1 regularized objective function of varying coefficient model in locally weighted regression form. It provides the efficient computation of coefficient function estimates and the variable selection for given value of smoothing variable. We present the generalized cross validation function and Akaike information type criterion for the model selection. Applications of the proposed model are illustrated through the artificial examples and the real example of predicting the effect of the input variables and the smoothing variable on the output.
Feature selection in the semivarying coefficient LS-SVR
황창하,심주용 한국데이터정보과학회 2017 한국데이터정보과학회지 Vol.28 No.2
In this paper we propose a feature selection method identifying important features in the semivarying coefficient model. One important issue in semivarying coefficient model is how to estimate the parametric and nonparametric components. Another issue is how to identify important features in the varying and the constant effects. We propose a feature selection method able to address this issue using generalized cross validation functions of the varying coefficient least squares support vector regression (LS-SVR) and the linear LS-SVR. Numerical studies indicate that the proposed method is quite effective in identifying important features in the varying and the constant effects in the semivarying coefficient model.
Partially linear support vector orthogonal quantile regression with measurement errors
황창하 한국데이터정보과학회 2015 한국데이터정보과학회지 Vol.26 No.1
Quantile regression models with covariate measurement errors have received a greatdeal of attention in both the theoretical and the applied statistical literature. A lot ofeort has been devoted to develop eective estimation methods for such quantile regres-sion models. In this paper we propose the partially linear support vector orthogonalquantile regression model in the presence of covariate measurement errors. We alsoprovide a generalized approximate cross-validation method for choosing the hyperpa-rameters and the ratios of the error variances which aect the performance of theproposed model. The proposed model is evaluated through simulations.
Asymmetric least squares regression estimation using weighted least squares support vector machine
황창하 한국데이터정보과학회 2011 한국데이터정보과학회지 Vol.22 No.5
This paper proposes a weighted least squares support vector machine for asymmetric least squares regression. This method achieves nonlinear prediction power, while making no assumption on the underlying probability distributions. The cross validation function is introduced to choose optimal hyperparameters in the procedure. Experimental results are then presented which indicate the performance of the proposed model.
황창하,유지영 대구효성가톨릭대학교 자연과학연구소 1997 基礎科學硏究論集 Vol.11 No.2
This paper surveys the content of regression by feed-forward neural network approaches, and compares it with kernel, spline and projection pursuit method. For the comparison study we use the famous French curve as the main example.