http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
Support Vector Machine Regression for a Gaussian Fuzzy Model
황창하 한국자료분석학회 2004 Journal of the Korean Data Analysis Society Vol.6 No.2
Support vector machine(SVM) has been very successful in pattern recognition and function estimation problems for crisp data. In this paper, we propose SVM approach to evaluating fuzzy regression model with multiple crisp inputs and a Gaussian fuzzy output. The proposed algorithm here is model-free method in the sense that we do not need assume the underlying model function. This algorithm is robust for the estimation of fuzzy linear and nonlinear regression models, especially when outliers are present. Numerical examples are given to detail the effectiveness of this approach.
Variable selection in L1 penalized censored regression
황창하,김말숙,심주용 한국데이터정보과학회 2011 한국데이터정보과학회지 Vol.22 No.5
The proposed method is based on a penalized censored regression model with L1-penalty. We use the iteratively reweighted least squares procedure to solve L1 penalized log likelihood function of censored regression model. It provide the ecient computation of regression parameters including variable selection and leads to the generalized cross validation function for the model selection. Numerical results are then presented to indicate the performance of the proposed method.
Multioutput LS-SVR based residual MCUSUM control chart for autocorrelated process
황창하 한국데이터정보과학회 2016 한국데이터정보과학회지 Vol.27 No.2
Most classical control charts assume that processes are serially independent, and autocorrelation among variables makes them unreliable. To address this issue, a variety of statistical approaches has been employed to estimate the serial structure of the process. In this paper, we propose a multioutput least squares support vector regression and apply it to construct a residual multivariate cumulative sum control chart for detecting changes in the process mean vector. Numerical studies demonstrate that the proposed multioutput least squares support vector regression based control chart provides more satisfying results in detecting small shifts in the process mean vector.
Robust varying coecient model using L1 regularization
황창하,배종식,심주용 한국데이터정보과학회 2016 한국데이터정보과학회지 Vol.27 No.4
In this paper we propose a robust version of varying coefficient models, which is based on the regularized regression with L1 regularization. We use the iteratively reweighted least squares procedure to solve L1 regularized objective function of varying coefficient model in locally weighted regression form. It provides the efficient computation of coefficient function estimates and the variable selection for given value of smoothing variable. We present the generalized cross validation function and Akaike information type criterion for the model selection. Applications of the proposed model are illustrated through the artificial examples and the real example of predicting the effect of the input variables and the smoothing variable on the output.
Feature selection in the semivarying coefficient LS-SVR
황창하,심주용 한국데이터정보과학회 2017 한국데이터정보과학회지 Vol.28 No.2
In this paper we propose a feature selection method identifying important features in the semivarying coefficient model. One important issue in semivarying coefficient model is how to estimate the parametric and nonparametric components. Another issue is how to identify important features in the varying and the constant effects. We propose a feature selection method able to address this issue using generalized cross validation functions of the varying coefficient least squares support vector regression (LS-SVR) and the linear LS-SVR. Numerical studies indicate that the proposed method is quite effective in identifying important features in the varying and the constant effects in the semivarying coefficient model.
Geographically weighted least squares-support vector machine
황창하,심주용 한국데이터정보과학회 2017 한국데이터정보과학회지 Vol.28 No.1
When the spatial information of each location is given specically as coordinates it is popular to use the geographically weighted regression to incorporate the spatial information by assuming that the regression parameters vary spatially across locations. In this paper, we relax the linearity assumption of geographically weighted regression and propose a geographically weighted least squares-support vector machine for estimating geographically weighted mean by using the basic concept of kernel machines. Generalized cross validation function is induced for the model selection. Numerical studies with real datasets have been conducted to compare the performance of proposed method with other methods for predicting geographically weighted mean.
Partially linear support vector orthogonal quantile regression with measurement errors
황창하 한국데이터정보과학회 2015 한국데이터정보과학회지 Vol.26 No.1
Quantile regression models with covariate measurement errors have received a greatdeal of attention in both the theoretical and the applied statistical literature. A lot ofeort has been devoted to develop eective estimation methods for such quantile regres-sion models. In this paper we propose the partially linear support vector orthogonalquantile regression model in the presence of covariate measurement errors. We alsoprovide a generalized approximate cross-validation method for choosing the hyperpa-rameters and the ratios of the error variances which aect the performance of theproposed model. The proposed model is evaluated through simulations.
Asymmetric least squares regression estimation using weighted least squares support vector machine
황창하 한국데이터정보과학회 2011 한국데이터정보과학회지 Vol.22 No.5
This paper proposes a weighted least squares support vector machine for asymmetric least squares regression. This method achieves nonlinear prediction power, while making no assumption on the underlying probability distributions. The cross validation function is introduced to choose optimal hyperparameters in the procedure. Experimental results are then presented which indicate the performance of the proposed model.
황창하,유지영 대구효성가톨릭대학교 자연과학연구소 1997 基礎科學硏究論集 Vol.11 No.2
This paper surveys the content of regression by feed-forward neural network approaches, and compares it with kernel, spline and projection pursuit method. For the comparison study we use the famous French curve as the main example.