http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
A MEMORY EFFICIENT INCREMENTAL GRADIENT METHOD FOR REGULARIZED MINIMIZATION
Yun, Sangwoon Korean Mathematical Society 2016 대한수학회보 Vol.53 No.2
In this paper, we propose a new incremental gradient method for solving a regularized minimization problem whose objective is the sum of m smooth functions and a (possibly nonsmooth) convex function. This method uses an adaptive stepsize. Recently proposed incremental gradient methods for a regularized minimization problem need O(mn) storage, where n is the number of variables. This is the drawback of them. But, the proposed new incremental gradient method requires only O(n) storage.
A New Multiplicative Denoising Variational Model Based on <tex> $m$</tex>th Root Transformation
Sangwoon Yun,Hyenkyun Woo IEEE 2012 IEEE TRANSACTIONS ON IMAGE PROCESSING - Vol.21 No.5
<P>In coherent imaging systems, such as the synthetic aperture radar (SAR), the observed images are contaminated by multiplicative noise. Due to the edge-preserving feature of the total variation (TV), variational models with TV regularization have attracted much interest in removing multiplicative noise. However, the fidelity term of the variational model, based on maximum a posteriori estimation, is not convex, and so, it is usually difficult to find a global solution. Hence, the logarithmic function is used to transform the nonconvex variational model to the convex one. In this paper, instead of using the log, we exploit the th root function to relax the nonconvexity of the variational model. An algorithm based on the augmented Lagrangian function, which has been applied to solve the log transformed convex variational model, can be applied to solve our proposed model. However, this algorithm requires solving a subproblem, which does not have a closed-form solution, at each iteration. Hence, we propose to adapt the linearized proximal alternating minimization algorithm, which does not require inner iterations for solving the subproblems. In addition, the proposed method is very simple and highly parallelizable; thus, it is efficient to remove multiplicative noise in huge SAR images. The proposed model for multiplicative noise removal shows overall better performance than the convex model based on the log transformation.</P>
A memory efficient incremental gradient method for regularized minimization
Sangwoon Yun 대한수학회 2016 대한수학회보 Vol.53 No.2
In this paper, we propose a new incremental gradient method for solving a regularized minimization problem whose objective is the sum of $m$ smooth functions and a (possibly nonsmooth) convex function. This method uses an adaptive stepsize. Recently proposed incremental gradient methods for a regularized minimization problem need $O(mn)$ storage, where $n$ is the number of variables. This is the drawback of them. But, the proposed new incremental gradient method requires only $O(n)$ storage.