http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
EXTENDED APPROXIMATION BY NEURAL NETWORKS
HAHM, NAHMWOO,HONG, BUM IL,CHOI, SUNG-HEE 한국정보처리학회 1998 한국정보처리학회 추계 학술발표논문집 Vol.4 No.1
In this paper, we prove that any continuous function on a bounded closed interval of □ can be approximated by the superposition of a bounded sigmoidal function with a fixed weight. In addition, we show that any continuous function over 112 which vanishes at infinity can be approximated by the superposition of a hounded sigmoidal function with a weighted norm. Our proof is constructive.
The Capability of Localized Neural Network Approximation
( Nahmwoo Hahm ),( Bum Il Hong ) 호남수학회 2013 호남수학학술지 Vol.35 No.4
In this paper, we investigate a localized approximation of a continuously differentiable function by neural networks. To do this, we first approximate a continuously differentiable function by B-spline functions and then approximate B-spline functions by neural networks. Our proofs are constructive and we give numerical results to support our theory.
HAHM, NAHMWOO,HONG, BUM IL 호남수학회 2006 호남수학학술지 Vol.28 No.1
We investigate the approximation order to a function in $L_p$[-1, 1] for $0{\leq}p<{\infty}$ by generalized translation networks. In most papers related to neural network approximation, sigmoidal functions are adapted as an activation function. In our research, we choose an infinitely many times continuously differentiable function as an activation function. Using the integral modulus of continuity and the divided difference formula, we get the approximation order to a function in $L_p$[-1, 1].
THE CAPABILITY OF PERIODIC NEURAL NETWORK APPROXIMATION
Hahm, Nahmwoo,Hong, Bum Il The Kangwon-Kyungki Mathematical Society 2010 한국수학논문집 Vol.18 No.2
In this paper, we investigate the possibility of $2{\pi}$-periodic continuous function approximation by periodic neural networks. Using the Riemann sum and the quadrature formula, we show the capability of a periodic neural network approximation.
Simultaneous Approximation Algorithm Using a Feedforward Neural Network with a Single Hidden Layer
Nahmwoo Hahm,홍범일 한국물리학회 2009 THE JOURNAL OF THE KOREAN PHYSICAL SOCIETY Vol.54 No.6
Neural networks are useful classes of flexible approximations that have attracted considerable attention and have been widely investigated recently. In this research, we were able to obtain an improved simultaneous approximation algorithm using feedforward neural networks with a single hidden layer when the target trajectory satisfied a certain smoothness condition. We assume a certain smoothness condition here because our neural network algorithm can simultaneously approximate the target trajectory and its derivatives by using the corresponding derivatives of the neural network. Our approach provides not only the errors but also the design of the hidden layer. Furthermore, the obtained algorithm shows that the speed of the approximation with the neural networks relies on the smoothness of the target trajectory.
THE CAPABILITY OF LOCALIZED NEURAL NETWORK APPROXIMATION
Hahm, Nahmwoo,Hong, Bum Il The Honam Mathematical Society 2013 호남수학학술지 Vol.35 No.4
In this paper, we investigate a localized approximation of a continuously differentiable function by neural networks. To do this, we first approximate a continuously differentiable function by B-spline functions and then approximate B-spline functions by neural networks. Our proofs are constructive and we give numerical results to support our theory.
ON A STUDY OF ERROR BOUNDS OF TRAPEZOIDAL RULE
( Nahmwoo Hahm ),( Bum Il Hong ) 호남수학회 2014 호남수학학술지 Vol.36 No.2
In this paper, through a direct computation with subin-tervals partitioning [0,1], we compute better α posteriori bounds for the average case error of the difference between the true value of I(f) = ∫(1)0 f(x) dx with f ∈ C(r) [0,1] minus the composite trapezoidal rule and the composite trapezoidal rule minus the basic trapezoidal rule for r ≥ 3 by using zero mean-Gaussian.
DEGREE OF APPROXIMATION BY PERIODIC NEURAL NETWORKS
Hahm, Nahmwoo,Hong, Bum Il The Kangwon-Kyungki Mathematical Society 2014 한국수학논문집 Vol.22 No.2
We investigate an approximation order of a continuous $2{\pi}$-periodic function by periodic neural networks. By using the De La Vall$\acute{e}$e Poussin sum and the modulus of continuity, we obtain a degree of approximation by periodic neural networks.
The simultaneous approximation order by neural networks with a squashing function
Nahmwoo Hahm 대한수학회 2009 대한수학회보 Vol.46 No.4
In this paper, we study the simultaneous approximation to functions in C^m [0,1] by neural networks with a squashing function and the complexity related to the simultaneous approximation using a Bernstein polynomial and the modulus of continuity. Our proofs are constructive. In this paper, we study the simultaneous approximation to functions in C^m [0,1] by neural networks with a squashing function and the complexity related to the simultaneous approximation using a Bernstein polynomial and the modulus of continuity. Our proofs are constructive.
DEGREE OF APPROXIMATION TO A SMOOTH FUNCTION BY GENERALIZED TRANSLATION NETWORKS
HAHM, NAHMWOO,YANG, MEEHYEA,HONG, BUM IL The Honam Mathematical Society 2005 호남수학학술지 Vol.27 No.2
We obtain the approximation order to a smooth function on a compact subset of $\mathbb{R}$ by generalized translation networks. In our study, the activation function is infinitely many times continuously differentiable function but it does not have special properties around ${\infty}$ and $-{\infty}$ like a sigmoidal activation function. Using the Jackson's Theorem, we get the approximation order. Especially, we obtain the approximation order by a neural network with a fixed threshold.