http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
APPROXIMATION ORDER TO A FUNCTION IN $C^1$[0, 1] AND ITS DERIVATIVE BY A FEEDFOWARD NEURAL NETWORK
Hahm, Nahm-Woo,Hong, Bum-Il The Korean Society for Computational and Applied M 2009 Journal of applied mathematics & informatics Vol.27 No.1
We study the neural network approximation to a function in $C^1$[0, 1] and its derivative. In [3], we used even trigonometric polynomials in order to get an approximation order to a function in $L_p$ space. In this paper, we show the simultaneous approximation order to a function in $C^1$[0, 1] using a Bernstein polynomial and a feedforward neural network. Our proofs are constructive.
THE SIMULTANEOUS APPROXIMATION ORDER BY NEURAL NETWORKS WITH A SQUASHING FUNCTION
Hahm, Nahm-Woo Korean Mathematical Society 2009 대한수학회보 Vol.46 No.4
In this paper, we study the simultaneous approximation to functions in $C^m$[0, 1] by neural networks with a squashing function and the complexity related to the simultaneous approximation using a Bernstein polynomial and the modulus of continuity. Our proofs are constructive.
NOTES ON EXTENDED NEURAL NETWORK APPROXIMATION
Hahm, Nahm-Woo,Hong, Bum-Il,Choi, Sung-Hee 한국전산응용수학회 1998 Journal of applied mathematics & informatics Vol.5 No.3
In this paper we prove that any continuous function on a bounded closed interval of can be approximated by the superposition of a bounded sigmoidal function with a fixed weight. In addition we show that any continuous function over $\mathbb{R}$ which vanishes at infinity can be approximated by the superposition f a bounded sigmoidal function with a weighted norm. Our proof is constructive.
CONSTRUCTIVE APPROXIMATION BY GAUSSIAN NEURAL NETWORKS
Hahm, Nahm-Woo,Hong, Bum-Il The Honam Mathematical Society 2012 호남수학학술지 Vol.34 No.3
In this paper, we discuss a constructive approximation by Gaussian neural networks. We show that it is possible to construct Gaussian neural networks with integer weights that approximate arbitrarily well for functions in $C_c(\mathbb{R}^s)$. We demonstrate numerical experiments to support our theoretical results.
Hahm, Nahm-Woo,Hong, Bum-Il Korean Mathematical Society 2001 대한수학회보 Vol.38 No.2
We improve the greedy algorithm which is one of the general convergence criterion for certain iterative sequence in a given space by building a constructive greedy algorithm on a normed linear space using an arithmetic average of elements. We also show the degree of approximation order is still $Ο(1\sqrt{\n}$) by a bounded linear functional defined on a bounded subset of a normed linear space which offers a good approximation method for neural networks.
Constructive Approximation by Gaussian neural networks
( Nahm Woo Hahm ),( Bum Il Hong ) 호남수학회 2012 호남수학학술지 Vol.34 No.3
In this paper, we discuss a constructive approximation by Gaussian neural networks. We show that it is possible to construct Gaussian neural networks with integer weights that approximate arbitrarily well for functions in Cc (Rs). We demonstrate numerical experiments to support our theoretical results.
A SIMULTANEOUS NEURAL NETWORK APPROXIMATION WITH THE SQUASHING FUNCTION
Hahm, Nahm-Woo,Hong, Bum-Il The Honam Mathematical Society 2009 호남수학학술지 Vol.31 No.2
In this paper, we actually construct the simultaneous approximation by neural networks to a differentiable function. To do this, we first construct a polynomial approximation using the Fejer sum and then a simultaneous neural network approximation with the squashing activation function. We also give numerical results to support our theory.