http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
UNIVARIATE LEFT FRACTIONAL POLYNOMIAL HIGH ORDER MONOTONE APPROXIMATION
George A. Anastassiou 대한수학회 2015 대한수학회보 Vol.52 No.2
Let f ∈ Cr ([−1, 1]), r ≥ 0 and let L∗ be a linear left fractional differential operator such that L∗ (f) ≥ 0 throughout [0, 1]. We can find a sequence of polynomials Qn of degree ≤ n such that L∗ (Qn) ≥ 0 over [0, 1], furthermore f is approximated left fractionally and simultaneously by Qn on [−1, 1] . The degree of these restricted approximations is given via inequalities using a higher order modulus of smoothness for f(r).
George A. Anastassiou 한국전산응용수학회 2023 Journal of Applied and Pure Mathematics Vol.5 No.1
Here we give multivariate quantitative approximations of Banach space valued continuous multivariate functions on a box or $\mathbb{R}^{N},$ $N\in \mathbb{N}$, by the multivariate normalized, quasi-interpolation, Kantorovich type and quadrature type neural network operators. We treat also the case of approximation by iterated operators of the last four types. These approximations are derived by establishing multidimensional Jackson type inequalities involving the multivariate modulus of continuity of the engaged function or its high order Fr\'{e}chet derivatives. Our multivariate operators are defined by using a multidimensional density function induced by a parametrized Gudermannian sigmoid function. The approximations are pointwise and uniform. The related feed-forward neural network is with one hidden layer.
UNIVARIATE LEFT FRACTIONAL POLYNOMIAL HIGH ORDER MONOTONE APPROXIMATION
Anastassiou, George A. Korean Mathematical Society 2015 대한수학회보 Vol.52 No.2
Let $f{\in}C^r$ ([-1,1]), $r{\geq}0$ and let $L^*$ be a linear left fractional differential operator such that $L^*$ $(f){\geq}0$ throughout [0, 1]. We can find a sequence of polynomials $Q_n$ of degree ${\leq}n$ such that $L^*$ $(Q_n){\geq}0$ over [0, 1], furthermore f is approximated left fractionally and simulta-neously by $Q_n$ on [-1, 1]. The degree of these restricted approximations is given via inequalities using a higher order modulus of smoothness for $f^{(r)}$.
George A. Anastassiou 한국전산응용수학회 2021 Journal of Applied and Pure Mathematics Vol.3 No.5
Here we extended our earlier high order simultaneous fractional polynomial spline monotone approximation theory to abstract high order simultaneous fractional polynomial spline monotone approximation, with applications to Prabhakar fractional calculus and non-singular kernel fractional calculi. We cover both the left and right sides of this constrained approximation. Let f\in C^{s}\left( \left[ -1,1\right] \right), s\in \mathbb{N} and L^{\ast } be a linear left or right side fractional differential operator such that L^{\ast }\left( f\right) \geq 0 over \left[ 0,1\right] or \left[ -1,0\right] , respectively. Then there exists a sequence Q_{n}, n\in \mathbb{N} of polynomial splines with equally spaced knots of given fixed order such that L^{\ast }\left(Q_{n}\right) \geq 0 on \left[ 0,1\right] or \left[ -1,0\right], respectively. Furthermore f is approximated with rates fractionally and simultaneously by Q_{n} in the uniform norm. This constrained fractional approximation on \left[ -1,1\right] is given via inequalities involving a higher modulus of smoothness of f^{\left( s\right) }.
GEORGE A. ANASTASSIOU The Korean Society for Computational and Applied M 2023 Journal of applied and pure mathematics Vol.5 No.1/2
Here we give multivariate quantitative approximations of Banach space valued continuous multivariate functions on a box or ℝ<sup>N</sup>, N ∈ ℕ, by the multivariate normalized, quasi-interpolation, Kantorovich type and quadrature type neural network operators. We treat also the case of approximation by iterated operators of the last four types. These approximations are derived by establishing multidimensional Jackson type inequalities involving the multivariate modulus of continuity of the engaged function or its high order Fréchet derivatives. Our multivariate operators are defined by using a multidimensional density function induced by a parametrized Gudermannian sigmoid function. The approximations are pointwise and uniform. The related feed-forward neural network is with one hidden layer.
Generalized symmetrical sigmoid function activated neural network multivariate approximation
George A. Anastassiou 한국전산응용수학회 2022 Journal of Applied and Pure Mathematics Vol.4 No.3
Here we exhibit multivariate quantitative approximations of Banach space valued continuous multivariate functions on a box or \mathbb{R}^{N}, N\in \mathbb{N}, by the multivariate normalized, quasi-interpolation, Kantorovich type and quadrature type neural network operators. We treat also the case of approximation by iterated operators of the last four types. These approximations are achieved by establishing multidimensional Jackson type inequalities involving the multivariate modulus of continuity of the engaged function or its high order Fr\'{e}chet derivatives. Our multivariate operators are defined by using a multidimensional density function induced by the generalized symmetrical sigmoid function. The approximations are pointwise and uniform. The related feed-forward neural network is with one hidden layer.
GENERALIZED SYMMETRICAL SIGMOID FUNCTION ACTIVATED NEURAL NETWORK MULTIVARIATE APPROXIMATION
ANASTASSIOU, GEORGE A. The Korean Society for Computational and Applied M 2022 Journal of applied and pure mathematics Vol.4 No.3/4
Here we exhibit multivariate quantitative approximations of Banach space valued continuous multivariate functions on a box or ℝ<sup>N</sup>, N ∈ ℕ, by the multivariate normalized, quasi-interpolation, Kantorovich type and quadrature type neural network operators. We treat also the case of approximation by iterated operators of the last four types. These approximations are achieved by establishing multidimensional Jackson type inequalities involving the multivariate modulus of continuity of the engaged function or its high order Fréchet derivatives. Our multivariate operators are defined by using a multidimensional density function induced by the generalized symmetrical sigmoid function. The approximations are point-wise and uniform. The related feed-forward neural network is with one hidden layer.
Vectorial Hilfer-Prabhakar-Hardy type fractional inequalities
George A. Anastassiou 한국전산응용수학회 2023 Journal of Applied and Pure Mathematics Vol.5 No.5
We present a variety of univariate and multivariate left and right side Hardy type fractional inequalities, many of them under convexity, and other also of $L_{p}$ type, $p\geq 1$, in the setting of generalized Hilfer and Prabhakar fractional Calculi.
George A. Anastassiou 한국전산응용수학회 2022 Journal of Applied and Pure Mathematics Vol.4 No.5
In this article we exhibit univariate and multivariate quantitative approximation by Kantorovich-Choquet type quasi-interpolation neural network operators with respect to supremum norm. This is done with rates using the first univariate and multivariate moduli of continuity. We approximate continuous and bounded functions on $\mathbb{R}^{N},$ $N\in \mathbb{N}$. When they are also uniformly continuous we have pointwise and uniform convergences. Our activation functions are induced by the arctangent, algebraic, Gudermannian and generalized symmetrical sigmoid functions.
Trigonometric generated rate of convergence of smooth Picard singular integral operators
George A. Anastassiou 한국전산응용수학회 2023 Journal of Applied and Pure Mathematics Vol.5 No.5
In this article we continue the study of smooth Picard singular integral operators that started in \cite{2}, see there chapters 10-14. This time the foundation of our research is a trigonometric Taylor's formula. We establish the convergence of our operators to the unit operator with rates via Jackson type inequalities engaging the first modulus of continuity. Of interest here is a residual appearing term. Note that our operators are not positive. Our results are pointwise and uniform.