Exponential sampling type neural network Kantorovich operators based on Hadamard fractional integral.

Saved in:
Bibliographic Details
Title: Exponential sampling type neural network Kantorovich operators based on Hadamard fractional integral.
Authors: Agrawal, Purshottam N.1 (AUTHOR) pnappfma@gmail.com, Baxhaku, Behar2 (AUTHOR) behar.baxhaku@uni-pr.edu
Source: Fractional Calculus & Applied Analysis. Aug2025, Vol. 28 Issue 4, p1887-1922. 36p.
Subject Terms: *ARTIFICIAL neural networks, *FRACTIONAL calculus, *FRACTIONAL integrals, *MATHEMATICAL optimization, *INTERPOLATION, *ASYMPTOTIC analysis, *IMAGE enhancement (Imaging systems)
Abstract: This study introduces a novel family of exponential sampling type neural network Kantorovich operators, leveraging Hadamard fractional integrals to significantly enhance function approximation capabilities. By incorporating a flexible parameter α , derived from fractional Hadamard integrals, and utilizing exponential sampling, introduced to tackle exponentially sampled data, our operators address critical limitations of existing methods, providing substantial improvements in approximation accuracy. We establish fundamental convergence theorems for continuous functions and demonstrate effectiveness in pth Lebesgue integrable spaces. Approximation degrees are quantified using logarithmic moduli of continuity, asymptotic expansions, and Peetre's K-functional for r-times continuously differentiable functions. A Voronovskaja-type theorem confirms higher-order convergence via linear combinations. Extensions to multivariate cases are proven for convergence in L p -spaces (1 ≤ p < ∞). MATLAB algorithms and illustrative examples validate theoretical findings, confirming convergence, computational efficiency, and operator consistency. We analyze the impact of various sigmoidal activation functions on approximation errors, presented via tables and graphs for one and two-dimensional cases. To demonstrate practical utility, we apply these operators to image scaling, focusing on the "Butterfly" dataset. With fractional parameter α = 2 , our operators, activated by a parametric sigmoid function, consistently outperform standard interpolation methods. Significant improvements in Structural Similarity Index Measure (SSIM) and Peak Signal-to-Noise Ratio (PSNR) are observed at m = 128 , highlighting the operators' efficacy in preserving image quality during upscaling. These results, combining theoretical rigor, computational validation, and practical application to image scaling, showcase the performance advantage of our proposed operators. By integrating fractional calculus and neural network theory, this work advances constructive approximation and image processing. [ABSTRACT FROM AUTHOR]
Database: Academic Search Index
Be the first to leave a comment!
You must be logged in first