Abstract:
Because of the usage of many functions as a kernel, the support vector machine method has demonstrated remarkable versatility in tackling numerous machine learning issues. Gegenbauer polynomials, like the Chebyshev and Legender polynomials which are introduced in previous chapters, are among the most commonly utilized orthogonal polynomials that have produced outstanding results in the support vector machine method. In this chapter, some essential properties of Gegenbauer and fractional Gegenbauer functions are presented and reviewed, followed by the kernels of these functions, which are introduced and validated. Finally, the performance of these functions in addressing two issues (two example datasets) is evaluated.