Narrow your search
Listing 1 - 7 of 7
Sort by

Book
Kernel Determination Problems in Hyperbolic Integro-Differential Equations
Authors: ---
ISBN: 9819922607 9819922593 Year: 2023 Publisher: Singapore : Springer Nature Singapore : Imprint: Springer,

Loading...
Export citation

Choose an application

Bookmark

Abstract

This book studies the construction methods for solving one-dimensional and multidimensional inverse dynamical problems for hyperbolic equations with memory. The theorems of uniqueness, stability and existence of solutions of these inverse problems are obtained. This book discusses the processes, by using generalized solutions, the spread of elastic or electromagnetic waves arising from sources of the type of pulsed directional “impacts” or “explosions”. This book presents new results in the study of local and global solvability of kernel determination problems for a half-space. It describes the problems of reconstructing the coefficients of differential equations and the convolution kernel of hyperbolic integro-differential equations by the method of Dirichlet-to-Neumann. The book will be useful for researchers and students specializing in the field of inverse problems of mathematical physics.


Book
Statistical Inference Based on Kernel Distribution Function Estimators
Authors: ---
ISBN: 9819918618 9819918626 Year: 2023 Publisher: Singapore : Springer Nature Singapore : Imprint: Springer,

Loading...
Export citation

Choose an application

Bookmark

Abstract

This book presents a study of statistical inferences based on the kernel-type estimators of distribution functions. The inferences involve matters such as quantile estimation, nonparametric tests, and mean residual life expectation, to name just some. Convergence rates for the kernel estimators of density functions are slower than ordinary parametric estimators, which have root-n consistency. If the appropriate kernel function is used, the kernel estimators of the distribution functions recover the root-n consistency, and the inferences based on kernel distribution estimators have root-n consistency. Further, the kernel-type estimator produces smooth estimation results. The estimators based on the empirical distribution function have discrete distribution, and the normal approximation cannot be improved—that is, the validity of the Edgeworth expansion cannot be proved. If the support of the population density function is bounded, there is a boundary problem, namely the estimator does not have consistency near the boundary. The book also contains a study of the mean squared errors of the estimators and the Edgeworth expansion for quantile estimators.


Book
Kernel Methods for Machine Learning with Math and R
Authors: ---
ISBN: 9789811903984 Year: 2022 Publisher: Singapore Springer Nature Singapore :Imprint: Springer


Multi
Heat Kernel on Lie Groups and Maximally Symmetric Spaces
Authors: ---
ISBN: 9783031274510 9783031274503 9783031274527 Year: 2023 Publisher: Cham Springer Nature Switzerland :Imprint: Birkhäuser

Loading...
Export citation

Choose an application

Bookmark

Abstract

This monograph studies the heat kernel for the spin-tensor Laplacians on Lie groups and maximally symmetric spaces. It introduces many original ideas, methods, and tools developed by the author and provides a list of all known exact results in explicit form – and derives them – for the heat kernel on spheres and hyperbolic spaces. Part I considers the geometry of simple Lie groups and maximally symmetric spaces in detail, and Part II discusses the calculation of the heat kernel for scalar, spinor, and generic Laplacians on spheres and hyperbolic spaces in various dimensions. This text will be a valuable resource for researchers and graduate students working in various areas of mathematics – such as global analysis, spectral geometry, stochastic processes, and financial mathematics – as well in areas of mathematical and theoretical physics – including quantum field theory, quantum gravity, string theory, and statistical physics.


Multi
Kernel Methods for Machine Learning with Math and Python
Authors: ---
ISBN: 9789811904011 9789811904004 9789811904028 Year: 2022 Publisher: Singapore Springer Nature Singapore :Imprint: Springer

Loading...
Export citation

Choose an application

Bookmark

Abstract

The most crucial ability for machine learning and data science is mathematical logic for grasping their essence rather than relying on knowledge or experience. This textbook addresses the fundamentals of kernel methods for machine learning by considering relevant math problems and building Python programs. The book's main features are as follows: The content is written in an easy-to-follow and self-contained style. The book includes 100 exercises, which have been carefully selected and refined. As their solutions are provided in the main text, readers can solve all of the exercises by reading the book. The mathematical premises of kernels are proven and the correct conclusions are provided, helping readers to understand the nature of kernels. Source programs and running examples are presented to help readers acquire a deeper understanding of the mathematics used. Once readers have a basic understanding of the functional analysis topics covered in Chapter 2, the applications are discussed in the subsequent chapters. Here, no prior knowledge of mathematics is assumed. This book considers both the kernel for reproducing kernel Hilbert space (RKHS) and the kernel for the Gaussian process; a clear distinction is made between the two.

Advances in large margin classifiers
Author:
ISBN: 0262194481 0262283972 1423729544 9780262283977 9780262194488 0262292408 9781423729549 9780262292405 Year: 2000 Publisher: Cambridge, Mass. MIT Press

Loading...
Export citation

Choose an application

Bookmark

Abstract

The concept of large margins is a unifying principle for the analysis of many different approaches to the classification of data from examples, including boosting, mathematical programming, neural networks, and support vector machines. The fact that it is the margin, or confidence level, of a classification--that is, a scale parameter--rather than a raw training error that matters has become a key tool for dealing with classifiers. This book shows how this idea applies to both the theoretical analysis and the design of algorithms.The book provides an overview of recent developments in large margin classifiers, examines connections with other methods (e.g., Bayesian inference), and identifies strengths and weaknesses of the method, as well as directions for future research. Among the contributors are Manfred Opper, Vladimir Vapnik, and Grace Wahba.


Book
Learning with Fractional Orthogonal Kernel Classifiers in Support Vector Machines : Theory, Algorithms and Applications
Authors: --- ---
ISBN: 9811965536 9811965528 Year: 2023 Publisher: Singapore : Springer Nature Singapore : Imprint: Springer,

Loading...
Export citation

Choose an application

Bookmark

Abstract

This book contains select chapters on support vector algorithms from different perspectives, including mathematical background, properties of various kernel functions, and several applications. The main focus of this book is on orthogonal kernel functions, and the properties of the classical kernel functions—Chebyshev, Legendre, Gegenbauer, and Jacobi—are reviewed in some chapters. Moreover, the fractional form of these kernel functions is introduced in the same chapters, and for ease of use for these kernel functions, a tutorial on a Python package named ORSVM is presented. The book also exhibits a variety of applications for support vector algorithms, and in addition to the classification, these algorithms along with the introduced kernel functions are utilized for solving ordinary, partial, integro, and fractional differential equations. On the other hand, nowadays, the real-time and big data applications of support vector algorithms are growing. Consequently, the Compute Unified Device Architecture (CUDA) parallelizing the procedure of support vector algorithms based on orthogonal kernel functions is presented. The book sheds light on how to use support vector algorithms based on orthogonal kernel functions in different situations and gives a significant perspective to all machine learning and scientific machine learning researchers all around the world to utilize fractional orthogonal kernel functions in their pattern recognition or scientific computing problems.

Listing 1 - 7 of 7
Sort by