Listing 1 - 7 of 7 |
Sort by
|
Choose an application
This book studies the construction methods for solving one-dimensional and multidimensional inverse dynamical problems for hyperbolic equations with memory. The theorems of uniqueness, stability and existence of solutions of these inverse problems are obtained. This book discusses the processes, by using generalized solutions, the spread of elastic or electromagnetic waves arising from sources of the type of pulsed directional “impacts” or “explosions”. This book presents new results in the study of local and global solvability of kernel determination problems for a half-space. It describes the problems of reconstructing the coefficients of differential equations and the convolution kernel of hyperbolic integro-differential equations by the method of Dirichlet-to-Neumann. The book will be useful for researchers and students specializing in the field of inverse problems of mathematical physics.
Choose an application
This book presents a study of statistical inferences based on the kernel-type estimators of distribution functions. The inferences involve matters such as quantile estimation, nonparametric tests, and mean residual life expectation, to name just some. Convergence rates for the kernel estimators of density functions are slower than ordinary parametric estimators, which have root-n consistency. If the appropriate kernel function is used, the kernel estimators of the distribution functions recover the root-n consistency, and the inferences based on kernel distribution estimators have root-n consistency. Further, the kernel-type estimator produces smooth estimation results. The estimators based on the empirical distribution function have discrete distribution, and the normal approximation cannot be improved—that is, the validity of the Edgeworth expansion cannot be proved. If the support of the population density function is bounded, there is a boundary problem, namely the estimator does not have consistency near the boundary. The book also contains a study of the mean squared errors of the estimators and the Edgeworth expansion for quantile estimators.
Statistics. --- Nonparametric statistics. --- Mathematical statistics. --- Statistical Theory and Methods. --- Applied Statistics. --- Non-parametric Inference. --- Mathematical Statistics. --- Mathematics --- Estadística matemàtica --- Funcions de Kernel
Choose an application
R (Computer program language) --- Kernel functions. --- Functions, Kernel --- Functions of complex variables --- Geometric function theory --- GNU-S (Computer program language) --- Domain-specific programming languages --- Funcions de Kernel --- Aprenentatge automàtic --- R (Llenguatge de programació)
Choose an application
This monograph studies the heat kernel for the spin-tensor Laplacians on Lie groups and maximally symmetric spaces. It introduces many original ideas, methods, and tools developed by the author and provides a list of all known exact results in explicit form – and derives them – for the heat kernel on spheres and hyperbolic spaces. Part I considers the geometry of simple Lie groups and maximally symmetric spaces in detail, and Part II discusses the calculation of the heat kernel for scalar, spinor, and generic Laplacians on spheres and hyperbolic spaces in various dimensions. This text will be a valuable resource for researchers and graduate students working in various areas of mathematics – such as global analysis, spectral geometry, stochastic processes, and financial mathematics – as well in areas of mathematical and theoretical physics – including quantum field theory, quantum gravity, string theory, and statistical physics.
Group theory --- Algebraic geometry --- Differential geometry. Global analysis --- Differential equations --- Mathematics --- Mathematical physics --- differentiaalvergelijkingen --- topologie (wiskunde) --- statistiek --- wiskunde --- fysica --- geometrie --- Heat equation. --- Kernel functions. --- Lie groups. --- Symmetric spaces. --- Equació de la calor --- Grups de Lie --- Espais simètrics --- Funcions de Kernel
Choose an application
The most crucial ability for machine learning and data science is mathematical logic for grasping their essence rather than relying on knowledge or experience. This textbook addresses the fundamentals of kernel methods for machine learning by considering relevant math problems and building Python programs. The book's main features are as follows: The content is written in an easy-to-follow and self-contained style. The book includes 100 exercises, which have been carefully selected and refined. As their solutions are provided in the main text, readers can solve all of the exercises by reading the book. The mathematical premises of kernels are proven and the correct conclusions are provided, helping readers to understand the nature of kernels. Source programs and running examples are presented to help readers acquire a deeper understanding of the mathematics used. Once readers have a basic understanding of the functional analysis topics covered in Chapter 2, the applications are discussed in the subsequent chapters. Here, no prior knowledge of mathematics is assumed. This book considers both the kernel for reproducing kernel Hilbert space (RKHS) and the kernel for the Gaussian process; a clear distinction is made between the two.
Programming --- Artificial intelligence. Robotics. Simulation. Graphics --- neuronale netwerken --- fuzzy logic --- cybernetica --- programmeren (informatica) --- KI (kunstmatige intelligentie) --- Artificial intelligence. --- Artificial intelligence --- Data processing. --- Funcions de Kernel --- Aprenentatge automàtic --- Python (Llenguatge de programació) --- AI (artificiële intelligentie)
Choose an application
The concept of large margins is a unifying principle for the analysis of many different approaches to the classification of data from examples, including boosting, mathematical programming, neural networks, and support vector machines. The fact that it is the margin, or confidence level, of a classification--that is, a scale parameter--rather than a raw training error that matters has become a key tool for dealing with classifiers. This book shows how this idea applies to both the theoretical analysis and the design of algorithms.The book provides an overview of recent developments in large margin classifiers, examines connections with other methods (e.g., Bayesian inference), and identifies strengths and weaknesses of the method, as well as directions for future research. Among the contributors are Manfred Opper, Vladimir Vapnik, and Grace Wahba.
Machine Learning --- Algorithms --- Kernel functions --- Machine learning --- #TELE:SISTA --- Learning, Machine --- Artificial intelligence --- Machine theory --- Functions, Kernel --- Functions of complex variables --- Geometric function theory --- Algorism --- Algebra --- Arithmetic --- Foundations --- Kernel functions. --- Algorithms. --- Machine learning. --- Engineering & Applied Sciences --- Computer Science --- E-books --- COMPUTER SCIENCE/General --- Algorismes --- Aprenentatge automàtic --- Kernel, Funcions de --- Funcions de Kernel --- Funcions Kernel --- Funcions de variables complexes --- Intel·ligència artificial --- Màquines, Teoria de --- Algoritmes --- Euclides, Algorisme d' --- Àlgebra --- Aritmètica --- Fonaments
Choose an application
This book contains select chapters on support vector algorithms from different perspectives, including mathematical background, properties of various kernel functions, and several applications. The main focus of this book is on orthogonal kernel functions, and the properties of the classical kernel functions—Chebyshev, Legendre, Gegenbauer, and Jacobi—are reviewed in some chapters. Moreover, the fractional form of these kernel functions is introduced in the same chapters, and for ease of use for these kernel functions, a tutorial on a Python package named ORSVM is presented. The book also exhibits a variety of applications for support vector algorithms, and in addition to the classification, these algorithms along with the introduced kernel functions are utilized for solving ordinary, partial, integro, and fractional differential equations. On the other hand, nowadays, the real-time and big data applications of support vector algorithms are growing. Consequently, the Compute Unified Device Architecture (CUDA) parallelizing the procedure of support vector algorithms based on orthogonal kernel functions is presented. The book sheds light on how to use support vector algorithms based on orthogonal kernel functions in different situations and gives a significant perspective to all machine learning and scientific machine learning researchers all around the world to utilize fractional orthogonal kernel functions in their pattern recognition or scientific computing problems.
Algebraic fields. --- Polynomials. --- Mathematical optimization. --- Quantitative research. --- Machine learning. --- Pattern recognition systems. --- Python (Computer program language). --- Field Theory and Polynomials. --- Optimization. --- Data Analysis and Big Data. --- Machine Learning. --- Automated Pattern Recognition. --- Python. --- Scripting languages (Computer science) --- Pattern classification systems --- Pattern recognition computers --- Pattern perception --- Computer vision --- Learning, Machine --- Artificial intelligence --- Machine theory --- Data analysis (Quantitative research) --- Exploratory data analysis (Quantitative research) --- Quantitative analysis (Research) --- Quantitative methods (Research) --- Research --- Optimization (Mathematics) --- Optimization techniques --- Optimization theory --- Systems optimization --- Mathematical analysis --- Maxima and minima --- Operations research --- Simulation methods --- System analysis --- Algebra --- Algebraic number fields --- Algebraic numbers --- Fields, Algebraic --- Algebra, Abstract --- Algebraic number theory --- Rings (Algebra) --- Aprenentatge automàtic --- Algorismes --- Funcions de Kernel --- Python (Llenguatge de programació) --- Python (Computer program language)
Listing 1 - 7 of 7 |
Sort by
|