Listing 1 - 10 of 27 | << page >> |
Sort by
|
Choose an application
Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.
Research & information: general --- Mathematics & science --- Bregman divergence --- f-divergence --- Jensen–Bregman divergence --- Jensen diversity --- Jensen–Shannon divergence --- capacitory discrimination --- Jensen–Shannon centroid --- mixture family --- information geometry --- difference of convex (DC) programming --- conditional Rényi divergence --- horse betting --- Kelly gambling --- Rényi divergence --- Rényi mutual information --- relative entropy --- chi-squared divergence --- f-divergences --- method of types --- large deviations --- strong data–processing inequalities --- information contraction --- maximal correlation --- Markov chains --- information inequalities --- mutual information --- Rényi entropy --- Carlson–Levin inequality --- information measures --- hypothesis testing --- total variation --- skew-divergence --- convexity --- Pinsker’s inequality --- Bayes risk --- statistical divergences --- minimum divergence estimator --- maximum likelihood --- bootstrap --- conditional limit theorem --- Bahadur efficiency --- α-mutual information --- Augustin–Csiszár mutual information --- data transmission --- error exponents --- dimensionality reduction --- discriminant analysis --- statistical inference --- n/a --- Jensen-Bregman divergence --- Jensen-Shannon divergence --- Jensen-Shannon centroid --- conditional Rényi divergence --- Rényi divergence --- Rényi mutual information --- strong data-processing inequalities --- Rényi entropy --- Carlson-Levin inequality --- Pinsker's inequality --- Augustin-Csiszár mutual information
Choose an application
Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.
Bregman divergence --- f-divergence --- Jensen–Bregman divergence --- Jensen diversity --- Jensen–Shannon divergence --- capacitory discrimination --- Jensen–Shannon centroid --- mixture family --- information geometry --- difference of convex (DC) programming --- conditional Rényi divergence --- horse betting --- Kelly gambling --- Rényi divergence --- Rényi mutual information --- relative entropy --- chi-squared divergence --- f-divergences --- method of types --- large deviations --- strong data–processing inequalities --- information contraction --- maximal correlation --- Markov chains --- information inequalities --- mutual information --- Rényi entropy --- Carlson–Levin inequality --- information measures --- hypothesis testing --- total variation --- skew-divergence --- convexity --- Pinsker’s inequality --- Bayes risk --- statistical divergences --- minimum divergence estimator --- maximum likelihood --- bootstrap --- conditional limit theorem --- Bahadur efficiency --- α-mutual information --- Augustin–Csiszár mutual information --- data transmission --- error exponents --- dimensionality reduction --- discriminant analysis --- statistical inference --- n/a --- Jensen-Bregman divergence --- Jensen-Shannon divergence --- Jensen-Shannon centroid --- conditional Rényi divergence --- Rényi divergence --- Rényi mutual information --- strong data-processing inequalities --- Rényi entropy --- Carlson-Levin inequality --- Pinsker's inequality --- Augustin-Csiszár mutual information
Choose an application
Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.
Research & information: general --- Mathematics & science --- Bregman divergence --- f-divergence --- Jensen-Bregman divergence --- Jensen diversity --- Jensen-Shannon divergence --- capacitory discrimination --- Jensen-Shannon centroid --- mixture family --- information geometry --- difference of convex (DC) programming --- conditional Rényi divergence --- horse betting --- Kelly gambling --- Rényi divergence --- Rényi mutual information --- relative entropy --- chi-squared divergence --- f-divergences --- method of types --- large deviations --- strong data-processing inequalities --- information contraction --- maximal correlation --- Markov chains --- information inequalities --- mutual information --- Rényi entropy --- Carlson-Levin inequality --- information measures --- hypothesis testing --- total variation --- skew-divergence --- convexity --- Pinsker's inequality --- Bayes risk --- statistical divergences --- minimum divergence estimator --- maximum likelihood --- bootstrap --- conditional limit theorem --- Bahadur efficiency --- α-mutual information --- Augustin-Csiszár mutual information --- data transmission --- error exponents --- dimensionality reduction --- discriminant analysis --- statistical inference
Choose an application
This monograph presents a comprehensive, self-contained, and novel approach to the Divergence Theorem through five progressive volumes. Its ultimate aim is to develop tools in Real and Harmonic Analysis, of geometric measure theoretic flavor, capable of treating a broad spectrum of boundary value problems formulated in rather general geometric and analytic settings. The text is intended for researchers, graduate students, and industry professionals interested in applications of harmonic analysis and geometric measure theory to complex analysis, scattering, and partial differential equations. Volume I establishes a sharp version of the Divergence Theorem (aka Fundamental Theorem of Calculus) which allows for an inclusive class of vector fields whose boundary trace is only assumed to exist in a nontangential pointwise sense.
Functional analysis --- Harmonic analysis. Fourier analysis --- Mathematical analysis --- analyse (wiskunde) --- Fourierreeksen --- functies (wiskunde) --- mathematische modellen --- wiskunde --- Divergence theorem. --- Functional analysis. --- Anàlisi funcional
Choose an application
Divergence theorem. --- Functional analysis. --- Anàlisi funcional --- Functional calculus --- Calculus of variations --- Functional equations --- Integral equations --- Gauss-Ostrogradsky theorem --- Gauss's theorem --- Vector algebra --- Vector analysis --- Càlcul funcional --- Càlcul de variacions --- Àlgebres de Hilbert --- Àlgebres topològiques --- Anàlisi funcional no lineal --- Anàlisi microlocal --- Espais analítics --- Espais de Hardy --- Espais d'Orlicz --- Espais funcionals --- Espais vectorials normats --- Espais vectorials --- Filtres digitals (Matemàtica) --- Funcionals --- Funcions vectorials --- Multiplicadors (Anàlisi matemàtica) --- Pertorbació (Matemàtica) --- Teoria d'operadors --- Teoria de distribucions (Anàlisi funcional) --- Teoria de functors --- Teoria de l'aproximació --- Teoria del funcional de densitat --- Teoria espectral (Matemàtica) --- Equacions funcionals --- Equacions integrals
Choose an application
Condition monitoring of machinery is one of the most important aspects of many modern industries. With the rapid advancement of science and technology, machines are becoming increasingly complex. Moreover, an exponential increase of demand is leading an increasing requirement of machine output. As a result, in most modern industries, machines have to work for 24 hours a day. All these factors are leading to the deterioration of machine health in a higher rate than before. Breakdown of the key components of a machine such as bearing, gearbox or rollers can cause a catastrophic effect both in terms of financial and human costs. In this perspective, it is important not only to detect the fault at its earliest point of inception but necessary to design the overall monitoring process, such as fault classification, fault severity assessment and remaining useful life (RUL) prediction for better planning of the maintenance schedule. Information theory is one of the pioneer contributions of modern science that has evolved into various forms and algorithms over time. Due to its ability to address the non-linearity and non-stationarity of machine health deterioration, it has become a popular choice among researchers. Information theory is an effective technique for extracting features of machines under different health conditions. In this context, this book discusses the potential applications, research results and latest developments of information theory-based condition monitoring of machineries.
Technology: general issues --- History of engineering & technology --- fault detection --- deep learning --- transfer learning --- anomaly detection --- bearing --- wind turbines --- misalignment --- fault diagnosis --- information fusion --- improved artificial bee colony algorithm --- LSSVM --- D–S evidence theory --- optimal bandwidth --- kernel density estimation --- JS divergence --- domain adaptation --- partial transfer --- subdomain --- rotating machinery --- gearbox --- signal interception --- peak extraction --- cubic spline interpolation envelope --- combined fault diagnosis --- empirical wavelet transform --- grey wolf optimizer --- low pass FIR filter --- support vector machine --- satellite momentum wheel --- Huffman-multi-scale entropy (HMSE) --- support vector machine (SVM) --- adaptive particle swarm optimization (APSO) --- rail surface defect detection --- machine vision --- YOLOv4 --- MobileNetV3 --- multi-source heterogeneous fusion --- n/a --- D-S evidence theory
Choose an application
This book focuses on systematics, biogeography, and evolution of arachnids, a group of ancient chelicerate lineages that have taken on terrestrial lifestyles. The book opens with the questions of what arachnology represents, and where the field should go in the future. Twelve original contributions then dissect the current state-of-the-art in arachnological research. These papers provide innovative phylogenomic, evolutionary and biogeographic analyses and interpretations of new data and/or synthesize our knowledge to offer new directions for the future of arachnology.
BioGeoBEARS --- Caatinga --- dispersal --- Galapagos --- Neotropical --- speciation --- spiders --- tropical dry forests --- vicariance --- coin spider --- Nephilidae --- phylogenomics --- biogeography --- dispersal probability --- Arthropoda --- circular reasoning --- investigator bias --- paleontology --- Arachnida --- tissue --- X-rays --- micro-CT --- cerebrum --- nervous system --- neuroanatomy --- imaging --- Araneae --- biodiversity --- community ecology --- elevation --- Pantepui --- species turnover --- Tetragnatha --- dynamic disperser --- intermediate dispersal model of biogeography --- GAARlandia --- Tetragnathidae --- taxonomy --- taxonomic crisis --- species concepts --- data management --- monographic research --- molecular phylogeny --- divergence time --- relict group --- Linyphiidae --- phylogeny --- Caribbean biogeography --- arachnid --- araneae --- Micrathena --- long distance dispersal --- distribution --- diversity --- Salticidae --- target sequencing --- reduced representation sequencing (RRS) --- spider phylogenomics --- deep phylogeny --- molecular dating --- ancestral range analysis --- endemics --- founder-event --- intermediate dispersal model --- n/a --- Research. --- Biology. --- Microbiology.
Choose an application
Condition monitoring of machinery is one of the most important aspects of many modern industries. With the rapid advancement of science and technology, machines are becoming increasingly complex. Moreover, an exponential increase of demand is leading an increasing requirement of machine output. As a result, in most modern industries, machines have to work for 24 hours a day. All these factors are leading to the deterioration of machine health in a higher rate than before. Breakdown of the key components of a machine such as bearing, gearbox or rollers can cause a catastrophic effect both in terms of financial and human costs. In this perspective, it is important not only to detect the fault at its earliest point of inception but necessary to design the overall monitoring process, such as fault classification, fault severity assessment and remaining useful life (RUL) prediction for better planning of the maintenance schedule. Information theory is one of the pioneer contributions of modern science that has evolved into various forms and algorithms over time. Due to its ability to address the non-linearity and non-stationarity of machine health deterioration, it has become a popular choice among researchers. Information theory is an effective technique for extracting features of machines under different health conditions. In this context, this book discusses the potential applications, research results and latest developments of information theory-based condition monitoring of machineries.
fault detection --- deep learning --- transfer learning --- anomaly detection --- bearing --- wind turbines --- misalignment --- fault diagnosis --- information fusion --- improved artificial bee colony algorithm --- LSSVM --- D–S evidence theory --- optimal bandwidth --- kernel density estimation --- JS divergence --- domain adaptation --- partial transfer --- subdomain --- rotating machinery --- gearbox --- signal interception --- peak extraction --- cubic spline interpolation envelope --- combined fault diagnosis --- empirical wavelet transform --- grey wolf optimizer --- low pass FIR filter --- support vector machine --- satellite momentum wheel --- Huffman-multi-scale entropy (HMSE) --- support vector machine (SVM) --- adaptive particle swarm optimization (APSO) --- rail surface defect detection --- machine vision --- YOLOv4 --- MobileNetV3 --- multi-source heterogeneous fusion --- n/a --- D-S evidence theory
Choose an application
This book focuses on systematics, biogeography, and evolution of arachnids, a group of ancient chelicerate lineages that have taken on terrestrial lifestyles. The book opens with the questions of what arachnology represents, and where the field should go in the future. Twelve original contributions then dissect the current state-of-the-art in arachnological research. These papers provide innovative phylogenomic, evolutionary and biogeographic analyses and interpretations of new data and/or synthesize our knowledge to offer new directions for the future of arachnology.
Research. --- Biology. --- Microbiology. --- BioGeoBEARS --- Caatinga --- dispersal --- Galapagos --- Neotropical --- speciation --- spiders --- tropical dry forests --- vicariance --- coin spider --- Nephilidae --- phylogenomics --- biogeography --- dispersal probability --- Arthropoda --- circular reasoning --- investigator bias --- paleontology --- Arachnida --- tissue --- X-rays --- micro-CT --- cerebrum --- nervous system --- neuroanatomy --- imaging --- Araneae --- biodiversity --- community ecology --- elevation --- Pantepui --- species turnover --- Tetragnatha --- dynamic disperser --- intermediate dispersal model of biogeography --- GAARlandia --- Tetragnathidae --- taxonomy --- taxonomic crisis --- species concepts --- data management --- monographic research --- molecular phylogeny --- divergence time --- relict group --- Linyphiidae --- phylogeny --- Caribbean biogeography --- arachnid --- araneae --- Micrathena --- long distance dispersal --- distribution --- diversity --- Salticidae --- target sequencing --- reduced representation sequencing (RRS) --- spider phylogenomics --- deep phylogeny --- molecular dating --- ancestral range analysis --- endemics --- founder-event --- intermediate dispersal model
Choose an application
Condition monitoring of machinery is one of the most important aspects of many modern industries. With the rapid advancement of science and technology, machines are becoming increasingly complex. Moreover, an exponential increase of demand is leading an increasing requirement of machine output. As a result, in most modern industries, machines have to work for 24 hours a day. All these factors are leading to the deterioration of machine health in a higher rate than before. Breakdown of the key components of a machine such as bearing, gearbox or rollers can cause a catastrophic effect both in terms of financial and human costs. In this perspective, it is important not only to detect the fault at its earliest point of inception but necessary to design the overall monitoring process, such as fault classification, fault severity assessment and remaining useful life (RUL) prediction for better planning of the maintenance schedule. Information theory is one of the pioneer contributions of modern science that has evolved into various forms and algorithms over time. Due to its ability to address the non-linearity and non-stationarity of machine health deterioration, it has become a popular choice among researchers. Information theory is an effective technique for extracting features of machines under different health conditions. In this context, this book discusses the potential applications, research results and latest developments of information theory-based condition monitoring of machineries.
Technology: general issues --- History of engineering & technology --- fault detection --- deep learning --- transfer learning --- anomaly detection --- bearing --- wind turbines --- misalignment --- fault diagnosis --- information fusion --- improved artificial bee colony algorithm --- LSSVM --- D-S evidence theory --- optimal bandwidth --- kernel density estimation --- JS divergence --- domain adaptation --- partial transfer --- subdomain --- rotating machinery --- gearbox --- signal interception --- peak extraction --- cubic spline interpolation envelope --- combined fault diagnosis --- empirical wavelet transform --- grey wolf optimizer --- low pass FIR filter --- support vector machine --- satellite momentum wheel --- Huffman-multi-scale entropy (HMSE) --- support vector machine (SVM) --- adaptive particle swarm optimization (APSO) --- rail surface defect detection --- machine vision --- YOLOv4 --- MobileNetV3 --- multi-source heterogeneous fusion
Listing 1 - 10 of 27 | << page >> |
Sort by
|