Listing 1 - 3 of 3 |
Sort by
|
Choose an application
Probabilities. --- Phase transformations (Statistical physics) --- Measure theory. --- Lebesgue measure --- Measurable sets --- Measure of a set --- Algebraic topology --- Integrals, Generalized --- Measure algebras --- Rings (Algebra) --- Phase changes (Statistical physics) --- Phase transitions (Statistical physics) --- Phase rule and equilibrium --- Statistical physics --- Probability --- Statistical inference --- Combinations --- Mathematics --- Chance --- Least squares --- Mathematical statistics --- Risk --- Gaussian Fields. --- Gibbs Measures. --- Markov Chains. --- Phase Transition. --- Statistical Mechanics.
Choose an application
This monograph offers a state-of-the-art mathematical account of functional integration methods in the context of self-adjoint operators and semigroups using the concepts and tools of modern stochastic analysis. These ideas are then applied principally to a rigorous treatment of some fundamental models of quantum field theory. In this self-contained presentation of the material both beginners and experts are addressed, while putting emphasis on the interdisciplinary character of the subject.
Integration, Functional. --- Stochastic analysis. --- Quantum field theory --- Relativistic quantum field theory --- Field theory (Physics) --- Quantum theory --- Relativity (Physics) --- Analysis, Stochastic --- Mathematical analysis --- Stochastic processes --- Functional integration --- Functional analysis --- Integrals, Generalized --- Mathematics. --- Brownian Motion. --- Feynman-Kac-TypeTheorems. --- Gibbs Measures. --- Quantum Field Theory.
Choose an application
As the ultimate information processing device, the brain naturally lends itself to being studied with information theory. The application of information theory to neuroscience has spurred the development of principled theories of brain function, and has led to advances in the study of consciousness, as well as to the development of analytical techniques to crack the neural code—that is, to unveil the language used by neurons to encode and process information. In particular, advances in experimental techniques enabling the precise recording and manipulation of neural activity on a large scale now enable for the first time the precise formulation and the quantitative testing of hypotheses about how the brain encodes and transmits the information used for specific functions across areas. This Special Issue presents twelve original contributions on novel approaches in neuroscience using information theory, and on the development of new information theoretic results inspired by problems in neuroscience.
synergy --- Gibbs measures --- categorical perception --- entorhinal cortex --- neural network --- perceived similarity --- graph theoretical analysis --- orderness --- navigation --- network eigen-entropy --- Ising model --- higher-order correlations --- discrimination --- information theory --- recursion --- goodness --- consciousness --- neuroscience --- feedforward networks --- spike train statistics --- decoding --- eigenvector centrality --- discrete Markov chains --- submodularity --- free-energy principle --- infomax principle --- neural information propagation --- integrated information --- mismatched decoding --- maximum entropy principle --- perceptual magnet --- graph theory --- internal model hypothesis --- channel capacity --- complex networks --- representation --- latching --- noise correlations --- independent component analysis --- mutual information decomposition --- connectome --- redundancy --- mutual information --- information entropy production --- unconscious inference --- hippocampus --- neural population coding --- spike-time precision --- neural coding --- maximum entropy --- neural code --- Potts model --- pulse-gating --- functional connectome --- integrated information theory --- minimum information partition --- brain network --- Queyranne’s algorithm --- principal component analysis
Listing 1 - 3 of 3 |
Sort by
|