Listing 1 - 10 of 3067 | << page >> |
Sort by
|
Choose an application
Inference. --- Biometry. --- Inférence. --- Biométrie.
Choose an application
Big Data--broadly considered as datasets whose size, complexity, and heterogeneity preclude conventional approaches to storage and analysis--continues to generate interest across many scientific domains in both the public and private sectors. However, analyses of large heterogeneous datasets can suffer from unidentified bias, misleading correlations, and increased risk of false positives. In order for the proliferation of data to produce new scientific discoveries, it is essential that the statistical models used for analysis support reliable, reproducible inference. The National Academies of Sciences, Engineering, and Medicine convened a workshop to discuss how scientific inference should be applied when working with large, complex datasets.
Inference. --- Big data.
Choose an application
This book is a compilation of unpublished papers written by Jean-Marie Rolin (with several co-authors) on nonparametric bayesian estimation. Jean-Marie was professor of statistics at University of Louvain and died on November 5th, 2018. He made important contributions in mathematical statistics with applications to different fields like econometrics or biometrics.These papers cover a variety of topics, including: • The Mathematical structure of the Bayesian model and the main concepts (sufficiency, ancillarity, invariance…) • Representation of the Dirichlet processes and of the associated Polya urn model and applications to nonparametric bayesian analysis. • Contributions to duration models and to their non parametric bayesian treatment.
Choose an application
"Everyone has heard the claim, "Correlation does not imply causation." What might sound like a reasonable dictum metastasized in the twentieth century into one of science's biggest obstacles, as a legion of researchers became unwilling to make the claim that one thing could cause another. Even two decades ago, asking a statistician a question like "Was it the aspirin that stopped my headache?" would have been like asking if he believed in voodoo, or at best a topic for conversation at a cocktail party rather than a legitimate target of scientific inquiry. Scientists were allowed to posit only that the probability that one thing was associated with another. This all changed with Judea Pearl, whose work on causality was not just a victory for common sense, but a revolution in the study of the world"--
Causation. --- Inference. --- Philosophy of nature --- Causation --- Inference --- Causalité --- Inférence. --- Causalité --- Inférence.
Choose an application
Inference (Logic) --- Relevance logic --- Relevant logic --- Relevance logic. --- Inference. --- Inference --- Inférence (Logique) --- Ampliative induction --- Induction, Ampliative --- Logic --- Reasoning
Choose an application
Certain combinations of sounds or signs on paper are meaningful. What makes it the case that, unlike most combinations of sounds answers to these questions are based on the idea that words stand for something, but it is difficult to say what words such as good, if, or probable stand for. This book advances novel answers based on the idea that words get their meaning from the way they are used to express states of mind and what follows from them. It articulates a precise version of this idea, at a time when the shortcomings of the traditional answers are hotly discussed.
Expression. --- Semantics. --- Inference. --- Language. --- Language: reference & general.
Choose an application
Scientists today have access to an unprecedented arsenal of high-tech tools that can be used to thoroughly characterize biological systems of interest. High-throughput “omics” technologies enable to generate enormous quantities of data at the DNA, RNA, epigenetic and proteomic levels. One of the major challenges of the post-genomic era is to extract functional information by integrating such heterogeneous high-throughput genomic data. This is not a trivial task as we are increasingly coming to understand that it is not individual genes, but rather biological pathways and networks that drive an organism’s response to environmental factors and the development of its particular phenotype. In order to fully understand the way in which these networks interact (or fail to do so) in specific states (disease for instance), we must learn both, the structure of the underlying networks and the rules that govern their behavior. In recent years there has been an increasing interest in methods that aim to infer biological networks. These methods enable the opportunity for better understanding the interactions between genomic features and the overall structure and behavior of the underlying networks. So far, such network models have been mainly used to identify and validate new interactions between genes of interest. But ultimately, one could use these networks to predict large-scale effects of perturbations, such as treatment by multiple targeted drugs. However, currently, we are still at an early stage of comprehending methods and approaches providing a robust statistical framework to quantitatively assess the quality of network inference and its predictive potential. The scope of this Research Topic in Bioinformatics and Computational Biology aims at addressing these issues by investigating the various, complementary approaches to quantify the quality of network models. These “validation” techniques could focus on assessing quality of specific interactions, global and local structures, and predictive ability of network models. These methods could rely exclusively on in silico evaluation procedures or they could be coupled with novel experimental designs to generate the biological data necessary to properly validate inferred networks.
Validation --- Gene Expression --- Network Inference --- bioinformatics
Choose an application
This book is about abduction, 'the logic of Sherlock Holmes', and about how some kinds of abductive reasoning can be programmed in a computer. The work brings together Artificial Intelligence and philosophy of science and is rich with implications for other areas such as, psychology, medical informatics, and linguistics. It also has subtle implications for evidence evaluation in areas such as accident investigation, confirmation of scientific theories, law, diagnosis, and financial auditing. The book is about certainty and the logico-computational foundations of knowledge; it is about inference in perception, reasoning strategies, and building expert systems.
Choose an application
Knowledge, Theory of. --- Inference. --- Thought and thinking.
Choose an application
This work provides a framework for info-metrics-the science of modeling, inference, and reasoning under conditions of noisy and insufficient information. Info-metrics is an inherently interdisciplinary framework that emerged from the intersection of information theory, statistical inference, and decision-making under uncertainty. It allows us to process the available information with minimal reliance on assumptions that cannot be validated. This text focuses on unifying all information processing and model building within a single constrained optimization framework. It provides a complete framework for modeling and inference, rather than a problem-specific model.
Measurement uncertainty (Statistics) --- Inference. --- Mathematical statistics.
Listing 1 - 10 of 3067 | << page >> |
Sort by
|