Listing 1 - 10 of 44 | << page >> |
Sort by
|
Choose an application
Response surfaces (Statistics) --- Surfaces, Response (Statistics) --- Analysis of variance --- Experimental design --- Statistics --- Graphic methods
Choose an application
The purpose of this master thesis is to analyse the behaviour of European REITs in terms of volatility during economic steady growth and during crisis situations (bear markets). I will try to find out how some REITs behave in comparison to other REITs (geographically speaking). We faced and are facing a huge health crisis and it has affected the stock market violently; how did six different country-specific REIT indices behave? The six European REITs indices are the following: Belgium REIT index, France REIT index, the Netherlands REIT index, Germany REIT index, Spain REIT index and The UK REIT index. The ultimate goal is trying to find out overall conclusions regarding the multiple behaviours of REITs regarding their patterns, geography.
GARCH --- EGARCH GJR-GARCH --- conditional variance --- volatility --- European REITs --- Sciences économiques & de gestion > Finance
Choose an application
Bootstrapping is a conceptually simple statistical technique to increase the quality of estimates, conduct robustness checks and compute standard errors for virtually any statistic. This book provides an intelligible and compact introduction for students, scientists and practitioners. It not only gives a clear explanation of the underlying concepts but also demonstrates the application of bootstrapping using Python and Stata.
Python (Computer program language) --- Computer bootstrapping. --- Booting, Computer --- Bootstrapping, Computer --- Computer booting --- Computer systems --- Scripting languages (Computer science) --- Bootstrapping. --- Computational Statistics. --- Resampling. --- Variance Estimation Methods.
Choose an application
Ce mémoire de recherche intitulé « Les déterminants de l’investissement privé en Belgique : une modélisation VAR » a pour intérêt principal de mesurer la réponse de l’investissement privé aux impacts des chocs économiques, financiers et conjoncturels sur l’environnement macroéconomique de la Belgique. En ce sens, les variables explicatives du comportement de l’investissement privé retenues dans cette étude sont celles qui caractérisent l’environnement macroéconomique. La problématique sous-jacente à ce présent travail s’inscrit dans un corpus théorico-empirique qui expose les variables explicatives de la décision d’investir sous fond de crises économiques, financières et d’incertitudes macroéconomiques. La méthodologie VAR (Vecteur Autorégressif), grâce à son analyse dynamique (fonction impulsion-réponse, décomposition de la variance de l’erreur de prévision), nous a permis de mesurer, d’une part, la réactivité de l’investissement privé sous l’impulsion des chocs sur ses déterminants (variables explicatives), et d’autre part, le poids de chacun des déterminants dans l’explication de l’investissement privé. Les résultats de la modélisation VAR, à travers la fonction impulsion-réponse, laisse entrevoir l’importance des chocs sur la croissance du PIB (indicateur de la demande), les investissements publics, la croissance de l’indice des prix à la consommation (inflation), le solde budgétaire, la dette publique et les taux d’intérêts privés et publics dans l’explication du comportement de l’investissement privé en Belgique. Ils permettent également de conclure que l’investissement privé en Belgique a six déterminants importants : l’investissement privé proprement dit, la croissance du PIB réel (la demande), les investissements publics, la dette publique, le solde budgétaire, la croissance de l’indice des prix à la consommation (inflation). Les mesures de politiques économiques s’articulent autour de l’incitation de la demande par le canal d’investissements publics productifs, d’assainissement du climat des affaires, de la structuration de la dette et d’équilibre budgétaire et enfin de la stabilité des prix via la politique monétaire.
Choose an application
"Le livre en deux tomes (1500 pages) de Laurent Le Floch et Frédéric Testard couvre le programme de probabilités du lycée, de licence et des préparations aux concours de recrutement d'enseignants. Il fournira en outre une solide base pour les étudiants suivant des masters intégrant une branche probabiliste. Dans le premier tome, la démarche "en spirale" adoptée par les auteurs les conduit a développer les cadres successifs (hasard fini, discret, continu) en introduisant des outils ad hoc, regroupés à la fin de chaque grande partie. Ce n'est que dans le second tome que l'introduction des concepts relevant de l'intégration de Lebesgue les conduit aux énoncés abstraits de la théorie "moderne". Tout au long de l'ouvrage, de très nombreux exercices (plus de 700 au total) permettent aux lecteurs, grâce à des énoncés très détaillés, d'approfondir leur compréhension des notions rencontrées. L'aspect informatique est évidemment présent, et de nombreux exercices permettent ainsi de s'aguerrir à la pratique de la simulation d'expériences aléatoires, en langage Python en général"
Probabilities --- Lebesgue integral --- Analysis of variance --- Law of large numbers --- Random walks (Mathematics) --- Markov processes --- Probabilités. --- Lebesgue, Intégrale de. --- Analyse de variance. --- Hasard. --- Modèles relationnels probabilistes. --- Théorème de la limite centrale. --- Loi des grands nombres. --- Marches aléatoires (mathématiques) --- Martingales (mathématiques) --- Markov, Processus de.
Choose an application
Multivariate analysis. --- Estimation theory. --- Python (Computer program language) --- Scripting languages (Computer science) --- Estimating techniques --- Least squares --- Mathematical statistics --- Stochastic processes --- Multivariate distributions --- Multivariate statistical analysis --- Statistical analysis, Multivariate --- Analysis of variance --- Matrices
Choose an application
Experiments are a central methodology in the social sciences. Scholars from every discipline regularly turn to experiments. Practitioners rely on experimental evidence in evaluating social programs, policies, and institutions. This book is about how to "think" about experiments. It argues that designing a good experiment is a slow moving process (given the host of considerations) which is counter to the current fast moving temptations available in the social sciences. The book includes discussion of the place of experiments in the social science process, the assumptions underlying different types of experiments, the validity of experiments, the application of different designs, how to arrive at experimental questions, the role of replications in experimental research, and the steps involved in designing and conducting "good" experiments. The goal is to ensure social science research remains driven by important substantive questions and fully exploits the potential of experiments in a thoughtful manner.
Social sciences --- Experimental design. --- Experiments. --- Research. --- Mathematical optimization --- Research --- Science --- Statistical decision --- Statistics --- Analysis of means --- Analysis of variance --- Design of experiments --- Statistical design --- Social science research --- Experiments --- Methodology
Choose an application
"As a broad category of identity, "transgender" has given life to a vibrant field of academic research since the 1990s. Yet the Western origins of the field have tended to limit its cross-cultural scope. Howard Chiang proposes a new paradigm for doing transgender history in which geopolitics assumes central importance. Defined as the antidote to transphobia, transtopia challenges a minoritarian view of transgender experience and makes room for the variability of transness on a historical continuum. Against the backdrop of the Sinophone Pacific, Chiang argues that the concept of transgender identity must be rethought beyond a purely Western frame. At the same time, he challenges China-centrism in the study of East Asian gender and sexual configurations. Chiang brings Sinophone studies to bear on trans theory to deconstruct the ways in which sexual normativity and Chinese imperialism have been produced through one another. Grounded in an eclectic range of sources-from the archives of sexology to press reports of intersexuality, films about castration, and records of social activism-this book reorients anti-transphobic inquiry at the crossroads of area studies, medical humanities, and queer theory. Timely and provocative, Transtopia in the Sinophone Pacific highlights the urgency of interdisciplinary knowledge in debates over the promise and future of human diversity"--
Transsexuals --- Transgender people --- Gender nonconformity --- Gender variance (Gender nonconformity) --- Genderqueer --- Non-binary gender --- TGNC (Transgender and gender nonconformity) --- Transgenderism --- Gender expression --- Gender identity --- Persons --- Transexuals --- Transsexual people --- Transsexualism --- History --- Patients --- History.
Choose an application
In geodesy and geoinformation science, as well as in many other technical disciplines, it is often not possible to directly determine the desired target quantities. Therefore, the unknown parameters must be linked with the measured values by a mathematical model which consists of the functional and the stochastic models. The functional model describes the geometrical–physical relationship between the measurements and the unknown parameters. This relationship is sufficiently well known for most applications. With regard to the stochastic model, two problem domains of fundamental importance arise: 1. How can stochastic models be set up as realistically as possible for the various geodetic observation methods and sensor systems? 2. How can the stochastic information be adequately considered in appropriate least squares adjustment models? Further questions include the interpretation of the stochastic properties of the computed target values with regard to precision and reliability and the use of the results for the detection of outliers in the input data (measurements). In this Special Issue, current research results on these general questions are presented in ten peer-reviewed articles. The basic findings can be applied to all technical scientific fields where measurements are used for the determination of parameters to describe geometric or physical phenomena.
History of engineering & technology --- EM-algorithm --- multi-GNSS --- PPP --- process noise --- observation covariance matrix --- extended Kalman filter --- machine learning --- GNSS phase bias --- sequential quasi-Monte Carlo --- variance reduction --- autoregressive processes --- ARMA-process --- colored noise --- continuous process --- covariance function --- stochastic modeling --- time series --- elementary error model --- terrestrial laser scanning --- variance-covariance matrix --- terrestrial laser scanner --- stochastic model --- B-spline approximation --- Hurst exponent --- fractional Gaussian noise --- generalized Hurst estimator --- very long baseline interferometry --- sensitivity --- internal reliability --- robustness --- CONT14 --- Errors-In-Variables Model --- Total Least-Squares --- prior information --- collocation vs. adjustment --- mean shift model --- variance inflation model --- outlierdetection --- likelihood ratio test --- Monte Carlo integration --- data snooping --- GUM analysis --- geodetic network adjustment --- stochastic properties --- random number generator --- Monte Carlo simulation --- 3D straight line fitting --- total least squares (TLS) --- weighted total least squares (WTLS) --- nonlinear least squares adjustment --- direct solution --- singular dispersion matrix --- laser scanning data
Choose an application
In geodesy and geoinformation science, as well as in many other technical disciplines, it is often not possible to directly determine the desired target quantities. Therefore, the unknown parameters must be linked with the measured values by a mathematical model which consists of the functional and the stochastic models. The functional model describes the geometrical–physical relationship between the measurements and the unknown parameters. This relationship is sufficiently well known for most applications. With regard to the stochastic model, two problem domains of fundamental importance arise: 1. How can stochastic models be set up as realistically as possible for the various geodetic observation methods and sensor systems? 2. How can the stochastic information be adequately considered in appropriate least squares adjustment models? Further questions include the interpretation of the stochastic properties of the computed target values with regard to precision and reliability and the use of the results for the detection of outliers in the input data (measurements). In this Special Issue, current research results on these general questions are presented in ten peer-reviewed articles. The basic findings can be applied to all technical scientific fields where measurements are used for the determination of parameters to describe geometric or physical phenomena.
EM-algorithm --- multi-GNSS --- PPP --- process noise --- observation covariance matrix --- extended Kalman filter --- machine learning --- GNSS phase bias --- sequential quasi-Monte Carlo --- variance reduction --- autoregressive processes --- ARMA-process --- colored noise --- continuous process --- covariance function --- stochastic modeling --- time series --- elementary error model --- terrestrial laser scanning --- variance-covariance matrix --- terrestrial laser scanner --- stochastic model --- B-spline approximation --- Hurst exponent --- fractional Gaussian noise --- generalized Hurst estimator --- very long baseline interferometry --- sensitivity --- internal reliability --- robustness --- CONT14 --- Errors-In-Variables Model --- Total Least-Squares --- prior information --- collocation vs. adjustment --- mean shift model --- variance inflation model --- outlierdetection --- likelihood ratio test --- Monte Carlo integration --- data snooping --- GUM analysis --- geodetic network adjustment --- stochastic properties --- random number generator --- Monte Carlo simulation --- 3D straight line fitting --- total least squares (TLS) --- weighted total least squares (WTLS) --- nonlinear least squares adjustment --- direct solution --- singular dispersion matrix --- laser scanning data
Listing 1 - 10 of 44 | << page >> |
Sort by
|