Narrow your search

Library

KU Leuven (1)


Resource type

dissertation (1)


Language

English (1)


Year
From To Submit

2020 (1)

Listing 1 - 1 of 1
Sort by

Dissertation
Efficient Probabilistic Assessment of Hygrothermal Performance: Sequential Monte Carlo and Decomposition Methods

Loading...
Export citation

Choose an application

Bookmark

Abstract

In the my PhD project, we are going to improve the computational efficiency of probabilistic hygrothermal assessment mainly based on two approaches. The first approach focuses on the core model itself and aims at reducing the computation time for a single deterministic simulation. In this project these core simulation models are mainly about wall models which simulate the hygrothermal behavior of building materials and components in multi-layer walls. Several one-(1D) , two or three-(2 or 3D) dimensional models can be found in the literature. However, the application of these models is usually very time consuming due to the high degrees of freedom after the spatial and temporal discretization. Instead of these original models, Van Gelder et al used statistical surrogate models (such as polynomial regression model, Kriging etc.) to reduce the simulation time. However, since these statistical surrogate models can only deliver static results, surrogate models that allow mimicking the dynamic behavior (such as time evolution of temperatures, ...), need to be developed. In order to lower the computational complexity and obtain the dynamic behavior of the original model, model order reduction (MOR) methods are usually used. Through model order reduction, a large original model is approximated by a reduced model and the solution of the original system can be recovered from the solution of the reduced model.The second possible approach is going to restrict the number of needed repetitions of the core deterministic model in the framework of Monte Carlo method, which is the tool applied for estimating the probability distribution of the output parameters, and the current state-of-the-art in the Monte Carlo Method is based on a replicated optimized Latin hypercube sampling strategy. Optimized Latin hypercube sampling is a sampling strategy which divides each parameter into n intervals then makes sure that only a single sample is placed in each interval. Even though Optimized Latin hypercube sampling has a good convergence rate ( 1/n), since it is a variance-reduction method it becomes difficult to monitor it's convergence. In order to make convergence monitoring possible, replicated Latin hypercube sampling has been presented by Janssen, which uses permutated repetitions of smaller designs to reach the set number of runs n instead of single n-run Optimized Latin hypercube design. As a consequence, it allows evaluating the variances on the Monte Carlo outcomes which in turn permits halting the calculation when the desired accuracy levels are reached. However, the main drawback of this methods is, it does not converge as fast as normal optimal Latin hypercube designs. Another sampling design approach is to use low-discrepancy sampling designs to create the input variables of the Monte Carlo framework. Singhee [6] showed that low discrepancy sampling designs can often be a better choice compared to both simple random sampling and Latin Hypercube Sampling method due to it's lower variance, faster convergence and better accuracy. This result motivates us to study the application of sequential sampling method based on a low discrepancy design for improving the efficiency of Monte Carlo analysis.

Keywords

Listing 1 - 1 of 1
Sort by