Listing 1 - 9 of 9 |
Sort by
|
Choose an application
Choose an application
“Verpleegkundige aandachtspunten rond infectiepreventie bij oncologische patiënten met een poortkatheter” is een onderwerp dat ons voorgesteld werd door één van onze stageplaatsen. De afdeling gaf aan zelf een gevoel van kennistekort te hebben rond de poortkatheter. Risico op infectie werd aangehaald als een drempel om een poortkatheter aan te prikken. Daarom onderzochten we hoe de verpleegkundige op een doordachte manier kan handelen bij het aanprikken, verzorgen en verwijderen van een poortkatheter zodat het risico op infectie zo klein mogelijk is. Een poortkatheter is een volledig onderhuids geïmplanteerde centraal veneuze toegangspoort. De poortkatheter kent vele voordelen. Tot slot is het grote voordeel van de poortkatheter dat het een lange verblijfsduur heeft. Men spreekt in de literatuur over maanden tot jaren. Bij oncologische patiënten wordt deze vaak ingebracht ter ondersteuning van hun behandeling. Chemotherapie kan via deze weg toegediend worden. Oncologische patiënten hebben een verlaagde weerstand door hun ziekte, het gebruik van cytostatica en medicatie die de afweer onderdrukken. Daardoor zijn deze zorgvragers erg vatbaar voor micro-organisme van buitenaf. De poortkatheter is een rechtstreekse toegangsweg naar de bloedbaan waardoor er micro-organisme het lichaam snel kunnen binnendringen. Er zijn verschillende soorten infecties die met de poortkatheter te maken hebben en deze brengen verschillende gevolgen met zich mee. Zo zijn er de kathetergerelateerde lokale infecties, de kathetergerelateerde bacteriëmie en de kathetergerelateerde sepsis. Het onderscheid hierin kunnen maken is belangrijk in functie van de behandeling. Infectiepreventie is cruciaal op elk moment dat de verpleegkundige in contact komt met de poortkatheter. Alleen door bewust te zijn van het gevaar dat een infectie met zich kan voortbrengen voor de patiënt en hoe dit voorkomen kan worden, kan de poortkatheter veilig gebruikt worden.
Choose an application
Choose an application
Choose an application
Choose an application
Choose an application
Fluorescence microscopy is a key technique for research in biology, medicine etc. Certain molecules, called fluorophores, emit light in a specific colour when they are illuminated. The miniaturisation of a fluorescence microscope onto a chip would make the technique more widely accessible. It would also make it suitable for applications that need fast and parallel analysis, like DNA sequencing. The functioning of a chip for fluorescence microscopy contains several interesting elements. Light from a laser is transported on the chip by waveguides, and these waveguides send light beams from different directions into a central region where an interference pattern forms. The fluorophores within the sample placed on top of the central region feel this light. They start to emit their own light that can be captured with a CMOS sensor, a chip similar to a normal camera. However, the captured light creates blurry pictures, because it is not focused like with normal lenses. Only illuminating specific spots instead of the full sample makes it possible to calculate the origin of the captured light, since we know which part was illuminated. These light spots can then be moved to scan over the entire sample and create a detailed image. The goal is now to design a chip with these abilities. Crossing waves create interference patterns, and interference patterns with bright spots at regular locations are called optical lattices. Not any optical lattice is suitable for structured illumination, it is important that the spots are bright enough and that there is enough distance between them. The integer lattice method, developed by D. Kouznetsov, provides a systematic way of investigating possible lattices, and it also defines how they can be created and translated. This led to the currently existing design of an optical lattice-generating chip. It is 5.6 mm x 5.6 mm, but a lattice is only in the central region of 100 μm x 100 μm generated. This is not efficient and therefore, an alternative way to design a compact chip is desired. A computer can simulate the pattern generated for a device and optimise this for a lattice. This process is called inverse design and produces compact, but irregular devices. Now, the optical lattice can not be translated any more to scan the sample. In this thesis, explainable AI is used to extract design concepts for a compact, regular optical lattice-generating chip. Shapley values, a form of explainable AI, originate from game theory and can indicate the regions in an image most relevant for a neural network. A neural network is a layered structure of mathematical calculations with tunable parameters able to perform very complex tasks. With inverse design, a large database of two classes of optical lattice-generating devices was created, and the neural network was trained on this database to distinguish devices generating different optical lattices. The Shapley values of the dataset with respect to this network were computed and analysed. Specific features similar to the elements of the integer lattice method were recognized. The Shapley values provided guidance towards a new design. This design is defined by mathematics rather than the seemingly random shapes of inverse design, and it is slightly more compact than inverse designs. Unfortunately, it has similar difficulties in fabrication and translating the optical lattice as inverse design, and further research is needed.
Choose an application
An MRI scan allows the internal organs of a human to be imaged. A whole-body scan is necessarily taken in multiple segments. As a result, differences in brightness (signal intensity) arise between these parts. Radiologists who examine the images cannot view the entire body with the same quality in a single glance. Moreover, these differences make it difficult to compare the signal intensity of lesions in different segments, and lesions on the border of two segments can remain unnoticed. Therefore, it is important to equalize the overall intensity level of each image segment across the entire body (normalization). The current model used at UZ Leuven hospital achieves this normalization based on a statistical measure called the 95th percentile. This 95th percentile represents a sort of maximum intensity, where the highest values are ignored. However, this method does not always work well because this maximum can vary depending on the tissue composition in that segment. For example, water produces a strong signal and can be present in large amounts in the abdomen in certain diseases. When these high-intensity values are only present in a few segments, the current model works poorly, complicating the interpretation of the images. To improve normalization, this thesis devised and implemented six new normalization methods. Two of these are based on the diffusion map. A diffusion map is an image obtained with a special type of MRI scan, where the brightness at each point indicates how freely hydrogen atoms can move in that point. Imagine having this diffusion map on paper; then you cut out the parts with a certain greyscale value and place this as a mask over the original scan. The normalization can now be done based on the median or the 95th percentile of the signal intensity values that are now visible. The next two methods do not use the signal itself but the variation in the signal. They calculate the local variation to perform normalization based on this. The fifth method looks at the maximum of the Laplace image, where each pixel is the sum of all differences with the surrounding pixels. Finally, the sixth method first determines the body contour and then selects a band just below this contour. Then, it normalizes the 95th percentile of this band. These methods were applied to a dataset of 24 scans, and the resulting images were compared both visually and based on several quality metrics. These metrics looked at the variation of the 95th percentile per slice, the difference between the slices at the edge of the segments, and the difference between the average intensity of a selected tissue across the different segments. All methods improved the intensity consistency between the different segments, with the Laplace and body contour-based methods only to a limited extent, and those based on the diffusion map and variation more so. The method that normalizes the median of the part selected by the diffusion map stood out strongly and appears promising as a replacement for the current model used in the hospital. Although the method still needs to be optimized and tested, this thesis suggests that this normalization method could be a valuable tool for radiologists in the analysis of MRI images.
Choose an application
Economic law --- concurrentie --- economisch recht --- Belgium
Listing 1 - 9 of 9 |
Sort by
|