Sökning: "Computer"
Visar resultat 21 - 25 av 7042 avhandlingar innehållade ordet Computer.
21. QoS Driven Coordinated Management of Resources to Save Energy in Multi-Core Systems
Sammanfattning : Reducing the energy consumption of computing systems is a necessary endeavor. However, saving energy should not come at the expense of degrading user experience. To this end, in this thesis, we assume that applications running on multi-core processors are associated with a quality-of-service (QoS) target in terms of performance constraints. LÄS MER
22. Deterministic, Explainable and Resource-Efficient Stream Processing for Cyber-Physical Systems
Sammanfattning : We are undeniably living in the era of big data , where people and machines generate information at an unprecedented rate. While processing such data can provide immense value, it can prove especially challenging because of the data's Volume, Variety and Velocity . LÄS MER
23. Realistic Real-Time Rendering of Global Illumination and Hair through Machine Learning Precomputations
Sammanfattning : Over the last decade, machine learning has gained a lot of traction in many areas, and with the advent of new GPU models that include acceleration hardware for neural network inference, real-time applications have also started to take advantage of these algorithms. In general, machine learning and neural network methods are not designed to run at the speeds that are required for rendering in high-performance real-time environments, except for very specific and typically limited uses. LÄS MER
24. Energy and Route Optimization of Moving Devices
Sammanfattning : This thesis highlights our efforts in energy and route optimization of moving devices. We have focused on three categories of such devices; industrial robots in a multi-robot environment, generic vehicles in a vehicle routing problem (VRP) context, automated guided vehicles (AGVs) in a large-scale flexible manufacturing system (FMS). LÄS MER
25. Multi-LSTM Acceleration and CNN Fault Tolerance
Sammanfattning : This thesis addresses the following two problems related to the field of Machine Learning: the acceleration of multiple Long Short Term Memory (LSTM) models on FPGAs and the fault tolerance of compressed Convolutional Neural Networks (CNN). LSTMs represent an effective solution to capture long-term dependencies in sequential data, like sentences in Natural Language Processing applications, video frames in Scene Labeling tasks or temporal series in Time Series Forecasting. LÄS MER