Sökning: "Memory requirements"
Visar resultat 21 - 25 av 152 avhandlingar innehållade orden Memory requirements.
21. The neurophysiology of working memory : functional mapping of the human brain with positron emission tomography
Sammanfattning : Working memory (WM) refers to the retention, or keeping on-line, of information over short periods of time. WM is thought to be important for a variety of cognitive functions, including problem solving, learning and reading. LÄS MER
22. Structural, Electronic and Mechanical Properties of Advanced Functional Materials
Sammanfattning : The search for alternate and renewable energy resources as well as the efficient use of energy and development of such systems that can help to save the energy consumption is needed because of exponential growth in world population, limited conventional fossil fuel resources, and to meet the increasing demand of clean and environment friendly substitutes. Hydrogen being the simplest, most abundant and clean energy carrier has the potential to fulfill some of these requirements provided the development of efficient, safe and durable systems for its production, storage and usage. LÄS MER
23. Interframe Quantization for Noisy Channels
Sammanfattning : The demand for efficient transmission and storage of, for example, speech signals poses high requirements on the development of signal compression techniques. This thesis deals with vector quantization (VQ) which is a powerful technique for signal compression. LÄS MER
24. Techniques to Cancel Execution Early to Improve Processor Efficiency
Sammanfattning : The evolution of computer systems to continuously improve execution efficiency has traditionally embraced various approaches across microprocessor generations. Unfortunately, contemporary processors still suffer from several inefficiencies although they offer an unprecedented level of computing capabilities. LÄS MER
25. Multi-LSTM Acceleration and CNN Fault Tolerance
Sammanfattning : This thesis addresses the following two problems related to the field of Machine Learning: the acceleration of multiple Long Short Term Memory (LSTM) models on FPGAs and the fault tolerance of compressed Convolutional Neural Networks (CNN). LSTMs represent an effective solution to capture long-term dependencies in sequential data, like sentences in Natural Language Processing applications, video frames in Scene Labeling tasks or temporal series in Time Series Forecasting. LÄS MER