Machine Learning Under Budget Constraints

Download Machine Learning Under Budget Constraints PDF/ePub or read online books in Mobi eBooks. Click Download or Read Online button to get Machine Learning Under Budget Constraints book now. This website allows unlimited access to, at the time of writing, more than 1.5 million titles, including hundreds of thousands of titles in various foreign languages.
Machine Learning under Resource Constraints - Applications

Author: Katharina Morik
language: en
Publisher: Walter de Gruyter GmbH & Co KG
Release Date: 2022-12-31
Machine Learning under Resource Constraints addresses novel machine learning algorithms that are challenged by high-throughput data, by high dimensions, or by complex structures of the data in three volumes. Resource constraints are given by the relation between the demands for processing the data and the capacity of the computing machinery. The resources are runtime, memory, communication, and energy. Hence, modern computer architectures play a significant role. Novel machine learning algorithms are optimized with regard to minimal resource consumption. Moreover, learned predictions are executed on diverse architectures to save resources. It provides a comprehensive overview of the novel approaches to machine learning research that consider resource constraints, as well as the application of the described methods in various domains of science and engineering. Volume 3 describes how the resource-aware machine learning methods and techniques are used to successfully solve real-world problems. The book provides numerous specific application examples. In the areas of health and medicine, it is demonstrated how machine learning can improve risk modelling, diagnosis, and treatment selection for diseases. Machine learning supported quality control during the manufacturing process in a factory allows to reduce material and energy cost and save testing times is shown by the diverse real-time applications in electronics and steel production as well as milling. Additional application examples show, how machine-learning can make traffic, logistics and smart cities more effi cient and sustainable. Finally, mobile communications can benefi t substantially from machine learning, for example by uncovering hidden characteristics of the wireless channel.
Machine Learning under Resource Constraints - Fundamentals

Author: Katharina Morik
language: en
Publisher: Walter de Gruyter GmbH & Co KG
Release Date: 2022-12-31
Machine Learning under Resource Constraints addresses novel machine learning algorithms that are challenged by high-throughput data, by high dimensions, or by complex structures of the data in three volumes. Resource constraints are given by the relation between the demands for processing the data and the capacity of the computing machinery. The resources are runtime, memory, communication, and energy. Hence, modern computer architectures play a significant role. Novel machine learning algorithms are optimized with regard to minimal resource consumption. Moreover, learned predictions are executed on diverse architectures to save resources. It provides a comprehensive overview of the novel approaches to machine learning research that consider resource constraints, as well as the application of the described methods in various domains of science and engineering. Volume 1 establishes the foundations of this new field. It goes through all the steps from data collection, their summary and clustering, to the different aspects of resource-aware learning, i.e., hardware, memory, energy, and communication awareness. Several machine learning methods are inspected with respect to their resource requirements and how to enhance their scalability on diverse computing architectures ranging from embedded systems to large computing clusters.
Machine Learning Under Budget Constraints

This thesis studies the problem of machine learning under budget constraints, in particular we propose to focus on the cost of the information used by the system to predict accurately. Most methods in machine learning usually defines the quality as the performance (e.g accuracy) on the task at hand, but ignores the cost of the model itself: for instance, the number of examples and/or labels needed during learning, the memory used, or the number of features required to predict at test-time. We propose more specifically in this manuscript several methods for cost-sensitive prediction w.r.t. the quantity of features used. We present three models that learn to predict under such constraint, i.e that learn a strategy to gather only the necessary information in order to predict well but with a small cost. The first model is a static approach applied on cold-start recommendation. We then define two adaptive methods that allow for a better trade-off between cost and accuracy, in a more generic setting. We rely on representation learning techniques, along with recurrent neural networks architecture and gradient descent algorithms for learning. In the last part of the thesis, we propose to study the problem of active-learning, where one aims at constraining the amount of labels used to train a model. We present our work for a novel approach of the problem using meta-learning, with an instantiation using bi-directional recurrent neural networks.