Book contents
- Frontmatter
- Contents
- List of Figures
- List of Tables
- 1 Introduction
- 2 Preliminary
- 3 Fundamental Theory and Algorithms of Edge Learning
- 4 Communication-Efficient Edge Learning
- 5 Computation Acceleration
- 6 Efficient Training with Heterogeneous Data Distribution
- 7 Security and Privacy Issues in Edge Learning Systems
- 8 Edge Learning Architecture Design for System Scalability
- 9 Incentive Mechanisms in Edge Learning Systems
- 10 Edge Learning Applications
- Bibliography
- Index
4 - Communication-Efficient Edge Learning
Published online by Cambridge University Press: 14 January 2022
- Frontmatter
- Contents
- List of Figures
- List of Tables
- 1 Introduction
- 2 Preliminary
- 3 Fundamental Theory and Algorithms of Edge Learning
- 4 Communication-Efficient Edge Learning
- 5 Computation Acceleration
- 6 Efficient Training with Heterogeneous Data Distribution
- 7 Security and Privacy Issues in Edge Learning Systems
- 8 Edge Learning Architecture Design for System Scalability
- 9 Incentive Mechanisms in Edge Learning Systems
- 10 Edge Learning Applications
- Bibliography
- Index
Summary
Edge learning has enabled the training of large-scale machine learning models on a big dataset by implementing data parallelism in multiple nodes. However, the iterative interaction generated by multiple learning nodes together with the considerable quantity of communication data on each interaction yields huge communication overhead, which greatly hinders the scalability of Edge Learning. In this chapter, we introduce the mainstream approaches to achieve communication efficiency of edge training, including compressing communication data, reducing the synchronous frequency, overlapping computation and communication, and optimizing the transmission network. Specifically, we propose two hybrid mechanisms for communication-efficient Edge Learning. The first one is QOSP that integrates gradient quantization for communication compression and overlap synchronization parallel for simultaneous computation and communication. The second mechanism improves communication efficiency during the aggregation of client-side updates by quantizing the gradients and exploiting the inherent superposition of radio frequency signals. Finally, we discuss the future directions of communication-efficient edge learning.
Keywords
- Type
- Chapter
- Information
- Edge Learning for Distributed Big Data AnalyticsTheory, Algorithms, and System Design, pp. 42 - 72Publisher: Cambridge University PressPrint publication year: 2022