Communication Efficient Distributed Learning Over Wireless Channels

Idan Achituve, Wenbo Wang, Ethan Fetaya, Amir Leshem*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Vertically distributed learning exploits the local features collected by multiple learning workers to form a better global model. However, data exchange between the workers and the model aggregator for parameter training incurs a heavy communication burden, especially when the learning system is built upon capacity-constrained wireless networks. In this letter, we propose a novel hierarchical distributed learning framework, where each worker separately learns a low-dimensional embedding of their local observed data. Then, they perform communication-efficient distributed max-pooling to efficiently transmit the synthesized input to the aggregator. For data exchange over a shared wireless channel, we propose an opportunistic carrier sensing-based protocol to implement the max-pooling of the output of all the workers. Our simulation experiments show that the proposed learning framework is able to achieve almost the same model accuracy as the learning model using the concatenation of all the raw outputs from the learning workers while significantly reducing the communication load.

Original languageEnglish
Pages (from-to)1402-1406
Number of pages5
JournalIEEE Signal Processing Letters
Volume30
DOIs
StatePublished - 2023
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 1994-2012 IEEE.

Keywords

  • Learning over wireless channels
  • max-pooling
  • opportunistic carrier sensing
  • vertically distributed learning

Fingerprint

Dive into the research topics of 'Communication Efficient Distributed Learning Over Wireless Channels'. Together they form a unique fingerprint.

Cite this