Evaluating the Communication Efficiency in Federated Learning Algorithms

Muhammad Asad, Ahmed Moustafa, Takayuki Ito, Muhammad Aslam

Research output: Contribution to conferencePaperpeer-review

29 Citations (Scopus)

Abstract

In the era of advanced technologies, mobile devices are equipped with computing and sensing capabilities that gather excessive amounts of data. These amounts of data are suitable for training different learning models. Cooperated with Deep Learning (DL) advancements, these learning models empower numerous useful applications, e.g., image processing, speech recognition, healthcare, vehicular network, and many more. Traditionally, Machine Learning (ML) approaches require data to be centralised in cloud-based data-centres. However, this data is often large in quantity and privacy-sensitive, preventing logging into these data-centres for training the learning models. In turn, this results in critical issues of high latency and communication inefficiency. Recently, in light of new privacy legislation in many countries, the concept of Federated Learning (FL) has been introduced. In FL, mobile users are empowered to learn a global model by aggregating their local models without sharing the privacy-sensitive data. Usually, these mobile users have slow network connections to the data-centre where the global model is maintained. Moreover, in a complicated and extensive scale network, heterogeneous devices with various energy constraints are involved. This raises the challenge of communication cost when implementing FL at a large scale. To this end, in this research, we begin with the fundamentals of FL, and then we highlight the recent FL algorithms and evaluate their communication efficiency with detailed comparisons. Furthermore, we propose a set of solutions to alleviate the existing FL problems from a communication perspective and a privacy perspective.
Original languageEnglish
Pages552-557
Number of pages6
DOIs
Publication statusPublished - 05 May 2021
EventIEEE 24th International Conference of Computer Supported Work in Design - Dalian, China
Duration: 05 May 202107 May 2021

Conference

ConferenceIEEE 24th International Conference of Computer Supported Work in Design
Abbreviated titleIEEE CSCWD 2021
Country/TerritoryChina
CityDalian
Period05 May 202107 May 2021

Keywords

  • federated learning
  • collaborative learning
  • communication cost
  • decentralised data
  • Decentralised Data
  • Communication Cost
  • Federated Learning
  • Collaborative Learning

Fingerprint

Dive into the research topics of 'Evaluating the Communication Efficiency in Federated Learning Algorithms'. Together they form a unique fingerprint.

Cite this