Abstract
We envision a mobile edge computing (MEC) framework for machine learning (ML)\ntechnologies, which leverages distributed client data and computation resources\nfor training high-performance ML models while preserving client privacy. Toward\nthis future goal, this work aims to extend Federated Learning (FL), a\ndecentralized learning framework that enables privacy-preserving training of\nmodels, to work with heterogeneous clients in a practical cellular network. The\nFL protocol iteratively asks random clients to download a trainable model from\na server, update it with own data, and upload the updated model to the server,\nwhile asking the server to aggregate multiple client updates to further improve\nthe model. While clients in this protocol are free from disclosing own private\ndata, the overall training process can become inefficient when some clients are\nwith limited computational resources (i.e. requiring longer update time) or\nunder poor wireless channel conditions (longer upload time). Our new FL\nprotocol, which we refer to as FedCS, mitigates this problem and performs FL\nefficiently while actively managing clients based on their resource conditions.\nSpecifically, FedCS solves a client selection problem with resource\nconstraints, which allows the server to aggregate as many client updates as\npossible and to accelerate performance improvement in ML models. We conducted\nan experimental evaluation using publicly-available large-scale image datasets\nto train deep neural networks on MEC environment simulations. The experimental\nresults show that FedCS is able to complete its training process in a\nsignificantly shorter time compared to the original FL protocol.\n
Affiliated Institutions
Related Publications
Federated Learning in Mobile Edge Networks: A Comprehensive Survey
In recent years, mobile devices are equipped with increasingly advanced sensing and computing capabilities. Coupled with advancements in Deep Learning (DL), this opens up countl...
Advances and Open Problems in Federated Learning
Federated learning (FL) is a machine learning setting where many clients (e.g. mobile devices or whole organizations) collaboratively train a model under the orchestration of a ...
Federated Learning With Differential Privacy: Algorithms and Performance Analysis
Federated learning (FL), as a type of distributed machine learning, is capable of significantly preserving clients’ private data from being exposed to adversaries. Ne...
Federated Learning with Non-IID Data
Federated learning enables resource-constrained edge compute devices, such as mobile phones and IoT devices, to learn a shared model for prediction, while keeping the training d...
Federated Learning: Challenges, Methods, and Future Directions
Federated learning involves training statistical models over remote devices or siloed data centers, such as mobile phones or hospitals, while keeping data localized. Training in...
Publication Info
- Year
- 2019
- Type
- article
- Citations
- 1292
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1109/icc.2019.8761315