Related Publications

Federated Machine Learning

Today’s artificial intelligence still faces two major challenges. One is that, in most industries, data exists in the form of isolated islands. The other is the strengthening of...

2019 ACM Transactions on Intelligent Syste... 5133 citations

Relational Knowledge Distillation

Knowledge distillation aims at transferring knowledge acquired in one model (a teacher) to another model (a student) that is typically smaller. Previous approaches can be expres...

2019 2019 IEEE/CVF Conference on Computer ... 1437 citations

Publication Info

Year
2025
Type
article
Volume
12
Issue
1
Citations
0
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

0
OpenAlex

Cite This

Ping Zhang, Wenlong Lu, X. R. Zhou et al. (2025). FedMGKD: a multi-granularity trusted knowledge distillation framework for edge personalized federated learning. Complex & Intelligent Systems , 12 (1) . https://doi.org/10.1007/s40747-025-02142-x

Identifiers

DOI
10.1007/s40747-025-02142-x