Fedpdm: Representation Enhanced Federated Learning with Privacy Preserving Diffusion Models

38 Pages Posted: 24 May 2025

See all articles by Wei Guo

Wei Guo

affiliation not provided to SSRN

Fuzhen Zhuang

affiliation not provided to SSRN

Yiqi Tong

affiliation not provided to SSRN

Xiao Zhang

Shandong University

Zhaojun Hu

Renmin University of China

Jiejie Zhao

Zhongguancun Laboratory

Jin Dong

Beijing Academy of Blockchain and Edge Computing

Abstract

Most existing semi-parameter-sharing federated learning (FL) frameworks utilize generative models to achieve partial parameter sharing with the server, which effectively enhances the data privacy security of each client. However, these generative models often suffer from model utility degradation due to poor representation robustness. Meanwhile, representation inconsistency between local and global models exacerbates the client drift problem under non-IID scenarios. Furthermore, existing semi-parameter-sharing FL frameworks overlook representation leakage risks associated with generator sharing, while failing to balance privacy and utility. To alleviate these challenges, we propose FedPDM, a semi-parameter-sharing FL framework built upon a privacy-preserving diffusion model (PDM). Specifically, our proposed PDM enables model alignment with features from the privacy extractor without requiring direct exposure of this extractor, effectively mitigating utility degradation caused by poor representation robustness. Moreover, a feature-level penalty term is introduced into the optimization objective of PDM to avoid representation leakage. We further design a two-stage aggregation strategy that addresses representation inconsistency through initialization correction with Gaussian constraint for knowledge distillation. Finally, we provide the first theoretical convergence analysis for semi-parameter-sharing FL, demonstrating that our framework converges at a rate of O(1/T). Extensive experiments on four datasets show that FedPDM achieves average accuracy improvements of 1.78% to 5.56% compared with various state-of-the-art baselines.

Keywords: federated learning, diffusion model, Privacy protection, split learning

Suggested Citation

Guo, Wei and Zhuang, Fuzhen and Tong, Yiqi and Zhang, Xiao and Hu, Zhaojun and Zhao, Jiejie and Dong, Jin, Fedpdm: Representation Enhanced Federated Learning with Privacy Preserving Diffusion Models. Available at SSRN: https://ssrn.com/abstract=5267340 or http://dx.doi.org/10.2139/ssrn.5267340

Wei Guo

affiliation not provided to SSRN ( email )

No Address Available

Fuzhen Zhuang

affiliation not provided to SSRN ( email )

No Address Available

Yiqi Tong (Contact Author)

affiliation not provided to SSRN ( email )

No Address Available

Xiao Zhang

Shandong University ( email )

Zhaojun Hu

Renmin University of China ( email )

Room B906
Xianjin Building
Beijing, 100872
China

Jiejie Zhao

Zhongguancun Laboratory ( email )

Building 1, Courtyard 2, Cuihu North Ring Road
beijing, 100876
China

Jin Dong

Beijing Academy of Blockchain and Edge Computing ( email )

Beijing
China

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
5
Abstract Views
119
PlumX Metrics