快速检索:      
引用本文:
【打印本页】   【下载PDF全文】   查看/发表评论  【EndNote】   【RefMan】   【BibTex】
过刊浏览    高级检索
本文已被:浏览 10次   下载 0  
投稿日期:2024-12-17 录用日期:2025-02-24 最后修改日期:2025-02-19
分享到: 微信 更多
基于联邦学习的图像分类模型优化策略
许美静, 李鲁群, 李双
上海师范大学信息与机电工程学院
摘要:
本文以图像分类为研究背景,针对数据隐私保护需求及原始LeNet-5模型在复杂图像分类中的性能不足问题,提出一种结合联邦迁移学习(Federated Transfer Learning,FTL)的LeNet-5改进模型。与以往的集中式学习研究不同,该模型是在联邦学习下通过比较Xavier、He和迁移学习三种权重初始化策略,实现迁移学习策略在提升模型性能和保护数据隐私方面表现最佳。因此,最后形成的基于联邦迁移学习的改进LeNet-5模型不仅在分类准确率和收敛速度上显著优于原始模型,还有效保护了数据隐私,为未来隐私保护与联邦学习模型下权重初始化的研究提供了有价值的参考。
关键词:  联邦学习  LeNet-5  迁移学习  隐私保护  权重初始化
DOI:
分类号:
基金项目:国家自然科学基金青年基金(62302307)
Optimisation strategies for image classification models based on federated learning
xumeijing, liluqun, lishuang
College of Information, Mechanical and Electrical Engineering, Shanghai Normal University
Abstract:
Taking image classification as the research background, this paper proposes an improved model of LeNet-5 combined with Federated Transfer Learning (FTL) to address the data privacy protection needs and the performance limitations of the original LeNet-5 model in complex image classification. Unlike previous centralized learning studies, the model is based on comparing three weight initialization strategies,Xavier, He and transfer learning, under the federated learning, to demonstrate that the transfer learning strategy performs the best in terms of improving model performance and protecting data privacy. Therefore,the final improved LeNet-5 model based on federated transfer learning not only significantly outperforms the original model in terms of classification accuracy and convergence speed, but also effectively protects data privacy, which provides a valuable reference for future research on privacy protection and weight initialization under federated learning models.
Key words:  Federated learning  LeNet-5  migration learning  privacy preserving  weight initialisation