大数据 ›› 2023, Vol. 9 ›› Issue (2): 122-146.doi: 10.11959/j.issn.2096-0271.2022051
张传尧1,2, 司世景1, 王健宗1, 肖京1
出版日期:
2023-03-15
发布日期:
2023-03-01
作者简介:
张传尧(1998- ),男,中国科学技术大学硕士研究生,平安科技(深圳)有限公司算法工程师,主要研究方向为元学习和联邦学习基金资助:
Chuanyao ZHANG1,2, Shijing SI1, Jianzong WANG1, Jing XIAO1
Online:
2023-03-15
Published:
2023-03-01
Supported by:
摘要:
随着移动设备的普及,海量的数据在不断产生。数据隐私政策不断细化,数据的流动和使用受到严格监管。联邦学习可以打破数据壁垒,联合利用不同客户端数据进行建模。由于用户使用习惯不同,不同客户端数据之间存在很大差异。如何解决数据不平衡带来的统计挑战,是联邦学习研究的一个重要课题。利用元学习的快速学习能力,为不同数据节点训练不同的个性化模型来解决联邦学习中的数据不平衡问题成为一种重要方式。从联邦学习背景出发,系统介绍了联邦学习的问题定义、分类方式及联邦学习面临的主要问题。主要问题包括:隐私保护、数据异构、通信受限。从联邦元学习的背景出发,系统介绍了联邦元学习在解决联邦学习数据异构、通信受限问题及提高恶意攻击下鲁棒性方面的研究工作,对联邦元学习的工作进行了总结展望。
中图分类号:
张传尧, 司世景, 王健宗, 肖京. 联邦元学习综述[J]. 大数据, 2023, 9(2): 122-146.
Chuanyao ZHANG, Shijing SI, Jianzong WANG, Jing XIAO. Federated meta learning: a review[J]. Big Data Research, 2023, 9(2): 122-146.
表2
联邦学习算法分类"
分类 | 联邦学习算法 | 特点 |
基于服务器端聚合方法优化的算法 | FedSGD | 每次使用客户端所有数据进行一轮梯度下降,收敛速度慢 |
FedAvg | 以客户端数据量与总数据量的比值作为权重,聚合不同客户端发送的参数,客户端使用多轮更新的方式加快模型收敛速度 | |
FedAvgM | 通过引入动量更新缓解数据异构对联邦平均算法的影响 | |
FEDADAGRED | 借鉴非联邦环境下的自适应优化器(ADAGRAD、ADAM和YOGI)提出了联邦学习版本 | |
FEDYOGI | 的自适应优化器,通过自适应优化器显著提高了模型在数据异构情况下的收敛速度 | |
FEDADAM | ||
FedPER | 通过共享基础层加个性化层的个性化模型提高模型公平性 | |
FedMA | 以分层方式构建共享全局模型 | |
基于客户端优化的算法 | FedProx | 客户端局部损失函数使用基于前一步权重的二次惩罚进行正则化,减少数据异构对模型的影响 |
Fed-EMD | 通过创建一个在所有数据节点间共享的数据子集,来减少数据不平衡的影响。需要很高的通信成本,降低隐私安全性 | |
APFL | 通过推导全局模型和局部模型的一般边界找出最优混合参数 | |
SCAFFOLD | 使用控制变量纠正本地更新中的“客户端漂移”问题 | |
Ditto | 使用本地模型与全局模型的欧氏距离作为正则化项以鼓励个性化模型向全局最优模型靠近 | |
MOON | 利用模型表示之间的相似性来纠正各个客户端的局部学习 |
表3
监督元学习分类"
分类 | 方法 | 参考文献 |
基于优化器的方法 | Meta-Learner, LSTM, RL2, LTL | [42-44] |
基于记忆存储的方法 | MANN, SNAIL | [45-46] |
基于基础泛化模型的方法 | MAML, iMAML, ESMAML、 | [5, 47-48] |
Reptile, LEO | [50-51] | |
基于度量学习的方法 | Meta-critic Network, Matching Network、 | [52-56] |
Prototy Network, Siamses Network、 | ||
Relation Network | ||
基于数据增强的方法 | Sill-Net, MetaAugment, | [61, 63-65] |
MetaGAN, MED |
表4
联邦元学习算法分类"
分类 | 方法 | 特点 |
面向数据异构 | FedMeta | 使用一个共享的元学习者取代联邦学习中共享的全局模型,以更灵活的方式共享参数化算法 |
Fedavg-Reptile | 在联邦平均算法阶段后插入一个元学习微调阶段以提供一个可靠的初始化模型 | |
PerFedAvg | 聚焦MAML算法在联邦学习情景下的收敛性分析,并提供了一个可证明收敛的方法解决函数为非凸的情况 | |
q-MAML | 通过最小化一个聚合的加权损失函数,使得损失函数值较高的设备在目标优化函数中具有较高的相对权重从而促进模型在各个设备间的表现更加公平 | |
ARUBA | 将在线凸优化和序列预测算法相结合,将元学习视为在线学习一系列损失 | |
PFL | 在每轮训练过程中动态地修改设备损失函数以消除元模型对不同用户的偏见 | |
FedFomo | 不再只学习一个单一的平均模型,通过n个服务器元模型,向不同客户端组发送不同的元模型 | |
FedRECON | 利用元学习训练一个可以快速重构局部参数的全局参数,并在客户端上使用全局参数进行局部参数重构 | |
面向资源挑战 | NUFM | 使用非均匀的设备选择方案NUFM促进模型收敛速度提高 |
ADMM-FedMeta | 将原始问题分解为许多子问题,可以跨边缘节点和平台并行解决,通过线性近似和海森估计将计算复杂度降为O(n) | |
EEFML | 该模型使用投影随机梯度上升(P-SGA)来反向查找元模型,大大降低了计算成本、提高了通信效率 | |
FedMeta w/UGA | 提出了一种无偏梯度聚集算法(UGA)并引入一个元更新过程,通过一个可控的元训练集来促进优化目标向目标分布靠近 | |
面向隐私保护 | RobustFedML | 通过构造对抗数据的方式实现算法的鲁棒性 |
FL-MN | 使用匹配网络的分类机制有效降低后门攻击的成功率,但是也会导致在一些任务上准确度降低 | |
Meta-FL | 将防御检查点从用户端级别移动到聚合级别,有效缓解后门攻击带来的影响 |
[1] | 王健宗, 孔令炜, 黄章成 ,等. 联邦学习算法综述[J]. 大数据, 2020,6(6): 64-82. |
WANG J Z , KONG L W , HUANG Z C ,et al. Research review of federated learning algorithms[J]. Big Data Research, 2020,6(6): 64-82. | |
[2] | BERTINETTO L , HENRIQUES J F , TORR P H S ,et al. Meta-learning with differentiable closed-form solvers[J]. arXiv preprint, 2018,arXiv:1805.08136. |
[3] | VILALTA R , DRISSI Y . A perspective view and survey of meta-learning[J]. Artificial Intelligence Review, 2002,18(2): 77-95. |
[4] | MITCHELL R , MICHALSKI J , CARBONELL T . An artificial intelligence approach[M]. Machine Learning. Heidelberg: Springer, 2013. |
[5] | FINN C , ABBEEL P , LEVINE S . Modelagnostic meta-learning for fast adaptation of deep networks[C]// Proceedings of the 34th International Conference on Machine Learning. New York:ACM Press, 2017: 1126-1135. |
[6] | YANG Q , LIU Y , CHENG Y ,et al. Federated learning[J]. Synthesis Lectures on Artificial Intelligence and Machine Learning, 2019,13(3): 1-207. |
[7] | YANG Q , LIU Y , CHEN T J ,et al. Federated machine learning:concept and applications[J]. ACM Transactions on Intelligent Systems and Technology, 2019,10(2): 1-19. |
[8] | ALEDHARI M , RAZZAK R , PARIZI R M ,et al. Federated learning:a survey on enabling technologies,protocols,and applications[J]. IEEE Access:Practical Innovations,Open Solutions, 2020,8: 140699-140725. |
[9] | 王健宗, 孔令炜, 黄章成 ,等. 联邦学习隐私保护研究进展[J]. 大数据, 2021,7(3): 130-149. |
WANG J Z , KONG L W , HUANG Z C ,et al. Research advances on privacy protection of federated learning[J]. Big Data Research, 2021,7(3): 130-149. | |
[10] | WEI K , LI J , DING M ,et al. Federated learning with differential privacy:algorithms and performance analysis[J]. IEEE Transactions on Information Forensics and Security, 2020,15: 3454-3469. |
[11] | DWORK C , . Differential Privacy:a survey of results[C]// Proceedings of International Conference on Theory and Applications of Models of Computation. Heidelberg:Springer, 2008: 1-19. |
[12] | DWORK C , ROTH A . The algorithmic foundations of differential privacy[J]. Foundations and Trends? in Theoretical Computer Science, 2013,9(3/4): 211-407. |
[13] | WEI K , LI J , DING M ,et al. Federated learning with differential privacy:algorithms and performance analysis[J]. IEEE Transactions on Information Forensics and Security, 2020,15: 3454-3469. |
[14] | SATTLER F , WIEDEMANN S , MULLER K R ,et al. [J]. Robust and communication-efficient federated learning from non-i.i.d.data, 2020,31(9): 3400-3413. |
[15] | ZHAO Y , LI M , LAI L Z ,et al. Federated learning with non-IID data[J]. arXiv preprint, 2018,arXiv:1806.00582. |
[16] | BRIGGS C , FAN Z , ANDRAS P . Federated learning with hierarchical clustering of local updates to improve training on non-IID data[C]// Proceedings of 2020 International Joint Conference on Neural Networks. Piscataway:IEEE Press, 2020: 1-9. |
[17] | MCMAHAN H B , MOORE E , RAMAGE D ,et al. Communication-efficient learning of deep networks from decentralized data[J]. arXiv preprint, 2016,arXiv:1602.05629. |
[18] | ARIVAZHAGAN M G , AGGARWAL V , SINGH A K ,et al. Federated learning with personalization layers[J]. arXiv preprint, 2019,arXiv:1912.00818. |
[19] | KAIROUZ P , MCMAHAN H B , AVENT B ,et al. Advances and open problems in federated learning[J]. Foundations and Trends? in Machine Learning, 2021,14(1–2): 1-210. |
[20] | KONE?NY J , MCMAHAN H B , YU F X ,et al. Federated learning:strategies for improving communication efficiency[J]. arXiv preprint, 2016,arXiv:1610.05492. |
[21] | HSU T M H , QI H , BROWN M . Measuring the effects of non-identical data distribution for federated visual classification[J]. arXiv preprint, 2019,arXiv:1909.06335. |
[22] | REDDI S , CHARLES Z , ZAHEER M ,et al. Adaptive federated optimization[J]. arXiv preprint, 2020,arXiv:2003.00295. |
[23] | DUCHI J C , HAZAN E , SINGER Y . Adaptive subgradient methods for online learning and stochastic optimization[J]. Journal of Machine Learning Research, 2011,12: 2121-2159. |
[24] | KINGMA D P , BA J . Adam:a method for stochastic optimization[J]. arXiv preprint, 2014,arXiv:1412.6980. |
[25] | ZAHEER M , REDDI S J , SACHAN D ,et al. Adaptive methods for nonconvex optimization[C]// Proceedings of the 32nd International Conference on Neural Information Processing Systems. New York:ACM Press, 2018: 9815-9825. |
[26] | WANG H Y , YUROCHKIN M , SUN Y K ,et al. Federated learning with matched averaging[J]. arXiv preprint, 2020,arXiv:2002.06440. |
[27] | ARIVAZHAGAN M G , AGGARWAL V , SINGH A K ,et al. Federated learning with personalization layers[J]. arXiv preprint, 2019,arXiv:1912.00818. |
[28] | LI T , SAHU A K , ZAHEER M ,et al. Federated optimization in heterogeneous networks[J]. Proceedings of Machine Learning and Systems, 2020,2: 429-450. |
[29] | KONE?NY J , MCMAHAN B , RAMAGE D . Federated optimization:distributed optimization beyond the datacenter[J]. arXiv preprint, 2015,arXiv:1511.03575. |
[30] | DENG Y Y , KAMANI M M , MAHDAVI M . Adaptive personalized federated learning[J]. arXiv preprint, 2020,arXiv:2003.13461. |
[31] | KARIMIREDDY S P , KALE S , MOHRI M ,et al. SCAFFOLD:stochastic controlled averaging for federated learning[J]. arXiv preprint, 2019,arXiv:1910.06378. |
[32] | LI T , HU S Y , BEIRAMI A ,et al. Ditto:fair and robust federated learning through personalization[J]. arXiv preprint, 2020,arXiv:2012.04221. |
[33] | LI Q B , HE B S , SONG D . Modelcontrastive federated learning[C]// Proceedings of 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Piscataway:IEEE Press, 2021: 10708-10717. |
[34] | KHODAK M , BALCAN M F , TALWALKAR A . Adaptive gradientbased meta-learning methods[J]. arXiv preprint, 2019,arXiv:1906.02717. |
[35] | VANSCHOREN J . Meta-learning:a survey[J]. arXiv preprint, 2018,arXiv:1810.03548. |
[36] | VILALTA R , DRISSI Y . A perspective view and survey of meta-learning[J]. Artificial Intelligence Review, 2002,18(2): 77-95. |
[37] | 李凡长, 刘洋, 吴鹏翔 ,等. 元学习研究综述[J]. 计算机学报, 2021,44(2): 422-446. |
LI F Z , LIU Y , WU P X ,et al. A survey on recent advances in meta-learning[J]. Chinese Journal of Computers, 2021,44(2): 422-446. | |
[38] | HSU K , LEVINE S , FINN C . Unsupervised learning via metalearning[J]. arXiv preprint, 2018,arXiv:1810.02334. |
[39] | KHODADADEH S , BOLONI L , SHAH M . Unsupervised meta-learning for fewshot image classification[J]. Advances in Neural Information Processing Systems, 2019,32. |
[40] | SNELL J , SWERSKY K , ZEMEL R . Prototypical networks for few-shot learning[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. New York:ACM Press, 2017: 4080-4090. |
[41] | RAVI S , LAROCHELLE H . Optimization as a model for few-shot learning[J]. International Conference on Learning Representations, 2016. |
[42] | DUAN Y , SCHULMAN J , CHEN X ,et al. RL2:fast reinforcement learning via slow reinforcement learning[J]. arXiv preprint, 2016,arXiv:1611.02779. |
[43] | ANDRYCHOWICZ M , DENIL M , COLMENAREJO S G ,et al. Learning to learn by gradient descent by gradient descent[C]// Proceedings of the 30th International Conference on Neural Information Processing Systems. New York:ACM Press, 2016: 3988-3996. |
[44] | SANTORO A , BARTUNOV S , BOTVINICK M ,et al. Meta-learning with memory-augmented neural networks[C]// Proceedings of the 33rd International Conference on Machine Learning. New York:ACM Press, 2016: 1842-1850. |
[45] | MISHRA N , ROHANINEJAD M , CHEN X ,et al. A simple neural attentive meta-learner[J]. arXiv preprint, 2017,arXiv:1707.03141. |
[46] | RAJESWARAN A , FINN C , KAKADE S ,et al. Meta-learning with implicit gradients[J]. arXiv preprint, 2019arXiv:1909.04630. |
[47] | SONG X Y , GAO W B , YANG Y X ,et al. ESMAML:simple hessian-free meta learning[J]. arXiv preprint 2019,arXiv:1910.01215. |
[48] | SALIMANS T , HO J , CHEN X ,et al. Evolution strategies as a scalable alternative to reinforcement learning[J]. arXiv preprint, 2017,arXiv:1703.03864. |
[49] | NICHOL A , ACHIAM J , SCHULMAN J . On first-order meta-learning algorithms[J]. arXiv preprint, 2018,arXiv:1803.02999. |
[50] | RUSU A A , RAO D , SYGNOWSKI J ,et al. Meta-learning with latent embedding optimization[J]. arXiv preprint, 2018,arXiv:1807.05960. |
[51] | SUNG F , ZHANG L , XIANG T ,et al. Learning to learn:meta-critic networks for sample efficient learning[J]. arXiv preprint, 2017,arXiv:1706.09529. |
[52] | VINYALS O , BLUNDELL C , LILLICRAP T ,et al. Matching networks for one shot learning[C]// Proceedings of the 30th International Conference on Neural Information Processing Systems. New York:ACM Press, 2016: 3637-3645. |
[53] | SNELL J , SWERSKY K , ZEMEL R . Prototypical networks for few-shot learning[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. New York:ACM Press, 2017: 4080-4090. |
[54] | KOCH G , ZEMEL R , SALAKHUTDINOV R . Siamese neural networks for one-shot image recognition[C]// Proceedings of ICML Deep Learning Workshop.[S.l.:s.n.], 2015. |
[55] | SANTORO A , RAPOSO D , BARRETT D G T ,et al. A simple neural network module for relational reasoning[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. New York:ACM Press, 2017: 4974-4983. |
[56] | CHAWLA N V , BOWYER K W , HALL L O ,et al. SMOTE:synthetic minority oversampling technique[J]. Journal of Artificial Intelligence Research, 2002,16: 321-357. |
[57] | INOUE H . Data augmentation by pairing samples for images classification[J]. arXiv preprint, 2018,arXiv:1801.02929. |
[58] | GOODFELLOW I , POUGET-ABADIE J , MIRZA M ,et al. Generative adversarial networks[J]. Communications of the ACM, 2020,63(11): 139-144. |
[59] | CUBUK E D , ZOPH B , MANE D ,et al. AutoAugment:learning augmentation policies from data[J]. arXiv preprint, 2018,arXiv:1805.09501. |
[60] | ZHANG H P , CAO Z , YAN Z A ,et al. Sill-net:feature augmentation with separated illumination representation[J]. arXiv preprint, 2021,arXiv:2102.03539. |
[61] | RAJENDRAN J , IRPAN A , JANG E . Metalearning requires meta-augmentation[C]// Proceedings of the 34th International Conference on Neural Information Processing Systems. New York:ACM Press, 2020: 5705-5715. |
[62] | ZHOU F W , LI J W , XIE C L ,et al. MetaAugment:sample-aware data augmentation policy learning[J]. Proceedings of the AAAI Conference on Artificial Intelligence, 2021,35(12): 11097-11105. |
[63] | ZHANG R X , CHE T , GHAHRAMANI Z ,et al. MetaGAN:an adversarial approach to few-shot learning[C]// Proceedings of the 32nd International Conference on Neural Information Processing Systems. New York:ACM Press, 2018: 2371-2380. |
[64] | SUN P , OUYANG Y , ZHANG W ,et al. MEDA:meta-learning with data augmentation for few-shot text classification[C]// Proceedings of 2021 International Joint Conference on Artificial Intelligence.[S.l.:s.n.], 2021: 3929-3935. |
[65] | CHEN F , LUO M , DONG Z H ,et al. Federated meta-learning with fast convergence and efficient communication[J]. arXiv preprint, 2018,arXiv:1802.07876. |
[66] | JIANG Y H , KONE?NY J , RUSH K ,et al. Improving federated learning personalization via model agnostic meta learning[J]. arXiv preprint, 2019,arXiv:1909.12488. |
[67] | FALLAH A , MOKHTARI A , OZDAGLAR A . Personalized federated learning with theoretical guarantees:a model-agnostic meta-learning approach[J]. Advances in Neural Information Processing Systems, 2020,33: 3557-3568. |
[68] | ZAFAR M B , VALERA I , GOMEZ RODRIGUEZ M ,et al. Fairness beyond disparate treatment & disparate impact:learning classification without disparate mistreatment[C]// Proceedings of the 26th International Conference on World Wide Web. Switzerland:International World Wide Web Conferences Steering Committee, 2017: 1171-1180. |
[69] | LI T , SANJABI M , BEIRAMI A ,et al. Fair resource allocation in federated learning[J]. arXiv preprint, 2019,arXiv:1905.10497. |
[70] | KHODAK M , BALCAN M F , TALWALKAR A . Adaptive gradientbased meta-learning methods[J]. arXiv preprint, 2019,arXiv:1906.02717. |
[71] | ZHANG M , SAPRA K , FIDLER S ,et al. Personalized Federated Learning with first order model optimization[C]// Proceedings of International Conference on Learning Representations.[S.l.:s.n.], 2020. |
[72] | ACAR D A E , ZHAO Y , ZHU R ,et al. Debiasing model updates for improving personalized federated training[C]// Proceedings of International Conference on Machine Learning.[S.l.:s.n.], 2021: 21-31. |
[73] | SINGHAL K , SIDAHMED H , GARRETT Z ,et al. Federated reconstruction:partially local federated learning[J]. arXiv preprint, 2021,arXiv:2102.03448. |
[74] | YUE S , REN J , XIN J ,et al. Efficient federated meta-learning over multiaccess wireless networks[J]. IEEE Journal on Selected Areas in Communications, 2022,40(5): 1556-1570. |
[75] | YUE S , REN J , XIN J ,et al. InexactADMM based federated meta-learning for fast and continual edge learning[C]// Proceedings of the 22nd International Symposium on Theory,Algorithmic Foundations,and Protocol Design for Mobile Networks and Mobile Computing. New York:ACM Press, 2021: 91-100. |
[76] | ELGABLI A , ISSAID C B , BEDI A S ,et al. Energy-efficient and federated meta-learning via projected stochastic gradient ascent[C]// Proceedings of 2021 IEEE Global Communications Conference. Piscataway:IEEE Press, 2022: 1-6. |
[77] | YAO X , HUANG T C , ZHANG R X ,et al. Federated learning with unbiased gradient aggregation and controllable meta updating[J]. arXiv preprint, 2019,arXiv:1910.08234. |
[78] | EDMUNDS R , GOLMANT N , RAMASESH V ,et al. Transferability of adversarial attacks in model-agnostic meta-learning[C]// Proceedings of 2017 Deep Learning and Security Workshop.[S.l.:s.n.], 2017. |
[79] | YIN C X , TANG J , XU Z Y ,et al. Adversarial meta-learning[J]. arXiv preprint, 2018,arXiv:1806.03316. |
[80] | LIN S , YANG G , ZHANG J S . A collaborative learning framework via federated meta-learning[C]// Proceedings of 2020 IEEE 40th International Conference on Distributed Computing Systems. Piscataway:IEEE Press, 2021: 289-299. |
[81] | CHEN C L , GOLUBCHIK L , PAOLIERI M . Backdoor attacks on federated metalearning[J]. arXiv preprint, 2020,arXiv:2006.07026. |
[82] | YIN D , CHEN Y D , RAMCHANDRAN K ,et al. Byzantine-robust distributed learning:towards optimal statistical rates[J]. arXiv preprint, 2018,arXiv:1803.01498. |
[83] | ARAMOON O , CHEN P Y , QU G ,et al. Meta federated learning[J]. arXiv preprint, 2021,arXiv:2102.05561. |
[84] | BONAWITZ K , IVANOV V , KREUTER B ,et al. Practical secure aggregation for privacy-preserving machine learning[C]// Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security. New York:ACM Press, 2017: 1175-1191. |
[85] | ZHENG W B , YAN L , GOU C ,et al. Federated meta-learning for fraudulent credit card detection[C]// Proceedings of the 29th International Joint Conference on Artificial Intelligence. New York:ACM Press, 2021: 4654-4660. |
[86] | LI X F , LIU S H , LI Z F ,et al. FlowScope:spotting money laundering based on graphs[J]. Proceedings of the AAAI Conference on Artificial Intelligence, 2020,34(4): 4731-4738. |
[87] | DAL POZZOLO A , BORACCHI G , CAELEN O ,et al. Credit card fraud detection:a realistic modeling and a novel learning strategy[J]. IEEE Transactions on Neural Networks and Learning Systems, 2018,29(8): 3784-3797. |
[88] | ZHAO H , JI F , LI Q ,et al. Federated meta-learning enhanced acoustic radio cooperative framework for ocean of things[J]. IEEE Journal of Selected Topics in Signal Processing, 2022,16(3): 474-486. |
[89] | LIN Y J , REN P J , CHEN Z M ,et al. Meta matrix factorization for federated rating predictions[C]// Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval. New York:ACM Press, 2020: 981-990. |
[90] | JALALIRAD A , SCAVUZZO M , CAPOTA C ,et al. A simple and efficient federated recommender system[C]// Proceedings of the 6th IEEE/ACM International Conference on Big Data Computing,Applications and Technologies. New York:ACM Press, 2019: 53-58. |
[91] | BEEL J . Federated meta-learning:democratizing algorithm selection across disciplines and software libraries[J]. Science (AICS), 2018,210:219. |
[92] | 吴建汉, 司世景, 王健宗 ,等. 联邦学习攻击与防御综述[J]. 大数据, 2022,8(5): 12-32. |
WU J H , SI S J , WANG J Z ,et al. Threats and defenses of federated learning:a survey[J]. Big Data Research, 2022,8(5): 12-32. |
[1] | 尹虹舒, 周旭华, 周文君. 纵向联邦线性模型在线推理过程中成员推断攻击的隐私保护研究[J]. 大数据, 2022, 8(5): 45-54. |
[2] | 吴建汉, 司世景, 王健宗, 肖京. 联邦学习攻击与防御综述[J]. 大数据, 2022, 8(5): 12-32. |
[3] | 阮雯强, 徐铭辛, 涂新宇, 宋鲁杉, 韩伟力. 数据租赁——数据流通的新方式[J]. 大数据, 2022, 8(5): 3-11. |
[4] | 李懿, 王劲松, 张洪玮. 基于区块链与函数加密的隐私数据安全共享模型研究[J]. 大数据, 2022, 8(5): 33-44. |
[5] | 张燕, 杨一帆, 伊人, 罗圣美, 唐剑飞, 夏正勋. 隐私计算场景下数据质量治理探索与实践[J]. 大数据, 2022, 8(5): 55-73. |
[6] | 朱智韬, 司世景, 王健宗, 肖京. 联邦推荐系统综述[J]. 大数据, 2022, 8(4): 105-132. |
[7] | 王健宗, 孔令炜, 黄章成, 陈霖捷, 刘懿, 卢春曦, 肖京. 联邦学习隐私保护研究进展[J]. 大数据, 2021, 7(3): 130-149. |
[8] | 王健宗, 孔令炜, 黄章成, 陈霖捷, 刘懿, 何安珣, 肖京. 联邦学习算法综述[J]. 大数据, 2020, 6(6): 64-82. |
[9] | 乐洁玉, 罗超洋, 丁静姝, 李卿. 教育大数据隐私保护机制与技术研究[J]. 大数据, 2020, 6(6): 52-63. |
[10] | 汪靖伟, 郑臻哲, 吴帆, 陈贵海. 基于区块链的数据市场[J]. 大数据, 2020, 6(3): 21-35. |
[11] | 卢文雄, 王浩宇. 基于同源策略的移动应用细粒度隐私保护技术[J]. 大数据, 2020, 6(1): 23-34. |
[12] | 王平, 张玉书, 何兴, 仲盛. 基于安全压缩感知的大数据隐私保护[J]. 大数据, 2020, 6(1): 12-22. |
[13] | 孙慧中, 杨健宇, 程祥, 苏森. 一种基于随机投影的本地差分隐私高维数值型数据收集算法[J]. 大数据, 2020, 6(1): 3-11. |
[14] | 王智慧, 周旭晨, 朱云. 数据自治开放模式下的隐私保护[J]. 大数据, 2018, 4(2): 42-49. |
[15] | 李康, 孙毅, 张珺, 李军, 周继华, 李忠诚. 零知识证明应用到区块链中的技术挑战[J]. 大数据, 2018, 4(1): 57-65. |
阅读次数 | ||||||
全文 |
|
|||||
摘要 |
|
|||||
|