Shuffed Model of Differential Privacy in Federated Learning

1年前 ⋅ 494 阅读

Title1: Shuffed Model of Differential Privacy in Federated Learning

Title2: Shuffled Model of Federated Learning: Privacy,Accuracy and Communication Trade-Offs

作者:AM Girgis,D Data,SN Diggavi,P Kairouz,AT Suresh
会议:24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS)
Abstract:We consider a distributed empirical risk minimization (ERM) optimization problem with communication efficiency and privacy requirements, motivated by the federated learning (FL) framework. Unique challenges to the traditional ERM problem in the context of FL include (i) need to provide privacy guarantees on clients' data, (ii) compress the communication between clients and the server, since clients might have low-bandwidth links, (iii) work with a dynamic client population at each round of communication between the server and the clients, as a small fraction of clients are sampled at each round. To address these challenges we develop (optimal) communication-efficient schemes for private mean estimation for several $\ell_p$ spaces, enabling efficient gradient aggregation for each iteration of the optimization solution of the ERM. We also provide lower and upper bounds for mean estimation with privacy and communication constraints for arbitrary $\ell_p$ spaces. To get the overall communication, privacy, and optimization performance operation point, we combine this with privacy amplification opportunities inherent to this setup. Our solution takes advantage of the inherent privacy amplification provided by client sampling and data sampling at each client (through Stochastic Gradient Descent) as well as the recently developed privacy framework using anonymization, which effectively presents to the server responses that are randomly shuffled with respect to the clients. Putting these together, we demonstrate that one can get the same privacy, optimization-performance operating point developed in recent methods that use full-precision communication, but at a much lower communication cost, i.e., effectively getting communication efficiency for "free".
keywords:Privacy Servers; Differential privacy Optimization; Estimation Data models; Collaborative work

摘要:我们认为,在联合学习(FL)框架的推动下,传播效率和隐私要求是分散的经验风险最小化(ERM)优化(ERM)最佳化问题,在通信效率和隐私要求方面存在着分散的经验风险最小化(ERM)优化问题。在FL框架内,传统机构风险管理问题的独特挑战包括:(一) 需要为客户的数据提供隐私保障,(二) 压缩客户与服务器之间的通信,因为客户之间可能有低带宽的链接,(三) 在服务器与客户之间的每轮通信中与活跃客户群体合作,因为每轮都抽样调查一小部分客户的隐私优化机会。为了应对这些挑战,我们制定了(最佳的)个人平均估算计划,用于若干美元/ell_p$空间的私人平均估算,传统机构风险管理问题的传统问题包括:(一) 需要为客户提供隐私保障;(二) 压缩客户与服务器之间的通信联系,我们利用最近开发的保密性能评估的保密性能,通过最近开发的保密性能测试,通过一种保密性能测试,通过最近开发的保密性平流化的客户,从而展示最近开发的保密性反应。
关键字:隐私服务;差分隐私优化;估计数据模型;协同工作

DOI:10.1109/JSAIT.2021.3056102
原文链接:Shuffed Model of Differential Privacy in Federated Learning.pdf