Abstract
Users' data analysis has become widespread these days; however, privacy of users is a great concern, specifically, if these data are collected from several sources or shared with multiple entities. One example of distributed analysis is to aggregate statistics of users. Differential Privacy has been proved as a proper tool to perturb the aggregate results. Its previous deployment techniques have several limitations, e.g., they mostly support centralized databases and are prone to collusion in a distributed setting, they pose a trade-off between privacy and utility or they are inefficient in terms of communication and computation costs.To address these issues, we present DstrDP (Distributed Differential Privacy) protocol for private data aggregation. The goal is to generate differentially private aggregate results from distributed databases. In particular, DstrDP focuses on count queries and employs Laplace perturbation mechanism. DstrDP generates Laplace noise in a way that maintains the optimal utility of users' data while does not rely on any trusted party and is resistant to collusion as long as the decryption key remains confidential. We describe our proposed approach and using decision tree classifier as a case study and show that DstrDP can protect the privacy of intermediate results and confirm the efficiency of our protocol by evaluating its performance.