Correlated Differential Privacy:Hiding Information in Non-IID Dataset
Gang Li
Doctor
TULIP Lab, Deakin University, Australia
Abstract: Privacy preserving has attracted increasing interest over past decades. Differential privacy offers a rigorous and provable privacy gua rantee for data mining and data release, based on the assumption of independent data. However, records in a dataset are rarely independent in reaI appiications. A differential privacy technique performed on a correlated dataset will disclose more information than expected, and recent research was concerned with this new privacy violation.It still calls for a solid solution for the correlated dataset, and how to decrease the large amount of noise incurred via differential privacy is yet to be explored. Our work proposes an effective correlated differential privacy solution by defining the correlated sensitivity and designing a correlated data releasing mechanism. With consideration of the correlated levels between records, the proposed correlated sensitivity can significantly decrease the noise. The correlated data releasing mechanism CIM is designed based on an iterative method to answer a large number of queries. Compared with the traditional method, the proposed correlated differential privacy solution enhances the privacy guarantee for a correlated dataset with less accuracy cost.
Bio: Dr.Gang Li is a senior lecturer in the School of Information Technology at Deakin University (Australia). His research interests include topics in the area of data mining, machine learning and business intelligence. He received the PAKDD2014 best student paper, ACM/IEEE ASONAM2012 best paper award, the 2007 Nightingale Prize. He is currently an associate editor for Decision Support Systems (Elsevier), and has been the guest editor for the Chinese Journal of Computer, Enterprise Information System, Concurrency and Computing: Practise and Experience, and Future Generaton computer systems, etc.