
A common thread or assumption among all provable privacy frameworks, such as information-theoretic privacy and differential privacy, is randomization for safeguarding privacy. The definition of differential privacy assumes the use of randomized functions and information theoretic tools used so far have been based on randomized random variables. However, many popular heuristic-based privacy-preserving methods, such as k-anonymity and l-diversity, are deterministic (i.e., deterministic mappings, such as suppression and generalization, applied to non-stochastic datasets). This is because randomized, or stochastic, privacy-preserving policies have been shown to cause problems, such as un-truthfulness. For instance, randomized privacy-preserving policies in financial auditing have been criticized for complicating fraud detection. Also, generation of unreasonable and unrealistic outputs by randomness can cause undesirable financial outcomes (e.g., misleading investors or market operators by reporting noisy outputs that point to lack of liquidity in a bank). Randomized privacy-preserving policies, in general, have also encountered difficulties in health and social sciences. Finally, undesirable properties of differentially-private additive noise, especially the Laplace noise, might make it less appealing. For instace, optimal variable estimation in the presence of privacy-preserving Laplace noise is computationally expensive and probability of returning impossible reports (e.g., negative median income) could be relatively high due to slow-decaying nature of Laplace noise. In addition to negative consequences associated with randomized policies, the popularity of non-stochastic methods might also be caused by the simplicity of implementing deterministic policies, in the sense of not requiring a working knowledge of random variables and their generation by laymen. This motivates development of non-stochastic privacy frameworks.
- F. Farokhi, “Development and Analysis of Deterministic Privacy-Preserving Policies Using Non- Stochastic Information Theory,” IEEE Transactions on Information Forensics & Security, 14 (10), pp. 2567– 2576, 2019.
- F. Farokhi, “Non-Stochastic Hypothesis Testing with Application to Privacy Against Hypothesis-Testing Adversary,” in Proceedings of the 58th IEEE Conference on Decision and Control (CDC), Dec 11-13, 2019, Nice, France.
- F. Farokhi, “Noiseless Privacy,” Submitted.
- N. Ding, F. Farokhi, “Developing Non-Stochastic Privacy-Preserving Policies Using Agglomerative Clustering,” Submitted.
Collaborators: Ni Ding
Funding: Platforms for Open Data (PfOD) by Department of the Prime Minister and Cabinet, The University of Melbourne