Use Our Tools

The SmartNoise System

SmartNoise is jointly developed by Microsoft and Harvard's Institute for Quantitative Social Science (IQSS) and the School of Engineering and Applied Sciences (SEAS) as part of the Open Differential Privacy (OpenDP) initiative. The project aims to connect solutions from the research community with the lessons learned from realworld deployments to make Differential Privacy broadly accessible. SmartNoise is designed as a collection of components that can be flexibly configured to enable developers to use the right combination for their environments.

The SmartNoise tools primarily focus on the "global model" of Differential Privacy, as opposed to the "local model." In the global Differential Privacy model, a trusted data collector is presumed to have access to unprotected data and wishes to protect public releases of aggregate information. The core library includes implementations of the most common mechanisms and statistics and some utility functions like filtering, imputation, and others. It has an interface for composing operations in an analysis graph that can be validated for privacy properties. This validator is the critical functionality, as it ensures users are providing formal privacy assurances.

 

To Learn More About SmartNoise, Please Visit:

Microsoft SmartNoise Differential Privacy Machine Learning Case Studies

 

SmartNoise Early Adoption Acceleration Program

 

SmartNoise 

Sample Repository

 

For developers and data scientists, we recommend viewing the SmartNoise Samples repository which provides example code and notebooks to:

  • Demonstrate the use of the system platform
  • Teach the properties of differential privacy
  • Highlight some of the nuances of the system implementation

Based on public California census data, this example histogram here shows true values compared to differentially private, or DP, values. The DP values give high accuracy while protecting individual privacy. For details, please see the histogram notebook.

These examples use the SmartNoise Core Python bindings and SmartNoise SDK which provide basic building blocks for working with sensitive data, with implementations based on vetted and mature differential privacy research. 

 

Actual values vs. differentially private values

Figure 1. History of Education.   

Source: OpenDP Development Team, “Histograms,” SmartNoise Samples: Differential Privacy Examples, Notebooks, and Documentation, 2020. https://github.com/opendp/smartnoise-samples/blob/master/analysis/histograms.ipynb (accessed Mar. 04, 2021).

 

For questions: