Launching the Differential Privacy Deployments Registry

At the Eyes-Off Data Summit and the 2025 OpenDP Community Meeting, we and collaborators announced the launch of the Differential Privacy Deployments Registry—a public-facing repository of real-world differential privacy deployments. The registry will give practitioners an opportunity to share details of their deployments and observe others’ choices, ideally spurring the formation of best practices for deploying differential privacy.

The U.S. National Institute of Standards and Technology has proposed hosting the registry. The proposal is accompanied by a white paper detailing a proposed governance structure, along with details about the registry itself. The white paper is open for public comment until December 5, 2025. Comments about any aspect of the proposal are encouraged, and we look forward to the community’s engagement on the registry broadly.  

Below, we describe some of the research we (Priyanka Nanayakkara, Elena Ghazi, Salil Vadhan) did to inform the design of the registry and evaluate what value it could offer the community. The following post is based on our recent arXiv preprint.


It’s often useful to analyze data about people. Census statistics help us decide how to allocate funds across school districts while pageview counts help us study how information spreads online. Simultaneously, these data pose privacy risks: for example, they can inadvertently reveal where people live or their browsing habits.

In response, in 2006, researchers Dwork, McSherry, Nissim, and Smith introduced a theoretical standard of privacy for computing statistical releases. They called this standard differential privacy (DP). DP limits how much information is revealed about individuals during a computation while still allowing the analyst to learn overall patterns in the data.

In the years since its introduction, DP has gained traction as a practical tool for protecting privacy when releasing sensitive datasets, publishing statistics, and training AI models. Organizations like Google, Apple, the U.S. Census Bureau, and the Wikimedia Foundation have all used DP to protect people’s data.

Each DP deployment requires making a series of implementation choices that impact resulting privacy guarantees and utility of the data product. These choices include setting privacy-loss parameters, the privacy unit, and security measures for the underlying data. Ideally, there would be best practices practitioners could use to guide their decisions. However, as a community, it has been difficult for practitioners to develop best practices because they have little insight into choices others are making.

In 2019, researchers Dwork, Kohli, and Mulligan proposed a public-facing repository (“registry”) of DP deployments, where practitioners could make their implementation choices public and view others’ deployment details. In theory, a registry could help the community form best practices over time. In 2024, our collaborators at Oblivious (Berrios, Usynin, Fitzsimons) created an initial prototype of a registry in a blogpost format with a basic set of attributes about each deployment.

We build on the effort they started by developing a hierarchical schema for any given DP deployment, designing a registry prototype as an interactive interface, and populating it with 21 real-world deployments according to our schema. Much of these contributions have been incorporated into the live registry announced in September and linked at the top of this post.

A screenshot of the differential privacy deployment card.

Differential privacy deployment card

Inspired by model cards, we develop a differential privacy deployment card to describe any given DP deployment. The card describes a mix of technical and sociotechnical details. For example, the card includes details such as the mechanisms used to achieve DP, the deployment model (e.g., central, local), and privacy-loss parameters, as well as rationale for why those specific implementation choices were made and the intended use of the data product. Details of the card’s contents are described further in our paper.

Registry prototype

We designed the registry prototype as an interactive interface for technically skilled practitioners—such as privacy engineers or technical leads—to find and analyze information about prior deployments.

The registry contains an interactive table where each row represents a DP deployment (B). The columns of the table describe high-level attributes about the deployment—such as the data curator (e.g., Apple) and privacy unit & privacy-loss parameters—which roughly correspond to sections in the deployment card. Columns are searchable, sortable, and filterable to support a user in finding deployments they are most interested in.

To learn more about a particular deployment, a user can click on its row in the table. Doing so pulls up its deployment card with more details (C). In the screenshot below, the user has selected deployments by Microsoft and U.S. Census Bureau. They may refer to the guide (D) for background information on DP terms and concepts that appear in the registry.

Screenshot of the Differential Privacy Deployment Registry

To see overall trends in the registry, a user can interact with the exploratory visualizations panel (A). These visualizations depict the number of deployments across the selected feature (e.g., flavor/privacy measure, deployment model), over time. A user can brush the visualization on the right to filter deployments shown on the other plot (according to the selected time frame).

Screenshot of DP registry visualization

Exploratory user study

Appetite in the DP community has grown for the registry since it was originally proposed. But without a working registry, it has been difficult to foresee how it would be used in practice and what kinds of governance challenges might arise in making it a trusted resource.

Thus, we conducted an exploratory user study with DP practitioners to answer the following research questions:

  • RQ1: How does the registry challenge or reinforce practitioners’ existing beliefs about DP and prior DP deployments?
  • RQ2: How might practitioners use the registry to
    • (a) analyze and/or evaluate past deployments,
    • (b) inform future deployments, both in terms of making deployment decisions and establishing the suitability of DP for a specific use case, and
    • (c) identify, explore, and reflect on norms in the DP ecosystem?
  • RQ3: What challenges and opportunities do practitioners envision around the registry gaining adoption?

We recruited 16 participants with experience either making or evaluating a real-world DP deployment. They each completed the following tasks using the registry prototype, thinking aloud throughout.

Tasks

  1. Examine a known deployment. Participants were asked to select a deployment they were already familiar with and reflect on whether they learned anything new. They were also asked to reflect on whether their understanding of DP or how it is deployed changed after interacting with the registry.
  2. Evaluate deployments. Participants selected one deployment that stood out as particularly “good” and one that stood out as particularly “poor,” and explained their reasoning.
  3. Make a new deployment. Participants used the registry to determine appropriateness of DP for a hypothetical use case and how they might use the registry to make implementation choices for said use case.
  4. Explore emergent norms. Participants identified emergent norms among DP deployments in the registry and shared their reaction to these norms.

Findings

We analyzed participants’ responses using a thematic analysis approach. Below, we describe emergent themes which respond to our research questions.

RQ1: How does the registry challenge or reinforce practitioners’ existing beliefs about DP and prior DP deployments?

  • Participants’ high-level understanding of DP or how it is applied in practice did not always change after using the registry. However, the registry consistently supported them in discovering low-level details about deployments. This included both technical information about decisions like hyperparameter tuning and algorithms as well as sociotechnical information, such as the intended use of a data product.
  • Interacting with the registry inspired participants to ask specific questions, both technical and sociotechnical, about deployments that they may not have asked otherwise.

RQ2a: How might practitioners use the registry to analyze and/or evaluate past deployments?

  • Participants performed holistic evaluations, focusing on implementation choices plus a host of societal factors. For instance, they asked questions about whether the deployment would help society and whether DP was necessary in the first place. Furthermore, they considered privacy-loss parameters in context of the privacy unit and impacts to utility.
  • Deployments with higher transparency (i.e., more details available) were seen as better, suggesting incentive for organizations to share more details about their deployments.

RQ2b: How might practitioners use the registry to inform future deployments, both in terms of making deployment decisions and establishing the suitability of DP for a specific use case?

  • For the most part, participants did not rely on the registry to determine appropriateness of DP for new use cases. Instead, they relied heavily on their prior knowledge about when DP is well-suited.
  • Participants used “similar” deployments to inform new deployments, but defined similarity in different ways: dataset size, dataset domain/topic, etc.

RQ2c: How might practitioners use the registry to identify, explore, and reflect on norms in the DP ecosystem?

  • Participants used both the exploratory visualizations and interactive table to reflect on norms. They approached trends with curiosity, but also skepticism, noting the small sample size represented in the registry.
  • Participants felt that over time, the registry could help with the emergence of best practices. However, they also noted a potential risk of popular, but suboptimal, choices becoming standard without more intentional approaches to developing best practices.

RQ3: What challenges and opportunities do practitioners envision around the registry gaining adoption?

  • Opportunity: The registry can help inform future implementation choices. Along these lines, there was some desire for normative guidance around best practices.
  • Opportunity: Support broader communication about DP. Participants felt the registry would help them advocate for the use of DP within their organizations, particularly with legal teams. They also saw opportunities for the registry to help facilitate communication between data curators and data subjects, and with students learning DP.
  • Opportunity: Creative standardization and community. Participants envisioned the registry as a “hub” for the community to form best practices.
  • Challenge: Effort and risk involved with adding new entries. In addition to the time required to add new entries to the registry, participants thought adding new entries could pose downstream regulatory risk or invite additional scrutiny.
  • Challenge: Moderation. Participants foresaw challenges around maintaining entries and making sure their details are correct. They proposed several ideas around moderation, including having a contact listed for each deployment. This person would be responsible for fielding questions and possible corrections.

What’s next?

To be successful in the long run, the registry needs to become an active, community-driven resource for the privacy community. Based on our user study, we suggest the following ideas to achieve this vision.

  • Develop human-in-the-loop approaches to partially automate the process of registering new deployments. Approaches that make use of an LLM agent to automatically generate drafts of entries for expert review could lower the effort required to add new entries to the registry.
  • Create a discussion board to enable back-and-forth discussion and more intentional emergence of best practices. Creating more opportunities for communication between practitioners on the registry could help them actively shape best practices, instead of popular choices automatically becoming standard.
  • Expand the registry to support policymakers, data users, and data subjects. The registry could become useful not only for practitioners, but also for others across the data ecosystem if it included interactive modes with tailored explanations and information for their needs.