Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Are there changes to the aggregation model that could make it useful for training machine learning models? #332

Open
palenica opened this issue Feb 16, 2022 · 2 comments
Labels
possible-future-enhancement Feature request with no current decision on adoption

Comments

@palenica
Copy link
Collaborator

For example, multiple-pass querying, different privacy accounting schemes, different batching schemes.

@alois-bissuel
Copy link
Contributor

I guess supporting gaussian noise instead of a laplacian one would improve the utility of the API when reporting over a few dimensions.

@csharrison
Copy link
Collaborator

I guess supporting gaussian noise instead of a laplacian one would improve the utility of the API when reporting over a few dimensions.

Similarly, what we discussed before about moving from L1 constraints to L2 (or L0 + Linf) in this issue:
#249

I think that plus Gaussian noise is the key to unlocking the advanced DP composition tricks to improve privacy / utility in the face of multiple queries.

@csharrison csharrison added the possible-future-enhancement Feature request with no current decision on adoption label Jun 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
possible-future-enhancement Feature request with no current decision on adoption
Projects
None yet
Development

No branches or pull requests

3 participants