Contact Us

Give us a call or drop by anytime, we endeavor to answer all inquiries within 24 hours.

map

Find us

PO Box 16122 Collins Street West Victoria, Australia

Email us

info@domain.com / example@domain.com

Phone support

Phone: + (066) 0760 0260 / + (057) 0760 0560

Privacy

  • November 13, 2023
  • HDSIComm

Improving Technical Communication with End Users about Differential Privacy

Differential privacy (DP) is widely regarded as a gold standard for privacy-preserving computation over users’ data. A key challenge with DP is that its mathematical sophistication makes its privacy guarantees difficult to communicate to users, leaving them uncertain about how and whether they are protected. Despite recent widespread deployment of DP, relatively little is known about what users think of differential privacy and how to effectively communicate the practical privacy guarantees it offers.

This talk will cover a series of recent and ongoing user studies aimed at measuring and improving communication with non-technical end users about differential privacy. The first set explores users’ privacy expectations related to differential privacy and measures the efficacy of existing methods for communicating the privacy guarantees of DP systems. We find that users care about the kinds of information leaks against which differential privacy protects and are more willing to share their private information when the risk of these leaks is reduced. Additionally, we find that the ways in which differential privacy is described in-the-wild set users’ privacy expectations haphazardly, which can be misleading depending on the deployment. Motivated by these findings, the second set of user studies develops and evaluates prototype descriptions designed to help end users understand DP guarantees. These descriptions target two important technical details in DP deployments that are often poorly communicated to end users: the privacy parameter epsilon (which governs the level of privacy protections) and the distinctions between the local and central models of DP (which governs who can access exact user data). Based on joint works with Gabriel Kaptchuk, Priyanka Nanayakkara, Elissa Redmiles, Mary Anne Smart, including https://arxiv.org/abs/2110.06452 and https://arxiv.org/abs/2303.00738.

Read More
  • March 1, 2023
  • Kaleigh O'Merry

Responsible AI: Privacy and Fairness in Decision and Learning Systems

Differential Privacy has become the go-to approach for protecting sensitive information in data releases and learning tasks that are used for critical decision processes. For example, census data is used to allocate funds and distribute benefits, while several corporations use machine learning systems for criminal assessments, hiring decisions, and more. While this privacy notion provides strong guarantees, we will show that it may also induce biases and fairness issues in downstream decision processes. These issues may adversely affect many individuals’ health, well-being, and sense of belonging, and are currently poorly understood.

In this talk, we delve into the intersection of privacy, fairness, and decision processes, with a focus on understanding and addressing these fairness issues. We first provide an overview of Differential Privacy and its applications in data release and learning tasks. Next, we examine the societal impacts of privacy through a fairness lens and present a framework to illustrate what aspects of the private algorithms and/or data may be responsible for exacerbating unfairness. We hence show how to extend this framework to assess the disparate impacts arising in Machine Learning tasks. Finally, we propose a path to partially mitigate these fairness issues and discuss grand challenges that require further exploration.

Read More
  • March 1, 2023
  • Kaleigh O'Merry

Cryptographic Advances in Reasoning about Adversaries

A key challenge in cryptography is to ensure that a protocol resists all computationally feasible attacks, even when an adversary decides to follow a completely arbitrary and unpredictable strategy. This […]

Read More