PEPR '19 Conference Program

Monday, August 12

7:45 am–8:45 am

Continental Breakfast

8:45 am–9:00 am

Opening Remarks

Program Co-Chairs: Lorrie Cranor, Carnegie Mellon University, and Lea Kissner, Humu

9:00 am–10:30 am

Finding and Fixing Problems

The Privacy Engineering Mindset: From Design to Launch

Monday, 9:00 am9:22 am

Sha Sundaram, Snap

The privacy engineering mindset is a set of methodologies, tools and patterns you can employ to create products that protect users and their data. Having done this at scale at companies like Google and Snap, I am eager to share my learnings and practices with you.

In this talk, we will delve into the nuances of privacy, security, risk and impact to user and identify a blueprint for an effective privacy program you can setup at your company. We will discuss instituting the right collaborative practices and incentives to work with partners in engineering, privacy & security, product, and legal teams to develop products that respect user privacy rights. Finally, we'll elaborate on the role a privacy engineer can play in shaping product from design to launch.

Sha Sundaram, Snap

Sha Sundaram is a seasoned privacy and security engineer. She has brought her 10 years of experience in privacy engineering at Google and Symantec Research Labs to lead Snap’s Privacy Engineering team. As the first privacy engineer at Snap, Inc. starting in 2016, she built the privacy program for this hot startup from the ground up. Today, she is deep in the trenches of designing Snapchat app features with privacy in mind. She also leads the privacy engineering reviews for the majority of Snap releases and all of Snap acquisitions.

Sha holds an MS in Computer Science from Stanford University. Her first major influence was John Mitchell at Stanford University where she worked on the TRUST (Team for Research in Ubiquitous Secure Technology) team as well as early efforts of privacy for theoretical computer science.

Our Friend from Privacy: Building and Harnessing Meaningful Working Relationships

Monday, 9:22 am9:45 am

Amber Yust, Google

Privacy engineering doesn't work in a vacuum - we support a broader product, engineering, and business organization to help create respectful products. Based on years of practical experience partnering with teams at Google, this talk will focus on techniques for bridging the gap between privacy subject matter experts and the people they work with on a daily basis (including each other!) to create meaningful impact throughout an organization.

Amber Yust, Google

Amber Yust has helped build Google's privacy engineering organization from the ground up. Currently a staff privacy engineer and manager at Google, she and her team work to bring order to chaos in a world that runs on technology but is made of, by, and for human beings.

Addressing Privacy Risks of Targeted Advertising

Monday, 9:45 am10:07 am

Stephen Weis, Aspen Institute Tech Policy Hub

A real world case study of how a large advertising network responded to a vulnerability that leaked personally identifiable information.

Stephen Weis, Aspen Institute Tech Policy Hub

Steve Weis is a cryptographer & entrepreneur who focuses on securing people's data with applied cryptography. He is currently a Technology Policy Fellow at the Aspen Institute. Previously, Steve was a software engineer at Facebook working on data privacy.

User-centric Privacy: Designing Effective Privacy Protections That Meet Users' Needs

Monday, 10:07 am10:30 am

Florian Schaub, University of Michigan

Privacy engineering aims to respect and protect users' privacy. User studies provide insights on users' privacy needs, concerns, and expectations, which are essential to understand what a system's actual privacy issues are from a user perspective. Drawing on the speaker's research on privacy notices and controls online, on smartphones and in the context of smartspeakers, this talk discusses how and why privacy controls are often misaligned with user needs, and how user studies can inform the design of user-centric privacy protections that more effectively meet users' needs as well as benefit companies.

Florian Schaub, University of Michigan

Florian Schaub is Assistant Professor of Information and of Electrical Engineering and Computer Science at the University of Michigan. Dr. Schaub’s research focuses on investigating and supporting people’s privacy and security behavior and decision making in complex socio-technological systems. His research interests span privacy, human-computer interaction, and emergent technologies, such as the Internet of Things. Dr. Schaub received his doctoral degree in Computer Science from the University of Ulm, Germany, and was a postdoctoral fellow in Carnegie Mellon University’s School of Computer Science.

10:30 am–11:00 am

Break with Refreshments

11:00 am–12:30 pm

Hard Bits of Product Privacy

Privacy as a Service: Building an End-to-End Consent Platform

Monday, 11:30 am12:00 pm

Roche Janken, Uber

Confirming user consent in a consistent and transparent way is an important aspect of giving consumers visibility and control over data-sharing choices. Uber’s privacy engineering team built a consent service to make it easy for product teams to do the right thing when it comes to confirming user consent for data collection when needed. As a global company with businesses across multiple verticals, primary design considerations included flexibility for unanticipated use cases, ease of development so product teams can add functionality, and reliability. With only a few lines of code, feature teams can utilize this service in various features such as ebike onboarding and restaurant-partner support. This session will focus on how to design a consent service with the necessary flexibility for other teams with unanticipated use cases to easily confirm consent.

Roche Janken, Uber

Hello! I am a senior software engineer on the Privacy Team at Uber. I feel really lucky to have found this field because when I’m not quitely coding on projects that support user privacy, I get to work with really wonderful and interesting people across disciplines. What a joy to explore unknown territory in a creative and collaborative way.

Secure Messaging? More Like Secure Mess

Monday, 12:00 pm12:30 pm

Gennie Gebhart and Erica Portnoy, Electronic Frontier Foundation

There is no such thing as a perfect or one-size-fits-all messaging app. But people who build and maintain secure messengers have to start somewhere. In this talk, we’ll share our experience in the secure messaging space, what that has taught us about different group’s secure messaging needs, and how to balance those needs.

Gennie Gebhart, Electronic Frontier Foundation

Gennie conducts and manages research and advocacy for the Electronic Frontier Foundation on consumer privacy, surveillance, and security issues. Prior to joining EFF, Gennie earned a Master of Library and Information Science from the University of Washington Information School, where she published on Internet censorship in Thailand and zero-rating in Ghana, as well as investigating mobile access and technology terms in Myanmar (Burma) and public Internet access in Laos. While at the UW, she also co-founded and led a successful initiative for a university Open Access policy.

Erica Portnoy, Electronic Frontier Foundation

Erica Portnoy develops the Let's Encrypt client Certbot, which makes it easy for people who run websites to turn on https, keeping their users private and secure against network-based attackers. She writes and speaks about encryption in practice, including what people need from secure messaging providers and what the next generation of encryption in the cloud might look like. Erica has also worked on EFF's net neutrality project, writing technical filings and opinion pieces and organizing technologists from the networking industry to speak up for technical accuracy in policy decisions.

12:30 pm–2:00 pm

Monday Luncheon

2:00 pm–3:30 pm

Privacy Infrastructure

Now You See It, Now You Don't: Uber's Data Deletion Service

Monday, 2:00 pm2:30 pm

Yash Doshi and Harshal Shah, Uber

Deletion at scale is a complex and iterative process because as more and more systems and features are built, measuring completeness and auditing of a deletion becomes a key to success.

Uber’s data deletion system is a single-source authority for deleting user accounts and related data. It’s managed across multiple data stores and integrates with Uber’s complex environment of microservices. It supports a variety of deletion triggers including user requests, account inactivity, and deletion/retention policies.

The system also integrates with insurance, legal, safety, and money systems to evaluate if there any regulatory or legal requirements that could block the process such as subpoenas, lawsuit, arrears, safety incidents, insurance claims, or fraudulent activity. This talk will discuss how to build a scalable and reliable system with full audit capabilities that allow you to delete a user from all the systems where the data may reside.

Machine Learning at Scale with Differential Privacy in TensorFlow

Monday, 2:30 pm3:00 pm

Nicolas Papernot, Google Brain

This talk will illustrate how learning with rigorous differential privacy guarantees is possible using TensorFlow Privacy, an open-source library that makes it easier not only for developers to train ML models with privacy in real-world systems, but also for researchers to advance the state-of-the-art in ML with strong privacy guarantees.

Nicolas Papernot, Google Brain

Nicolas Papernot is a research scientist at Google Brain working on the security and privacy of machine learning. He will join the University of Toronto and Vector Institute as an assistant professor and Canada CIFAR AI Chair in the Fall 2019. He earned his Ph.D. in Computer Science and Engineering at the Pennsylvania State University, working with Prof. Patrick McDaniel and supported by a Google PhD Fellowship in Security and Privacy. Nicolas received a best paper award at ICLR 2017. He is also the co-author of CleverHans, an open-source library widely adopted in the technical community to benchmark machine learning in adversarial settings, and tf.Privacy, an open-source library for training differentially private models with TensorFlow. He serves on the program committees of several conferences including ACM CCS, IEEE S&P, and USENIX Security. In 2016, he received his M.S. in Computer Science and Engineering from the Pennsylvania State University and his M.S. in Engineering Sciences from the Ecole Centrale de Lyon.

Differentially Private Data Release under Partial Information

Monday, 3:00 pm3:30 pm

David Zeber, Mozilla Corporation

Differential privacy (DP) is now a standard technique for releasing reports based on sensitive data. However, selecting and tuning a DP mechanism so as to obtain high utility of the privacy-protected data is often difficult without detailed knowledge of the characteristics of the sensitive dataset. We propose an applied methodology for guiding the implementation of DP data protections using only partial summary information about the private data. In this setting, candidate DP mechanisms can be evaluated across possible realizations of the sensitive dataset, the selection of which is feasibly constrained using the available partial information. We demonstrate our approach for the problem of reporting the DP-protected distribution of item frequencies from a dataset of user-item pairs.

David Zeber, Mozilla Corporation

David Zeber is a research engineer at Mozilla. Reaching across data science, machine learning and differential privacy, his work focuses on collecting and modeling user data in a privacy-preserving way to improve user experience in the Firefox browser and on the Web.

3:30 pm–4:00 pm

Break with Refreshments

4:00 pm–5:30 pm

Privacy in Interesting Contexts

Making Ethical Decisions for the Immersive Web

Monday, 4:00 pm4:30 pm

Diane Hosfelt, Mozilla

This talk focuses on building a platform that encourages ethical development and usage in an environment where ubiquitous sensors and continuous computer vision processing are required. Spatial computing and immersive experiences expose, by necessity, information that poses a threat to privacy. We’re still in a position to intervene in the Mixed Reality development process, instead of attempting to retrofit ethical decisions into an established design.

Diane Hosfelt, Mozilla

Diane is the security and privacy lead for the Mixed Reality team at Mozilla. She studied at Johns Hopkins University where her research was focused in applications of machine learning to cryptography. She’s currently developing new paradigms for privacy and consent in MR environments

Privacy Engineering in the Automotive Domain

Monday, 4:30 pm5:00 pm

Frank Kargl, Ulm University

This talk addresses privacy protection in the automotive domain and for connected vehicles. As cars and mobility services like tolling or charging services have the potential to track their users mobility behavior, this field became aware of the need of privacy protection comparatively early. A particular challenge is to integrate privacy engineering into the daily practice of software and system engineers that have no deep knowledge of privacy concepts. The talk will outline how our understanding of automotive privacy developed over the past 15 years and how privacy engineering can be put into practice.

Frank Kargl, Ulm University

Frank Kargl is a full professor at Ulm University and works since 2004 on automotive security and privacy. He cooperated with automotive industry in many projects and also got involved with international standardization and harmonization in these areas. He was partner in the European PRIPARE project that investigated new approaches to privacy engineering.

Privacy in Unusual Contexts: A Case Study of A Theater Company

Monday, 5:00 pm5:30 pm

Maggie Oates, Carnegie Mellon University

From social media APIs to VR performance art, artists are collecting, generating, and transforming digital data. Give that one of art’s central projects throughout history is the disruption of social norms, this field poses unusual privacy challenges. Can privacy frameworks like contextual integrity even help us think about privacy norms for non-normative contexts? In addition, galleries, museums, production companies, and individual artists often operate as small businesses without substantive expertise in data security or privacy. I will present one case study (in progress) of a small theater company that faces privacy challenges during the development of an immersive theater piece involving audience data.

Maggie Oates, Carnegie Mellon University

Maggie Oates is a Societal Computing PhD student at Carnegie Mellon University interested in applying arts-based methods in computing research. She graduated from Indiana University with a BS in Computer Science and currently serves on the Board of Trustees for AnitaB.org, a global nonprofit serving women in computing.

Tuesday, August 13

7:55 am–8:55 am

Continental Breakfast

8:55 am–9:00 am

Opening Remarks

Program Co-Chairs: Lorrie Cranor, Carnegie Mellon University, and Lea Kissner, Humu

9:00 am–10:30 am

Privacy Engineering Careers

Explore NIST's Privacy Engineering Collaboration Space

Tuesday, 10:15 am10:30 am

Kaitlin Boeckl, NIST

Explore NIST’s recently-launched Privacy Engineering Collaboration Space and learn how you can engage at this session led by NIST. The Privacy Engineering Collaboration Space is an online venue open to the public where practitioners can discover, share, discuss, and improve upon open source tools, solutions, and processes that support privacy engineering and risk management. Individuals affiliated with organizations ranging from Google to the Future of Privacy Forum have shared tools and use cases. The space has an initial focus on de-identification, to include differential privacy techniques, and privacy risk assessment. Join this session for more about contributions to date and how you can contribute.

Kaitlin Boeckl, NIST

Katie Boeckl is a privacy risk strategist at the National Institute of Standards and Technology (NIST) as part of the Privacy Engineering Program and the Privacy Framework team. Katie manages the Privacy Engineering Collaboration Space, works to advance international privacy standards, and develops privacy risk management guidance. At NIST, she has served as a co-author for NIST Special Publication (SP) 800-37, revision 2: Guide for Applying the Risk Management Framework to Federal Information Systems, worked to implement the National Strategy for Trusted Identities in Cyberspace (NSTIC), and contributed to NIST SP 800-63, revision 3: Digital Identity Guidelines. Katie has a B.A. in English from the University of Maryland, College Park, where she specialized in technology through a digital cultures honors program.

10:30 am–11:00 am

Break with Refreshments

11:00 am–12:30 pm

Privacy Program in Practice

How to Run an Engineering-Focused Privacy Program

Tuesday, 11:00 am11:30 am

Giles Douglas

Getting engineering involved with the review and design of projects leads to better interactions. I will present how to make this successful from both a program and individual perspective.

Giles Douglas[node:field-speakers-institution]

Giles was the engineering lead for privacy review at Google for the last 2 years. He spent 14 years at Google, and led the internal payments team to scale and secure the system. He has worked on public facing internet websites since 1996.

Giles received a B.Sc. (Hons) Mathematics from the University of Warwick in 1995.

Privacy, Triage, and Risk

Tuesday, 11:30 am12:00 pm

Yonatan Zunger, Humu

Triage is often overlooked in threat management. But without understanding the significance of risks, making intelligent tradeoffs, and conveying those in a way which builds consensus among all stakeholders, even the best ideas will not happen. Privacy triage is notoriously hard: the threat models are extremely diverse, spanning both rare, catastrophic risks and dispersed continuous damage (the two things people are worst at understanding), and key stakeholders have frequently not internalized what failures can look like for real people. This talk will present a collection of tools which have proven useful for both making triage decisions and for getting difficult stakeholders on board.

Yonatan Zunger, Humu

Yonatan Zunger is Chief Ethics Officer and Distinguished Engineer at Humu. Prior to this, he spent 14 years at Google, where he was responsible for a wide range of technical, privacy, and policy matters, ranging from leading high-capacity search, to being overall technical lead for all Social efforts, to navigating hate and harassment policy and spearheading technical data governance. If there was something that involved the overlap between engineering, law, policy, privacy, and SVP-wrangling, it was probably his job.

Building and Scaling a Data Stewardship Program for Products Used by Hundreds of Millions of People

Tuesday, 12:00 pm12:30 pm

Rebecca Weiss, Mozilla Corporation

Data stewardship programs provide a unique solution to balancing the costs of adhering to privacy principles while still collecting the data needed for modern product development. In this talk, I will discuss how Mozilla uses data stewardship as we build Firefox and share the lessons we’ve learned along the way.

Rebecca Weiss, Mozilla Corporation

Rebecca Weiss is the Director of Data Science at Mozilla. She leads a team focusing on generating insights into the state of the browser and the Web at large. She was also the lead Data Steward of Firefox, where she worked on creating privacy standards for data collection in the browser. Previously, she was a Fellow at the Berkman Center for Internet and Society as well as the Brown Institute for Media Innovation. She holds a PhD from Stanford, a SM in Technology Policy from MIT, and a BA in Cognitive Systems from the University of British Columbia. She also knows a surprising amount about comic books and video games.

12:30 pm–12:40 pm

Closing Remarks

Program Co-Chairs: Lorrie Cranor, Carnegie Mellon University, and Lea Kissner, Humu

12:40 pm–2:15 pm

Tuesday Luncheon

2:15 pm–5:45pm

SOUPS 2019 Tuesday Afternoon Sessions

Please join us in the final sessions of the Fifteenth Symposium on Usable Privacy and Security, the subject matter of which is aligned with the interests of the PEPR community.