PEPR '20 Conference Program

All the times listed below are in Pacific Daylight Time (PDT).

Attendee Files 
PEPR '20 Attendee List (PDF)

Thursday, October 15

8:00 am–8:10 am

Opening Remarks

Program Co-Chairs: Lorrie Cranor, Carnegie Mellon University, and Lea Kissner, Apple

8:10 am–9:25 am

Data Governance

Session Chair: Lorrie Cranor, Carnegie Mellon University

Beyond Access: Using Abac Frameworks to Implement Privacy and Security Policies

Amanda Walker, Nuna, Inc.

Available Media

Over the last several decades, access control systems have evolved steadily from a single bits (“write protect”) through identity and role based approaches to complex, abstract frameworks such as Attribute Based Access Control (ABAC). However, in use they are most often used to answer traditional question of read and write access. In this talk, I will explore how frameworks like ABAC can be used to implement more abstract controls and policies such as purpose constraints and other data handling policies that need to depend on attributes of the data, the code, and surrounding context.

Amanda Walker[node:field-speakers-institution]

Amanda headed up privacy infrastructure engineering at Google for many years, leading and managing teams that built APIs, services, and other components used by Google’s product teams to solve privacy related problems. In 2019, she left Google to take up a new challenge as Vice President Engineering for Nuna, Inc., a health care data analytics company that builds data-driven platforms to support value-based health care.

Privacy Architecture for Data-Driven Innovation

Derek Care, Legal Director, Privacy at Uber; Nishant Bhajaria, Privacy Architecture and Strategy at Uber

Available Media

Building privacy governance for a decentralized and innovative workplace is challenging but can help earn customer trust and competitive differentiation.

In this session, an engineer and an attorney will explain how you can execute this during data collection and data sharing to provide an end-to-end privacy program for your most valued stakeholders: your users.

There are several challenges when instrumenting this: measuring risk for the data you collect, tailoring your security tools for privacy, sharing data with privacy in mind and quantifying improvements and investments.

In order to succeed, there is a combination of engineering techniques and management know-how that is critical. The speakers will share their experiences and lessons on how you engage engineers, privacy/security specialists and executive leadership in service of user data privacy.

Derek Care, Legal Director, Privacy at Uber

Derek Care is a Director on Uber’s Legal - Privacy team, where he has helped build Uber’s privacy program and drive compliance with global privacy laws. Prior to joining Uber, he was privacy counsel at Bloomberg L.P., and counsel in the privacy group at Bingham McCutchen LLP. He has spoken extensively on how to operationalize privacy compliance.

Nishant Bhajaria, Privacy Architecture and Strategy at Uber

Nishant Bhajaria has a B.S. and M.S. in computer science, and has built and led cross-functional privacy teams at Nike, Netflix, Google and Uber. His efforts are geared towards building tools and shaping organizational dynamics to improve privacy and protect user trust. He has spoken and written about these topics extensively and has taught courses on privacy hosted by LinkedIn as well.

Responsible Design through Experimentation: Learnings from LinkedIn

Guillaume Saint-Jacques, LinkedIn Corporation

Available Media

As technology advances, there is increasing concern about individuals being left behind. Businesses are striving to adopt responsible design practices and avoid any unintended consequences of their products. We propose a novel approach to fairness and inclusiveness based on experimentation. We use experimentation in order to assess not only the intrinsic properties of products and algorithms but also their impact on people. We do this by introducing an inequality approach to A/B testing. We show how to perform causal inference over this inequality measure. We provide real examples from LinkedIn, as well as an open-source, highly scalable implementation of the computation of the Atkinson index and its variance in Spark/Scala. We also provide over a year's worth of learnings - gathered by scaling our method and analyzing thousands of experiments - on which areas and which kinds of product innovations seem to foster fairness through inclusiveness.

9:25 am–9:45 am

Break

9:45 am–11:00 am

Privacy-Preserving Data Analysis

Session Chair: Lea Kissner, Apple

Building and Deploying a Privacy Preserving Data Analysis Platform

Frederick Jansen, Boston University

Available Media

This talk focuses on our experience building and deploying various iterations of a web-based secure multi-party computation (MPC) platform. Our experience demonstrates that secure computations can add value to questions of social good when otherwise constrained by legal, ethical, or privacy restrictions, and that it is feasible to deploy MPC solutions today.

Frederick Jansen, Boston University

Frederick Jansen is the Director of the Software & Application Innovation Lab (SAIL) at Boston University. At SAIL he currently leads a team of software engineers, and was a prior senior software engineer responsible for the design, development, and management of software development projects in support of computational research efforts across the University.

A Differentially Private Data Analytics API at Scale

Ryan Rogers, LinkedIn

Available Media

We present a privacy system that leverages differential privacy to protect LinkedIn members' data while also providing audience engagement insights to enable marketing analytics related applications. We detail the differentially private algorithms and other privacy safeguards used to provide results that can be used with existing real-time data analytics platforms, specifically with the open sourced Pinot system. Our privacy system provides user-level privacy guarantees. As part of our privacy system, we include a budget management service that enforces a strict differential privacy budget on the returned results to the analyst. This budget management service brings together the latest research in differential privacy into a product to maintain utility given a fixed differential privacy budget.

Ryan Rogers, LinkedIn

Ryan Rogers is a Senior Software Engineer in the applied research group at LinkedIn where he works on designing and implementing private algorithms and systems for data analytics and machine learning. Prior to working at LinkedIn, he worked with the ML Privacy team at Apple where he was the technical lead on developing the private algorithms for the private federated learning project. He received his PhD in Applied Mathematics from the University of Pennsylvania where he was advised by Aaron Roth and Michael Kearns.

Improving Usability of Differential Privacy at Scale

Milinda Perera and Miguel Guevara‎, Google LLC

Available Media

We present a framework to improve the usability of Differential Privacy (DP) by allowing practitioners to quantify and visualize privacy vs utility trade-offs of DP.

While DP has long been seen as a robust anonymization technique, there is a significant disconnect between theory, implementation, and usability. One of the biggest problems that practitioners face when using DP is forming mental models around the benefits that DP provides to end users and how DP affects data utility. Many users are not acquainted to think in terms of epsilons, deltas, and sensitivity bounds, and they shouldn't have to! Our system helps users think in terms of utility loss and user anonymity gains.

Our talk has three parts. First, we provide a very quick primer on DP. Second, we will explain why and how we build this framework. Third, we demo the system using a real dataset in real-time!

Milinda Perera, Google LLC

Milinda Perera is a Software Engineer in the Privacy group at Google. His primary focus areas include engineering usable anonymization, scaling pseudonymization, and improving privacy through cryptography. He holds a Ph.D. in Cryptography from the City University of New York (CUNY).

Miguel Guevara, Google LLC

Miguel Guevara is a Product Manager working in Google’s Differential Privacy team. His primary focus area is building systems that apply differential privacy at scale. He holds a Masters in Public Policy and is currently pursuing a Masters in Computer Science.

11:00 am–11:40 am

Networking Break

Join a Birds-of-a-Feather Session via Zoom to discuss topics of interest with other attendees. Topics and Zoom info will be posted on the conference Slack.

11:40 am–12:55 pm

Design

Session Chair: Manya Sleeper, Google

How to (In)Effectively Convey Privacy Choices with Icons and Link Text

Lorrie Faith Cranor, Carnegie Mellon University; Florian Schaub, University of Michigan

Available Media

Clear communication about privacy through icons and/or just a few words can be a difficult challenge. We have been involved in a number of research projects that evaluated proposed and implemented privacy choice icons and short text, including most recently studies related to the CCPA opt-out button. In this talk we will discuss our research into privacy icons and text, explaining both our methods, and our findings. We will demonstrate how to conduct these sorts of studies quickly and at low cost, and discuss why they are so important. We will also provide lessons learned about what works and what doesn’t when you want to communicate about privacy through icons and short text.

Lorrie Faith Cranor, Carnegie Mellon University

Lorrie Faith Cranor is the Director and Bosch Distinguished Professor of the CyLab Security and Privacy Institute and FORE Systems Professor of Computer Science and of Engineering and Public Policy at Carnegie Mellon University. She also directs the CyLab Usable Privacy and Security Laboratory (CUPS) and co-directs the MSIT-Privacy Engineering masters program. In 2016 she served as Chief Technologist at the US Federal Trade Commission. She co-founded Wombat Security Technologies, a security awareness training company that was acquired by Proofpoint. She is a fellow of the ACM and IEEE and a member of the ACM CHI Academy.

Florian Schaub, University of Michigan

Florian Schaub is an Assistant Professor in the School of Information. His research focuses on empowering users to effectively manage their privacy in complex socio-technological systems. His research interests span privacy, human-computer interaction, mobile and ubiquitous computing, and the Internet of Things. Before joining the University of Michigan, he was a postdoctoral fellow in the School of Computer Science at Carnegie Mellon University. He received his doctoral degree and Diplom in Computer Science from the University of Ulm, Germany, and a Bachelor in Information Technology from Deakin University, Australia.

Beyond the Individual: Exploring Data Protection by Design in Connected Communal Spaces

Martin J. Kraemer, University of Oxford

Available Media

There's a gap between the personal focus of data protection legislation and practices, and the communal implications of internet-connected technology. Through our research, we've started to explore how existing design tools and methods can help understand and address communal implications. In this talk, I will highlight opportunities for design in such spaces by discussing two case studies from participatory design workshops. These case studies show potential for our method to inform data protection design that is appropriate to social groups and their dynamics, complementing individual perspectives on data protection.

Martin Kraemer, University of Oxford

Martin is a final year doctoral student at the University of Oxford. His research aims to improve the understanding of and design for communal privacy practices in smart homes. Together with other HCI researchers, he runs the project Informing the Future of Data Protection by Design in Smart Homes funded by the UK Information Commissioner's Office.

Throwing Out the Checklist

Dan Crowley, Quizlet

Available Media

Effective privacy design is a crucial component of technology products and has been a global focus area for regulators and companies alike for decades. Despite this, many classic "privacy-by-design" processes ultimately fail. Why is that? This session focuses on the gaps in most PbD processes, notably how classic PbD systems exhibit inflexibility, are under-resourced, and conflict with predominant engineering cultures. It also discusses what companies, but especially smaller companies, and their privacy champions can do to achieve better privacy outcomes. By shifting from programs designed around process-oriented mandates to programs that emphasize building a “privacy-by-ethos” culture, companies can set themselves (and their users) up for success when it comes to privacy best practices.

Dan Crowley, Quizlet

Dan Crowley is the Global Head of Trust & Safety and Data Protection at Quizlet, an online learning platform serving over students and teachers around the world. In his role, he oversees all public policy, privacy, data protection, compliance, user safety and content moderation programs. Mr. Crowley develops tools, policies and processes that maintain Quizlet as an appropriate platform for all audiences and works to ensure the responsible use of data across all of Quizlet’s products and throughout engineering, product development, design and other business processes.

12:55 pm–1:15 pm

Break

1:15 pm–2:30 pm

Product Privacy

Session Chair: Nwokedi Idika, Google

Product Privacy Journey: Towards a Product Centric Privacy Engineering Framework

Igor Trindade Oliveira, Work & Co

Available Media

The Product Journey and the Consumer Journey are essential parts of the product development process. When thinking about privacy from the perspective of the user, there is an underlying part of the consumer journey that must be mapped, the product's privacy journey. This talk discusses challenges of embedding privacy into the product creation process and how privacy product principles help consumers and how they can translated to product requirements.

Igor Trindade Oliveira, Work & Co

Igor is an Associate Partner at Work & Co, he is responsible for working closely with Product, Design and Engineering teams across several projects at Work & Co to conceptualize and design successful product strategies with privacy as the competitive advantage.

Wikipedia and the Lean Data Diet

Nuria Ruiz, Principal Engineer, Wikimedia Foundation

Available Media

Privacy is one of the lesser known charms of Wikipedia. Wikipedia’s stand on privacy allows users to access and modify a wiki in anonymity, without fear of giving away personal information, editorship or browsing history. In this talk we will go into the challenges that this strong privacy stance poses for the Wikimedia Foundation, including how it affects data collection and some creative workarounds that allow WMF to calculate metrics in a privacy conscious way.

Nuria Ruiz, Wikimedia Foundation

Before moving bytes for good as an engineer for Wikipedia, Nuria spent time working in performance, mobile apps and web frameworks in the retail and social spaces. She learned about scale in the bootcamp of life working for Amazon in the early years. Nuria is a physicist by training and started writing software in a Physical Oceanography Lab in Seattle. A long time ago. When Big Data was just called "science".

Privacy Professional Boss Mode

Melanie Ensign, Discernible Inc.

Available Media

Despite the growth in new regulations around the world, reliance on legal mandates often has diminishing returns for privacy professionals exerting influence beyond their immediate team. This presentation will introduce new and experienced professionals from all privacy disciplines to pragmatic techniques for thinking creatively about your role, how to build influence for privacy throughout cross-functional organizations, and reduce your dependence on regulatory hammers for securing privacy outcomes.

Friday, October 16

8:00 am–8:10 am

Quick Kickoff

Program Co-Chairs: Lorrie Cranor, Carnegie Mellon University, and Lea Kissner, Apple

8:10 am–9:25 am

Privacy-Preserving Technologies

Session Chair: Lorrie Cranor, Carnegie Mellon University

Privacy in Deployment

Patricia Thaine, Private AI, University of Toronto; Pieter Luitjens, Private AI; Dr. Parinaz Sobhani, Georgian Partners

Available Media

This talk is a guide to using privacy technology in deployment. First, we will give a brief overview of the current state of privacy technology for (a) Differential Privacy & Anonymization, and (b) Secure Multiparty Computation, Homomorphic Encryption, Secure Enclaves. We will then go over the current obstacles of deploying privacy-preserving software; namely, identifying privacy risks & risk management, the capabilities & limitations of privacy tool sets and the backgrounds required to use them. Obstacles differ depending on whether one is attempting to retrofit a codebase in order to integrate privacy post-hoc or whether one is choosing the tech stack they will use for creating a codebase that integrates Privacy by Design. With those two scenarios in mind, we will discuss strategies for choosing privacy tools, for choosing to compute on the edge vs. on-premise vs. on the cloud, and for thinking about right risk management frameworks.

Patricia Thaine, Private AI, University of Toronto

Patricia Thaine is the Co-Founder and CEO of Private AI, as well as a Computer Science PhD Candidate at the University of Toronto and a Postgraduate Affiliate at the Vector Institute. Her research is focused on privacy-preserving natural language processing, machine learning, and applied cryptography. She also does research on computational methods for lost language decipherment. Patricia is a recipient of the NSERC Postgraduate Scholarship, the RBC Graduate Fellowship, the Beatrice “Trixie” Worsley Graduate Scholarship in Computer Science, and the Ontario Graduate Scholarship. She has eight years of research and software development experience, including at the McGill Language Development Lab, the University of Toronto's Computational Linguistics Lab, the University of Toronto's Department of Linguistics, and the Public Health Agency of Canada.

Pieter Luitjens, Private AI

Pieter Luitjens is the Co-Founder and CTO of Private AI. He worked on software for Mercedes-Benz and developed the first deep learning algorithms for traffic sign recognition deployed in cars made by one of the most prestigious car manufacturers in the world. He has over 10 years of engineering experience, with code deployed in multi-billion dollar industrial projects. Pieter specializes in ML edge deployment & model optimization for resource-constrained environments. He has a Bachelor of Science in Physics and Mathematics and a Bachelor of Engineering from the University of Western Australia, as well as a Masters from the University of Toronto.

Parinaz Sobhani, Georgian Partners

Parinaz Sobhani is the Director of Machine Learning on the Georgian Impact team and is responsible for leading the development of cutting-edge machine learning solutions for growth-stage startup companies.. Parinaz holds a Ph.D. from the University of Ottawa with a research focus on solving opinion mining problems using natural language processing and deep neural networks techniques. She has more than 10 years of experience in developing and designing new models and algorithms for various artificial intelligence tasks. Prior to joining Georgian Partners, Parinaz worked at Microsoft Research where she developed end-to-end neural machine translation models. She has also worked for the National Research Council of Canada, where she designed and developed deep neural network models for natural language understanding and sentiment analysis.

Design of a Privacy Infrastructure for the Internet of Things

Norman Sadeh, Carnegie Mellon University

Available Media

We have recently launched a Privacy Infrastructure for the Internet of Things. The infrastructure revolves around a growing collection of registries where owners of IoT resources (e.g. IoT devices, services) and volunteer contributors can publicize the presence of IoT resources and their data practices, including any privacy settings made available by these resources. An IoT Privacy Assistant app available in the iOS store and Android Google Play store enables people to discover IoT resources around them, find out about the data they collect and interact with privacy settings made available by these resources. In this presentation, we will discuss some of the the design challenges associated with the development of such a platform and how we approached these challenges as we developed a first instance of this technology. Within a month, our user community has grown to over 15,000 users, with our infrastructure hosting over 100,000 IoT resource descriptions.

Norman Sadeh, Carnegie Mellon University

Norman Sadeh is a Professor of Computer Science at Carnegie Mellon University, where among other things he co-founded and co-directs the Privacy Engineering Master's Program. Norman's primary research interests span mobile & IoT, cybersecurity, privacy and Artificial Intelligence. He was founding CEO and Chairman of Wombat Security Technologies, a company that grew to over 200 employees, prior to being acquired by Proofpoint for $225M in March 2018. In the late nineties, Norman also served as Chief Scientist of the European Commission's 550M EUR research initiative in e-Commerce, which at the time included all pan-European research in cybersecurity and privacy and related policy initiatives.

A Backdoor by Any Other Name, and How to Stop It

Max Hunter, EFF

Available Media

Recent attacks on encryption have diverged. On the one hand, we’ve seen Attorney General William Barr call for “extraordinary access” to encrypted communications, using arguments that have barely changed since the 1990’s. But we’ve also seen suggestions from a different set of actors for more purportedly “reasonable” interventions, particularly the use of client-side scanning to stop the transmission of contraband files, most often child sexual abuse material (CSAM).

On their face, proposals to do client-side scanning seem to give us the best of all worlds: they preserve encryption, while also combating the spread of illegal and morally objectionable content.

But unfortunately it’s not that simple. While it may technically maintain some properties of end-to-end encryption, client-side scanning would render the user privacy and security guarantees of encryption hollow. This talk will explain why that is, and what we can do to keep encryption encrypted.

Maximillian Hunter, EFF

Max manages a team of engineers who maintain Certbot, STARTTLS Everywhere, and other projects to encrypt the Internet. Max writes and speaks primarily about consumer privacy, security, and tech policy on cryptography. They serve on the boards of the Internet Security Research Group (which operates Let's Encrypt) and the Nordic Center for Data Privacy.

9:25 am–9:45 am

Break

9:45 am–10:55 am

Incidents

Session Chair: Florian Schaub, University of Michigan

Building an Effective Feedback Loop for Your Privacy Program through Privacy Incident Response

Sri Pravallika Maddipati

Available Media

Privacy Incident Response (PIR) can be challenging to most organizations given the growing number of regulations and notification obligations. While most of the focus is on timely response to incidents, breach notification and quick fixes to minimize damage, the communication of lessons learnt to the appropriate product and privacy teams is often ignored. This talk focuses on maturing the incident response program to analyze key privacy incident trends and metrics which can highlight the success / progress / challenges of the privacy program.

Sri Pravallika Maddipati, Google LLC

Sri is a Privacy Incident Manager at Google with a diverse experience ranging from Incident response, privacy engineering, security strategy development, security assessments/ audits and security architecture. She plays an integral role in responding to privacy reports and facilitating the remediation of privacy issues while helping teams achieve their business goals. She thrives at the cross section of Privacy Engineering, Security Assurance and Privacy regulations. She strongly believes that Privacy is about providing transparent, unambiguous and trustworthy user centric products/services more than just compliance

When Things Go Wrong

Lea Kissner

Available Media

Despite all we do to prevent them, mistakes happen. We’re fallible humans working with exceedingly complicated systems in a world of users with a dizzying array of different needs. Unsurprisingly but sadly, our systems sometimes end up with vulnerabilities and those vulnerabilities can turn into incidents, hurting people affected by our systems. In this talk we go through the stages of incident handling: finding the cut, stopping the bleeding, and cleaning up the blood. After the incident is over, our work is done: we need to find the root cause and ensure that neither this particular incident nor related ones happen again. We will go through real-world examples of things going wrong and how to make them go right.

Lea Kissner[node:field-speakers-institution]

Lea was the Chief Privacy Officer of Humu. They work to build respect for users into products and systems through product design, privacy-enhancing infrastructure, application security, and novel research into both theoretical and practical aspects of privacy. They were previously the Global Lead of Privacy Technology at Google, working for over a decade on projects including logs anonymization, infrastructure security, privacy infrastructure, and privacy engineering. They earned a Ph.D. in computer science (with a focus on cryptography) at Carnegie Mellon University and a BS in electrical engineering and computer science from UC Berkeley.

Taking Responsibility for Someone Else's Code: Studying the Privacy Behaviors of Mobile Apps at Scale

Serge Egelman

Available Media

Modern software development has embraced the concept of "code reuse," which is the practice of relying on third-party code to avoid "reinventing the wheel" (and rightly so). While this practice saves developers time and effort, it also creates liabilities: the resulting app may behave in ways that the app developer does not anticipate. This can cause very serious issues for privacy compliance: while an app developer did not write all of the code in their app, they are nonetheless responsible for it. In this talk, I will present research that my group has conducted to automatically examine the privacy behaviors of mobile apps vis-à-vis their compliance with privacy regulations. Using analysis tools that we developed and commercialized (as AppCensus, Inc.), we have performed dynamic analysis on hundreds of thousands of the most popular Android apps to examine what data they access, with whom they share it, and how these practices comport with various privacy regulations, app privacy policies, and platform policies. We find that while potential violations abound, many of the issues appear to be due to the (mis)use of third-party SDKs. I will provide an account of the most common types of violations that we observe and how app developers can better identify these issues prior to releasing their apps.

Serge Egelman[node:field-speakers-institution]

Serge Egelman is the Research Director of the Usable Security and Privacy group at the International Computer Science Institute (ICSI), which is an independent research institute affiliated with the University of California, Berkeley. He is also CTO and co-founder of AppCensus, Inc., which is a startup that is commercializing his research by performing on-demand privacy analysis of mobile apps for developers, regulators, and watchdog groups. He conducts research to help people make more informed online privacy and security decisions, and is generally interested in consumer protection. This has included improvements to web browser security warnings, authentication on social networking websites, and most recently, privacy on mobile devices. Seven of his research publications have received awards at the ACM CHI conference, which is the top venue for human-computer interaction research; his research on privacy on mobile platforms has been cited in numerous lawsuits and regulatory actions, as well as featured in the New York Times, Washington Post, Wall Street Journal, Wired, CNET, NBC, and CBS. He received his PhD from Carnegie Mellon University and has previously performed research at Xerox Parc, Microsoft, and NIST.

10:55 am–11:40 am

Networking Break

Join a Birds-of-a-Feather Session via Zoom to discuss topics of interest with other attendees. Topics and Zoom info will be posted on the conference Slack.

11:40 am–12:55 pm

Frameworks and Risk

Session Chair: Lea Kissner, Apple

Assessing Privacy Risk with the IPA Triad

Mark Funk, Obscure Group

Available Media

Introducing the IPA Triad, a generalized privacy framework consisting of three properties: Identity, Presence, and Activity. Like the CIA triad of information security, these properties provide a useful approach for modeling privacy risks in an applied problem space.

Mark Funk, Obscure Group

Mark Funk runs a small (“fun size”) security and privacy consultancy. Their experience involves nearly 15 years of product, systems, security, and privacy engineering. Loving big and small challenges alike, they have been employee number 1 / ~100 / ~1,000 / ~100,000 at tech companies across a wide variety of product verticals.

Engineering Ethics into the NIST Privacy Framework

R. Jason Cronk, Enterprivacy

Available Media

Version 1.0 of the NIST Privacy Framework invites organizations to use organizational privacy values and business objective to design a target profile. But how does an organization determine it's "privacy values?" This talk will examine 5 different models for privacy and how organizations can develop a set of custom privacy values from those models.

R. Jason Cronk, Enterprivacy

R. Jason Cronk is the author of Strategic Privacy by Design, one of the textbooks for the IAPP's privacy technologist certification (CIPT). He was designated in 2014 as a privacy by design ambassador by the Ontario Information and Privacy Commissioner's office and has been a leading figure in privacy by design and privacy engineering ever since. Currently he works as a consultant and trainer for a boutique firm Enterprivacy Consulting Group. He can be found tweeting @privacymaverick

When Engineers and Lawyers Talk: Right-Sizing Your Data Protection Risk Profile

Rafae Bhatti, Mode

Available Media

The path to navigating data protection risks is often filled with uncertainty. Overestimating the risks stifles growth, and underestimating them can derail the business. To be able to measure data protection risks and right-size the risk-profile of a company, we need to view them from both a technical and legal lens. Engineers and lawyers need to talk.

This talk will provide practical examples of how right-sizing the risk profile helps simply compliance. It will cover scenarios of data retention, use, and sharing, as well as breach notification. We will review key architectural decisions as well as engineering trade-offs that are often involved in shaping an organization’s compliance processes. These decisions and tradeoffs often center around the purpose of use, which is a concept that engineering teams do not traditionally pay attention to. Therefore, viewing the system requirements from a data protection lens helps clarify legal obligations and simplify compliance.

Rafae Bhatti, Mode

Rafae Bhatti is an information security expert and a lawyer who works with cloud-based start-ups in Silicon Valley to help build their cybersecurity and compliance programs. He is currently the Director of Security and Compliance at Mode Analytics. He is a speaker and a published author, and an inventor on 3 granted patents. Rafae received a Ph.D. in Computer Engineering from Purdue University, and in his spare time also obtained a J.D. from Santa Clara University.

12:55 pm–1:00 pm

Closing Remarks

1:00 pm–1:45 pm

Virtual Ice Cream Social

Grab your own ice cream or other tasty treat and join us for a virtual social event with PEPR attendees.