Panel: Privacy Challenges and Opportunities in LLM-Based Chatbot Applications

Tuesday, September 12, 2023 - 2:00 pm2:40 pm

Sameera Ghayyur, Snap Inc.; Jay Averitt, Microsoft; Eric Lin, DynamoFL; Eric Wallace, UC Berkeley; Apoorvaa Deshpande, Snap Inc.; Hunter Luthi, Google Bard


We are seeing a great interest in AI chatbots thanks to recent advances in Large Language Model (LLM) technology. We now have several publicly available chatbots, including ChatGPT by OpenAI, Google Bard, My AI by Snap, Microsoft's AI-powered Bing, etc., and in this panel we plan to discuss the privacy challenges and opportunities in these chatbots. We will discuss what is new about this chat data, tradeoffs in data collection, personalization implications, as well as considerations while using third party LLMs.

Sameera Ghayyur, Snap Inc.

Sameera Ghayyur is currently a privacy engineer at Snap Inc, where she is the primary privacy reviewer on My AI chatbot product offering, among many other features in Snapchat. In the past, she has also worked in the privacy teams at Meta and Honeywell. She received her Ph.D. in computer science from the University of California, Irvine, and her research is focused on accuracy-aware privacy-preserving algorithms. She also has experience working as a software engineer and a lecturer.

Jay Averitt, Microsoft

Jay Averitt is currently a Senior Privacy Product Manager at Microsoft, where he manages Technical Privacy Reviews involving M365 CoPilot, GPT, and other LLM products. He was previously a Privacy Engineer at Twitter, where he managed technical privacy reviews across the platform. He has 10+ years of experience in privacy as both a privacy technologist and a privacy attorney. He graduated with a BS in Management Information Systems from Auburn University and a JD from the University of Alabama School of Law.

Eric Lin, DynamoFL

Eric Lin is the Head of ML Ops at DynamoFL, where he leads a team of ML researchers and engineers empowering enterprise companies to deploy private and trustworthy generative AI models. His team focuses on democratizing the latest research techniques in privacy and safety to a broader audience. Eric previously researched privacy-preserving, trustworthy, and on-device ML optimizations during his BA and MS at Harvard. He has also shipped AI-powered products to over 1 billion users as a PM at Microsoft and Apple.

Eric Wallace, UC Berkeley

Eric Wallace is a PhD student at UC Berkeley advised by Dawn Song and Dan Klein. His research interests are in making large language models more robust, trustworthy, secure, and private. Eric's work is supported by the Apple Fellowship in AI/ML, and in the past he has been at Google, FAIR, AI2, and the University of Maryland. His expertise also includes memorization & privacy, in particular how LMs and diffusion models can memorize their training data raising concerns regarding privacy, copyright agreements, GDPR statutes, and more.

Apoorvaa Deshpande, Snap Inc.

Apoorvaa Deshpande is a privacy engineer at Snap Inc for 3+ years where she primarily works on privacy engineering reviews for the monetization products (including monetization of LLMs) as well as building privacy enhancing technologies. Prior to that, Apoorvaa completed her PhD in computer science (specifically, cryptography) from Brown University. She graduated with M.Sc. Mathematics + B.E Computer Science from BITS Pilani University, India.

Hunter Luthi, Google Bard

Hunter is the Privacy Lead and Privacy Engineering Manager for Google Search, Assistant, and Bard. Hunter and his team are responsible for reviewing and consulting on product design, including championing privacy by design philosophies with Google teams. Prior to Google, Hunter was the Program Manager for Privacy Solutions at TrustArc. He has a JD from Santa Clara University and received a Privacy Certificate with Honors, and a BS in Informatics from Indiana University Southeast.

@conference {290847,
author = {Sameera Ghayyur and Jay Averitt and Eric Lin and Eric Wallace and Apoorvaa Deshpande and Hunter Luthi},
title = {Panel: Privacy Challenges and Opportunities in {LLM-Based} Chatbot Applications},
year = {2023},
address = {Santa Clara, CA},
publisher = {USENIX Association},
month = sep