<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:spotify="https://www.spotify.com/ns/rss">
  <channel>
    <generator>Fame Host (https://fame.so)</generator>
    <title>Privacy in Practice</title>
    <link>https://podcasts.fame.so/privacy-in-practice</link>
    <itunes:new-feed-url>https://feeds.fame.so/privacy-in-practice</itunes:new-feed-url>
    <description>Privacy in Practice, brought to you by VeraSafe, is the podcast for actionable insights and real-world strategies for privacy and compliance teams. Hosted by privacy pros Kellie Du Preez and Danie Strachan, each episode unpacks the practical side of compliance and data management, bringing together industry leaders and thought-provoking discussions. Whether you’re leading privacy efforts at your company or just beginning to explore this field, tune in for meaningful conversations that provide a straightforward approach to data privacy and empower listeners to make informed, confident decisions. Privacy isn’t just about regulatory boxes—it’s about fostering trust and resilience in a digital world.</description>
    <copyright>Podcasts Settings for Copyright: Copyrights © 2026 All Rights Reserved by VeraSafe</copyright>
    <language>en</language>
    <pubDate>Mon, 04 Nov 2024 15:43:21 +0000</pubDate>
    <lastBuildDate>Fri, 24 Apr 2026 12:52:21 +0000</lastBuildDate>
    
    <googleplay:author>VeraSafe</googleplay:author>
    <googleplay:image href="https://content.fameapp.so/uploads/3jqv36k1/87d72f40-9ac1-11ef-b4f9-ad8b31dcff8d/87d73060-9ac1-11ef-8aa6-53a7e909a53e.jpg"/>
    <itunes:category text="Business">
      <itunes:category text="Management"/>
    </itunes:category>
    <itunes:category text="News">
      <itunes:category text="Tech News"/>
    </itunes:category>
    <itunes:category text="Government"/>
    <googleplay:summary>Privacy in Practice, brought to you by VeraSafe, is the podcast for actionable insights and real-world strategies for privacy and compliance teams. Hosted by privacy pros Kellie Du Preez and Danie Strachan, each episode unpacks the practical side of compliance and data management, bringing together industry leaders and thought-provoking discussions. Whether you’re leading privacy efforts at your company or just beginning to explore this field, tune in for meaningful conversations that provide a straightforward approach to data privacy and empower listeners to make informed, confident decisions. Privacy isn’t just about regulatory boxes—it’s about fostering trust and resilience in a digital world.</googleplay:summary>
    <googleplay:explicit>No</googleplay:explicit>
    <googleplay:block>No</googleplay:block>
    <itunes:type>episodic</itunes:type>
    <itunes:author>VeraSafe</itunes:author>
    <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/87d72f40-9ac1-11ef-b4f9-ad8b31dcff8d/87d73060-9ac1-11ef-8aa6-53a7e909a53e.jpg"/>
    <itunes:summary>Privacy in Practice, brought to you by VeraSafe, is the podcast for actionable insights and real-world strategies for privacy and compliance teams. Hosted by privacy pros Kellie Du Preez and Danie Strachan, each episode unpacks the practical side of compliance and data management, bringing together industry leaders and thought-provoking discussions. Whether you’re leading privacy efforts at your company or just beginning to explore this field, tune in for meaningful conversations that provide a straightforward approach to data privacy and empower listeners to make informed, confident decisions. Privacy isn’t just about regulatory boxes—it’s about fostering trust and resilience in a digital world.</itunes:summary>
    <itunes:subtitle>Privacy in Practice, brought to you by VeraSafe, is the podcast for actionable insights and real-world strategies for privacy and compliance teams. Hosted by privacy pros Kellie Du Preez and Danie Strachan, each episode unpacks the practical side of compliance and data management, bringing together industry leaders and thought-provoking discussions. Whether you’re leading privacy efforts at your company or just beginning to explore this field, tune in for meaningful conversations that provide a straightforward approach to data privacy and empower listeners to make informed, confident decisions. Privacy isn’t just about regulatory boxes—it’s about fostering trust and resilience in a digital world.</itunes:subtitle>
    <itunes:keywords>VeraSafe, Kellie Du Preez, Danie Strachan, Privacy in Practice, Data Privacy Compliance, TrustArc, OneTrust, Nymity, BigID, DataGrail, WireWheelGDPR and Privacy Laws, Data Protection Strategies, Privacy and Emerging Tech, Compliance Best Practices</itunes:keywords>
    <itunes:owner>
      <itunes:name>Kellie Du Preez and Danie Strachan</itunes:name>
      <itunes:email>team@fame.so</itunes:email>
    </itunes:owner>
    <itunes:complete>No</itunes:complete>
    <itunes:explicit>No</itunes:explicit>
    <itunes:block>No</itunes:block>
    <item>
      <title>Empowering Teams to Exercise Judgement in Privacy Decisions</title>
      <link>https://podcasts.fame.so/e/r874vmwn-empowering-teams-to-exercise-judgement-in-privacy-decisions</link>
      <itunes:title>Empowering Teams to Exercise Judgement in Privacy Decisions</itunes:title>
      <itunes:episode>16</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">k08ynwk1</guid>
      <description>In this episode of Privacy in Practice, Kellie du Preez and Danie Strachan speak with Leah Camilla R. Besa-Jimenez, Group Head, Enterprise Risk Management at PLDT, about how she approaches privacy inside one of the largest Southeast Asian telecommunications companies. The discussion focuses on privacy as an operational practice and the importance of using risk matrices to structure decisions, coaching teams to exercise judgment, raising privacy issues early in project conversations, and shifting the company mindset from data ownership to data stewardship.</description>
      <content:encoded><![CDATA[<div>In this episode of <em>Privacy in Practice</em>, hosts Kellie du Preez and Danie Strachan sit down with Leah Camilla R. Besa-Jimenez, Group Head, Enterprise Risk Management at PLDT, about how she approaches privacy inside one of the largest Southeast Asian telecommunications companies. The discussion focuses on privacy as an operational practice, not only a legal one: using risk matrices to structure decisions, coaching teams to exercise judgment, raising privacy issues early in project conversations, and shifting the company mindset from data ownership to data stewardship.&nbsp;<br><br></div><div>In this episode, the conversation centers on how privacy functions inside day-to-day operations. Leah explains that privacy is largely about process: how data is handled, how risks are assessed, and how teams are trained to identify issues before launch. The episode also discusses how leaders must empower privacy teams to make better decisions.<br><br></div><div><strong><br>What this episode covers:<br></strong><br></div><ul><li>Why privacy work is often operational, not only legal</li><li>How risk is assessed using impact, likelihood, and specific privacy dimensions</li><li>Why teams need to exercise judgment instead of waiting for answers</li><li>Why privacy questions should be addressed early in design and planning</li><li>Why customer data should be treated as something the company stewards, not owns</li><li>And so much more!<br><br></li></ul><div><br></div><div><strong>Connect with Leah Camilla R. Besa-Jimenez here: </strong><a href="https://www.linkedin.com/in/leahbesajimenez/">LinkedIn</a></div><div><strong>Connect with Kellie du Preez here: </strong><a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></div><div><strong>Connect with Danie Strachan here: </strong><a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></div><div><strong>Follow VeraSafe here: </strong><a href="https://www.linkedin.com/company/verasafe/posts/?feedView=all">LinkedIn</a><br><br><br></div><div>If you enjoyed this episode, make sure to subscribe, rate, and review it.<br><br></div><div><br></div><h1><strong>Episode Highlights:<br></strong><br></h1><ul><li>[00:05:21] Using Risk Matrices to Structure Decisions</li></ul><div><br>Risk assessments are used to guide conversations by assigning scores based on impact and likelihood. This helps teams explain their reasoning and makes discussions more structured and less reactive.<br><br></div><ul><li>[00:09:04] Privacy by Design as a Cultural Practice<br><br></li></ul><div>Privacy by design is described as a behavior, not just a process. Embedding privacy depends on how teams think, interact, and raise issues during day-to-day work.<br><br></div><ul><li>[00:10:59] Breaking Risk Into Specific Dimensions<br><br></li></ul><div>Risk is not treated as a single concept. It is broken down into customer impact, compliance impact, potential harm, exercise of rights, and operational cost, allowing for more precise evaluation.<br><br></div><ul><li>[00:14:54] Clarifying Roles in Decision-Making<br><br></li></ul><div>Privacy teams do not make business decisions. Their role is to outline the risks associated with each option so that the business can make an informed choice.<br><br></div><ul><li>[00:19:51] Planning for Exercise of Rights Early<br><br></li></ul><div>Teams are expected to consider how customers will exercise their rights as part of the design process, not after implementation.<br><br></div><ul><li>[00:22:10] Starting With Conversation, Not Just Assessment<br><br></li></ul><div>Rather than relying only on formal reviews, early conversations are used to understand what teams want to build and to identify privacy considerations before requirements are finalized.I&nbsp;<br><br></div><ul><li>[00:24:18] Moving From Fear to Practical Enablement<br><br></li></ul><div>Privacy often starts as a response to risk or pressure, but the goal is to integrate it into how the organization operates so it supports decision-making rather than blocking it.<br><br></div><ul><li>[00:27:43] Reframing Data as Stewardship<br><br></li></ul><div><br>Customer data is not owned by the company. It is entrusted to the company to process according to what the customer agreed to, which changes how responsibilities are understood.<br><br></div><div><br></div><h1><strong>Episode Resources:<br></strong><br></h1><ul><li>Leah Camilla R. Besa-Jimenez on <a href="https://www.linkedin.com/in/leahbesajimenez/">LinkedIn</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li><li>VeraSafe <a href="https://verasafe.com/">Website</a></li></ul><div><br></div><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Tue, 14 Apr 2026 13:29:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/wz7xy578.mp3" length="33995614" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/caca6dd0-3806-11f1-9a3f-034f92e0b581/caca6ee0-3806-11f1-a24a-b13991d23a3b.png"/>
      <itunes:duration>2124</itunes:duration>
      <itunes:summary>In this episode of Privacy in Practice, Kellie du Preez and Danie Strachan speak with Leah Camilla R. Besa-Jimenez, Group Head, Enterprise Risk Management at PLDT, about how she approaches privacy inside one of the largest Southeast Asian telecommunications companies. The discussion focuses on privacy as an operational practice and the importance of using risk matrices to structure decisions, coaching teams to exercise judgment, raising privacy issues early in project conversations, and shifting the company mindset from data ownership to data stewardship.</itunes:summary>
      <itunes:subtitle>In this episode of Privacy in Practice, Kellie du Preez and Danie Strachan speak with Leah Camilla R. Besa-Jimenez, Group Head, Enterprise Risk Management at PLDT, about how she approaches privacy inside one of the largest Southeast Asian telecommunications companies. The discussion focuses on privacy as an operational practice and the importance of using risk matrices to structure decisions, coaching teams to exercise judgment, raising privacy issues early in project conversations, and shifting the company mindset from data ownership to data stewardship.</itunes:subtitle>
      <itunes:keywords/>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>California Is Watching: Unpacking Enforcement Trends with Daniel Goldberg</title>
      <link>https://podcasts.fame.so/e/1n2r59zn-california-is-watching-unpacking-enforcement-trends-with-daniel-goldberg</link>
      <itunes:title>California Is Watching: Unpacking Enforcement Trends with Daniel Goldberg</itunes:title>
      <itunes:episode>15</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">219938v1</guid>
      <description>California continues to set the pace for U.S. privacy enforcement, and 2025 is proving to be a pivotal year. In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Daniel Goldberg, Partner and Chair of the Data Strategy, Privacy, and Security Group at Frankfurt Kurnit Klein &amp; Selz and 2025 California Privacy Lawyer of the Year, to unpack what’s really happening behind the scenes of CCPA enforcement.
Drawing on his direct experiences defending companies facing California privacy enforcements, Daniel shares firsthand insights from public and non-public investigations, including trends emerging from actions involving companies like Sephora, DoorDash, and Jam City. The conversation explores what regulators actually prioritize, why misconfigured opt-outs and vendor oversight remain the most common pitfalls, and how the new Delete Act and data broker rules could dramatically shift compliance obligations.</description>
      <content:encoded><![CDATA[<div>California continues to set the pace for U.S. privacy enforcement, and 2025 is proving to be a pivotal year. In this episode of <em>Privacy in Practice</em>, hosts Kellie du Preez and Danie Strachan welcome Daniel Goldberg, Partner and Chair of the Data Strategy, Privacy, and Security Group at Frankfurt Kurnit Klein &amp; Selz and 2025 California Privacy Lawyer of the Year, to unpack what’s really happening behind the scenes of CCPA enforcement.<br><br></div><div>Daniel shares firsthand insights from public and non-public investigations, including trends emerging from actions involving companies like Sephora, DoorDash, and Jam City. The conversation explores what regulators actually prioritize, why misconfigured opt-outs and vendor oversight remain the most common pitfalls, and how the new Delete Act and data broker rules could dramatically shift compliance obligations.<br><br></div><div><strong><br>What this episode covers:<br></strong><br></div><ul><li>California’s accelerating enforcement landscape under the CCPA and what’s changed in 2025<br><br></li><li>Why most enforcement actions stem from misconfigured consumer rights processes<br><br></li><li>The rising settlement amounts and what being on notice means for businesses<br><br></li><li>Public vs. non-public settlements and what regulators really prioritize<br><br></li><li>Vendor risk exposure, cookie banner failures, and contract deficiencies<br><br></li><li>How to operationalize opt-outs across websites, apps, and connected ecosystems<br><br></li><li>Data broker registration enforcement and what the Delete Act changes in practice<br><br></li></ul><div>And so much more!<br><br><br></div><div><strong>Connect with Daniel Goldberg here: </strong><a href="https://www.linkedin.com/in/danielmgoldberg/">LinkedIn</a></div><div><strong>Connect with Kellie du Preez here: </strong><a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></div><div><strong>Connect with Danie Strachan here: </strong><a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></div><div><strong>Follow VeraSafe here: </strong><a href="https://www.linkedin.com/company/verasafe/posts/?feedView=all">LinkedIn</a><br><br><br></div><div>If you enjoyed this episode, make sure to subscribe, rate, and review it.<br><br></div><div><br></div><h1><strong>Episode Highlights:<br></strong><br></h1><ul><li>[02:58] What’s Changing in California Privacy Enforcement<br><br></li></ul><div>California’s privacy regime has moved from abstract compliance talk to sustained, precedent-setting enforcement. What began as a slow, symbolic action after the CCPA’s 2020 rollout has evolved into a coordinated enforcement landscape led by both the California Attorney General and the California Privacy Protection Agency (CPPA). Public cases, such as Sephora, DoorDash, Tilting Point Media, and Jam City only represent part of the picture. Across actions, regulators are signaling consistent priorities, including clear notice, lawful targeted advertising practices, functional opt-outs, and compliant vendor contracts.<br><br></div><ul><li>[11:17] How Enforcement Starts<br><br></li></ul><div>Regulatory enforcement can begin the same way customers experience your brand: someone identifies a potential issue while browsing your website, reading about your company, or attempting to exercise their privacy rights. Consumer-facing companies are naturally more visible to regulators, since regulators are consumers too, but the biggest trigger is still complaints, especially when an organization advertises privacy rights and then fails to honor those rights due to misconfiguration, slow response, or a broken process. Many cases aren’t about willful neglect. Often, they stem from gaps between what a company thinks the law requires and what regulators expect in practice. And while companies often study post-investigation website changes for guidance, they do not represent a regulator endorsement nor should they be used as a compliance template. The practical move is to think like a regulator.<br><br></div><ul><li>[22:36] Why GDPR Compliance is not enough in the U.S.&nbsp;<br><br></li></ul><div>While the GDPR is rooted in human rights, most U.S. state privacy laws are grounded in consumer protection. That philosophical divide shapes everything from compliance strategy to enforcement risk. U.S. laws tend to focus on notice, opt-outs, unfair or deceptive practices, and contractual safeguards. GDPR, by contrast, is structured around broader accountability and rights-based principles. As enforcement expands and legal challenges grow, companies need more than a checklist. An effective privacy strategy requires practical judgment about real-world risk.<br><br></div><ul><li>[45:10] The Cross-Device Opt-Out Obligations<br><br></li></ul><div>Regulators expect companies to honor opt-outs signals across all your platforms where a user can be reasonably identified. Companies are expected to effectuate an opt-out across connected systems when data links exist or when a user is identifiable within the ecosystem. If systems genuinely don’t connect and it’s not reasonable to link them, the obligation may be narrower. But companies cannot use fragmentation or technical silos as a workaround or excuse for not honoring opt-outs.&nbsp;<br><br></div><ul><li>[53:22] The AI Service Provider Gray Area&nbsp;<br><br></li></ul><div>When an AI vendor uses client data to train its own models, it may no longer qualify as a “service provider” under the CCPA. Danie Strachan draws a direct parallel to the GDPR controller and processor distinction, while Daniel and Kellie discuss why deidentification is the only defensible safe harbor for companies navigating this gray area.&nbsp;<br><br></div><div><br></div><h1><strong>Episode Resources:<br></strong><br></h1><ul><li>Daniel Goldberg on <a href="https://www.linkedin.com/in/danielmgoldberg/">LinkedIn</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li><li>VeraSafe <a href="https://verasafe.com/">Website<br></a><br></li></ul><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Tue, 17 Mar 2026 10:00:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/8z7xnjlw.mp3" length="64041863" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/1ab95e70-21ec-11f1-a74a-89cc531a24c3/1ab95f70-21ec-11f1-a5d5-13531aaab57e.png"/>
      <itunes:duration>4002</itunes:duration>
      <itunes:summary>California continues to set the pace for U.S. privacy enforcement, and 2025 is proving to be a pivotal year. In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Daniel Goldberg, Partner and Chair of the Data Strategy, Privacy, and Security Group at Frankfurt Kurnit Klein &amp; Selz and 2025 California Privacy Lawyer of the Year, to unpack what’s really happening behind the scenes of CCPA enforcement.
Drawing on his direct experiences defending companies facing California privacy enforcements, Daniel shares firsthand insights from public and non-public investigations, including trends emerging from actions involving companies like Sephora, DoorDash, and Jam City. The conversation explores what regulators actually prioritize, why misconfigured opt-outs and vendor oversight remain the most common pitfalls, and how the new Delete Act and data broker rules could dramatically shift compliance obligations.</itunes:summary>
      <itunes:subtitle>California continues to set the pace for U.S. privacy enforcement, and 2025 is proving to be a pivotal year. In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Daniel Goldberg, Partner and Chair of the Data Strategy, Privacy, and Security Group at Frankfurt Kurnit Klein &amp; Selz and 2025 California Privacy Lawyer of the Year, to unpack what’s really happening behind the scenes of CCPA enforcement.
Drawing on his direct experiences defending companies facing California privacy enforcements, Daniel shares firsthand insights from public and non-public investigations, including trends emerging from actions involving companies like Sephora, DoorDash, and Jam City. The conversation explores what regulators actually prioritize, why misconfigured opt-outs and vendor oversight remain the most common pitfalls, and how the new Delete Act and data broker rules could dramatically shift compliance obligations.</itunes:subtitle>
      <itunes:keywords/>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>The Arc of a Cyber Incident and Strategies for Enterprise Response, with Lisa Sotto</title>
      <link>https://podcasts.fame.so/e/68r7265n-the-arc-of-a-cyber-incident-and-strategies-for-enterprise-response-with-lisa-sotto</link>
      <itunes:title>The Arc of a Cyber Incident and Strategies for Enterprise Response, with Lisa Sotto</itunes:title>
      <itunes:episode>14</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">805r7v31</guid>
      <description>In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Lisa Sotto, Chair of the Global Privacy and Cybersecurity Practice at Hunton Andrews Kurth, for a practitioner-level conversation on the full arc of a cyber incident, from first detection through board notification and the regulatory long tail that follows.
Drawing on Lisa’s decades of advising Fortune 500 companies and global regulators, the conversation examines why incident response efforts often fail when confined to IT and how organizations can meet complex international notification requirements with imperfect information.. Lisa breaks down real‑world ransomware negotiation dynamics and touches on the Bybit investigation, which has received significant media attention worldwide, as well as high‑profile cases like the Uber criminal conviction and the Drizly FTC consent order signal for executive accountability. Kellie draws on VeraSafe’s own client experience with cyber insurance and cross-boarder breach notification, while Danie bridges the US liability to Europe’s NIS 2 Directive and its implications for executive oversight. 

This episode goes far beyond cybersecurity basics. It’s a strategic, practitioner‑level briefing for leadership teams who need to understand not just how incidents unfold, but how to respond effectively under intense regulatory and operational pressure.</description>
      <content:encoded><![CDATA[<div>In this episode of <em>Privacy In Practice, </em>hosts Kellie du Preez and Danie Strachan welcome Lisa Sotto, Chair of the Global Privacy and Cybersecurity Practice at Hunton Andrews Kurth, and a Star Performer for Privacy and Data Security (Chambers and Partners), for a detailed, practitioner-level conversation on how cyber incidents actually unfold, from first anomaly detection through board notification and the regulatory long tail that follows.<br><br></div><div>The discussion traces what Sotto calls “the arc of an incident”: mobilizing the response team under privilege, retaining forensic investigators and extortion negotiators, coordinating with law enforcement agencies, and managing global notification obligations. Kellie raises the practical complexity of locating affected data subjects when address data is unavailable, the cost dynamics of cyber insurance, and why controllers remain responsible for regulatory notification even when the breach originates with a vendor. Danie examines the ethical and legal dimensions of ransomware payment decisions and extends the accountability discussion across the Atlantic to NIS 2's executive liability provisions—noting that several VeraSafe clients were unaware the directive applied to them.<br><br></div><div>Together, the three&nbsp; examine the emerging trend of personal executive accountability through the lens of three landmark matters: the criminal conviction of Uber’s chief security officer, the FTC’s individual consent order against Drizly’s CEO, and the SEC’s recently dismissed action against SolarWinds and its CISO. They also explore why boards have evolved from what Lisa describes as&nbsp; “deer in headlights” to active participants in cyber governance, and what practical steps—from tabletop exercises to vendor diligence to immutable backups—separate organizations that survive a breach from those that do not.<br><br><br></div><div><strong>What this episode covers:<br></strong><br></div><ul><li>The current nation-state and criminal threat landscape, including SALT Typhoon, the $1.5B Bybit theft, and DPRK imposter IT workers</li><li>How social engineering and agentic AI have rendered traditional phishing detection obsolete</li><li>The "become aware" notification threshold and the strategic case for early regulatory disclosure</li><li>Why one incident response plan with severity levels outperforms multiple plans</li><li>Ransomware payment decisions: sanctions risk, decryptor reliability, and the limits of criminal promises</li><li>NIS2 executive accountability and the CCPA cybersecurity audit requirements</li><li>How law enforcement agencies operate as strategic partners rather than adversaries during active incidents&nbsp;</li><li>And so much more!</li></ul><div><br><br></div><div>Lisa Sotto is Chair of the Global Privacy and Cybersecurity Practice at Hunton Andrews Kurth. Recognized as one of the National Law Journal's 100 most influential lawyers in America and named a Star Performer for Privacy and Data Security by Chambers and Partners, Lisa brings decades of experience advising governments and Fortune 500 companies on cybersecurity incident response and data protection strategy.<br><br></div><div><strong>Connect with Lisa Sotto here: </strong><a href="https://www.linkedin.com/in/lisa-sotto-028b086/">LinkedIn</a></div><div><strong>Connect with Kellie du Preez here: </strong><a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></div><div><strong>Connect with Danie Strachan here: </strong><a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></div><div><strong>Follow VeraSafe here: </strong><a href="https://www.linkedin.com/company/verasafe/posts/?feedView=all">LinkedIn</a><br><br></div><div>If you enjoyed this episode, make sure to subscribe, rate, and review it.<br><br><br></div><h1><strong>Episode Highlights:<br></strong><br></h1><ul><li>[05:02] How Ransomware Attacks Really Work Today</li></ul><div>Cybercrime today is not cinematic; it’s routine, opportunistic, and relentless. Financially motivated attackers target any organization they can access, exploiting technical gaps or human weaknesses through social engineering. Once inside, they quietly explore systems, stage sensitive data for exfiltration, and then apply pressure. These groups evolve, disband, and re-form, but their playbook stays consistent: find a vulnerability, take what’s valuable, and extort.<br><br></div><ul><li>[10:27] The First Critical Hours After a Breach</li></ul><div>AI has transformed cybersecurity risk by making sophisticated attacks easy to execute and nearly impossible to spot. Perfectly written phishing emails, deepfake voices, and fake videos have erased the old warning signs, shifting the threat from technical weaknesses to human instincts. Urgency, authority, and the desire to be helpful are now the most exploited vulnerabilities. Training still matters, but it’s no longer enough to rely on yesterday’s cues. Organizations must assume deception will look real and build new safeguards to protect people from being manipulated into doing the wrong thing for the right reasons.<br><br></div><ul><li>[21:10] Why Cyber Is a Leadership and Board Responsibility</li></ul><div>Cybersecurity incidents don’t fail organizations because of technology alone. They fail when teams operate in silos. A breach is an enterprise-wide crisis that requires coordinated action across IT, security, legal, privacy, communications, HR, risk, and audit, with consistent internal and external messaging. Daily alignment is essential. Equally important, involving law enforcement early can materially improve outcomes. These agencies treat companies as victims, share threat intelligence, and help map attacker tactics, while collaboration may later support prosecution through bodies like the US Department of Justice.<br><br></div><ul><li>[10:30] The Arc of a Cyber Incident: Mobilising the Response Under Privilege</li></ul><div>Lisa walks through the full incident lifecycle: anomaly detection (or the dreaded media call), mobilising the pre-assembled response team through out-of-band communications, retaining forensic investigators under legal privilege, engaging extortion negotiators, coordinating with law enforcement, and navigating cyber and data protection notification obligations across jurisdictions with timelines ranging from 3 hours (Chile) to 72 hours (EU). She describes live threat actors listening on incident response calls and the necessity of forcing cameras on.&nbsp;<br><br></div><ul><li>[31:19] Personal Liability: From the Uber Conviction to the SolarWinds Dismissal</li></ul><div>The discussion traces the three landmark cases establishing the trend of individual executive accountability: (1) the criminal conviction of Uber’s CSO for concealing a breach; (2) the FTC’s consent order that followed Drizly’s CEO personally for ten years, requiring him to implement security measures at any company where he holds a leadership role; and (3) the SEC’s action against SolarWinds’ CISO for allegedly misrepresenting security posture, which was recently dismissed. Strachan adds NIS2’s executive accountability provisions and du Preez notes the CCPA cybersecurity audit requirements — critically observing that these apply to companies with revenue as low as $50M, not just billion-dollar enterprises.<br><br></div><ul><li>[44:40] Ransomware Negotiations: Sanctions, Reliability, and the Ethics of Payment</li></ul><div>This section provides insights into a practitioner’s framework for the ransom payment decision: when payment for a decryptor may be the only alternative to shutting doors, why immutable backups change the calculus, why paying for data deletion rarely works (criminals may not delete, and notification obligations remain regardless), how sanctions screening determines whether payment is legally permissible, and the sobering reality that some decryptors arrive corrupted and some companies are re-extorted after paying.<br><br></div><div><br></div><h1><strong>Episode Resources:</strong></h1><ul><li>Lisa Sotto on <a href="https://www.linkedin.com/in/lisa-sotto-028b086/">LinkedIn</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li><li>VeraSafe <a href="https://verasafe.com/">Website</a></li></ul><div><br></div><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Tue, 17 Feb 2026 10:00:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/wqym1llw.mp3" length="48692312" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/e46d2790-0be9-11f1-96b1-dfa908e082f2/e46d28a0-0be9-11f1-863e-e9e9a63583c8.png"/>
      <itunes:duration>3043</itunes:duration>
      <itunes:summary>In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Lisa Sotto, Chair of the Global Privacy and Cybersecurity Practice at Hunton Andrews Kurth, for a practitioner-level conversation on the full arc of a cyber incident, from first detection through board notification and the regulatory long tail that follows.
Drawing on Lisa’s decades of advising Fortune 500 companies and global regulators, the conversation examines why incident response efforts often fail when confined to IT and how organizations can meet complex international notification requirements with imperfect information.. Lisa breaks down real‑world ransomware negotiation dynamics and touches on the Bybit investigation, which has received significant media attention worldwide, as well as high‑profile cases like the Uber criminal conviction and the Drizly FTC consent order signal for executive accountability. Kellie draws on VeraSafe’s own client experience with cyber insurance and cross-boarder breach notification, while Danie bridges the US liability to Europe’s NIS 2 Directive and its implications for executive oversight. 

This episode goes far beyond cybersecurity basics. It’s a strategic, practitioner‑level briefing for leadership teams who need to understand not just how incidents unfold, but how to respond effectively under intense regulatory and operational pressure.</itunes:summary>
      <itunes:subtitle>In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Lisa Sotto, Chair of the Global Privacy and Cybersecurity Practice at Hunton Andrews Kurth, for a practitioner-level conversation on the full arc of a cyber incident, from first detection through board notification and the regulatory long tail that follows.
Drawing on Lisa’s decades of advising Fortune 500 companies and global regulators, the conversation examines why incident response efforts often fail when confined to IT and how organizations can meet complex international notification requirements with imperfect information.. Lisa breaks down real‑world ransomware negotiation dynamics and touches on the Bybit investigation, which has received significant media attention worldwide, as well as high‑profile cases like the Uber criminal conviction and the Drizly FTC consent order signal for executive accountability. Kellie draws on VeraSafe’s own client experience with cyber insurance and cross-boarder breach notification, while Danie bridges the US liability to Europe’s NIS 2 Directive and its implications for executive oversight. 

This episode goes far beyond cybersecurity basics. It’s a strategic, practitioner‑level briefing for leadership teams who need to understand not just how incidents unfold, but how to respond effectively under intense regulatory and operational pressure.</itunes:subtitle>
      <itunes:keywords/>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>How CBPR Certification Builds Trust and Enables Global Scale, with Charmian Aw</title>
      <link>https://podcasts.fame.so/e/xny7900n-how-cbpr-certification-builds-trust-and-enables-global-scale-with-charmian-aw</link>
      <itunes:title>How CBPR Certification Builds Trust and Enables Global Scale, with Charmian Aw</itunes:title>
      <itunes:episode>13</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">l14rmpp1</guid>
      <description>In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan welcome Charmian Aw, Partner at Hogan Lovells, to examine the growing relevance of the Cross-Border Privacy Rules (CBPR) System in an increasingly global data economy. Learn why organizations such as Cisco, Mastercard, and Alibaba have obtained certification, why the framework is gaining renewed attention among multinational organizations, and how it complements existing transfer mechanisms such as Standard Contractual Clauses (SCCs). The conversation also explores how CBPR certification plays a role in procurement, regulatory cooperation, and the evolution of responsible data processing.</description>
      <content:encoded><![CDATA[<div>In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan welcome Charmian Aw, Partner at Hogan Lovells, to examine the growing relevance of the Cross-Border Privacy Rules (CBPR) System in an increasingly global data economy. Learn why organizations such as Cisco, Mastercard, and Alibaba have obtained certification, why the framework is gaining renewed attention among multinational organizations, and how it complements existing transfer mechanisms such as Standard Contractual Clauses (SCCs). The conversation also explores how CBPR certification plays a role in procurement, regulatory cooperation, and the evolution of responsible data processing.<br><br></div><div><strong><br>What You'll Learn:</strong>&nbsp;<br><br></div><ul><li>Why the CBPR System is gaining momentum globally beyond APEC</li><li>The commercial case for pursuing both PRP and CBPR certification</li><li>How certification actually works</li><li>The competitive advantage hiding in your procurement checklist</li><li>Why AI and healthcare use cases are accelerating CBPR adoption</li><li>How the Global Cross-Border Privacy Enforcement Arrangement (Global CAPE) enables regulators to share information and coordinate cross-border investigations</li><li>Why regulatory recognition matters and how it may evolve as more jurisdictions join</li><li>And so much more!<br><br></li></ul><div><br>Charmian Aw is a leading privacy and cybersecurity advisor with deep knowledge of frontier technologies, including AI, data protection frameworks, and international compliance strategy. In this episode, she shares insights from the recent Global CBPR forum, which she attended alongside VeraSafe, a recognized CBPR Accountability Agent. The discussion offers a practical and engaging look at how regulatory developments translate into real-world operations and commercial outcomes. Together we discuss how organizations can move beyond compliance as a checkbox to use accountability frameworks such as the CBPR and PRP Systems to support trust, scalability, and business value.<br><br><br></div><div><strong>Connect with Charmian Aw here: </strong><a href="https://www.linkedin.com/in/charmian-aw-40bb158a/">LinkedIn<strong><br></strong></a><strong>Connect with Kellie du Preez here: </strong><a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></div><div><strong>Connect with Danie Strachan here: </strong><a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn<br></a><strong>Follow VeraSafe here: </strong><a href="https://www.linkedin.com/company/verasafe/posts/?feedView=all">LinkedIn</a><br><br></div><div>If you enjoyed this episode, make sure to subscribe, rate, and review it.<br><br><br></div><h1><strong>Episode Highlights:</strong></h1><ul><li>[04:45] Why the CBPR Deserves More Attention</li></ul><div><br></div><div>The CBPR is widely underhyped because many organizations still approach it with the wrong mindset, treating it as a nice-to-have rather than a strategic, regulator-backed tool. As data flows across borders faster than regulation can keep up, relying on fragmented, country-by-country compliance is neither scalable nor sustainable. CBPR reframes that conversation by offering a unified, multi-stakeholder framework that is designed to support continuous, compliant data transfers across diverse jurisdictions with visible regulatory participation and endorsement<br><br></div><ul><li>[17:40] How Certification Actually Works</li></ul><div><br></div><div>To earn CBPR certification, organizations must apply in their home country through an approved accountability agent, an auditor-like partner who evaluates whether your privacy program meets the framework’s principles. The process involves completing a structured assessment, closing identified gaps, and maintaining ongoing compliance through annual renewals. Importantly, certification requires core privacy practices to already be in place and encourages a truly holistic privacy program.<br><br></div><ul><li>[23:51] Overlap With ISO, SOC 2, and GDPR</li></ul><div><br></div><div>CBPR certification strategically aligns with existing frameworks, such as GDPR, ISO 27001, SOC 2, and the Data Privacy Framework (DPF). Many of the same controls already exist in your privacy and security programs, making CBPR a natural next step rather than a reinvention of the wheel. More importantly, its value isn’t measured by the cost of certification but rather by the trust signal it sends to customers, regulators, and procurement teams. As more organizations add CBPR certification to procurement and vendor risk checklists, failing to adopt it risks becoming a competitive disadvantage. The true ROI lies in regulatory endorsement, market confidence, and being positioned ahead of the compliance curve.<br><br></div><ul><li>[28:51] Practical Business Benefits</li></ul><div><br></div><div>Organizations relying solely on adequacy decisions or SCCs are betting their compliance on paperwork that few people truly understand or can operationalize. Global CBPR certification flips that dynamic: instead of signing complex, non-negotiable contracts and hoping the business can keep up, it delivers a regulator-endorsed, trusted compliance stamp. By requiring real assessments, cross-functional involvement, and evidence-based governance, Global CBPR transforms data transfer compliance from a legal checkbox into a practical, scalable framework that reduces contract risk, builds customer trust, and future-proofs operations in a fragmented regulatory world.<br><br><br></div><h1><strong>Episode Resources:</strong></h1><ul><li>Charmian Aw on <a href="https://www.linkedin.com/in/charmian-aw-40bb158a/">LinkedIn</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li><li>VeraSafe <a href="https://verasafe.com/">Website</a></li></ul><div><br></div><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Tue, 20 Jan 2026 10:22:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/853k9x28.mp3" length="55577354" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/6bcff140-f5ea-11f0-a413-4b955ca943d9/6bcff240-f5ea-11f0-8fc0-c74f0e6310f4.png"/>
      <itunes:duration>3473</itunes:duration>
      <itunes:summary>In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan welcome Charmian Aw, Partner at Hogan Lovells, to examine the growing relevance of the Cross-Border Privacy Rules (CBPR) System in an increasingly global data economy. Learn why organizations such as Cisco, Mastercard, and Alibaba have obtained certification, why the framework is gaining renewed attention among multinational organizations, and how it complements existing transfer mechanisms such as Standard Contractual Clauses (SCCs). The conversation also explores how CBPR certification plays a role in procurement, regulatory cooperation, and the evolution of responsible data processing.</itunes:summary>
      <itunes:subtitle>In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan welcome Charmian Aw, Partner at Hogan Lovells, to examine the growing relevance of the Cross-Border Privacy Rules (CBPR) System in an increasingly global data economy. Learn why organizations such as Cisco, Mastercard, and Alibaba have obtained certification, why the framework is gaining renewed attention among multinational organizations, and how it complements existing transfer mechanisms such as Standard Contractual Clauses (SCCs). The conversation also explores how CBPR certification plays a role in procurement, regulatory cooperation, and the evolution of responsible data processing.</itunes:subtitle>
      <itunes:keywords/>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>Privacy Challenges in the Gaming Industry, with Alex Roberts</title>
      <link>https://podcasts.fame.so/e/1np7pvq8-privacy-challenges-in-the-gaming-industry-with-alex-roberts</link>
      <itunes:title>Privacy Challenges in the Gaming Industry, with Alex Roberts</itunes:title>
      <itunes:episode>12</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">z0r4v380</guid>
      <description>Gaming is now one of the world's biggest entertainment industries, and with growth comes scrutiny. In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Alex Roberts, Partner and Head of TMT in China at Linklaters, to explore how gaming companies navigate privacy compliance across multiple jurisdictions. From children's online safety requirements in APAC to class action litigation trends in the U.S., the discussion offers practical insights on building privacy programs that work globally. Whether you're managing vendor relationships, responding to data subject requests, or trying to understand regional regulatory variations, this episode sheds light on how to approach privacy compliance in one of the world's most dynamic and complex industries.</description>
      <content:encoded><![CDATA[<div>The gaming industry has surpassed Hollywood as one of the world's largest entertainment sectors, projected to reach $500 billion by 2030. But with this explosive growth comes increasing regulatory scrutiny across every major market. In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan welcome Alex Roberts, Partner and Head of TMT in China at Linklaters, for an in-depth conversation about privacy and data protection compliance in the global gaming industry.</div><div><br><strong><br>What You'll Learn:</strong>&nbsp;</div><ul><li>Why gaming’s rapid growth is drawing heightened attention from privacy regulators</li><li>How APAC privacy laws have evolved beyond their "lax" reputation</li><li>The nuances of children's privacy and online safety requirements across major jurisdictions</li><li>How organizations can think about establishing a practical&nbsp; "high watermark" for global compliance</li><li>Why the UK Online Safety Act introduces particularly demanding obligations for gaming companies</li><li>How data subject access requests are being used strategically in litigation</li><li>The role of third-party age verification providers in meeting compliance requirements</li><li>How enforcement and&nbsp; litigation trends differ between APAC, Europe, and the U.S.</li><li>Why privacy notices are increasingly scrutinized in investigations and lawsuits</li><li>The importance of vendor due diligence in privacy compliance</li><li>How unsanctioned or “shadow” AI tools introduce unexpected privacy and security risks</li><li>Practical strategies for building collaborative, cross-functional privacy programs</li><li>And so much more!<br><br></li></ul><div><br></div><div><strong>Alex Roberts</strong> is a Partner and Head of Technology, Media &amp; Telecommunications (TMT) in China at Linklaters, where he also co-leads the firm's global gaming sector practice. With over thirteen years of experience in the Asia-Pacific region, Alex has advised financial institutions, corporates, and fintechs operating in heavily regulated sectors, helping them navigate the complex legal and regulatory landscape in gaming and technology. His work spans multiple jurisdictions and regulatory frameworks, offering clients practical guidance on balancing innovation with compliance in one of the world's most dynamic markets. Alex brings a collaborative, communications-focused approach to privacy compliance, recognizing that technical legal skills must be paired with effective stakeholder engagement to build sustainable privacy programs.<br><br><br></div><div><strong>Connect with Alex Roberts here: </strong><a href="https://www.linkedin.com/in/alex-roberts-%E7%BD%97%E4%B8%96%E8%BD%A9/">LinkedIn</a></div><div><strong>Follow VeraSafe here: </strong><a href="https://www.linkedin.com/company/verasafe/posts/?feedView=all">LinkedIn</a><br><br></div><div>If you enjoyed this episode, make sure to subscribe, rate, and review it.<br><br><br></div><div><strong>Episode Highlights:<br></strong><br></div><div><strong>[00:02:34] Why Gaming Has Become a Regulatory Hotspot<br></strong>Gaming has surpassed Hollywood to become one of the world's largest entertainment industries, valued at approximately $300 billion and projected to grow to $500 billion by 2030. With that scale has come heightened regulatory attention across major jurisdictions. As Alex explains, whenever you have a hot market—whether gaming or other tech sectors—regulators focus on whether users are protected and whether businesses have appropriate long-term safeguards. Data plays a central role in the gaming industry, from personalizing player experiences to supporting advertising and partnership models. As a result, regulators in APAC, Europe, and the U.S. are strengthening their approaches to data protection in the gaming sector, creating an increasingly complex compliance landscape for companies operating globally.<br><br><br></div><div><strong>[00:06:31] Children's Online Safety: The Gaming Industry's Critical Challenge<br></strong>Children's online safety has become a critical regulatory focus for gaming companies, driven by concerning statistics showing that approximately one-fifth of gamers report receiving hateful and abusive language in gaming environments over the past twelve months. Australia has taken a leadership role in this area, with other APAC markets also playing closer attention to how children’s data is handled. Most gaming companies have their users' best interests at heart, but moderating and helping people stay safe in digital environments requires clear safeguards, particularly for young audiences. This is one of the areas where gaming companies and regulators are focusing most intensely, with requirements ranging from enhanced consent mechanisms to age-verification expectations and other safeguards aimed at protecting younger users.<br><br><br></div><div><strong>[00:08:01] The "GDPR Effect" Across APAC<br></strong>The misconception that APAC has lax privacy regulations is largely outdated. Alex notes that roughly thirteen jurisdictions in the region have drawn heavily from the GDPR, with some adopting large portions of it and others incorporating key principles. This isn't a region where privacy regulations are lacking anymore. For example, Singapore's privacy framework&nbsp; reflects many GDPR-aligned principles, while China has implemented what Alex describes as "GDPR plus with Chinese characteristics." This regional convergence around European-style privacy frameworks means gaming companies can no longer treat APAC as a regulatory-light environment. However, the similarities also create opportunities—organizations can identify their "high watermark" jurisdiction (the one with the strictest requirements among their key markets) and use that as a baseline for their compliance program, adjusting for market-specific nuances rather than building entirely separate programs for each country.<br><br><br></div><div><strong>[00:31:35] Shadow AI and the Communication Imperative for Privacy Professionals<br></strong>One of the growing risks emerging in privacy compliance is "Shadow AI"—employees using consumer versions of AI tools without organizational knowledge or approval. IBM statistics indicate that 90% of AI tools used within organizations have not been signed off by CISO teams. Someone might casually mention they "normally just use ChatGPT" for certain tasks, suddenly revealing that confidential information has been fed into public AI models. This creates immediate compliance risks related to data security, confidentiality, and the potential exposure of proprietary information. Addressing this challenge requires more than technical controls—it depends heavily on effective communication and collaboration across legal, operations, technology, design, and marketing teams. Privacy professionals who can communicate effectively and build strong cross-functional relationships will continue to add value even as AI transforms other aspects of legal work. As Alex emphasizes, "You can still have a job if you can talk to other people and communicate effectively—that's only going to become more and more important in this space."<br><br><br></div><div><strong>Episode Resources:<br></strong><br></div><ul><li>Alex Roberts on <a href="https://www.linkedin.com/in/alex-roberts-%E7%BD%97%E4%B8%96%E8%BD%A9/">LinkedIn</a></li><li>VeraSafe <a href="https://verasafe.com/">Website</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li></ul><div><br></div><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Wed, 10 Dec 2025 10:00:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/wqymxv4w.mp3" length="95475840" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/5862b4e0-d50e-11f0-a568-470fb158557f/5862b600-d50e-11f0-abfa-ef238a9da7aa.png"/>
      <itunes:duration>2386</itunes:duration>
      <itunes:summary>Gaming is now one of the world's biggest entertainment industries, and with growth comes scrutiny. In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Alex Roberts, Partner and Head of TMT in China at Linklaters, to explore how gaming companies navigate privacy compliance across multiple jurisdictions. From children's online safety requirements in APAC to class action litigation trends in the U.S., the discussion offers practical insights on building privacy programs that work globally. Whether you're managing vendor relationships, responding to data subject requests, or trying to understand regional regulatory variations, this episode sheds light on how to approach privacy compliance in one of the world's most dynamic and complex industries.</itunes:summary>
      <itunes:subtitle>Gaming is now one of the world's biggest entertainment industries, and with growth comes scrutiny. In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Alex Roberts, Partner and Head of TMT in China at Linklaters, to explore how gaming companies navigate privacy compliance across multiple jurisdictions. From children's online safety requirements in APAC to class action litigation trends in the U.S., the discussion offers practical insights on building privacy programs that work globally. Whether you're managing vendor relationships, responding to data subject requests, or trying to understand regional regulatory variations, this episode sheds light on how to approach privacy compliance in one of the world's most dynamic and complex industries.</itunes:subtitle>
      <itunes:keywords/>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>Lessons from Ireland's Former Data Protection Commissioner with Helen Dixon</title>
      <link>https://podcasts.fame.so/e/rnklpvz8-lessons-from-ireland-s-former-data-protection-commissioner-helen-dixon</link>
      <itunes:title>Lessons from Ireland's Former Data Protection Commissioner with Helen Dixon</itunes:title>
      <itunes:episode>11</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">70wjlw21</guid>
      <description>What does a regulator really want to see when they investigate your company? In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Helen Dixon, Ireland's former Data Protection Commissioner, for an unprecedented conversation about privacy compliance from the regulator's perspective. Helen shares practical insights on building effective privacy programs with limited budgets, handling data subject access requests without triggering complaints, and what actually matters when regulators assess compliance efforts. Whether you're running a growing SME or managing privacy for a larger organization, this episode offers invaluable guidance on navigating GDPR requirements with common sense, fairness, and pragmatism.</description>
      <content:encoded><![CDATA[<div>What really happens when a regulator investigates your organization? And more importantly, what can you do to stay off their radar while building a sustainable privacy program? In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan welcome Helen Dixon, Ireland's former Data Protection Commissioner, for a candid conversation about privacy compliance from the regulator's perspective. With over ten years leading one of Europe's most influential data protection authorities—overseeing landmark cases including Schrems litigation and levying over €3 billion in GDPR fines against tech giants—Helen brings unparalleled insights into what regulators actually look for in privacy programs.<br><br></div><div><strong><br>What You'll Learn:</strong>&nbsp;<br><br></div><ul><li>Why fairness and common sense matter more than perfect documentation</li><li>How to prioritize privacy compliance with a €100K budget (or limited time)</li><li>The most common mistakes SMEs make that trigger regulatory complaints</li><li>Why engaging with data subjects during access requests can prevent escalation</li><li>What regulators really think about risk registers and documented compliance gaps</li><li>How cooperative behavior influences regulatory outcomes (even though the GDPR doesn't explicitly say so)</li><li>The difference between EU and U.S. regulatory approaches</li><li>Why physical security and CCTV remain surprisingly common compliance pitfalls</li><li>How to handle the e-privacy compliance challenge across fragmented EU member states</li><li>The empathy regulators have for DPOs and privacy professionals facing organizational resistance</li><li>And so much more!<br><br></li></ul><div><br>Helen Dixon most recently served as Ireland's Commissioner for Communications Regulation. Prior to that, she was the Irish Data Protection Commissioner for ten years, overseeing several landmark cases including the high-profile Schrems litigation and rulings against prominent companies such as Meta, and investigations into Twitter, TikTok, Apple, and others. Her office levied over €3 billion worth of fines for GDPR violations and, in some cases more critically, imposed important corrective actions against some of the world's largest tech companies. Helen's tenure spanned many eras in data protection, from leading a small, remote DPC through the waves of GDPR implementation to ultimately building a powerhouse office of several hundred staff. She has been called "the world's first global privacy regulator" and "a paragon of judicious, balanced, disciplined, and principled enforcement and regulation." She is currently in a listening and learning phase, having joined the board of an environmental organization focused on circular economy and packaging waste, while preparing to return to data protection and digital regulation work.<br><br><br></div><div><strong>Connect with Helen Dixon here: </strong><a href="https://www.linkedin.com/in/helen-dixon-1765318/">LinkedIn</a></div><div><strong>Follow VeraSafe here: </strong><a href="https://www.linkedin.com/company/verasafe/posts/?feedView=all">LinkedIn</a><br><br></div><div>If you enjoyed this episode, make sure to subscribe, rate, and review it.</div><div><br><br></div><div><strong>Episode Highlights:</strong></div><div><strong><br>[00:15:24] Fairness and Common Sense: The Underrated Compliance Strategy<br></strong><br></div><div>Helen emphasizes that for SMEs in particular, applying fairness and common sense can prevent many compliance problems from escalating. While this sounds simple, it's remarkably powerful in practice. She observed repeatedly as a regulator how organizations made situations exponentially worse by trying to restrict information, delay responses, or refuse to accept their obligations under the GDPR. When faced with a data subject request or complaint, the instinct to be defensive or minimize disclosure often backfires spectacularly. Instead, approaching situations with genuine fairness—asking "what's the right thing to do here for this person?"—and common sense problem-solving frequently resolves issues before they become formal complaints or investigations. This doesn't mean organizations should ignore legitimate exemptions or protections, but rather that the starting point should be cooperation and good faith rather than adversarial positioning. Privacy professionals should advocate internally for this approach, helping business stakeholders understand that fairness and transparency typically cost less and create better outcomes than defensive strategies. This principle applies across GDPR compliance—from how you handle access requests to how you design data collection practices to how you respond to regulator inquiries.<br><br><br></div><div><strong>[00:22:12] How to Spend €100K on Privacy Compliance (If You Had It)<br></strong><br></div><div>When asked how she would prioritize a hypothetical €100K privacy budget for a 100-employee company, Helen provides a comprehensive roadmap that reveals regulatory thinking about what actually matters. First, bring in a third party to conduct a gap analysis and potentially refresh any areas that may need to be updated. Start with fundamental questions about what business you're actually in, what data supports that business, what categories of personal data you hold, whether legacy data can be disposed of, how data is secured and stored, and what jurisdictional issues arise. This foundational data mapping and inventory work is non-negotiable and cannot be skipped. Second, conduct a thorough risk assessment informed by recent case studies and case law—identifying particular exposure areas like website cookies (given French authority enforcement trends), data transfers and associated documentation, or specific high-risk processing activities. Third, review all documentation including internal policies, procedures, and templates that staff routinely use (especially for acknowledging and responding to data subject rights), plus public-facing privacy policies and notices. Fourth, address "peripheral" issues that seem trivial but trip organizations up—like event photography, delegate lists, and physical security measures. Finally, if budget remains, invest in refreshed staff training that goes beyond compliance boxes to help people understand the purpose and value of the GDPR, perhaps bringing in external speakers to enliven the topic. For organizations with limited budgets, Helen's framework provides a prioritization methodology: data mapping and risk assessment first, then documentation, then training and culture.<br><br><br></div><div><strong>[00:28:57] The Art of Handling Data Subject Access Requests<br></strong><br></div><div>Helen offers invaluable advice on handling data subject access requests (DSARs) that reveals how regulators assess organizational behavior. First and most importantly: don't approach access requests with excessive emotion or spend energy speculating about the requester's "real" motivations, even when you know it stems from a&nbsp; grievance. Accept that individuals have rights under the GDPR regardless of their reasons. Second, don't be afraid to engage directly with data subjects—in fact, Helen actively encourages it. While the request may come in writing, making contact to understand what they're actually looking for can dramatically limit the scope of work and satisfy their needs more effectively. If you assume reasonableness on the part of the requester (which is often justified), they frequently have a specific thing they're seeking for a specific purpose, and understanding that context allows you to provide responsive information without unnecessary over-disclosure. Third, for extensive requests where searches will be proportionally burdensome, engagement is essential for communicating timelines, processes, and the scope of proportionate searches. Fourth, communicate clearly about what you're doing and involve the data subject in understanding how you'll fulfill the request. Helen acknowledges these situations are "incredibly difficult" because they often arise in contested circumstances, but looking at case studies helps organizations learn what makes situations easier versus harder. The overarching principle: bringing interactions back to a human level, assuming good faith, and avoiding cynical or suspicious approaches serves everyone better and often prevents escalation to formal complaints.<br><br><br></div><div><strong>[00:44:06] What Regulators Really Think About Compliance Efforts (Even If the GDPR Doesn't Say So)<br></strong><br></div><div>In one of the episode's most revealing moments, Helen candidly explains the gap between what the GDPR technically requires and how regulatory enforcement actually works in practice. The GDPR doesn't explicitly set out prioritization methods—it theoretically requires DPAs to investigate every complaint from over 400 million EU persons, doesn't prioritize by organization type or size (beyond its general risk-based approach), and suggests imposing corrective measures and considering fines in every case of infringement. But in reality, this approach would be completely unworkable, tying up all regulatory resources in court cases and administration without achieving meaningful behavior change. So what actually happens? Helen explains that "in practice, how compliant and cooperative and course-correcting an organization is as you're dealing with them does count." For one-off complaints from organizations that haven't previously come to regulatory attention, if you can see they made an honest mistake, did the analysis but took the wrong fork in the road, or are genuinely trying to comply, regulators will often simply get satisfaction for the data subject and move on to focus on bigger systemic risks, repeat offenders, or situations where organizations clearly should have known better and failed to implement appropriate safeguards. This doesn't mean organizations should document known compliance gaps carelessly, but it does mean that demonstrating good faith, cooperation, and willingness to course-correct matters significantly—even though the GDPR doesn't explicitly provide for this distinction. Privacy professionals should understand this reality when advising on regulatory strategy and engagement approaches.<br><br><br></div><div><strong>Episode Resources:<br></strong><br></div><ul><li>Helen Dixon on <a href="https://www.linkedin.com/in/helen-dixon-1765318/">LinkedIn</a></li><li>VeraSafe <a href="https://verasafe.com/">Website</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li></ul><div><br></div><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Tue, 11 Nov 2025 10:43:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/8l4m06v8.mp3" length="43481704" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/80a11ce0-bbf2-11f0-81a0-e9ff40f06aff/80a11e80-bbf2-11f0-b59f-85ae27557dd1.png"/>
      <itunes:duration>3262</itunes:duration>
      <itunes:summary>What does a regulator really want to see when they investigate your company? In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Helen Dixon, Ireland's former Data Protection Commissioner, for an unprecedented conversation about privacy compliance from the regulator's perspective. Helen shares practical insights on building effective privacy programs with limited budgets, handling data subject access requests without triggering complaints, and what actually matters when regulators assess compliance efforts. Whether you're running a growing SME or managing privacy for a larger organization, this episode offers invaluable guidance on navigating GDPR requirements with common sense, fairness, and pragmatism.</itunes:summary>
      <itunes:subtitle>What does a regulator really want to see when they investigate your company? In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Helen Dixon, Ireland's former Data Protection Commissioner, for an unprecedented conversation about privacy compliance from the regulator's perspective. Helen shares practical insights on building effective privacy programs with limited budgets, handling data subject access requests without triggering complaints, and what actually matters when regulators assess compliance efforts. Whether you're running a growing SME or managing privacy for a larger organization, this episode offers invaluable guidance on navigating GDPR requirements with common sense, fairness, and pragmatism.</itunes:subtitle>
      <itunes:keywords/>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>Understanding PETs with Monisha Varadan</title>
      <link>https://podcasts.fame.so/e/1np7xlj8-understanding-pets-with-monisha-varadan</link>
      <itunes:title>Understanding PETs with Monisha Varadan</itunes:title>
      <itunes:episode>10</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">z0r45zn0</guid>
      <description>Privacy enhancing technologies sound great in theory, but how do you actually implement them? In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Monisha Varadan, EMEA Privacy Lead at Google, to explore the practical realities of implementing PETs. From differential privacy to synthetic data, Monisha demystifies these cutting-edge tools and explains how organizations can leverage them to enable business innovation while protecting user privacy. Whether you're a privacy professional exploring PETs for the first time or a technical leader looking to balance privacy with utility, this episode offers practical guidance on when, why, and how to use privacy enhancing technologies effectively.</description>
      <content:encoded><![CDATA[<div>Privacy enhancing technologies (PETs) have evolved from academic concepts to practical tools that can transform how organizations handle sensitive data. In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan welcome Monisha Varadan, EMEA Privacy Lead at Google, for an in-depth exploration of how PETs work in practice and why they matter more than ever in the age of AI.<br><br><br></div><div><strong>What You'll Learn:</strong>&nbsp;<br><br></div><ul><li>Why PETs are business enablers, not just compliance tools</li><li>The difference between privacy by design philosophy and PETs as implementation tools</li><li>How Google uses differential privacy in real-world products like Maps and spam detection</li><li>Why synthetic data matters for AI model training and its privacy limitations</li><li>When to think about PETs in the product development lifecycle (hint: before your DPIA)</li><li>The gap between conceptual and practical PETs and how to bridge it</li><li>Why using a PET doesn't automatically tick GDPR compliance boxes</li><li>How the PETs landscape is becoming more accessible through startups and open-source libraries</li><li>Which industries are leading PET adoption and why</li><li>The role of regulators in advancing PET implementation</li><li>And so much more!</li></ul><div><br><br></div><div>Monisha Varadan is the EMEA Privacy Lead and Senior Privacy Engineer at Google, based in Paris, where she sits in the DPO office within the Privacy, Safety, and Security organization. With over twenty years of experience working across various industries and geographies, including previous roles as Head of Chrome Partnerships for APAC at Google, Monisha brings a unique perspective on privacy that spans technical solutions, regional regulatory variations, and cultural differences in privacy expectations. She is also a Lecturer in Corporate Entrepreneurship at INSEAD and Partner at Zephyr Ventures. Her work focuses on helping organizations understand that privacy is both a cultural and technical change process, and she has become a leading voice in making privacy enhancing technologies more accessible and practical for organizations of all sizes.<br><br></div><div><strong>Connect with Monisha Varadan here: </strong><a href="https://www.linkedin.com/in/monishavaradan/?utm_source=chatgpt%2Ecom&amp;originalSubdomain=fr">LinkedIn</a></div><div><strong>Follow VeraSafe here: </strong><a href="https://www.linkedin.com/company/verasafe/posts/?feedView=all">LinkedIn</a><br><br></div><div>If you enjoyed this episode, make sure to subscribe, rate, and review it.<br><br><br></div><div><strong>Episode Highlights:<br></strong><br></div><div><strong>[00:09:59] PETs as Business Enablers, Not Just Compliance Tools<br></strong><br></div><div>The conversation about PETs often centers on compliance, but Monisha reframes them as powerful business enablers. Privacy enhancing technologies make previously impossible use cases possible by allowing organizations to use data in privacy-preserving ways. Rather than viewing PETs solely as mechanisms to comply with regulations like GDPR, privacy professionals should present them to business stakeholders as tools that unlock new opportunities. This shift in framing is crucial for getting buy-in from revenue teams and C-level executives who may otherwise see privacy as a cost center or obstacle to innovation. For organizations considering PET implementation, the key message is that these technologies can solve real business problems—enabling data sharing, analysis, and innovation that would otherwise be blocked by privacy concerns. Privacy professionals should learn to speak the language of business enablement when advocating for PET adoption.<br><br><br></div><div><strong>[00:13:29] When to Think About PETs: Before Your DPIA<br></strong><br></div><div>Monisha challenges the common practice of considering PETs only at the DPIA stage or when trying to mitigate privacy risks after a product is already designed. Instead, organizations should think about PETs from the very beginning—when first developing a product idea that involves data use. Integrating PET consideration into the earliest stages of product development makes implementation much more intuitive and effective than retrofitting privacy solutions later. This early integration aligns with true privacy by design principles, where privacy isn't an afterthought but a fundamental part of the product development process. For privacy professionals, this means engaging with product teams at the ideation stage, not just during formal privacy assessments. Organizations that make PET consideration instinctive and cultural from the start will find implementation significantly easier than those trying to solve privacy problems after the fact.<br><br><br></div><div><strong>[00:19:03] Real-World PETs: Google Maps Busyness and COVID Mobility Reports<br></strong><br></div><div>Monisha brings PETs to life with concrete examples from Google products. The "busyness times" feature in Google Maps uses differential privacy by introducing noise into location data, allowing users to see crowd levels at locations without revealing individual movements. This same technology powered COVID-19 mobility reports that helped governments understand movement patterns during the pandemic without compromising individual privacy. These examples demonstrate that the goal isn't to know exactly who was where, but rather to understand aggregate patterns—the number of people, not their identities. Privacy professionals can use these examples to help stakeholders understand that PETs allow you to get the insights you need while protecting individual privacy. The key is being clear about what you actually need from the data versus what would be nice to have, and designing your data collection accordingly.<br><br><br></div><div><strong>[00:30:13] The Reality Check: PETs Are Not Silver Bullets<br></strong><br></div><div>Throughout the conversation, Monisha provides crucial reality checks about PET implementation. It's not as easy as taking a PET "off the shelf" and solving all your problems—it takes significant work, iteration, and careful selection of the right tool for the right use case. There's a substantial gap between conceptual PETs (how they work in theory) and practical PETs (how to actually implement them), and organizations are still building bridges across this gap. Using a PET doesn't automatically mean you've "ticked a GDPR box"—you still need to think carefully about how you're using the technology and the data. However, despite these challenges, progress is happening. The PET landscape is becoming more accessible through startups building practical implementations, open-source libraries, and better tooling. For privacy professionals just starting to explore PETs, Monisha's advice is clear: don't be discouraged by the complexity, but do approach implementation with realistic expectations, invest in understanding which PET solves which problem, and be prepared for iteration and experimentation.<br><br><br></div><div><strong>Episode Resources:<br></strong><br></div><ul><li>Monisha Varadan on <a href="https://www.linkedin.com/in/monishavaradan/?utm_source=chatgpt%2Ecom&amp;originalSubdomain=fr">LinkedIn</a></li><li>VeraSafe <a href="https://verasafe.com/">Website</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li></ul><div><br></div><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Tue, 21 Oct 2025 10:00:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/wmkmv5zw.mp3" length="57949321" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/47b1d250-ae4b-11f0-b3ce-a94e1cfb8073/47b1d370-ae4b-11f0-80b9-75fd937fa2f8.png"/>
      <itunes:duration>2171</itunes:duration>
      <itunes:summary>Privacy enhancing technologies sound great in theory, but how do you actually implement them? In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Monisha Varadan, EMEA Privacy Lead at Google, to explore the practical realities of implementing PETs. From differential privacy to synthetic data, Monisha demystifies these cutting-edge tools and explains how organizations can leverage them to enable business innovation while protecting user privacy. Whether you're a privacy professional exploring PETs for the first time or a technical leader looking to balance privacy with utility, this episode offers practical guidance on when, why, and how to use privacy enhancing technologies effectively.</itunes:summary>
      <itunes:subtitle>Privacy enhancing technologies sound great in theory, but how do you actually implement them? In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Monisha Varadan, EMEA Privacy Lead at Google, to explore the practical realities of implementing PETs. From differential privacy to synthetic data, Monisha demystifies these cutting-edge tools and explains how organizations can leverage them to enable business innovation while protecting user privacy. Whether you're a privacy professional exploring PETs for the first time or a technical leader looking to balance privacy with utility, this episode offers practical guidance on when, why, and how to use privacy enhancing technologies effectively.</itunes:subtitle>
      <itunes:keywords/>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>Adaptive Privacy in Clinical Research with Aarthi Iyer</title>
      <link>https://podcasts.fame.so/e/1n2r4pqn-adaptive-privacy-in-clinical-research-with-aarthi-iyer</link>
      <itunes:title>Adaptive Privacy in Clinical Research with Aarthi Iyer</itunes:title>
      <itunes:episode>9</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">21997m51</guid>
      <description>How do you protect patient data when clinical trials move beyond traditional clinic walls? In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Aarthi Iyer, Senior Director &amp; Senior Corporate Counsel at Cogent Biosciences and Microsoft Fellow, to explore the evolving landscape of privacy in clinical research. From managing decentralized trials with wearables and apps to navigating AI applications like synthetic control arms, Aarthi shares insights on balancing innovation with patient protection. Whether you're working in clinical research or simply interested in how privacy adapts to emerging technologies, this conversation offers practical guidance on building privacy programs that support both scientific integrity and patient rights.</description>
      <content:encoded><![CDATA[<div>Clinical trials are rapidly evolving beyond traditional clinic settings, creating new opportunities and privacy challenges. In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan welcome Aarthi Iyer, Senior Director &amp; Senior Corporate Counsel at Cogent Biosciences and Microsoft Fellow focused on AI in clinical research.<br><br></div><div><strong><br>What You'll Learn:</strong>&nbsp;<br><br></div><ul><li>How decentralized clinical trials work and what privacy considerations they create</li><li>Why patient data collection through wearables and apps requires enhanced vendor management</li><li>How to balance GDPR consent requirements with informed consent in clinical trials</li><li>What role AI is playing in modern clinical research, from patient recruitment to synthetic control arms</li><li>How to manage complex vendor relationships when dealing with digital health technologies</li><li>Why clinical trial sponsors have unique advantages in privacy compliance</li><li>What privacy professionals new to clinical trials should focus on first</li><li>How to approach controller and processor relationships across global trial sites</li><li>Why pseudonymized data in clinical trials still requires careful privacy protection</li><li>And so much more!</li></ul><div><br><br></div><div>Aarthi Iyer serves as Senior Director &amp; Senior Corporate Counsel at Cogent Biosciences, a clinical-stage biotechnology company developing precision therapies for patients with rare and serious diseases. With experience spanning roles at Dartmouth-Hitchcock Health and clinical research consulting, Aarthi brings a unique perspective on the legal and ethical complexities of data privacy in clinical trials. As a Microsoft Fellow focused on AI in clinical research, she explores how emerging technologies can enhance patient outcomes while maintaining rigorous privacy protections. Her work sits at the intersection of privacy law, clinical innovation, and emerging technologies.<br><br></div><div><strong>Connect with Aarthi Iyer here: </strong><a href="https://www.linkedin.com/in/aarthi-i-842397133/">LinkedIn</a><br><strong>Follow VeraSafe here: </strong><a href="https://www.linkedin.com/company/verasafe/posts/?feedView=all">LinkedIn</a><br><br></div><div>If you enjoyed this episode, make sure to subscribe, rate, and review it.<br><br><br></div><div><strong>Episode Highlights:<br></strong><br></div><div><strong>[00:02:26] Embracing Change as a Privacy Professional </strong>The clinical trials landscape evolves constantly, and Aarthi emphasizes that successful privacy professionals must be prepared to adapt. Rather than seeking rigid playbooks, privacy teams should build flexible frameworks that can accommodate regulatory changes, new technologies, and shifting stakeholder needs. This adaptability is particularly crucial in clinical trials, where innovations like decentralized studies and AI applications continuously reshape how data is collected and processed. Privacy professionals should view change as an opportunity to refine and improve their programs rather than a threat to established processes.<br><br></div><div><strong>[00:03:20] The Data Subject-Centered Shift in Clinical Research </strong>One of the most significant changes Aarthi has observed is how privacy regulations have placed data subjects at the center of clinical research considerations. Unlike other areas of drug development that focus primarily on scientific and regulatory requirements, privacy compliance requires constant consideration of individual rights and protections. This shift has improved the research landscape by ensuring that patient perspectives and rights are integrated into trial design from the beginning, creating more ethical and transparent research practices.<br><br></div><div><strong>[00:15:05] Decentralized Clinical Trials&nbsp;</strong>Decentralized clinical trials allow patients to participate from their homes using wearables, apps, and remote monitoring devices, significantly improving access for rural and vulnerable populations. However, these innovations create complex privacy challenges around continuous data collection, vendor management, and ensuring appropriate safeguards are in place. Organizations must carefully evaluate digital health technologies, establish clear data collection boundaries, and maintain human oversight throughout the process to protect patient privacy while enabling innovative research methods.<br><br></div><div><strong>[00:21:39] Vendor Management in Clinical Research&nbsp;</strong>Clinical trial sponsors have developed sophisticated vendor management processes that other industries can learn from. The approach involves multiple layers: regulatory due diligence using established guidance, comprehensive privacy and security questionnaires, collaboration with experienced clinical sites, and ongoing vendor relationship management throughout the trial lifecycle. The key is recognizing that no single organization can handle all aspects of complex clinical research alone—success requires building trusted partnerships with specialized vendors who understand the clinical trial environment.<br><br></div><div><strong>[00:29:31] AI Applications in Clinical Trials&nbsp;</strong>AI is transforming clinical research through applications like patient recruitment optimization, automated document drafting, enhanced medical imaging analysis, and synthetic control arms for vulnerable populations. These innovations can reduce research timelines, improve patient access, and enhance data quality. However, implementation requires careful consideration of data training sets, bias prevention, regulatory compliance, and patient consent regarding AI usage. The key is focusing on operational objectives and implementing AI tools systematically with appropriate oversight and validation.<br><br><br></div><div><strong>Episode Resources:<br></strong><br></div><ul><li>Aarthi Iyer on <a href="https://www.linkedin.com/in/aarthi-i-842397133/">LinkedIn</a></li><li>VeraSafe <a href="https://verasafe.com/">Website</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li></ul><div><br></div><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Tue, 09 Sep 2025 10:00:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/wmkm9yzw.mp3" length="35594775" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/0689afd0-8d58-11f0-917e-871703d9469d/0689b100-8d58-11f0-837c-eb19c8329df3.png"/>
      <itunes:duration>2229</itunes:duration>
      <itunes:summary>How do you protect patient data when clinical trials move beyond traditional clinic walls? In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Aarthi Iyer, Senior Director &amp; Senior Corporate Counsel at Cogent Biosciences and Microsoft Fellow, to explore the evolving landscape of privacy in clinical research. From managing decentralized trials with wearables and apps to navigating AI applications like synthetic control arms, Aarthi shares insights on balancing innovation with patient protection. Whether you're working in clinical research or simply interested in how privacy adapts to emerging technologies, this conversation offers practical guidance on building privacy programs that support both scientific integrity and patient rights.</itunes:summary>
      <itunes:subtitle>How do you protect patient data when clinical trials move beyond traditional clinic walls? In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Aarthi Iyer, Senior Director &amp; Senior Corporate Counsel at Cogent Biosciences and Microsoft Fellow, to explore the evolving landscape of privacy in clinical research. From managing decentralized trials with wearables and apps to navigating AI applications like synthetic control arms, Aarthi shares insights on balancing innovation with patient protection. Whether you're working in clinical research or simply interested in how privacy adapts to emerging technologies, this conversation offers practical guidance on building privacy programs that support both scientific integrity and patient rights.</itunes:subtitle>
      <itunes:keywords/>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>International Data Transfers with Kellie du Preez and Danie Strachan</title>
      <link>https://podcasts.fame.so/e/qn0vpx5n-international-data-transfers-with-kellie-du-preez-and-danie-strachan</link>
      <itunes:title>International Data Transfers with Kellie du Preez and Danie Strachan</itunes:title>
      <itunes:episode>8</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">p1knvj50</guid>
      <description>International data transfers are notoriously complex, and while there’s no silver bullet, gaining a better handle on them starts with the right conversation. In this deep-dive episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan tackle one of privacy's most technical and misunderstood topics. From defining what actually counts as an international data transfer to navigating SCCs, the EU-U.S. Data Privacy Framework, and transfer impact assessments, this conversation takes a grounded look at a complicated topic. Whether you're dealing with cloud providers, global vendors, or simply trying to understand why TikTok was fined €530 million, this episode provides a practical framework for approaching cross-border data flows with more confidence and clearer direction.</description>
      <content:encoded><![CDATA[<div>In this comprehensive episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan dive deep into the complex world of international data transfers, one of the most technical and misunderstood areas of privacy compliance.<br><br></div><div><strong><br>What You'll Learn:<br></strong><br></div><ul><li>What qualifies as an international data transfer</li><li>How hidden transfers via cloud services, APIs, SDKs, and website plugins can slip through unnoticed</li><li>Why major companies like TikTok and Uber faced hefty data transfer fines</li><li>How to evaluate cross-border transfer mechanisms, including adequacy decisions, certifications, and contractual safeguards</li><li>What to consider when conducting a Transfer Impact Assessment&nbsp;</li><li>How cross-border data transfer requirements vary across Brazil, China, Canada, and other jurisdictions</li><li>How to manage vendor relationships when data crosses multiple borders</li><li>Why transparency in privacy notices is just as important as technical safeguards</li><li>Practical strategies for building scalable data transfer compliance programs</li><li>And so much more!<br><br></li></ul><div><br>Kellie du Preez is a privacy compliance leader and former litigation attorney who transitioned from defending banks in Boston to focusing on global privacy compliance. With experience as both an IP litigator and privacy professional, she brings a unique perspective on balancing practical business needs with regulatory requirements. As a Data Protection Officer and privacy consultant at VeraSafe, Kellie helps organizations navigate complex privacy challenges with a focus on creating workable, cost-effective solutions.<br><br></div><div>Danie Strachan is a privacy professional who began his career in South African legal practice, where he developed deep experience in data protection law during the implementation of South Africa's Protection of Personal Information Act (POPIA). As a senior privacy counsel at VeraSafe, he specializes in helping organizations understand and implement privacy requirements across multiple jurisdictions, including the EU. Danie brings valuable insight into the evolution of privacy regulations and practical approaches to compliance.<br><br><br></div><div><strong>Connect with Kellie du Preez here: </strong><a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></div><div><strong>Connect with Danie Strachan here: </strong><a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></div><div><strong>Follow VeraSafe here: </strong><a href="https://www.linkedin.com/company/verasafe/posts/?feedView=all">LinkedIn</a></div><div><br><br></div><div>If you enjoyed this episode, make sure to subscribe, rate, and review it.</div><div><br><br></div><div><strong>Episode Highlights:</strong></div><div><strong><br>[00:01:41] What Qualifies as an International Data Transfer:</strong> The biggest misconception about international data transfers is that they require physical movement of data. Kellie explains that data transfers can happen whenever someone in one country accesses, views, or processes data hosted in another country. This includes cloud hosting, API calls, website plugins, and even employee access during travel. Understanding this broader definition is important because many organizations unknowingly trigger data transfer requirements through routine business operations. Privacy professionals need to look beyond obvious data exports to identify hidden data transfers through embedded tools, analytics platforms, and vendor relationships that span multiple countries.<br><br></div><div><strong>[00:04:20] The GDPR Data Transfer Framework: Adequacy, SCCs, and Beyond:</strong> Under the GDPR, transfers of personal data outside the EEA require a valid transfer mechanism before they can occur. This isn't something you can retrofit after the fact. Danie explains that adequacy decisions (such as those for the UK or Japan) provide the simplest path, while countries without adequacy require additional safeguards like Standard Contractual Clauses (SCCs). The EU-U.S. Data Privacy Framework offers an option for certified U.S. companies, but many organizations also implement SCCs as a fallback. Privacy teams must understand that SCCs alone aren't sufficient—they trigger additional obligations like Transfer Impact Assessments to evaluate the receiving country's legal environment and potential government access to data.<br><br></div><div><strong>[00:12:15] Learning from Major Data Transfer Violations:</strong> The TikTok and Uber cases demonstrate how data transfer violations can result in massive fines and other enforcement actions. TikTok's €530 million fine stemmed from two critical failures: inadequate safeguards when Chinese employees accessed EU user data, and lack of transparency about these data transfers in privacy notices. Similarly, Uber faced €290 million in fines for transferring French driver data to the U.S. without proper protections. These cases highlight that data transfer compliance requires both technical safeguards (encryption, access controls) and transparency (clear disclosure in privacy notices). Privacy professionals must ensure that users understand not just what data is collected, but where it goes and who can access it.<br><br></div><div><strong>[00:27:20] Transfer Impact Assessments:</strong> Simply signing SCCs doesn't complete your data transfer compliance—you must also conduct a Transfer Impact Assessment (TIA) to evaluate real-world risks in the destination country. This involves analyzing local laws, government access powers, legal remedies available to data subjects, and the practical ability of recipients to resist data requests. The French data protection authority (CNIL) has issued updated guidance with six steps for conducting effective TIAs. Privacy professionals should view TIAs as living documents that require regular updates as laws and geopolitical situations change, not one-time compliance exercises.<br><br></div><div><strong>[00:32:55] Global Data Transfer Requirements Beyond the GDPR</strong>:<strong> </strong>Other jurisdictions impose their own rules for cross-border transfers of personal data, and the specifics vary widely. Brazil, for example, has introduced new Standard Contractual Clauses that must be in place by August 2025, while Canada and Australia take a more general approach by requiring organizations to ensure the protection of personal data without prescribing specific contract language. China enforces stricter requirements, such as CAC-conducted security assessments and official certifications, and some countries, including South Africa in certain cases, require explicit regulator approval before a transfer can take place. These examples highlight the importance of assessing not only where data is going, but also whose data is being processed and the unique legal frameworks that may apply.<br><br></div><div><strong>[00:37:53] U.S. Data Transfer Rules and Certification Options</strong>: Kellie and Danie discuss the bulk data transfer rule and explore various certification programs that can help organizations manage cross-border data flows such as the EU-U.S. Data Privacy Framework, the APEC CBPR and PRP Systems, Global CBPR, and other tools.&nbsp;<br><br></div><div><strong>[00:43:42] Myth-Busting and Common Misconceptions</strong>: In a rapid-fire Q&amp;A, Kellie and Danie tackle widespread misunderstandings about international data transfers. They explain why ISO 27001 certification doesn’t eliminate the need for SCCs, how cloud-hosted data can still count as a transfer if accessed abroad, and why pseudonymized or hashed data may still be considered personal data. They also clarify that “GDPR-compliant” vendors still require legal safeguards, that website plugins, APIs, and SDKs can create hidden transfers, and that even large, well-known vendors often use subprocessors—sometimes requiring their own TIAs.<br><br><br></div><div><strong>Episode Resources:</strong></div><div><br></div><ul><li>VeraSafe <a href="https://verasafe.com/">Website</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li></ul><div><br></div><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Tue, 12 Aug 2025 10:00:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/wnnm20kw.mp3" length="138346560" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/521ab600-7742-11f0-9dd5-cdb7dea53184/521ab7a0-7742-11f0-a02e-95a04e301401.png"/>
      <itunes:duration>3458</itunes:duration>
      <itunes:summary>International data transfers are notoriously complex, and while there’s no silver bullet, gaining a better handle on them starts with the right conversation. In this deep-dive episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan tackle one of privacy's most technical and misunderstood topics. From defining what actually counts as an international data transfer to navigating SCCs, the EU-U.S. Data Privacy Framework, and transfer impact assessments, this conversation takes a grounded look at a complicated topic. Whether you're dealing with cloud providers, global vendors, or simply trying to understand why TikTok was fined €530 million, this episode provides a practical framework for approaching cross-border data flows with more confidence and clearer direction.</itunes:summary>
      <itunes:subtitle>International data transfers are notoriously complex, and while there’s no silver bullet, gaining a better handle on them starts with the right conversation. In this deep-dive episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan tackle one of privacy's most technical and misunderstood topics. From defining what actually counts as an international data transfer to navigating SCCs, the EU-U.S. Data Privacy Framework, and transfer impact assessments, this conversation takes a grounded look at a complicated topic. Whether you're dealing with cloud providers, global vendors, or simply trying to understand why TikTok was fined €530 million, this episode provides a practical framework for approaching cross-border data flows with more confidence and clearer direction.</itunes:subtitle>
      <itunes:keywords/>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>Privacy in Clinical Trials with Jim Schneider</title>
      <link>https://podcasts.fame.so/e/2nxzxz2n-privacy-in-clinical-trials-with-jim-schneider</link>
      <itunes:title>Privacy in Clinical Trials with Jim Schneider</itunes:title>
      <itunes:episode>7</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">61mkjkm1</guid>
      <description>How do privacy professionals protect patient data in high-risk, life-saving research? In this episode of Privacy in Practice, Jim Schneider, Senior Director of Compliance and Data Privacy at Immunome, offers an introduction to the foundational privacy challenges in clinical trials. From untangling GDPR and HIPAA to navigating global data transfers and the ethics of consent, Jim shares hard-earned insights from over 15 years in biopharma privacy leadership. Whether you're navigating privacy in high-stakes, regulated environments or simply want to understand how theory meets practice, this episode offers thoughtful perspectives.</description>
      <content:encoded><![CDATA[<div>Clinical trials pose unique privacy challenges where scientific integrity, ethics, and compliance intersect. In this episode, Jim Schneider of Immunome shares practical insights from over 15 years of biopharma privacy leadership. From untangling the complexities of GDPR and HIPAA to tackling cross-border data transfers and handling future use of trial data, Jim explains how privacy professionals embed privacy into clinical research without slowing down innovation. This episode offers foundational insights for privacy professionals and business leaders seeking to understand the basics of compliance in clinical trials and how it can drive business growth while navigating legal hurdles.&nbsp;<br><br><br></div><div>What You'll Learn:<br><br></div><ul><li>How to balance GDPR and GCP in clinical trial consent</li><li>Why HIPAA often doesn't apply to clinical trial sponsors</li><li>How to manage data subject rights while maintaining trial integrity</li><li>Practical steps for handling cross-border data transfers in clinical research</li><li>Key insights on the future use of clinical trial data</li><li>And so much more!</li></ul><div><br><br></div><div>Jim Schneider is the Senior Director, Counsel of Compliance and Data Privacy at Immunome, a clinical-stage oncology company. With deep experience across biopharma, including previous roles at Seattle Genetics and Boston Scientific, Jim specializes in translating regulatory mandates into operational reality, particularly in complex, data-heavy environments like clinical trials.</div><div><br><br></div><div><strong>Connect with Jim Schneider here: </strong><a href="https://www.linkedin.com/in/lawlawyerattorneycompliance/">LinkedIn</a></div><div><strong>Connect with Kellie du Preez here: </strong><a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></div><div><strong>Connect with Danie Strachan here: </strong><a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></div><div><strong>Follow VeraSafe here: </strong><a href="https://www.linkedin.com/company/verasafe/posts/?feedView=all">LinkedIn</a></div><div><br><br></div><div>If you enjoyed this episode, make sure to subscribe, rate, and review it.</div><div><br><br></div><div><strong>Episode Highlights:<br></strong><br></div><div><strong>[00:04:30] Balancing Ethics and Privacy Laws in Clinical Trial Consent<br></strong>In clinical trials, obtaining informed consent is not just a regulatory checkbox—it's about ensuring participants fully understand what they are signing up for. Jim explains the delicate balance privacy professionals must strike when navigating both GCP (Good Clinical Practices) and GDPR requirements. GCP sets out comprehensive standards for clinical trial conduct, with a strong emphasis on ethical considerations and ensuring participants are well-informed, while GDPR emphasizes data privacy and requires explicit consent for data processing. Privacy professionals must integrate both frameworks without compromising the integrity of the trial. Jim shares how clear, simple language in consent forms and transparency about how data will be used can help maintain trust while meeting legal obligations. Key Takeaway: Consent forms should not only meet regulatory standards but also empower participants to make informed decisions about their involvement in a trial.<br><br></div><div><strong>[00:10:20] Pseudonymization: Protecting Identity, Ensuring Science<br></strong>Pseudonymization is a vital tool in maintaining participant privacy while upholding the scientific integrity of clinical trials. By replacing identifiable information with unique codes, clinical trial sponsors can safeguard participant identities while continuing to use the data for research. Jim discusses how pseudonymization works in practice, ensuring compliance with regulations like GDPR without obstructing the analysis required for clinical trials. The challenge, however, lies in ensuring that pseudonymized data remains useful while preventing unauthorized access. Privacy professionals must establish robust data security practices to protect this data, ensuring it is only accessible to authorized personnel. Bottom Line: Regularly audit the effectiveness of pseudonymization techniques and ensure that data encryption methods are up to date to mitigate the risk of a data breach.<br><br></div><div><strong>[00:16:00] Managing Data Deletion Requests in Trials<br></strong>Under GDPR, participants have the right to request the deletion of their personal data, but this can be complicated in the context of clinical trials. Deleting data can compromise the integrity of a trial, as every data point is crucial for the validity of the research. Jim explains how privacy professionals must handle data deletion requests delicately. While participants can withdraw from the trial, their data may need to remain in the dataset for scientific and regulatory reasons. This requires transparent communication, where privacy professionals must clearly explain the implications of data removal to participants, including why their data might remain in the system. Actionable Insight: Always include clear data retention policies in your consent forms, and ensure that participants understand the limits of their right to deletion in a research context.<br><br></div><div><strong>[00:27:00] Understanding HIPAA in Clinical Trials<br></strong>HIPAA’s role in clinical trials can be confusing, particularly when it comes to the data responsibilities of sponsors versus clinical sites. While clinical sites must comply with HIPAA to protect patient health information, clinical trial sponsors are typically not covered by HIPAA. Jim demystifies this distinction, explaining that HIPAA governs health data at the site level but does not directly apply to the sponsor's handling of data. Instead, sponsors are guided by other privacy laws like GDPR, which provides a framework for managing personal data in a clinical research context. Privacy professionals must navigate these differing regulations, ensuring that the clinical sites comply with HIPAA while the sponsor follows the applicable data privacy laws. Key Insight: HIPAA primarily applies to clinical sites and other covered entities. As a sponsor or CRO, you may not be directly subject to HIPAA, but you should be prepared to respect HIPAA-compliant data handling practices while fulfilling your obligations under other applicable privacy laws, such as the GDPR.&nbsp;<br><br></div><div><strong>[00:29:45] Managing Consent for Biobanking and Future Use of Data<br></strong>As clinical trials generate vast amounts of valuable data, biobanking and future use of data become essential considerations for privacy professionals. Jim explains that obtaining consent for future use of data is complex and varies depending on the jurisdiction. Some regions require that this consent be obtained separately from the initial trial consent. Privacy professionals must be diligent in communicating to participants that their data may be used in future research, emphasizing transparency and clarity in consent forms. The future use of data can significantly enhance scientific discovery, but it must be handled carefully to ensure ongoing compliance with data protection laws. Bottom Line: Stay up-to-date on the evolving regulations around future data use and ensure that separate consents are obtained where required, allowing participants to make informed decisions about their data’s future use.<br><br></div><div><br></div><div><strong>Episode Resources:<br></strong><br></div><ul><li>Jim Schneider on <a href="https://www.linkedin.com/in/lawlawyerattorneycompliance/">LinkedIn</a></li><li>VeraSafe <a href="https://verasafe.com/">Website</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li></ul><div><br></div><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Wed, 09 Jul 2025 13:39:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/8l4m6pv8.mp3" length="84328056" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/2b4aa200-5cca-11f0-b1e9-bd9d7de03d62/2b4aa3a0-5cca-11f0-822e-95f215e2860a.png"/>
      <itunes:duration>2108</itunes:duration>
      <itunes:summary>How do privacy professionals protect patient data in high-risk, life-saving research? In this episode of Privacy in Practice, Jim Schneider, Senior Director of Compliance and Data Privacy at Immunome, offers an introduction to the foundational privacy challenges in clinical trials. From untangling GDPR and HIPAA to navigating global data transfers and the ethics of consent, Jim shares hard-earned insights from over 15 years in biopharma privacy leadership. Whether you're navigating privacy in high-stakes, regulated environments or simply want to understand how theory meets practice, this episode offers thoughtful perspectives.</itunes:summary>
      <itunes:subtitle>How do privacy professionals protect patient data in high-risk, life-saving research? In this episode of Privacy in Practice, Jim Schneider, Senior Director of Compliance and Data Privacy at Immunome, offers an introduction to the foundational privacy challenges in clinical trials. From untangling GDPR and HIPAA to navigating global data transfers and the ethics of consent, Jim shares hard-earned insights from over 15 years in biopharma privacy leadership. Whether you're navigating privacy in high-stakes, regulated environments or simply want to understand how theory meets practice, this episode offers thoughtful perspectives.</itunes:subtitle>
      <itunes:keywords/>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>The Critical Friend: How to Build Privacy Programs That Actually Work with Sean Milford</title>
      <link>https://podcasts.fame.so/e/r874z1jn-the-critical-friend-how-to-build-privacy-programs-that-actually-work-with-sean-milford</link>
      <itunes:title>The Critical Friend: How to Build Privacy Programs That Actually Work with Sean Milford</itunes:title>
      <itunes:episode>6</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">k08yvr61</guid>
      <description>What do you do when your privacy budget is shrinking, your vendors won’t negotiate, and your team is spread thin? In this episode of Privacy in Practice, Sean Milford, Global Head of Data Privacy at Syndigo, shares a pragmatic toolbox of strategies for privacy pros, from making privacy engineering real to simplifying vendor management and data transfers. It’s a masterclass in staying effective, scalable, and sane, no matter your team size or budget.</description>
      <content:encoded><![CDATA[<div>What does it <em>actually</em> take to run an effective privacy program inside a fast-moving, resource-strapped business? In this episode of <em>Privacy in Practice</em>, Sean Milford, Global Head of Data Privacy at Syndigo, shares a playbook for turning privacy theory into operational results. If you're managing privacy across teams, tools, and time zones, this episode is a masterclass in making it work.</div><div><br></div><div>What You'll Learn:</div><ul><li>Why privacy is a team sport and how to lead across law, marketing, and tech</li><li>What privacy engineering looks like beyond theory</li><li>How to build vendor programs that scale and stick</li><li>Why privacy ops succeed (or fail) at the department level</li><li>How to use maturity models to prioritize risk</li><li>When to use DPF, SCCs, or BCRs and why backups matter</li><li>How to run privacy programs in remote or global teams</li><li>What it means to be a “critical friend” to the business</li><li>Why centralized “Base Camp” docs make compliance scalable</li><li>And so much more!</li></ul><div><br></div><div>Sean Milford is the Global Head of Data Privacy at Syndigo, where he leads global privacy initiatives across multiple industries. With over 20 years of experience in privacy, cybersecurity, and compliance, Sean has worked for major global brands including Visa, PwC, Dell Technologies, and HSBC. He is also completing an advanced master’s in Privacy, Cybersecurity, and Data Management at Maastricht University. Known for his ability to translate between legal, technical, and business audiences, Sean specializes in building operationally grounded privacy programs that scale.</div><div>For those seeking to strengthen their understanding of privacy program design and implementation, Sean Milford recommends the following resources:<br><br></div><ol><li>“The Checklist Manifesto” by Atul Gawande</li><li>“Privacy Design Strategies: The Little Blue Book” by Jaap-Henk Hoepman</li><li>“Strategic Privacy by Design” by R. Jason Cronk</li><li>“Design Assurance Standard v1.0” by the Institute of Operational Privacy Design (IOPD)<br><br></li></ol><div><strong>Connect with Sean Milford here: </strong><a href="https://www.linkedin.com/in/seanmilford/">LinkedIn</a></div><div><strong>Connect with Kellie du Preez here: </strong><a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></div><div><strong>Connect with Danie Strachan here: </strong><a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></div><div><strong>Follow VeraSafe here: </strong><a href="https://www.linkedin.com/company/verasafe/posts/?feedView=all">LinkedIn</a></div><div><br></div><div>If this episode gave you a new way to think about privacy in practice, we’d love it if you subscribed, rated, and left a quick review. It helps us keep bringing practical insights to privacy pros like you.</div><div><br></div><div><strong>Episode Highlights:</strong></div><div><strong><br>[00:06:31] From Privacy by Design to Privacy Engineering That Works<br></strong>"Privacy by design" might sound good in theory, but what does it look like in practice? Sean pulls back the curtain on privacy engineering and explains how teams can go beyond policy to actually embed privacy into technical architecture. He shares how his background in software development helps him ask the right questions about data flow, system boundaries, and access control, so that privacy becomes a functional requirement, not just a legal obligation. Sean argues that privacy teams don’t need to build everything from scratch; instead, they should integrate privacy into existing engineering and operations frameworks. This means speaking the language of developers, mapping privacy goals to security controls, and packaging requirements as design-ready inputs. The result? Privacy programs that are built to scale and designed to last.<br><br></div><div><strong><br>[00:15:16] Privacy Is a Team Sport And Needs a Game Plan<br></strong>One of the biggest misconceptions in privacy is that it lives with the legal team. Sean challenges that idea and emphasizes why privacy must be co-owned across legal, engineering, marketing, and business operations. He explains that privacy success often hinges on alignment, not expertise, because no single function can fully see or manage data risks. Using examples from his global roles, Sean shares how to “translate” privacy across disciplines so it becomes relevant and actionable to each stakeholder group. For instance, you’ll need different narratives when working with marketing on data minimization vs. engineers on system design. His tip? Think of the privacy team as a “critical friend” to the business: supportive, honest, and embedded enough to influence decisions in real time.<br><br></div><div><strong><br>[00:24:07] Vendor Management Without the Burnout<br></strong>Vendor risk is one of privacy’s most daunting tasks, but Sean offers a refreshingly pragmatic framework for managing it. He outlines how to tier vendors based on business value and data sensitivity, and explains how to align contract terms, like SCCs, DPF, or audit rights, with real leverage. The episode dives deep into what sustainable vendor management looks like in practice: integrating privacy reviews into existing supply chain workflows, standardizing assessments, and being honest about where you can’t negotiate. Sean also touches on how to maintain accountability post-onboarding, especially in global environments where ongoing audits can be challenging. The key lesson? A repeatable, right-sized approach will keep you compliant without consuming your entire privacy function.<br><br></div><div><strong><br>[00:38:18] Scaling Privacy with Lean Teams and Frameworks That Stick<br></strong>What if you're a privacy team of one or none? Sean tackles the challenge of building impact with limited resources, sharing a two-pronged strategy: manage the day-to-day (“run the business”) while carving out time for strategic improvements (“change the business”). He explains how capability maturity models and frameworks like Nymity or the ICO Accountability Framework help prioritize what matters most, rather than chasing perfection in every domain. Sean also introduces the concept of a “Base Camp”, a central, accessible hub where teams can get privacy answers fast, reducing noise and confusion. He shares practical tips for embedding privacy in distributed teams, including pre-built Jira templates, checklists, and cross-functional playbooks. For resourceful teams working across time zones, this is a blueprint for doing more with less, without compromising on quality or clarity.<br><br></div><div><strong><br>[00:46:55] Turning Policy into Practice: Why Privacy Ops Lives (or Dies) at the Department Level<br></strong>You can have a gold-standard privacy policy, but it’s worthless if no one in the organization knows how to use it. Sean explains how successful privacy operations rely on decentralization, not in ownership, but in execution. By requiring each business unit to develop its own operating procedures based on core policy, companies embed accountability where it counts. He also emphasizes the importance of tools and playbooks, like linking Jira tickets to privacy workflows or using checklists that guide users through compliant actions. It’s not just about having a policy; it’s about integrating that policy into the systems and rhythms of real teams. The lesson: privacy doesn’t live in the binder; it lives in the daily decisions made by customer support, engineers, marketers, and product managers.<br><br></div><div><strong><br>[00:51:08] Working With External Advisors: The ‘Critical Friend’ Mindset<br></strong>Engaging outside consultants and legal partners isn’t just about outsourcing expertise; it’s about building long-term collaboration that can flex with evolving needs. Sean shares why the best external advisors act like “critical friends”, professionals who offer blunt honesty when needed, but remain embedded in your business context. He outlines what makes these relationships work: mutual trust, shared tools (like access to internal frameworks or SharePoint hubs), and a human connection that extends beyond contracts. Especially in remote-first or high-pressure environments, these external partnerships can provide stability, strategic insight, and backup when bandwidth is low. The takeaway? Your privacy partners should help you build, not just review, your program.<br><br></div><div><br></div><div><strong>Episode Resources:</strong></div><ul><li>Sean Milford on <a href="https://www.linkedin.com/in/seanmilford/">LinkedIn</a></li><li>VeraSafe <a href="https://verasafe.com/">Website</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li></ul><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Wed, 11 Jun 2025 12:59:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/895jpk98.mp3" length="106164488" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/d01db7f0-46c2-11f0-9c18-a3f0d26267f7/d01db920-46c2-11f0-8688-2d043ac4223e.png"/>
      <itunes:duration>2654</itunes:duration>
      <itunes:summary>What do you do when your privacy budget is shrinking, your vendors won’t negotiate, and your team is spread thin? In this episode of Privacy in Practice, Sean Milford, Global Head of Data Privacy at Syndigo, shares a pragmatic toolbox of strategies for privacy pros, from making privacy engineering real to simplifying vendor management and data transfers. It’s a masterclass in staying effective, scalable, and sane, no matter your team size or budget.</itunes:summary>
      <itunes:subtitle>What do you do when your privacy budget is shrinking, your vendors won’t negotiate, and your team is spread thin? In this episode of Privacy in Practice, Sean Milford, Global Head of Data Privacy at Syndigo, shares a pragmatic toolbox of strategies for privacy pros, from making privacy engineering real to simplifying vendor management and data transfers. It’s a masterclass in staying effective, scalable, and sane, no matter your team size or budget.</itunes:subtitle>
      <itunes:keywords/>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>AI Governance Without the Hype with Shane Witnov</title>
      <link>https://podcasts.fame.so/e/l8qwy4r8-ai-governance-without-the-hype-with-shane-witnov</link>
      <itunes:title>AI Governance Without the Hype with Shane Witnov</itunes:title>
      <itunes:episode>5</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">80nv5xj0</guid>
      <description>How should companies approach AI governance when the laws are still taking shape? 

In this episode of Privacy in Practice, Shane Witnov, AI Policy Director at Meta, joins us to share practical insights from years at the intersection of privacy, policy, and technology. He reveals how Meta stays ahead of developing regulations and uses existing privacy frameworks to build robust AI policies that support its safe, ethical use. Whether you're a privacy professional or business leader, this episode is packed with actionable insights that will prepare your organization for the future of AI while protecting your users and brand.</description>
      <content:encoded><![CDATA[<div>The future of AI is now, but how can you ensure it’s used responsibly while also driving business growth? In this episode of <em>Privacy in Practice</em>, Shane Witnov, AI Policy Director at Meta, provides a behind-the-scenes look at how the company navigates the complex intersection of AI innovation and privacy. Shane reveals how Meta uses its proven privacy frameworks to govern AI at scale and stay ahead of emerging regulations, offering a blueprint that businesses of all sizes can follow. This episode shows that AI governance doesn’t have to be an obstacle; instead, it can be your next strategic advantage.</div><div><br></div><div>What You'll Learn:</div><ul><li>Why AI governance doesn’t have to mean starting from scratch.&nbsp;</li><li>Why AI governance can (and should) build on your existing privacy, security, and data use frameworks.</li><li>How to use proven privacy frameworks to govern AI safely.</li><li>Why open-source AI models offer a better privacy solution.</li><li>How to set clear, actionable guidelines for safe AI use without banning existing tools.</li><li>Why staying ahead of state-level AI bills is crucial for protecting your business.</li><li>How to identify AI risks early with red-teaming and practical testing.</li><li>Why transparency isn’t just about labels.</li><li>How to build trust through real-world impact.</li><li>And so much more!</li></ul><div><br></div><div>Shane Witnov serves as AI Policy Director at Meta, where he focuses on the intersection of technology, privacy, and public policy, particularly in artificial intelligence. With a background in digital civil liberties and privacy law, he has been instrumental in guiding Meta's approach to AI governance and ethical implementation since joining the company in 2015. His expertise spans privacy compliance, AI ethics, and technological innovation, having previously worked with organizations like the Electronic Frontier Foundation and gaining valuable insights through his experience in law and technology.&nbsp;</div><div><br></div><div><strong>Connect with Shane Witnov here: </strong><a href="https://www.linkedin.com/in/switnov/">LinkedIn</a></div><div><strong>Connect with Kellie du Preez here: </strong><a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></div><div><strong>Connect with Danie Strachan here: </strong><a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></div><div><strong>Follow VeraSafe here: </strong><a href="https://www.linkedin.com/company/verasafe/posts/?feedView=all">LinkedIn</a></div><div><br></div><div>If you enjoyed this episode, make sure to subscribe, rate, and review it.</div><div><br></div><div><strong>Episode Highlights:<br></strong><br></div><div><strong>[00:06:43] Convergence Over Compliance: Building AI Governance That Scales<br></strong>Effective AI governance is about more than simply meeting regulatory requirements. Shane explains how Meta's "Convergence-based" approach helps create scalable, user-focused privacy solutions. By prioritizing features based on the value they offer to users globally, rather than tailoring to niche or less-used legal requirements, businesses can build systems that serve both compliance needs and real user benefits. Shane highlights the internal question at Meta: “Are we building a toggle for 5 users or 20% of users?” This distinction is critical in determining whether a control should be globally prioritized or tailored for specific jurisdictions. The takeaway for privacy professionals is clear: don’t waste resources on solutions no one uses; instead, build solutions that provide value now and set your business up for future regulatory developments.<br><br></div><div><strong>[00:15:45] Why AI Isn’t an Exception: Use the Frameworks You Already Have<br></strong>Shane cautions against AI exceptionalism, the idea that AI requires entirely new governance structures. Instead, start with existing privacy and risk frameworks, and then layer in AI-specific considerations like robustness, reliability, and appropriate use. He stresses that Meta used its well-established privacy risk processes as the foundation for AI model evaluations and red teaming. This approach, which builds on years of work, offers privacy and compliance teams a practical and cost-effective way to start governing AI while evolving as new risks emerge. The message is clear: don't start from scratch, evolve your existing frameworks to meet the needs of emerging technologies.<br><br></div><div><strong>[00:26:54] Bans Don’t Work, Clear Guidance Does<br></strong>Many businesses fear AI's potential risks and react by banning tools like ChatGPT outright. Shane warns that this is a mistake. "If you don’t give guidance, your employees are probably using it anyway," he points out. Rather than banning tools outright, organizations should focus on providing clear, actionable guidelines for acceptable uses. For example, encouraging employees to use AI for internal tasks like summarizing meeting notes or drafting emails is acceptable, while uploading customer data to these platforms is not. This approach empowers employees to use AI safely and responsibly, without stifling productivity or innovation. Whether you’re a privacy officer or a business leader, this segment provides a roadmap for creating clear boundaries and ensuring safe AI use.</div><div><strong><br>[00:34:43] How to Start AI Governance with No Budget and No Team<br></strong>No team? No budget? No problem! Shane offers a simple, three-step process for small businesses or startups to start implementing AI governance:<br><br></div><ol><li>Assign Someone to Oversee AI Use: This doesn’t need to be a full-time role, just someone who can monitor AI developments and risks.</li><li>Run Low-Risk Pilot Programs: Start with non-critical workflows that can benefit from AI, and gradually scale up as you gather insights.</li><li>Test with a Red-Team Mindset: Identify vulnerabilities and risks early on by testing AI tools before fully implementing them.<br><br></li></ol><div>By following these steps, businesses can take meaningful action without needing large teams or massive budgets. Shane emphasizes that AI governance is about being iterative and thoughtful rather than perfect, which is especially important for smaller organizations working with limited resources.<br><br></div><div><strong>[00:38:34] Transparency Isn’t Just Labels: It’s Context That Matters<br></strong>Shane explains how transparency around AI usage is evolving. While labeling AI-generated content is one approach, it often doesn’t align with user concerns. For example, Meta’s attempt to label AI-edited images using metadata standards (like those from Photoshop) led to confusion and frustration among users, who didn’t care about the technical aspects of AI use, they just didn’t want to be misled. This highlights an important lesson for privacy leaders: transparency isn’t about disclosing every instance of AI use; it’s about providing meaningful context that aligns with users' expectations. By focusing on user impact rather than technical disclosures, organizations can build trust and ensure that transparency efforts are both meaningful and effective.<br><br></div><div><strong>[00:21:00] From Focus Groups to Global Consensus: Listening as a Governance Tool<br></strong>How do you know if your AI tools align with user values? Ask them. Shane explains how Meta uses a variety of methods, including global focus groups, UX research, and deliberative democracy forums, to gather input from real users about how AI should be governed. These forums, which bring together ordinary users after structured education on ethical dilemmas, often reveal surprising alignment. For example, when presented with challenging questions, 70% of participants reached consensus on issues that initially seemed divisive. The key takeaway for privacy professionals is clear: building real-world input into your governance framework can help ensure that AI tools align with the needs and values of the people who use them.<br><br></div><div><strong>Episode Resources:</strong></div><ul><li>Shane Witnov on <a href="https://www.linkedin.com/in/switnov/">LinkedIn</a></li><li>VeraSafe <a href="https://verasafe.com/">Website</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li></ul><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Tue, 20 May 2025 09:56:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/84v46l08.mp3" length="111242376" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/c7fdb490-3554-11f0-9837-c9a19afd267d/c7fdb5a0-3554-11f0-b052-51f07daa324d.png"/>
      <itunes:duration>2781</itunes:duration>
      <itunes:summary>How should companies approach AI governance when the laws are still taking shape? 

In this episode of Privacy in Practice, Shane Witnov, AI Policy Director at Meta, joins us to share practical insights from years at the intersection of privacy, policy, and technology. He reveals how Meta stays ahead of developing regulations and uses existing privacy frameworks to build robust AI policies that support its safe, ethical use. Whether you're a privacy professional or business leader, this episode is packed with actionable insights that will prepare your organization for the future of AI while protecting your users and brand.</itunes:summary>
      <itunes:subtitle>How should companies approach AI governance when the laws are still taking shape? 

In this episode of Privacy in Practice, Shane Witnov, AI Policy Director at Meta, joins us to share practical insights from years at the intersection of privacy, policy, and technology. He reveals how Meta stays ahead of developing regulations and uses existing privacy frameworks to build robust AI policies that support its safe, ethical use. Whether you're a privacy professional or business leader, this episode is packed with actionable insights that will prepare your organization for the future of AI while protecting your users and brand.</itunes:subtitle>
      <itunes:keywords>Privacy In Practice, Privacy, Data Protection, Marketing, Digital Marketing, Big Data, Information Privacy, Privacy Law, GDPR, US Privacy Laws, CCPA, Privacy Compliance, Data Security, Privacy Regulations, VeraSafe, EDPB Guidance, Privacy Challenges, Privacy Compliance Tips, Compliance, Data Mapping, Risk Management, Privacy Best Practices, DPO, Data Protection Officer, Privacy Program, Privacy Consulting, AI ACT, EU AI Regulations, AI Compliance, Artificial Intelligence Law, Ethical AI, AI Governance, Startup Privacy, Corporate Compliance, Privacy Strategy, Tech Law, Cybersecurity, Consumer Privacy, Privacy Awareness</itunes:keywords>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>Privacy vs. Marketing? It Doesn’t Have to Be a Fight, with Sachiko Scheuing</title>
      <link>https://podcasts.fame.so/e/qn0q7r48-privacy-vs-marketing-it-doesn-t-have-to-be-a-fight-with-sachiko-scheuing</link>
      <itunes:title>Privacy vs. Marketing? It Doesn’t Have to Be a Fight, with Sachiko Scheuing</itunes:title>
      <itunes:episode>4</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">p1kpwqm0</guid>
      <description>What if privacy wasn’t just about compliance but your next competitive edge? In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Dr. Sachiko Scheuing, European Privacy and AI Governance Officer at Acxiom. Sachiko reveals how smart businesses are turning privacy compliance into a growth strategy. From digital advertising and customer data use to PETs, data minimization, and AI governance, this conversation dives into how privacy and marketing can work hand-in-hand to build trust and deliver results. Packed with practical insights and real-world stories, this episode is a must-listen for privacy pros and marketing leaders alike.</description>
      <content:encoded><![CDATA[<div>Privacy and marketing can (and should) work together—and the most forward-thinking businesses are proving it.. In this episode of <em>Privacy in Practice</em>, hosts Kellie du Preez and Danie Strachan welcome Dr. Sachiko Scheuing, European Privacy and AI Governance Officer at Acxiom. As co-chair of the Federation of European Data and Marketing, Sachiko shares her front-line perspective on how responsible data use, smart marketing, and privacy compliance can fuel both trust and growth.</div><div><br></div><div>What You'll Learn:</div><ul><li>How digital advertising empowers small and medium sized businesses</li><li>The three essential categories of AdTech: SEO, Walled Gardens, and Open Internet</li><li>Why PETs and data minimization are key to responsible data use</li><li>How to build a privacy-first culture that drives business success</li><li>Why DPOs are perfectly positioned to lead AI governance</li><li>Practical strategies for data minimization using pseudonymization and anonymization</li><li>Sachiko’s "Inform, Involve, Initiate" framework to improve privacy practices</li><li>And so much more!</li></ul><div><br></div><div>Dr. Sachiko Scheuing serves as the European Privacy and AI Governance Officer at Acxiom, where she leads privacy and compliance initiatives across Europe. With over 20 years of experience in marketing, technology, and compliance, she has established herself as a leader in privacy, AI governance, and data protection. As co-chair of the Federation of European Data and Marketing, she is pivotal in shaping data protection and marketing policies across Europe. She is also the author of "How to Use Customer Data: Navigating GDPR, DPDI and a Future with Marketing AI," a comprehensive guide on marketing compliance under the GDPR. Her expertise in balancing privacy requirements with innovative marketing strategies, combined with her advocacy for responsible data practices and AI governance, makes her a respected voice in the privacy and data protection community.<br><br><strong>Episode Highlights:</strong></div><div><br></div><div><strong>[00:09:08] Why Digital Advertising Isn’t Evil</strong></div><div><br>Digital advertising gets a bad rap, but Sachiko explains why it’s time to reframe the conversation. She challenges the common perception of digital advertising as inherently problematic by highlighting its role in democratizing marketing opportunities. Before the rise of digital ads, only big players like Coca-Cola or IKEA could afford mass-market campaigns. Today, targeted advertising makes it possible for local businesses to reach relevant audiences affordably and effectively. This democratization empowers small and medium-sized enterprises—the backbone of most economies—while helping fund the mostly free internet we all rely on today. The takeaway: privacy professionals can help organizations strike a balance—leveraging ethical digital advertising to support both business growth and equitable access to information, all while maintaining compliance and trust.<br><br></div><div><strong>[00:15:34] Privacy Culture Starts at the Top</strong></div><div>A privacy policy alone won’t build trust—genuine privacy compliance requires a cultural shift, and that shift starts at the top. Sachiko shares how executive commitment to privacy has been essential at Acxiom, where leadership views trust as a business advantage. When executives visibly prioritize privacy, they send a clear message that trust, transparency, and data ethics are fundamental to the organization’s values. But it has to go beyond lip service. Leaders must take concrete actions—allocating resources to privacy programs, funding staff training, and embedding privacy into strategic decision-making. Even when the immediate return isn’t obvious, executives who champion privacy are investing in the company’s future viability and credibility. They empower privacy teams to drive meaningful change across the business, creating an environment where privacy becomes second nature The result? Privacy shifts from a regulatory obligation to a strategic differentiator—fueling customer loyalty, enhancing brand reputation, and ensuring sustainable success. And in an era of tightening regulations and rising consumer expectations, organizations that don’t make privacy leadership a priority risk falling behind.</div><div><br></div><div><strong>[00:33:37] Using PETs, Anonymization, and RoPA to Strengthen Privacy</strong></div><div>In a thoughtful exchange with Kellie and Danie, Sachiko discusses how privacy enhancing technologies (PETs)—and techniques like pseudonymization, anonymization, differential privacy, and synthetic data—are gaining traction as practical tools for reducing risk without compromising data utility. These approaches are especially valuable for organizations working with large data sets or building AI systems, offering ways to protect individuals while still enabling insight and innovation.</div><div>The conversation then turns to Records of Processing Activities (RoPA) as a foundational step for understanding your data landscape. A key takeaway being that there’s no single correct structure—what works depends on the business. Whether organized by purpose, system, or data subject type, a well-thought-out ROPA helps teams identify where PETs can be most effective and where compliance risks may be hiding. For those new to these concepts, the group shares practical suggestions for building literacy—from free online trainings to industry conferences and Sachiko’s own book, How to Use Customer Data.&nbsp;</div><div><br><strong>Episode Resources:<br></strong><br></div><ul><li>Sachiko Scheuing on <a href="https://www.linkedin.com/in/dr-sachiko-scheuing-46b168/">LinkedIn</a></li><li><a href="https://www.amazon.in/How-Use-Customer-Data-Navigating/dp/139861517X">How to Use Customer Data: Navigating GDPR, DPDI and a Future with Marketing AI&nbsp;</a></li><li>VeraSafe <a href="https://verasafe.com/">Website</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li></ul><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Tue, 08 Apr 2025 10:00:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/wrj7nz1w.mp3" length="106766876" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/76f93120-139c-11f0-8e14-ed1157313cda/76f932a0-139c-11f0-be14-4127abb41a5a.png"/>
      <itunes:duration>2669</itunes:duration>
      <itunes:summary>What if privacy wasn’t just about compliance but your next competitive edge? In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Dr. Sachiko Scheuing, European Privacy and AI Governance Officer at Acxiom. Sachiko reveals how smart businesses are turning privacy compliance into a growth strategy. From digital advertising and customer data use to PETs, data minimization, and AI governance, this conversation dives into how privacy and marketing can work hand-in-hand to build trust and deliver results. Packed with practical insights and real-world stories, this episode is a must-listen for privacy pros and marketing leaders alike.</itunes:summary>
      <itunes:subtitle>What if privacy wasn’t just about compliance but your next competitive edge? In this episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Dr. Sachiko Scheuing, European Privacy and AI Governance Officer at Acxiom. Sachiko reveals how smart businesses are turning privacy compliance into a growth strategy. From digital advertising and customer data use to PETs, data minimization, and AI governance, this conversation dives into how privacy and marketing can work hand-in-hand to build trust and deliver results. Packed with practical insights and real-world stories, this episode is a must-listen for privacy pros and marketing leaders alike.</itunes:subtitle>
      <itunes:keywords>Privacy In Practice, Privacy, Data Protection, Marketing, Digital Marketing, Big Data, Information Privacy, Privacy Law, GDPR, US Privacy Laws, CCPA, Privacy Compliance, Data Security, Privacy Regulations, VeraSafe, EDPB Guidance, Privacy Challenges, Privacy Compliance Tips, Compliance, Data Mapping, Risk Management, Privacy Best Practices, DPO, Data Protection Officer, Privacy Program, Privacy Consulting, AI ACT, EU AI Regulations, AI Compliance, Artificial Intelligence Law, Ethical AI, AI Governance, Startup Privacy, Corporate Compliance, Privacy Strategy, Tech Law, Cybersecurity, Consumer Privacy, Privacy Awareness</itunes:keywords>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>The Future of Mental Privacy with Kristen Mathews</title>
      <link>https://podcasts.fame.so/e/r8kmxz08-the-future-of-mental-privacy-with-kristen-mathews</link>
      <itunes:title>The Future of Mental Privacy with Kristen Mathews</itunes:title>
      <itunes:episode>3</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">71w7y2p1</guid>
      <description>We’ve all had that moment—you think about a thing, and moments later, something related to it appears on your phone, smartwatch, or tablet. In this world of hyper-personalization, do our very thoughts become data opportunities? And can this shift be regulated? In the latest episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Kristen Mathews, Partner at Cooley's Cyber Data Privacy Practice Group, to explore the emerging field of mental privacy and neurotech regulation. From wearable devices capable of detecting brain activity to the complexities of regulating neural data, this eye-opening conversation unpacks the unique privacy challenges—and opportunities—presented by brain-monitoring technologies. Whether you're interested in the future of data protection or want to understand how businesses can responsibly navigate this evolving space, this conversation provides practical insights into the intersection of AI, neurotech, and privacy law.</description>
      <content:encoded><![CDATA[<div>How long can we tow the line on our thoughts being accessed by new technology? In the latest episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan welcome Kristen Mathews, Partner at Cooley's Cyber Data Privacy Practice Group to explore the evolving landscape of mental privacy—its challenges, opportunities, and the critical questions shaping its future.</div><div><br></div><div>Together, they:&nbsp;</div><ul><li>Examine how businesses collect and use personal information beyond their core services</li><li>Explore how emerging technologies can collect and interpret brain activity</li><li>Investigate the unique challenges of protecting neural data and whether traditional privacy laws are enough</li><li>Reflect on the role of AI in processing neural data and the intersection with emerging regulations like the EU AI Act</li><li>Consider the potential for industry self-regulation in the neurotech space</li><li>Highlight the positive applications of neurotech, from medical uses like seizure prediction to mental wellness applications</li><li>Emphasize the importance of privacy-conscious implementation</li></ul><div><br></div><div>Kristen Mathews is a partner in Cooley's Cyber Data Privacy Practice Group and a pioneering voice in the emerging field of mental privacy and neurotech regulation. With a career spanning multiple decades in privacy law, she has been at the forefront of numerous privacy developments, including early data breach responses, online behavioral advertising implementation, and biometric data protection. Her current focus on mental privacy and neurotech regulation demonstrates her commitment to addressing tomorrow's privacy challenges today. Her practical approach to balancing innovation with privacy protection makes her a valuable voice for privacy professionals navigating emerging technologies and their associated regulatory challenges.</div><div><br></div><div><br><strong>Episode Highlights:</strong></div><div><br></div><div><strong>[00:03:18] The Data Exchange Framework for Privacy Communication</strong></div><div>Privacy professionals should reframe data collection as a "data exchange" between businesses and consumers, where both parties receive clear value. This framework helps organizations clearly communicate what data they need to provide their service versus additional data they collect for other purposes. Companies should explicitly demonstrate the benefits users receive in exchange for their data, making the value proposition transparent. The approach requires privacy teams to work closely with product and marketing teams to articulate the exchange in user-friendly terms. This helps build trust and reduces the risk of users feeling "cheated" when they later discover unexpected data uses.</div><div><br></div><div><strong>[00:07:14] Effective Privacy Notice Design: Beyond the Legal Document</strong></div><div>Kristen emphasizes that privacy notices should be integrated into the user interface at the exact moment users need the information, not just buried in legal documents. Privacy professionals should ensure notices match the voice and tone of the service, using the same language style that resonates with users. The information should be presented concisely and prominently, avoiding overwhelming users with legal jargon. This approach helps build trust and transparency while reducing the likelihood of litigation and complaints. For maximum effectiveness, privacy teams should coordinate with UI/UX designers to create notices that appear at key decision points in the user journey.</div><div><br></div><div><strong>[00:28:29] Protecting Neural Data: A Layered Security Approach</strong></div><div>For organizations working with neural data, Kristen recommends implementing multiple layers of protection beyond standard privacy measures. Privacy teams should consider storing neural data locally on devices rather than in the cloud, implementing strong encryption that only allows individual device access, and carefully evaluating the effectiveness of de-identification methods. Organizations need to think about future-proofing their privacy protections, anticipating how advancing technology might affect data security. This approach helps protect sensitive neural data from breaches, unauthorized access, and potential subpoenas while maintaining functionality for legitimate uses.</div><div><br></div><div><strong>[00:31:10] Proactive Self-Regulation in Emerging Technologies</strong></div><div>Privacy professionals working with emerging technologies should consider implementing self-regulation before legislation mandates specific requirements. Drawing from the successful example of the ad tech industry, companies should develop privacy protection frameworks that align with their business models while protecting individual rights. Early self-regulation can help shape future legislation in practical ways that work for both businesses and consumers. This approach requires privacy teams to collaborate across industries to establish standards that address key concerns while maintaining innovation. Organizations that take the lead in self-regulation often have more influence over eventual regulatory requirements.</div><div><br></div><div><br><strong>Episode Resources:<br><br></strong><br></div><ul><li>Kristen Mathews on <a href="https://www.linkedin.com/in/kristen-mathews-6025257/">LinkedIn</a></li><li>VeraSafe <a href="https://verasafe.com/">Website</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li></ul><div><br></div><div><br><br></div><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Tue, 11 Mar 2025 19:36:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/wl4xxrlw.mp3" length="30741427" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/0952cd70-04a8-11f0-8545-abd38246f54d/0952ce70-04a8-11f0-98f3-1d6aa8928b94.png"/>
      <itunes:duration>2196</itunes:duration>
      <itunes:summary>We’ve all had that moment—you think about a thing, and moments later, something related to it appears on your phone, smartwatch, or tablet. In this world of hyper-personalization, do our very thoughts become data opportunities? And can this shift be regulated? In the latest episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Kristen Mathews, Partner at Cooley's Cyber Data Privacy Practice Group, to explore the emerging field of mental privacy and neurotech regulation. From wearable devices capable of detecting brain activity to the complexities of regulating neural data, this eye-opening conversation unpacks the unique privacy challenges—and opportunities—presented by brain-monitoring technologies. Whether you're interested in the future of data protection or want to understand how businesses can responsibly navigate this evolving space, this conversation provides practical insights into the intersection of AI, neurotech, and privacy law.</itunes:summary>
      <itunes:subtitle>We’ve all had that moment—you think about a thing, and moments later, something related to it appears on your phone, smartwatch, or tablet. In this world of hyper-personalization, do our very thoughts become data opportunities? And can this shift be regulated? In the latest episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Kristen Mathews, Partner at Cooley's Cyber Data Privacy Practice Group, to explore the emerging field of mental privacy and neurotech regulation. From wearable devices capable of detecting brain activity to the complexities of regulating neural data, this eye-opening conversation unpacks the unique privacy challenges—and opportunities—presented by brain-monitoring technologies. Whether you're interested in the future of data protection or want to understand how businesses can responsibly navigate this evolving space, this conversation provides practical insights into the intersection of AI, neurotech, and privacy law.</itunes:subtitle>
      <itunes:keywords>Privacy In Practice, Data Privacy, Data Protection, Privacy Law, Mental Privacy Rights, Brain Data Protection, Privacy Professional Insights, Neurotech Privacy, Data Protection Regulations, US Privacy Laws, Privacy Compliance, Data Security, Privacy Regulations, Verasafe, Privacy Challenges, Privacy Program, Privacy Consulting, AI ACT, EU AI Regulations, AI Compliance, Artificial Intelligence Law, Ethical AI, AI Governance, Privacy Strategy, Tech Law, Cyber Security, Consumer Privacy, Privacy Awareness</itunes:keywords>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>Integrating IT Know-How into Privacy Law with Peter Jaffe</title>
      <link>https://podcasts.fame.so/e/x8ymj528-integrating-it-know-how-into-privacy-law-with-peter-jaffe</link>
      <itunes:title>Integrating IT Know-How into Privacy Law with Peter Jaffe</itunes:title>
      <itunes:episode>2</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">l04nvyq0</guid>
      <description>What if you could build a more effective privacy program by bridging the gap between legal requirements and technical implementation? In this episode of Privacy in Practice, we sit down with Peter Jaffe, VP &amp; Sr. Associate General Counsel for Privacy, Technology, Facilities &amp; Operations at National Geographic Society, to explore the critical intersection of privacy law and technology. Peter shares invaluable insights on building privacy programs that work, from finding internal allies and managing stakeholder relationships to navigating the technical aspects of privacy compliance. Whether you're a privacy professional looking to strengthen your technical understanding or a lawyer seeking to make your privacy advice more practical, this episode offers actionable strategies for creating privacy programs that truly serve your organization's needs. Tune in to discover why understanding both the technical and human elements of privacy is crucial for success in a complex regulatory landscape.</description>
      <content:encoded><![CDATA[<div>What if you could build a more effective privacy program by bridging the gap between legal requirements and technical implementation? In this episode of Privacy in Practice, we sit down with Peter Jaffe, VP &amp; Sr. Associate General Counsel for Privacy, Technology, Facilities &amp; Operations at National Geographic Society.<br><br><br>Together, we:<br><br>- Explore Peter's unique journey into privacy law<br>- Examine the critical intersection of technical knowledge and privacy law<br>- Discuss essential technical concepts for privacy professionals<br>- Delve into effective strategies for building privacy programs<br>- Consider the role of privacy professionals as translators between stakeholders<br>- Explore practical approaches to privacy training<br>- Share valuable insights on managing global privacy compliance<br>- And so much more!<br><br>With over a decade of experience spanning both private practice and in-house roles, Peter brings a unique blend of technical acumen and legal expertise to privacy law, having evolved from his early career in financial services litigation to becoming a respected voice in privacy and data protection. His approach combines rigorous technical understanding with human-centered privacy principles, making him particularly effective at bridging the gap between legal requirements and practical implementation. Peter's experience in building privacy programs, managing data breaches, and navigating complex regulatory landscapes, coupled with his ability to translate technical concepts for diverse stakeholders, provides valuable insights for privacy professionals at all levels. His emphasis on understanding both the technical architecture and human elements of privacy makes him an especially relevant voice for organizations working to build sustainable privacy programs.<br><br><strong>Episode Highlights:<br><br>[05:27] Technical Foundation for Privacy Professionals</strong><br>Peter emphasizes that privacy professionals need a basic understanding of technical concepts to be effective advisors. He recommends learning fundamentals of object-oriented programming, database structures, and access controls - even if informally through self-study. This technical knowledge helps professionals spot nuances that impact transparency requirements and data-handling practices. For privacy leaders developing their skills, starting with introductory programming concepts and database fundamentals provides a crucial foundation for understanding modern privacy challenges. Most importantly, this technical literacy enables better communication with IT teams and more practical implementation guidance.<br><br><strong>[11:36] Strategic Approach to Delivering Difficult Privacy News</strong><br>When delivering challenging privacy-related messages, Peter advocates for a methodical, analytical approach rather than emotional reactions. He recommends first identifying applicable laws and requirements, then systematically exploring options for compliance, and finally presenting a clear risk analysis across liability, litigation, and reputational dimensions. This structured approach helps maintain professional relationships while ensuring stakeholders understand the full context of privacy decisions. Privacy professionals can use this framework to transform potentially confrontational situations into collaborative problem-solving opportunities. The method also helps build credibility and trust with business partners who may be skeptical of privacy requirements.<br><br><strong>[18:40] Building Effective Privacy Programs: The "Why" Before "How"</strong><br>Peter stresses the importance of establishing the foundational "why" of privacy programs before diving into implementation. Privacy leaders should help organizations understand both risk factors (regulatory, litigation, reputation) and positive motivators (customer trust, contractual obligations). This foundation requires a deep understanding of organizational culture, risk tolerance, and stakeholder expectations. The approach should align with existing governance structures rather than imposing a one-size-fits-all solution. Most critically, success depends on finding internal allies with relevant technical skills who can help discover and manage privacy requirements effectively.<br><br><strong>[30:35] Targeted Privacy Training Strategy</strong><br>Rather than attempting to cover all privacy principles broadly, Peter recommends separating training content based on audience needs. For general staff, focus on helping them recognize personal information and understand when to ask for help rather than technical compliance details. This targeted approach improves retention and practical application of privacy concepts. Training should be customized to reflect the organization's specific context and use cases rather than relying solely on generic materials. The key measure of success is whether employees know how to identify privacy issues and engage appropriate resources when needed.<br><br><strong>Episode Resources:<br><br></strong><br></div><ul><li>Peter Jaffe on <a href="https://www.linkedin.com/in/peter-jaffe-9b5b0330/">LinkedIn</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li><li>VeraSafe on<strong> </strong><a href="https://www.linkedin.com/company/verasafe/posts/?feedView=all">LinkedIn</a></li></ul><div><br></div><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Tue, 25 Feb 2025 12:00:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/895z1j68.mp3" length="96024116" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/447a6470-f362-11ef-9c0a-eb8c803333e2/447a6570-f362-11ef-808a-3dcb0c0736b2.png"/>
      <itunes:duration>2400</itunes:duration>
      <itunes:summary>What if you could build a more effective privacy program by bridging the gap between legal requirements and technical implementation? In this episode of Privacy in Practice, we sit down with Peter Jaffe, VP &amp; Sr. Associate General Counsel for Privacy, Technology, Facilities &amp; Operations at National Geographic Society, to explore the critical intersection of privacy law and technology. Peter shares invaluable insights on building privacy programs that work, from finding internal allies and managing stakeholder relationships to navigating the technical aspects of privacy compliance. Whether you're a privacy professional looking to strengthen your technical understanding or a lawyer seeking to make your privacy advice more practical, this episode offers actionable strategies for creating privacy programs that truly serve your organization's needs. Tune in to discover why understanding both the technical and human elements of privacy is crucial for success in a complex regulatory landscape.</itunes:summary>
      <itunes:subtitle>What if you could build a more effective privacy program by bridging the gap between legal requirements and technical implementation? In this episode of Privacy in Practice, we sit down with Peter Jaffe, VP &amp; Sr. Associate General Counsel for Privacy, Technology, Facilities &amp; Operations at National Geographic Society, to explore the critical intersection of privacy law and technology. Peter shares invaluable insights on building privacy programs that work, from finding internal allies and managing stakeholder relationships to navigating the technical aspects of privacy compliance. Whether you're a privacy professional looking to strengthen your technical understanding or a lawyer seeking to make your privacy advice more practical, this episode offers actionable strategies for creating privacy programs that truly serve your organization's needs. Tune in to discover why understanding both the technical and human elements of privacy is crucial for success in a complex regulatory landscape.</itunes:subtitle>
      <itunes:keywords>Privacy In Practice, Data Privacy, Data Protection, Privacy Law, GDPR, US Privacy Laws, Privacy Compliance, Data Security, Privacy Regulations, Verasafe, EDPB Guidance, Privacy Challenges, Privacy Compliance Tips, Data Mapping, Risk Management, Privacy Best Practices, DPO, Data Protection Officer, Privacy Program, Privacy Consulting, AI ACT, EU AI Regulations, AI Compliance, Artificial Intelligence Law, Ethical AI, AI Governance, Startup Privacy, Corporate Compliance, Privacy Strategy, Tech Law, Cyber Security, Consumer Privacy, Privacy Awareness</itunes:keywords>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>Beyond the Checkbox: Practical Privacy Strategies for Real-World Compliance</title>
      <link>https://podcasts.fame.so/e/q80ql3jn-beyond-the-checkbox-practical-privacy-strategies-for-real-world-compliance</link>
      <itunes:title>Beyond the Checkbox: Practical Privacy Strategies for Real-World Compliance</itunes:title>
      <itunes:episode>1</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">p0kp24x1</guid>
      <description>In this inaugural episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan dive into the practical realities of modern privacy compliance. This foundation-setting conversation explores the evolving landscape of data protection, from the challenges of managing processor relationships to the EU AI Act prohibitions that kicked in recently. Whether you're a privacy professional navigating complex regulations or a compliance leader building sustainable programs, this episode offers actionable insights on balancing regulatory demands with business realities. Learn how to approach privacy compliance pragmatically while staying ahead of evolving regulations and emerging privacy challenges.</description>
      <content:encoded><![CDATA[<div>In this inaugural episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan introduce VeraSafe's new podcast focused on making privacy compliance practical and accessible. Together, they:</div><div><br></div><ul><li>Share their personal journeys into privacy law</li><li>Explore why privacy compliance is both challenging and rewarding</li><li>Discuss the importance of balancing theoretical compliance requirements with real-world business constraints</li><li>Examine recent EDPB guidance on controller obligations&nbsp;</li><li>Address the growing regulatory emphasis on understanding technical implementations and data flows</li><li>Examine the latest challenges with the EU AI Act</li><li>Emphasize the need for a holistic approach to privacy compliance</li><li>And so much more!</li></ul><div><br></div><div>Kellie du Preez is a privacy compliance leader and former litigation attorney who transitioned from defending banks in Boston to focusing on global privacy compliance. With experience as both an IP litigator and privacy professional, she brings a unique perspective on balancing practical business needs with regulatory requirements. As a Data Protection Officer and privacy consultant at VeraSafe, Kellie helps organizations navigate complex privacy challenges with a focus on creating workable, cost-effective solutions.</div><div><br></div><div>Danie Strachan is a privacy professional who began his career in South African legal practice, where he developed deep experience in data protection law during the implementation of South Africa's Protection of Personal Information Act (POPIA). As a senior privacy counsel at VeraSafe, he specializes in helping organizations understand and implement privacy requirements across multiple jurisdictions, including the EU. Danie brings valuable insight into the evolution of privacy regulations and practical approaches to compliance.</div><div><br><strong>Episode Highlights:</strong></div><div><br></div><div><strong>[00:20:58] Understanding Your Data Processing Chain -&nbsp;</strong></div><div>Privacy professionals must take a more active role in understanding their complete data processing ecosystem. Recent EDPB guidance emphasizes that organizations can't simply delegate responsibility to processors - they need detailed knowledge of all subprocessors and their security measures. This includes knowing where data is hosted, what security measures are in place, and maintaining proper documentation of the entire processing chain. For DPOs and privacy leads, this means implementing robust vendor management processes, maintaining detailed data maps, and regularly reviewing subprocessor arrangements. This increased oversight requirement may require updating data processing agreements and implementing new monitoring systems.</div><div><br></div><div><strong>[00:36:28] Beyond Checkbox Compliance -</strong></div><div>Privacy compliance requires moving beyond surface-level documentation to meaningful implementation. Organizations often focus too heavily on having privacy notices and policies while neglecting the actual operational aspects of privacy compliance. Privacy professionals need to dive deep into understanding actual data flows, processing activities, and technical implementations. This includes regular audits of data collection practices, storage durations, and processing purposes. The key is connecting written policies to practical implementation through technical controls and operational procedures.</div><div><br></div><div><strong>[00:42:28] Preparing for the EU AI Act -&nbsp;</strong></div><div>With the February 2025 deadline here for prohibited AI systems, privacy professionals need to conduct comprehensive AI audits within their organizations. This includes identifying all AI systems in use, evaluating them against the EU AI Act's risk categories, and developing plans to address any systems that are prohibited. Privacy teams should focus particularly on workplace monitoring systems, automated decision-making tools, and any AI systems that could affect individual rights. Creating an AI inventory and risk assessment framework should be an immediate priority.</div><div><br></div><div><strong>[00:47:51] Managing Vendor AI Implementation -</strong></div><div>Privacy professionals must establish processes to evaluate AI capabilities being introduced through existing vendor relationships. Many vendors are rolling out AI features without explicit notification, creating compliance risks. Privacy teams should implement specific AI review procedures as part of vendor management, require vendors to provide detailed information about AI features, and establish clear internal protocols for when teams need to involve privacy review of new AI capabilities. This requires ongoing communication with business units and regular vendor technology reviews.</div><div><br><strong>Episode Resources:<br></strong><br></div><ul><li>VeraSafe <a href="https://verasafe.com/">Website</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li></ul><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Tue, 11 Feb 2025 12:00:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/wj07321w.mp3" length="117836408" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/5990a720-e79d-11ef-bb64-b9c31bb31f04/5990a830-e79d-11ef-9383-fb255c508861.png"/>
      <itunes:duration>2945</itunes:duration>
      <itunes:summary>In this inaugural episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan dive into the practical realities of modern privacy compliance. This foundation-setting conversation explores the evolving landscape of data protection, from the challenges of managing processor relationships to the EU AI Act prohibitions that kicked in recently. Whether you're a privacy professional navigating complex regulations or a compliance leader building sustainable programs, this episode offers actionable insights on balancing regulatory demands with business realities. Learn how to approach privacy compliance pragmatically while staying ahead of evolving regulations and emerging privacy challenges.</itunes:summary>
      <itunes:subtitle>In this inaugural episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan dive into the practical realities of modern privacy compliance. This foundation-setting conversation explores the evolving landscape of data protection, from the challenges of managing processor relationships to the EU AI Act prohibitions that kicked in recently. Whether you're a privacy professional navigating complex regulations or a compliance leader building sustainable programs, this episode offers actionable insights on balancing regulatory demands with business realities. Learn how to approach privacy compliance pragmatically while staying ahead of evolving regulations and emerging privacy challenges.</itunes:subtitle>
      <itunes:keywords>Privacy In Practice, Data Privacy, Data Protection, Privacy Law, GDPR, US Privacy Laws, Privacy Compliance, Data Security, Privacy Regulations, Verasafe, EDPB Guidance, Privacy Challenges, Privacy Compliance Tips, Data Mapping, Risk Management, Privacy Best Practices, DPO, Data Protection Officer, Privacy Program, Privacy Consulting, AI ACT, EU AI Regulations, AI Compliance, Artificial Intelligence Law, Ethical AI, AI Governance, Startup Privacy, Corporate Compliance, Privacy Strategy, Tech Law, Cyber Security, Consumer Privacy, Privacy Awareness</itunes:keywords>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
    <item>
      <title>Privacy in Practice</title>
      <link>https://podcasts.fame.so/e/xnvqk6r8-privacy-in-practice-trailer</link>
      <itunes:title>Privacy in Practice</itunes:title>
      <itunes:episode>0</itunes:episode>
      <itunes:block>No</itunes:block>
      <googleplay:block>No</googleplay:block>
      <guid isPermaLink="false">71y7yzl1</guid>
      <description>Privacy in Practice, brought to you by VeraSafe, is the podcast for actionable insights and real-world strategies for privacy and compliance teams. Hosted by privacy pros Kellie du Preez and Danie Strachan, each episode unpacks the practical side of compliance and data management, bringing together industry leaders and thought-provoking discussions.</description>
      <content:encoded><![CDATA[<div>Privacy in Practice, brought to you by VeraSafe, is the podcast for actionable insights and real-world strategies for privacy and compliance teams. Hosted by privacy pros Kellie du Preez and Danie Strachan, each episode unpacks the practical side of compliance and data management, bringing together industry leaders and thought-provoking discussions.<br><br>In every episode, listeners will discover actionable solutions for today's most pressing privacy challenges. From navigating privacy laws like the GDPR and U.S. state regulations to exploring the intersection of privacy with AI, cybersecurity, and emerging technologies, we cover the topics that matter most to modern privacy professionals. Our discussions feature insights from professionals to help you develop and support privacy programs that drive business growth rather than hinder it.<br><br>Whether you’re leading privacy efforts at your company or just beginning to explore this field, tune in for meaningful conversations that provide a straightforward approach to data privacy and empower listeners to make informed, confident decisions. Privacy isn’t just about regulatory boxes—it’s about fostering trust and resilience in a digital world.<br><br><strong>Key Links</strong></div><ul><li>VeraSafe <a href="https://verasafe.com/">Website</a></li><li>Kellie du Preez on <a href="https://www.linkedin.com/in/kellie-du-preez-a8abb3/">LinkedIn</a></li><li>Danie Strachan on <a href="https://www.linkedin.com/in/danie-strachan/">LinkedIn</a></li></ul><div><br></div><div>Privacy in Practice is handcrafted by our friends over at: <a href="https://www.fame.so/?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=masters-of-community-with-david-spinks?utm_medium=podcast&amp;utm_source=bcast&amp;utm_campaign=fame-client">fame.so</a>.<br><br>Connect with us at podcast@verasafe.com <br><br>This podcast is brought to you by <a href="https://verasafe.com/">VeraSafe</a>.</div>]]></content:encoded>
      <pubDate>Wed, 05 Feb 2025 16:31:00 +0000</pubDate>
      <author>VeraSafe</author>
      <enclosure url="https://media.fame.so/wk47v5z8.mp3" length="5613716" type="audio/mpeg"/>
      <itunes:author>VeraSafe</itunes:author>
      <itunes:image href="https://content.fameapp.so/uploads/3jqv36k1/87d72f40-9ac1-11ef-b4f9-ad8b31dcff8d/87d73060-9ac1-11ef-8aa6-53a7e909a53e.jpg"/>
      <itunes:duration>140</itunes:duration>
      <itunes:summary>Privacy in Practice, brought to you by VeraSafe, is the podcast for actionable insights and real-world strategies for privacy and compliance teams. Hosted by privacy pros Kellie du Preez and Danie Strachan, each episode unpacks the practical side of compliance and data management, bringing together industry leaders and thought-provoking discussions.</itunes:summary>
      <itunes:subtitle>Privacy in Practice, brought to you by VeraSafe, is the podcast for actionable insights and real-world strategies for privacy and compliance teams. Hosted by privacy pros Kellie du Preez and Danie Strachan, each episode unpacks the practical side of compliance and data management, bringing together industry leaders and thought-provoking discussions.</itunes:subtitle>
      <itunes:keywords>Privacy In Practice, Data Privacy, Data Protection, Privacy Law, GDPR, US Privacy Laws, Privacy Compliance, Data Security, Privacy Regulations, Verisafe, Digital Privacy, Privacy Professionals, Privacy Career, Privacy Leadership, Data Protection Officer, DPO, Cybersecurity, Privacy Best Practices, Compliance Leadership, Privacy Consulting, Privacy Trends, Data Governance, Global Privacy, Digital Nomads, Remote Work, Privacy Advocacy, Privacy Culture, Privacy and Innovation, Emerging Privacy Trends, Virtual Collaboration, Chief Privacy Officer, Privacy Insights, Privacy Education, Privacy Strategy</itunes:keywords>
      <itunes:explicit>No</itunes:explicit>
      <googleplay:explicit>No</googleplay:explicit>
    </item>
  </channel>
</rss>
