This article was published in 2025 and references a historical event from 2013, included here for context and accuracy.
- Tension: Data-driven marketing thrives on consumer information, yet its survival depends on the very trust that aggressive data practices erode.
- Noise: Industry debates focus on compliance frameworks and technical solutions while missing that trust cannot be engineered through policies alone.
- Direct Message: Privacy responsibility isn’t a regulatory burden to distribute, it’s the foundational value that must permeate every decision your organization makes.
To learn more about our editorial approach, explore The Direct Message methodology.
In 2013, Rick Erwin of Experian Marketing Services posed a question that seemed straightforward at the time: where does responsibility lie for customer privacy in data-driven marketing?
His answer, that every organization touching consumer data shares equal responsibility felt comprehensive.
Today, more than a decade later, after GDPR, CCPA, countless data breaches, and the erosion of third-party cookies, that question reveals something more troubling.
We’ve spent years debating who should be responsible while watching consumer trust steadily decline.
The real issue isn’t about distributing responsibility across a supply chain.
It’s about whether any amount of shared accountability can substitute for organizations that genuinely value privacy as a core principle rather than a compliance obligation.
The paradox of shared responsibility
Erwin’s 2013 framework identified every participant in the data ecosystem as equally responsible: data originators, aggregators, compilers, agencies, brokers, processors, trading desks, and advertisers.
This sounds reasonable until you recognize the inherent contradiction. When everyone is equally responsible, no one is individually accountable.
The intervening years have proven this tension real. Despite industry-wide agreements on privacy principles, we’ve witnessed an escalating series of failures.
Facebook’s Cambridge Analytica scandal in 2018 exposed how data could flow through multiple parties, with each assuming someone else was ensuring proper consent.
Equifax’s 2017 breach compromised 147 million consumers’ personal information, revealing that even companies whose business model depends on data security can fail catastrophically at protecting it.
The marketing technology stack has also grown exponentially more complex since 2013. Today’s digital advertising ecosystem involves dozens of intermediaries between a brand and a consumer; each touching data, each theoretically responsible, each with different standards and capabilities.
When a programmatic ad impression gets served, data passes through supply-side platforms, demand-side platforms, data management platforms, verification services, and more.
Asking each to be equally responsible for privacy is like asking every person who touches a package in a shipping network to personally guarantee its contents remain intact.
The compliance theater distracting us
The decade since Erwin’s article has seen an explosion of privacy frameworks, certifications, and compliance programs. Organizations now employ chief privacy officers, conduct privacy impact assessments, and implement elaborate consent management platforms.
The industry has responded to the responsibility question with infrastructure: policies, processes, audits, and documentation.
Yet consumer trust in how companies handle their data has not improved proportionally. According to a 2019 Pew Research study, 79% of Americans reported being concerned about how companies use their data, and 81% felt they have little to no control over the data companies collect.
These numbers reflect a fundamental disconnect: the industry has built impressive compliance machinery while consumers feel less protected than ever.
This represents what we might call compliance theater, the creation of visible privacy processes that satisfy regulatory requirements without fundamentally changing how organizations view and value consumer data.
Companies can check every box on a privacy assessment while still designing products that maximize data extraction, create deliberately confusing privacy choices, and treat consumer information as an asset to be monetized rather than a trust to be honored.
The noise here is the belief that privacy responsibility can be engineered through proper frameworks and distributed accountability.
We’ve confused having privacy policies with having privacy principles, having consent mechanisms with having genuine respect for consumer autonomy.
What actually protects consumer trust
Privacy responsibility cannot be delegated, distributed, or documented into existence. It must be the default assumption embedded in every product decision, every data partnership, and every business model choice your organization makes.
Erwin was right that this requires values “so central to the beliefs of your company that they find their way into every corner of your operation.”
Where the decade since has taught us differently is in recognizing that formal value statements and review processes, while necessary, are insufficient.
What protects consumer trust is not having the right words in your values document but making the hard business choices that those values demand.
This means declining profitable data partnerships when you cannot verify how that data was collected. It means building products that work well with minimal data rather than designing for maximum data collection and then adding privacy controls. It means accepting that some highly effective targeting capabilities may not be ethical to deploy, even if they’re technically legal and your competitors use them.
Building organizations that default to privacy
The path forward requires moving beyond shared responsibility to embedded accountability.
This starts with recognizing that privacy cannot be primarily the domain of privacy officers and compliance teams. It must be integrated into how product managers define success, how engineers build systems, how salespeople represent capabilities, and how executives evaluate growth opportunities.
Practically, this means several shifts.
First, privacy review processes should occur before product development begins, not as a gate before launch. By the time a product reaches a formal review, significant resources have been invested and business momentum created, making it difficult to recommend fundamental changes.
Second, organizations need to develop muscle memory for privacy-respecting alternatives. When a product team wants to implement a feature that requires extensive data collection, the default question should be “how can we deliver similar value with less data” rather than “how can we get proper consent for this data.”
Third, privacy metrics need to carry equal weight with business metrics in how organizations evaluate success. If a product drives significant revenue but requires privacy practices that erode trust or push ethical boundaries, that should be visible and weighted in how leadership views that product’s performance.
Finally, organizations must be willing to publicly acknowledge when their privacy practices fall short and commit to specific improvements, treating privacy failures with the same seriousness as security breaches or financial misstatements.
The marketing industry in 2025 faces a choice that has become only more stark since 2013. We can continue building more sophisticated compliance infrastructure while consumer trust continues to erode, or we can recognize that privacy responsibility cannot be distributed across a supply chain but must be embedded within each organization’s core decision-making.
The companies that will thrive in the coming decade won’t be those with the most comprehensive privacy policies, but those where privacy is so fundamental to how they operate that it requires no separate policy at all.