Short of forsaking all modern amenities and living in a cave on a deserted island, all customers will have data about their lives stored and accessed by businesses. This is a fact of 21st century life: Personal data from Internet searches, email inquiries, credit card payments, mail orders, and any other consumer-to-company interactions that happen both on- and offline will be collated and stored.
Consumers and watchdogs worry about exactly what data is collected, how it’s used, and how consumers are educated about the process—and have an expectation that companies will respect consumers’ privacy, thus will self regulate and limit the use of their data. Industries that rely on consumer data to find, target, or advertise to prospects and customers, on the other hand, argue that excessive regulation will stifle business and harm the economy.
Ultimately, the consumer privacy debate is about balance: When does data collection become data corruption? And when does relevant targeting cross over to obtrusive tracking?
Privacy experts examine how to achieve that balance as it relates to five key customer data privacy issues.
The indefinable Do Not Track
Interest in Do Not Track (DNT) initiatives gained a great deal of attention in the media last year. “Do Not Track has been born out of concerns by consumers that there were too many people involved in the online process, and that more people were collecting their data than they were aware of,” says Brooks Dobbs, chief privacy officer of KBM Group and member at the World Wide Web Consortium’s (W3C) Tracking Protection Working Group.
In other words, by looking at a user’s Web activity holistically—tracking his or her movements across different websites—browsers gain comprehensive insight into that user’s personal habits and activities. DNT was initiated to give users the power to block this type of monitoring. Last September Google released the 23rd iteration of its Chrome browser, which included DNT support. Now, the top five browsers—Microsoft Internet Explorer, Apple Safari, Mozilla Firefox, Opera, and Google Chrome support the DNT standard.
On the surface, DNT sounds like a viable compromise. But Dobbs points out that it’s not that simple; one of the biggest problems keeping DNT from being universally adapted is definitional: There’s no standard consensus on the meaning of track.
“We’re now two years into the process and [the] Working Group…hasn’t defined track, hasn’t defined who it applies to, and certainly hasn’t laid out ways that consumers could have this communicated to have them make a material choice,” Dobbs says.
The W3C may not have officially established a DNT definition, but Sarah Branam, privacy manager at Epsilon, believes that DNT should indicate “Do Not Target” as opposed to “Do Not Collect.”
“You always have to collect information for certain purposes, such as fraud prevention, audit trails; logging that information is necessary, but you should always enable a consumer to opt out of that specific target,” Branam explains.
Tim Suther, chief marketing and strategy officer at Acxiom, argues that companies simply need to be assiduous about when they collect consumer data. “If you’re not going to use data for a particular case, you’ve got to think about, is it wise to even collect it?” Suther asks. “For us, it’s really use-driven.”
While these points make sense for corporate best practices, they still don’t address the chief legislative problem: the lack of a true definition for DNT. This vagueness potentially gives marketers plausible deniability—and reason to pay too little attention to the issue. If demand for DNT legislation spikes, many marketers might find they’d been ignoring the fire alarm.
“I think many people bury their head in the sand and say, ‘I don’t do behavioral advertising; Do Not Track doesn’t apply to me.’ But it applies to everybody,” Dobbs says. Some form of tracking is necessary to measure the efficacy of online advertising, Dobbs argues. Otherwise, how can marketers know how many ads were actually served? If one million impressions were purchased, were one million impressions actually delivered? DNT legislation without a proper DNT definition would severely threaten that business model.
Moreover, there are questions as to whom DNT should actually apply. Dobbs points out that currently, DNT only applies to “third parties.” But determining what constitutes these third parties is nebulous at best. For instance, a visitor to Facebook interacts with the site in a first-party context. Yet if that visitor later clicks on a like button alongside, say, an article on the Huffington Post website, Facebook essentially becomes the third party.
So, while consumers might embrace the concept of DNT, in practice it’s far too vague to form the basis of solid legislation that will protect consumer privacy without harming online business practices.
The hidden complexity of opt-in/opt-out
Marketers often present customers the opportunity to opt in to various services in exchange for providing certain personal data. But it’s not a magic bullet for privacy matters. Like DNT, implementing opt-in or opt-out clauses sounds simple on the surface, but is far more complex in practice.
Specifically, deciding when and under what circumstances to enable such measures can be confusing. Moreover, there’s a difference between opting in, in which consumers make a choice, versus opting out, in which consumers respond to a choice that an enterprise has already made for them. “Certainly having permissioned data directly from individuals is very valuable, [and] that’s an opt-in kind of model,” says Suther, adding that current opt-in and opt-out guidelines are situational rather than clear-cut. “I think there’s a role for both in this world.”
Katherine Race Brin, Federal Trade Commission (FTC) attorney for the division of privacy and identity protection, agrees. She says companies should provide consumer opt-ins when collecting sensitive information, such as personal data or data based on geolocation.
Additionally, she recommends that organizations have consumers opt in when using data for a purpose other than the one initially expressed when the data was first acquired.
“Companies should get affirmative express consent, or opt-in, before using consumer data in a materially different manner than claimed when the data was collected,” Brin explains. “So if they’re using the data for something, they can’t then turn around and use it for something different without getting express user consent in that instance.”
Mobile clouds the privacy picture
Mobile confounds the privacy puzzle even more. According to advertising network Chitika, 24% of mobile Google searches have a local element (by comparison, 29% of mobile searches on Bing and 25% on Yahoo are local), which means greater restrictions on consumer privacy. Additionally, shared mobile devices make it harder for marketers to know who they’re tracking—an adult or a child—which affects whether certain privacy regulations apply.
What’s more, consumers are extremely protective of their activity on mobile devices. According to the September 2012 “Privacy and Data Management on Mobile Devices” report by Pew Research Center’s Internet & American Life Project, among the 88% of the adult population that owns a cell phone, 32% admit to clearing their mobile browser or search history. And 19% say they’ve switched off location-tracking settings.
Branam believes that consumers, at this point, aren’t ready to release certain information from their mobile devices—notably geolocation data. “Perhaps many consumers aren’t quite ready for this type of specific advertising and this channel of advertising,” she says.
Another issue with mobile privacy, Brin notes, is that mobile privacy disclosures can often appear too lengthy for a mobile screen and, as a result, are impractical.
“Disclosures have to be clear to consumers. So having disclosures that are [multiple pages] and complicated [are] not effective in the online context in general and it’s specifically not effective in the mobile context,” Brin says. “It’s not the responsibility of the consumer necessarily to try and figure out information about what is happening to their data. That needs to be provided to them in a clear, easy-to-read fashion in the context of their transaction—the mobile context.”
Brin adds that the disclosure issue only intensifies when placed within the mobile app environment, especially when it comes to mobile apps targeting children The FTC’s “Mobile Apps for Kids: Current Privacy Disclosures are Disappointing” report found that more information was being gathered and shared than parents were aware of.
“Many of these apps, most in fact, failed to provide information about the data that was collected through the apps, let alone specific information about the type of data collected and the purpose that it was collected,” Brin explains. “[And] many of the apps shared information with third parties without disclosing that fact to parents.”
Who’s responsible for consumer education?
Everyone agrees on the importance of customer education on the privacy value proposition—essentially, better services in exchange for data.
“I think as an industry we need to start doing a much better job of explaining the value exchange and not being afraid to withhold value based on a user’s choice; without lessening our responsibility to present that choice,” KBM’s Dobbs says.
But whose job is it, within the industry, to provide this education? The answer is not so concrete. John M. Simpson, consumer advocate at Consumer Watchdog, believes responsibility for consumer education around privacy needs to be shared by government entities, businesses, and consumer interest groups. This is easier said than done, however, because it requires these disparate entities to concur on the messaging around privacy to avoid fragmentation and confusion.
Mobile Media Summit founder and CEO Paran Johar says the responsibility should begin with trade organizations such as the Mobile Marketing Association and the Direct Marketing Association (DMA). “They need to provide the framework for educating consumers and communicating that value exchange in terms of what they’re getting for their data,” Johar argues, also noting that if tech firms and companies were to do so individually, the messaging would be too varied.
However, Dennis Dayman, chief privacy and security officer at marketing automation provider Eloqua (recently acquired by Oracle), says that consumers are responsible for educating themselves about privacy issues, while marketers should educate their organization about data and privacy policies.
Perhaps enterprises prefer consumers to be autodidacts when it comes to privacy issues, because many marketers aren’t entirely clear on how to proceed should they be compelled to educate consumers themselves. “I don’t think there’s a best way to do that because you have consumers who span the spectrum of knowledge about direct marketing and what their savviness is, just in regards to being online or offline,” Epsilon’s Branam says.
Acxiom’s Suther also argues that if consumers really wanted to shelter their data, they would take advantage of the free tools to limit data collection. For example, the Ghostery app, available for all top Web browsers, allows consumers to identify and block tracking components that exist on each website.
“You have this interesting paradox. People say they’re concerned, but they don’t take advantage of the wealth of tools that are available that could stop [data] collection,” Suther says.
But Consumer Watchdog’s Simpson insists that businesses also have an obligation to clearly explain to consumers what data is gathered, what it’s used for, and with whom it’s shared. “If they use information in ways they have not disclosed, they should be penalized,” Simpson says.
Fighting over self-regulation
From the standpoint of the federal government and consumers, allowing marketers to self regulate their data policies is essentially a fox-in-the-henhouse type of proposition many business leaders believe that the lack of clarity defining the terminology and responsibilities associated with data marketers means it’s difficult to establish responsible legislation.
For example, when asked for the definition of a data broker—a term widely used to describe Acxiom—Suther demurs. “I don’t have a good answer for you, to be honest,” he says. “The way that people talk about data brokers is so broad that you could include every business,” Suther admits.
But FTC’s Brin argues that legislation could help establish formal guidelines that would provide the needed clarity. “What we’ve called for,” she says, “is broad-ranging privacy legislation that we know would provide us with some of the tools to tackle some of these problems a little more directly.”
Eight congressional members made headlines last summer after sending letters to nine data companies—including Acxiom and Epsilon, that questioned practices on collecting the personal data of children and teens; the letters also wondered if “low value” consumers would lose out on certain opportunities.
The DMA responded, arguing that “unnecessary restrictions on marketing could undermine economic and job growth.”
“I believe that government should stay out of that,” Johar says. “As long as the trade organizations can self regulate that’s a better approach because, ultimately, [they] can pound that insight in terms of best practices, and they weigh pros and cons in terms of the consumer perspective versus the industry perspective.”
Indeed, DMA has created programs specifically designed to assist marketers on best practices related to data collection and use.
Additionally, Suther argues that data marketers already take extra measures to responsibly handle and use consumer data. “We like to think that we do more than what the law requires us to do; that’s the essence of self-regulation,” Suther says. “There’s no law that required us to establish a chief privacy officer in 1991, but we did. There’s no law that requires us to scan 70,000 privacy policies a year, but we do that.”
Moreover, with the constant development of technology, legislation can often appear dated and not with the times, Branam adds. “We want to have a process in place that is fluid, [and] that can evolve with technology, consumer preferences, and as consumers become more educated,” she says.
Although legislation may help tie up a few loose data application ends, it’s inefficient if it’s unclear to whom it applies. Laws that don’t respect or keep pace with technological innovation (a likely happening, given Congress’s glacial tempo) risks being obsolete the instance it’s introduced.
“It’s a fine line because you don’t want to have something that’s so specific that you’re inhibiting technological advancements or innovation,” the FTC’s Brin says. “At the same time, we have to make sure that consumers are protected and that privacy doesn’t fall by the wayside as we move forward with innovation.”