Your phone buzzes as you wander through the supermarket; must be passing another special offer. Like all good shops, they use customer data to hand-pick specials for you based on your shopping and browsing history. Of course, it’s rare that you’re in a shop these days. It’s more likely Alexa has picked up that you’re nearly out of milk and ordered it for you before you realize it’s needed. She’s good like that.
Alexa and Siri talk to each other these days. Just as Alexa knows your online browsing habits—including those sites you’d rather she turned a blind eye to—so does Siri know your offline habits. She tracks your movement via GPS, knows exactly where you are at any given time. That information is fed back to the cloud, where any company can access it. They need to know where you are and what you’re doing so they can anticipate what you might need or what you might be interested in—now, or tomorrow, or next week.
You pause in front of the travel counter; the screen beckons you by name. “Lauren,” it says, “Australia beckons!” Just as your phone buzzes again with a message from your mum. You have been calling home a lot lately; maybe it’s time for a visit. At your feet, a robot brings you some sun cream; it’s on special, too.
As you contemplate the sunshine on your back, a shout from behind catches your attention. A couple is arguing; he pushes her against the rails, then storms off.
You run over to check on her. “Are you OK?”
“Yeah,” she says, rubbing her back. “Bloody Siri told him to get a divorce lawyer. How she knew about my affair, I’ll never understand . . .”
Image attribution: Thought Catalog
We’re not too far off this scenario today; marketers have been tracking and segmenting based on data for years, tailoring and personalizing how they speak with their audiences to ensure greatest conversion. It’s just that most people don’t realize the extent to which they are being tracked.
They hadn’t, that is, until the words “Cambridge Analytica” came into the news, and Facebook’s Mark Zuckerberg faced the US Senate to answer questions about his business practice. We all know by now that Facebook inadvertently allowed the profiles of up to 87 million people to be collected by the political data-mining firm Cambridge Analytica, and that information was then used to allegedly influence the US Presidential election. The revelations by whistle blower Christopher Wylie shocked the world—although, as Professor Mark Ritson wrote in Marketing Week as the scandal broke, it’s “not a scandal of wrongdoing but rather one of modern, legitimate marketing practice.”
This data mining, this customer data tracking, is not new. It’s a case of marketing ethics, yes, but it’s not new. There’s that creepy tale from a few years back, the one where Target figured out a teen girl was pregnant before her father did. This 2012 story from the New York Times goes into great detail about how a statistician helped Target’s marketing department work out the signals that a consumer is pregnant. It’s called predictive analytics, and it was the new frontier in marketing once upon a time.
Then other new frontiers came along: proximity beacons, chat bots, AI and robot writers, adtech. Social media kept chugging along, too, always inventing new ways to engage—and new ways to track and convert. And as we marketers lapped it up, we also assumed consumers would be fine with it. After all, they weren’t complaining, right?
Enter: GDPR. The European Union’s General Data Protection Regulation is some of the most wide-ranging data protection regulation ever seen, seemingly designed to preempt the Cambridge Analytica/Facebook scandal. It sets many, many parameters for how companies can use the personal data it collects—and, indeed, what constitutes personal data—but the big takeaway from any GDPR conversation is the notion of consent. A consumer must consent to provide a company with their data, and that company must then collect and store that data in a way that cannot be identified. They must also, it goes without saying, use that data in an ethical manner—that is, not to influence an election.
Unfortunately, the GDPR compliance deadline of May 25, 2018 was too late for those 87 million people affected by the Facebook leak. The perfect storm created by both of these issues, however, is threatening marketing practice as we know it. The most extreme voices say it’s unethical to segment and target consumers in this way. In a March Reuters poll, some 63 percent of Americans said they would like to see “less targeted advertising” in the future; just nine percent wanted more. Forty-one percent said targeted ads are “worse” than traditional advertising, too.
“I think they make a lot of assumptions that are not true,” poll respondent Maria Curran, 56, told Reuters in a follow-up interview. “It’s like if I show an interest in healthy eating, all of a sudden all of the ads are about weight control and exercise and how to lose weight. I just get inundated.” And that’s something we’ve all seen. When I announced my engagement on Facebook a few years ago, I suddenly was targeted with wedding-related ads.
Image attribution: Dayne Topkin
Back to Mark Ritson in Marketing Week; he writes: “It turns out that, as marketers, we quickly start to lose the perspective of the market as we spend hundreds of days a year inside a company that is launching or managing a product. We start to think the product is the center of the world, not the customer that we are designing it for. We begin to assume the claims we make in the advertising are what the customer should care about. We start using dumb verbs like ‘convert’ and ‘educate’ to describe what we will do with our marketing rather than smarter ones like ‘listen’ and ‘serve’.
“Even though there is a mountain of evidence and precedent that shows that the best way to make money is to find out what the customer is doing and wanting and then design products for them, we start making ‘innovative’ products in a vain attempt to change what they want and how they currently do things. As the great marketing guru Seth Godin puts it, we spend too much time finding customers for our products rather than doing it the other, more successful, way around.”
“It used to be so much easier,” he cries from the brand’s war room. It’s what you all think, but no one dares say. Back when technology was the saving grace, when your audience didn’t know how much you knew about them—and often didn’t care. Now it’s all about engaging, communicating. There’s no instant answer, no data crunching that can be done to tell you where to move next.
Over there, through that door, Darren is conducting yet another focus group. He’s got in a bunch of teenagers to taste test the latest release. The brand head is watching through the one-way window, listening intently. On the bank of desks in the middle of the room, a bunch of interns are debating the next phase of the chat bot; they’ve been trying to get people to talk once they land on the website, but people are so nervous these days, thinking any technology is out to get their personal details.
The marketing leadership is poring over the latest survey results and reviewing the new campaign. They’re assured it’s based on real consumer data, but no one’s really sure these days. It’s all a bit nervy; that last campaign targeting 30-something females bombed so badly that the next move must be carefully judged. Who knew that not all women wanted to get pregnant?
It definitely used to be so much easier, back before that whole data thing blew up . . .
. . . but it’s not all doom and gloom. “A while ago I was looking for a special kind of glove for my job,” Kamaal Greene, a firefighter from Detroit, told Reuters. “I put it in my Amazon cart and forgot about it. Then, later, the ad popped up on Facebook, and I was like ‘oh shoot.’ It reminded me and I clicked on it and bought it.”
Image attribution: Raw Pixel
We’ve all been so wrapped up in GDPR compliance and Facebook scandals that we’ve forgotten the big thing: Yes, there are plenty out there who don’t want to be tracked and see personalized advertising. But there are also plenty who do.
Like Reuters in the US, YouGov surveyed the UK population to get their thoughts on all of this. They then went one step further, segmenting the British adult population using two variables: the degree to which people believe ads help them buy products, and their acceptance of targeted advertising in principle. They found three-quarters of the population are somewhat suspicious of targeted advertising, but a large percentage also believe that those tailored ads help them choose what to buy or are likely to engage with them. And that’s good news for marketers.
I believe the main reason this has caused such a scandal is less about the fact it’s happening and more because it was a dark art that no one talked about. GDPR is forcing marketers around the world to be more transparent about how it works with consumer data and to give those consumers a choice about whether they want that to happen.
Steve Jobs, back in 2010, said something that we should all listen to right now: “Privacy means people know what they’re signing up for, in plain English, and repeatedly. I’m an optimist; I believe people are smart, and some people want to share more data than other people do. Ask them. Ask them every time. Make them tell you to stop asking them if they get tired of your asking them. Let them know precisely what you’re going to do with their data.”
It’s true, you won’t get everyone. There will always be a certain percentage of your audience that opt out. That doesn’t mean they won’t eventually become customers; it just means you need to diversify your marketing mix so it’s less reliant on targeting technologies. Get a bit old school. Talk to people. Make decisions based on what you do know. Think more about the consumer and less about the product. And remember that privacy is not the end of the world; it’s just a new road that needs to be negotiated.
For more stories like this, subscribe to the Content Standard newsletter.
Feature image attribution: Nathaniel Dahan