Our data isn’t just being used to sell us things – it’s shaping how we see the world. Can we achieve ethical marketing in a data landscape that’s evolving at warp speed?
Today, data has an almost universal hold on our culture. From personalised advertising that shows us only what we know we like, to political campaigning that creates a skewed perspective from manufactured ‘news’. Our personal data is being used to create a million alternate universes, none of them quite like the others.
Personalisation: a source of mixed messages?
Nowadays, an organisation can simultaneously serve totally different messages to different people – who may well believe they’re looking at the same thing. Web pages, social feeds, email campaigns, and even individual ads can be versioned and segmented to a highly granular degree. Is that a recipe for inauthenticity, or just a more efficient way to communicate?
Maybe a bit of both. Personalisation in marketing is by no means a negative thing. Many consumers welcome the chance to receive promotions for what they’re actually interested in, rather than being hammered with irrelevant and repetitive ads. There are other benefits to the technology too, especially when it’s applied in non-commercial ways.
Take music and film. We’re living in a time where we can listen to almost any song ever recorded at the touch of a keyboard (or voice command, if you’re technologically so inclined). Algorithmic recommendations introduce us to new artists and tell us about the trends among other people who liked our favourite shows. It’s a far cry from being dependent on radio stations for new music, or saving up to buy physical discs or tapes we may not end up enjoying.
But in order to personalise, companies have to be selective about what they display, and to do that, they use certain rules and assumptions. Those assumptions are invisible to us as individuals, and they can have a powerful influence on our lives.
Think of the popular complaint about social media ‘echo chambers’. These happen when users are served only what they know and like – i.e. what they’ve either clicked on or searched for already. Algorithms take behaviours and use them to deliver more of the same kinds of content. The result is a self-reinforcing circle of information that might feel to the end-user like a balanced view of the world, but totally cuts out certain unwelcome or seemingly irrelevant perspectives.
Who sees what? More segments, less oversight
Our targeted, personalised worlds are tailored just to us, and they’re a whirl of quick-fire messages that disappear as fast as they arrive. The result is that only you are seeing what you see, and there’s little or no public debate or shared awareness to help put it in context. This lack of a joined-up media experience means that a lot of information is being widely spread without ever really making it into public consciousness. Except to those who send and receive them, the content of these messages is a mystery.
Without a single, mainstream channel of information – which the media provided for previous generations, we end up in culture where one hand doesn’t know what the other one is doing. And while we still watch TV and read newspapers, the internet is gaining ground as a main source of news. According to Ofcom’s latest figures, online has mushroomed in popularity as a news source, with 48% of Brits using it to find out what’s happening, up from 32% in 2013.
13% of us use the internet as our only news source. (For comparison, just 1% only use newspapers.) Considering the internet is essentially an unregulated, open-access platform, that’s a pretty broad scope for misinformation.
The great privacy debate
Perhaps the biggest issue with personal data collection and use is the implications for privacy. Organisations – made up of people we’ve never met – know more about us than our families and friends.
We spend a huge chunk of our lives online, doing our shopping, chatting with friends, and even finding partners via the internet. All the while, the platforms we use are collecting data and building huge dossiers of information about our most personal thoughts and preferences. Mostly for harmless purposes, and in the huge majority of cases only accessed by computers, but the information is still there. And as Apple’s Tim Cook pointed out recently, ‘these stockpiles of personal data serve only to enrich the companies that collect them. This should unsettle us.’
Our collected data is pooled across services and companies
While most of us are happy to share some data in return for ‘free’ services like email, apps and other online tools, what we may not realise is that what we share with one organisation is usually pooled with data from many others, creating a combined profile that’s far more powerful than the sum of its parts. Many people simply aren’t aware of the value of their data to companies, or the scale of these combined profiles.
In fact, for many people, the understanding that they own and produce this valuable commodity called personal data, let alone that they might be able to control it, comes far down the line when it’s already been claimed and used by organisations.
That’s one of the reasons the GDPR legislation was introduced across Europe earlier this year. The regulation aims to create more accountability among businesses collecting personal data, and to make sure explicit, informed consent is in place before collection happens. We’ve written plenty about GDPR already, and while in the medium-term it’s an extra task for marketers, it’s likely a step in the right direction for everyone. Whether it’s enough to turn the tide and bring order to the wild data frontier, time will tell.
Facebook-gate: who are the people holding the power?
Who could forget the look on Mark Zuckerberg’s face as he was questioned by Congress over the 2016 election scandal? The Facebook founder ended up in the hot-seat in front of 42 US Senators when it emerged that the data of 87 million voters might have been misused to influence election results.
Cambridge Analytica, a third party data analysis and political consultancy firm, attempted to sway US voters in the 2016 election using data scraped from unconsenting Facebook users. They used this personal data to blast pro-Trump content at carefully selected individuals. The company’s leadership was also filmed admitting to smear tactics, honey-trapping and bribes in its political work.
Events like these highlight the incredible power that’s fallen into the hands of tech companies. What might have begun as a start-up in an undergraduate’s bedroom can, in the space of a few short years, become the company in charge of some huge ethical and social decisions, and have an impact on a global scale.
Political neutrality (or the lack of it) is one example of the influence these companies can have, especially those large enough to basically have a monopoly on their audience. For example, Twitter and Facebook were recently accused of ‘purging’ anti-establishment organisations from their platforms under the guise of an anti-spam push.
Another unexpected area of control is the wellbeing of other, smaller businesses. Tripadvisor, once a straightforward review platform, now holds the power to make or break millions of hotels, restaurants and other leisure venues across the globe, and shapes the worldwide travel industry. To say it has struggled with the meteoric rise in responsibility is an understatement.
Where do we go next?
Two things have become clear: firstly, that there is enormous power in the use of personal data, and two, that governments and societies are starting to wake up to that power and ask questions about who is using it.
It’s clear that data isn’t going away, and we need to harness it in a way that’s not only in sync with the current ethical standards, but that looks ahead to what kind of role data will play in the future. As marketers, our responsibility is to develop an approach to data that’s as transparent and ethical as possible, while still taking advantage of the benefits it can deliver.
For most organisations, the first step is to look at what data you collect, how you use it, and how clearly these processes are communicated to the people who produce it in the first place. If you’re GDPR-compliant, you’re already well on the way to achieving this.
Another place to look is at your social and SEO strategy. At this stage of the game, there’s very little to be achieved with organic search and social reach, and the majority of views will be paid for. That means using the data provided by platforms like Google, Facebook etc, and targeting a subsection of their users. Make sure the platforms you use are consistent with your company values on data and privacy, since they’ll be targeting individuals on your behalf. The same goes for agencies or services you outsource your marketing to.
Make a long-term investment in transparency
It’s certainly possible to get away with doing the bare minimum on informed consent and legal compliance. Although the net is tightening, companies are still in a position to make short-term gains and take advantage of poor public understanding around privacy and data manipulation. But as Facebook’s experience shows, tactics like these are subject to risk, and if you find yourself on the wrong side of an increasingly sensitive debate, the consequences are serious. (Cambridge Analytica, by the way, has now gone out of business.)
Companies with authenticity as a core value will find that consistent messaging, a solid brand and a well-defined set of values will naturally ripple their way out across their marketing, however many channels and personalised experiences that involves.
And when customers are willingly sharing their data with you, along with their feedback and input, real data symbiosis becomes a possibility. True knowledge of your users, when it comes in the form of deep understanding, positive experiences and earned loyalty, can never be a bad thing.