The general public is increasingly cynical and suspicious of how technology – particularly social media – captures and uses personal data.
But the only reason data has become so valuable to these platforms is because marketers are willing to pay for access.
“If you’re not paying for it, you’re not the customer; you’re the product.” You’ve probably come across versions of that quote many times before – often in relation to social media. It’s not a new idea; similar observations were made about television advertising nearly fifty years ago.
But the nature of the product is different now. With television, all the advertiser was buying was access to the attention of a largely anonymous audience for thirty seconds or so – assuming we weren’t putting the kettle on. Targeting was about as sophisticated as showing commercials for laundry powder during daytime programs when housewives were more likely to be watching (hence the creation of the soap opera).
But in the digital age, advertisers are buying far more than our anonymous attention. They’re buying access to who we are; our wider interests, behaviours, beliefs, relationships. And consumers are increasingly uncomfortable about it.
While Google remains the heavy-weight champion of surveillance capitalism, social media is attracting most of the negative attention right now. There are the ongoing arguments about the spread of misinformation, weaponising social media to influence voters during an election. And then there are the books and documentaries – like recent Netflix chart-topper The Social Dilemma – that characterise social media as an unhealthy addiction eroding attention, impacting relationships and damaging the very fabric of our society.
Surveillance capitalism describes how companies now view private data – particularly data that captures how each of us thinks and behaves – as a valuable commodity. And the marketing industry is the primary customer for this commodity, dazzled by data-driven ad platforms that promise levels of segmentation those old radio and TV advertisers could only dream of.
Instead of targeting programs more likely to have a higher percentage of housewives in the audience – putting aside the outdated misogyny for a moment – laundry powder brands can use re-targeting technologies to serve advertising to anyone who recently searched for a new washing machine or looked for tips on how to remove stubborn stains. To achieve this, Facebook (and others) default privacy settings allows it to track a user’s activity across the internet – not just within Facebook.
This can seem a lot like stalking to users who suddenly realise one or more tech platforms is effectively watching over their shoulder – not necessarily for the user’s benefit, but to benefit the platform’s real customers; marketers.
So, if the marketing industry is enabling and fuelling this surveillance capitalism, and the general public increasingly sees it as a problem – to the point that #DeleteFacebook has become a thing – marketers need to pick a side.
I don’t mean brands should leave social media – that’s still where the audience is – but they should look beyond the attractive numbers to apply ethics to their digital and data strategies. This is why many businesses are adopting data ethics guidelines on top of any existing IT or social media policies
Instead of asking what we can do with all this data, data ethics considers whether we should. Here are a few things to consider when developing your own data ethics guidelines.
1. Is how you collect and use data consistent with current community expectations?
Don’t assess your data practices solely in terms of whether they comply with today’s regulations, because regulations will change – such as when the European Parliament introduced its General Data Protection Regulation (GDPR) in 2018.
Instead, review how your business collects and uses data in light of current community concerns about data privacy, as these concerns will likely drive future regulations.
2. Is the same true for the external platforms, tools and services you use?
Even if you’re not directly engaged in creepy or questionable data tactics, you may still be benefitting from these practices when using other marketing services and platforms.
Do you know how that ad platform decides where to place your ad? Do you know where the data comes from to make that handy feature possible?
3. How comfortable would you feel explaining to someone how you were able to target them?
Sometimes, the best test is not whether a practice makes you uncomfortable – because you also stand to benefit. But if you had to explain to a concerned friend or family member how that ad appeared on their screen, and which of their data made that possible,
It’s easy to say that social media’s bad reputation of late is nothing to do with us or that we’re powerless to do anything about surveillance capitalism because that’s the system we’re in. But we’re the business model. We’re the customer.
Without government intervention or new regulations, we’re the only ones who can bring about change.