A crazy amount of data is generated on the net every second, and with technology’s ability to deliver on personalisation catching up, the possibilities of what can be done in digital is mind boggling. But when using customer data to personalise digital experiences the best sentiment for organisations to grip to might be “don’t be creepy”.
Being able to use the data is just the first step in personalisation. Setting the right strategy, creating the right processes and thinking about your ‘ethical rule book for data use’ is crucial to making sure your brand doesn’t suffer any long-term damage by making your audience feel uncomfortable with the way their personal information is being used.
3 factors for your rule book.
If your organisation is thinking about embarking on a personalisation journey – whether it be basic targeting, a re-marketing exercise or a complete personalised communications plan – there are 3 simple factors to consider including in your rule book:
1. Be transparent.
Knowing that the content being shown has been personalised makes a big difference to how your customers will receive it. And giving people the choice to ‘opt out’ or change their personalisation settings can significantly help audiences feel empowered.
Take for example, Amazon. They are upfront and clear about what is being personalised and why, and clearly indicate that their product recommendations are ‘based on what you have viewed or purchased’. Which, I think, has contributed to the general public’s acceptance of personal data being used in this way.
2. Be sensitive.
The damage from getting it wrong can be truly catastrophic. Everyone has a creepy story of being stalked by brands on online. As marketers, we sometimes get so wrapped up in the possibilities that we lose touch with reality and forget what it means to be sensitive to the people we’re trying to connect with.
In a study by PEW Research Centre in the U.S.A this year, participant’s (unsurprisingly) rated health information, phone and email conversations, and details of physical location as being the most sensitive data. Perhaps more interesting was that the least sensitive information was purchasing habits, media preferences and political views.
Think about what you’re selling. For example, if you’re personalising for health insurance you may need to think a little more closely about your ethical rule book than you might for say an online retailer.
3. Stick to your purpose.
Fuelling an already shaking reputation when it comes to the ethical use of data, Facebook was slammed in the media earlier this year for their ’emotion experiment‘ in which they altered the content displayed in feeds. Facebook hid emotive words from member news feeds to see the impact it would have on peoples’ reaction to other posts and the posts they created.
Needless to say it raised eyebrows. Firstly, because it was entering into the murky area of human research, but secondly because the experiment had very shaky links to Facebook’s purpose. What does Facebook stand to gain from this kind of experiment?
Unlike Amazon’s product recommendations, Facebook’s purpose was unclear, which continued to fuel Facebook’s untrustworthy reputation.
Set your intent, make a plan, and be open about it.
Before embarking on a personalisation journey, set out your plan. Your organisation needs to have a conversation about how it uses customer data in regards to what it’s comfortable with and what it’s not. Tell your audiences what you’re personalising and give them the freedom to opt-out.
Be polite, and don’t get all ‘Facebook-ish’ on your customers.