Explore Ethical Use Principles and Best Practices in Personalization
Learning Objectives
After completing this unit, you’ll be able to:
- Describe the biggest perceived risks of real-time personalization.
- Build trust with customers through transparency and consent.
- Describe the differences between intent-based behaviors and demographic attributes.
Trailcast
If you'd like to listen to an audio recording of the units in this module, please use the player below. Listen to the entire module now, or continue to the next unit where the recording starts at the beginning of that unit. When you’re finished listening to this recording, remember to come back to each unit to check out the resources and complete the associated assessments.
Ethical Use of Technology at Salesforce
Salesforce deeply cares about the way we design, develop, and deliver our products. As part of our ongoing commitment to the ethical use of technology, we published our Responsible Marketing Principles. This module is intended to help you apply these principles to real-time Salesforce personalization solutions that connect you with your customers while earning trust in every interaction.
Perceived Risks
Marketers and their colleagues across the business who set up personalized interactions have a tough job. It’s a big responsibility to be accountable for databases of demographic and behavioral data, and to send out messages that are relevant, welcomed, and effective to thousands if not millions of people at a time. Meanwhile, regulations and consumer expectations continue to evolve at a rapid pace. One accident can damage the brand equity you’ve worked so hard to build. This is one reason why trust is so important to Salesforce.
But ethical use of data is more than the regulations. It’s also just good business. Customers are making decisions constantly–not only about a company’s products and experiences, but also about their trust in the company. According to our State of the Connected Customer 2020 report (registration required), 89% of customers are more loyal to companies they trust, and 65% have stopped buying from companies that did something they consider distrustful. In addition, 84% of customers are more loyal to companies with strong security controls, and 80% are more loyal to companies with good ethics.
We know that trust is heavily influenced by a brand’s perceived values and ethics. We’ve found that consumers and marketers align closely on the biggest perceived risks of real-time personalization in marketing:
- Security events, like data breaches
- Data being collected, shared, or used in unanticipated ways
- Personalizing interactions that feel invasive or unwanted to consumers
- Inadvertent bias introduced by relying on demographic attributes for interactions instead of behavioral and engagement data
To help you manage data ethically, we’ve outlined several recommendations to address the risks above. These are intended to help you adhere to the Responsible Marketing Principles when implementing personalization in your Salesforce solutions.
Be Transparent About Security
Consumers are concerned about identity theft risks and invasions of privacy. Marketers are concerned about the impact to the brand’s image if consumers perceive that the company does not appear mindful of, or invested in, security and privacy.
Salesforce has a high standard for security safeguards and responses. We prioritize and provide service-level agreements around any reported issue. Teams of developers and project managers identify, triage, and resolve these findings. Salesforce makes our security documents—like external vulnerability/penetration tests and SOC Compliance—publicly available.
Salesforce is a trusted partner for real-time personalization technology. Being transparent about this partnership with your customers can help you cement their trust in–and loyalty to—your brand. When you win customer trust, you build a foundation for continued and deeper engagement.
Develop Trust Through Consent and Transparency
Marketing Cloud Personalization offers mechanisms to collect data from the open web in order to personalize experiences. However, it’s important to be transparent with consumers. They expect it. Let them know what data you’re collecting and how you’re using it. Provide clear benefits in exchange for that data. Give them a way to opt-out of data collection. And respect their choices regardless.
We can’t emphasize this enough: When building personalized experiences, make sure that you check and obtain consent before collecting data. Ask and collect data only for which you have a valid use case, such as personalization or identification. For example, if current personalization designs don’t require first and last names, don’t collect them. Develop trust by designing experiences that clearly show the data you collect and the purpose for which it’s collected.
If you’re asking for their email, tell them that they’ll receive a one-time offer now and exciting offers in the future. When you’re transparent and forthcoming about data collection and use, those interactions build trust. Don’t bury them in a privacy policy or in the terms and conditions—show them in plain language. Poor technology choices or misconfigurations, like sharing with advertising platforms without disclosure, recommending products they just bought, or emailing someone who hasn’t given explicit permission, can deter the customer from returning to your brand.
Today, we take for granted the ability to stitch together consumers, households, and devices through a variety of methods. Things like cookies and mobile device IDs allow companies to remember who you are without authentication. As we move toward a cookieless future, it’s increasingly important for companies to provide clear value and incentives for sharing data. Give your customers an easy way to log in and stay authenticated. Show them the benefits of authenticated experiences, such as curated content or more relevant recommendations.
Use Data to Personalize the Experience
Personalization means serving your customers. It’s about providing better, more relevant experiences and, most importantly, ones that are driven by your customers’ intent. Within a curated experience, customers are more likely to share personal and sometimes sensitive information when the benefit to them is clear and understood.
We know that 92% of marketers say their customers and prospects expect personalized experiences. But if the personalization doesn’t actually provide any value to the customer, it can feel intrusive or manipulative. For marketers, it’s critical to always center personalization tactics around the customer. Keep these two questions in mind: What value or benefit is the customer getting in exchange for their data? Is it commensurate with what information they’re sharing?
Let’s look at some examples. In Marketing Cloud Personalization, you can personalize an experience with the data collected and tie it back to a browser cookie. Within a web campaign, you can configure your site to display the information collected by Marketing Cloud Personalization. This can be as simple as a banner that shows imagery based on previous interactions, or as sophisticated as a list of recently viewed products.
You can also use Content Zones to gate more detailed personalizations behind authenticated experiences. For example, I might see a banner image of a runner based on my previous interactions on a site’s homepage. But before the site shows me the list of recently viewed products, it makes me log in so that my wife–who also has access to our shared computer–doesn’t inadvertently learn what her birthday present is going to be ahead of time.
Once you have a campaign template that accounts for logged-in and anonymous use, targeting even more advanced use cases, such as progressive form filling, can be safer and more secure.
Prioritize Behavior-Based Intent Over Demographic Attributes
All too often, demographic targeting causes bias and fails to deliver the right messaging to the right people. Let’s say, for example, you run a campaign to boost sales of an anti-aging serum and you target customers aged 55 and over. You’ve introduced bias into the campaign because you assumed that only certain age groups are interested in looking younger. You’re also losing out on potential sales by limiting yourself to a narrow demographic. To mitigate the risk of bias inherent in demographic targeting, use interest- and intent-based targeting. That way, you match your products and services to the people most likely interested in something similar, irrespective of demographic factors. You also avoid reinforcing stereotypes, which can harm your brand.
You can also unintentionally introduce bias into machine learning systems through proxies of sensitive information. When building and training machine learning models, carefully consider which attributes and behaviors you select. To avoid introducing bias into the model, think about which data are most valuable. Omit data that can contribute to bias against sensitive classes, such as race, religion, age, and national origin.
For example, in geographic areas with a significantly high percentage of a particular racial population, there can be a correlation between race and postal code. While race may not be directly input into the system, discrimination can result because of the indirect association. However, the existence of proxy bias doesn’t necessarily exclude postal codes from being usable. If the use of a postal code is a good indication of what store the customer should visit (one within 5 miles is better than those 50 miles away), it can be beneficial and still ethical to use.
When building personalized experiences, focus on your customer’s intent, not just the attributes or demographics you collect. Intent-based or behavioral-based actions can be page views, clicks to products, explicitly setting a preference for a type of item, and so on. For example, segmenting a population based on a binary identifier can have unintended implications. If marketers want to create a campaign for makeup products and only select “women” or “female,” they automatically limit the audience and potential reach of the campaign. Not only can they lose out on a lot of customers, but, worse, they alienate people by using a binary gender identifier.
Instead, for example, use Marketing Cloud Personalization Recipes and its multidimensional catalog to surface targeted intent based on consumer behaviors instead of attributes alone.
Customers Expect Something in Exchange
Let’s review what you learned in this unit. When your customers share personal and sensitive information, they expect you to provide valuable, relevant content in return. It’s important that you are up front about the use of your customer’s data, obtain consent, and provide clear benefits in exchange. As you design and build your personalization solution, applying these principles can help you earn and maintain the customer trust you need to power ongoing and deeper engagement.
Next, let’s take a look at ethical use of cross-channel behavioral (or trigger-based) messaging.