Skip to main content
Register now for TDX! Join the must-attend event to experience what’s next and learn how to build it.

Detect and Handle Inconsistent and Unreliable Field Values

Learning Objectives

After completing this unit, you’ll be able to:

  • Identify inconsistent values that should be standardized.
  • Know where there are unreliable values that can lead to an incorrect understanding.
  • Detect single-value or default-only fields and describe how to handle them.

Small Field Issues That Have Big Impact at NTO

As Luna profiles Northern Trail Outfitters’s (NTO’s) customer and order data, she sees that many issues are not obvious. They are small problems that are easy to miss.

Some fields show the same meaning in different ways. For example:

  • Over time, labeling has changed, so identical issues might not be categorized consistently.
  • Country and state values are entered in different formats across systems, like CRM, ecommerce, and point of sale. This forces teams to fix the data before they can use it.
  • Some fields look complete but aren’t. A recent compliance inquiry regarding the Email Opt Out field required that all recent contact records had this field set to No.

These issues can cause problems later.

  • Customers and end users become frustrated when values are inconsistent or misleading—self-service fails, cases get misrouted, and people can’t find the right answer.
  • Developers waste time on one-off transforms, mapping logic, and repeated cleanup scripts just to make basic reporting and automation work.
  • Leaders lose confidence in AI. If the data is not reliable, AI tools cannot make safe or accurate decisions.

Detect and Decide: What to Review and What to Do

Luna used Salesforce Customer Success’s Data Quality Management Framework and data profiling to find risks in her data and understand their impact.

Data profiling helps her decide how to fix these issues. She can cleanse the data at the source or use Data 360 tools to standardize, filter, or remove values before unifying the data and using it downstream.

Scenario

Data Profiling Signals

Cleansing Approaches

Check for consistent data or find values to standardize.

Within a single object:

Identify string fields (for example, text or picklist) with a manageable distinct value count (for example, <= 1000).

Review top and bottom value frequency to find long tail variants that appear rarely and likely need consolidation.

Use bulk data cleansing tools (for example, AgentExchange solutions) or controlled update jobs after data steward decisions to standardize values at scale.

Across related objects or sources:

Compare semantically similar fields (for example, CRM Contact versus Marketing Cloud Subscriber).

Review the average value’s length and frequency to detect mismatched standards.

Examples: Country or state variations across CRM versus ecommerce or POS or inconsistent status or stage labels.

Use Data 360 data transforms to standardize values, with data model objects (DMOs) that provide a consistent view across objects.

Create a crosswalk or value-mapping table that provides source-to-target mapping rules for your automated processes. For example, most common country name standardization logic relies on ISO 3166-1 reference, with English short name, alpha-2, alpha-3 codes, representing the United States = US = USA.

Find values that should be cleaned or filtered.

Identify values that appear disproportionately, for example, 10 times more often than the average across all field values.

Identify if they are bad values (for example, na@na.com email) or if they represent good but not fit-for-purpose content (for example, a shipping facility where orders are sent to contact the customer for marketing mailers).

Clean invalid values at the source, when possible, using bulk-cleansing tools.

When source cleanup isn’t feasible or when you need to preserve raw values, use Data 360 transforms to filter, normalize, or exclude misleading values from matching and unification inputs, or the pieces of data used to decide if records belong to the same customer and should be combined.

Use top and bottom value frequency results from data profiling tools in your data transformation pipelines to streamline and automate cleaning.

Find fields with little or no useful data.

Identify fields with a single distinct value, especially if that value is the default.

Be mindful of null values, for example, in binary fields, where the absence of data can itself be a value.

Examples: Email Opt Out set to No on nearly every contact and status fields left at default.

If a field is only populated with a default value, re-evaluate the default or the validation rule.

For all other fields, identify the root cause. If the data can be corrected using a third-party source, address it through enrichment. If the data cannot be reliably corrected, exclude the field from scoping, user access, and integrations, and consider deprecating the field due to low data quality.

For NTO, a contextual and correct understanding of each customer is essential. Using data profiling insights, Luna analyzes which email addresses, phone numbers, and mailing addresses appear repeatedly.

For example, placeholder values such as na@na.com, shared phone numbers, or generic addresses appear across many records. While these values might exist in transactional systems for operational reasons, they don’t represent reliable identifiers for understanding the customer.

Luna evaluates these patterns and decides how to handle them. Instead of deleting the values outright, she designs a process that keeps them available in the source systems, where they might still be needed operationally, and filters them out before identity resolution occurs in Data 360 to prevent false matches.

Then, Luna assesses which fields in different source systems need to be reconciled to have the same meaning. To ensure a consistent understanding of fields like customer sales by country or marketing opt-in preferences, they must have consistent values. Luna designs a process in which Data 360 DMOs provide access to unified information, and Data 360 data transforms standardize data that is inconsistent across different sources.

This approach allows NTO to retain operational data while ensuring unreliable values do not distort the customer profiles used for analytics, automation, and AI-driven experiences.

Continue Your Journey

In the next unit, you explore another important data-cleanup decision: Which records should remain operational, and which should be archived or purged?

Resources

Salesforce 도움말에서 Trailhead 피드백을 공유하세요.

Trailhead에 관한 여러분의 의견에 귀 기울이겠습니다. 이제 Salesforce 도움말 사이트에서 언제든지 새로운 피드백 양식을 작성할 수 있습니다.

자세히 알아보기 의견 공유하기