Data professional wearing a hoodie on a path looking at a futuristic data-powered city on a short wide tree growing in a canyon. The trunk of the tree is visible. The city buildings all sit on top of the tree cover and build on top of each other. At the top of the buildings is a large data-powered computer that is underlit.
Salesforce Data Quality is Top of Mind for Many
As a Product Owner and R&D Leader, I constantly remind myself to look for the human considerations that drive choices. I look for opportunities to empathize with what people value and are afraid of — so I can know where to meet them. If you want to be in service to people, you have to empathize with their pain, wants, fears, and experiences.
This especially applies to assessing data quality in the Salesforce ecosystem. I’ve spoken with well over one hundred individuals within the community this year — starting at TDX, at Connections for our Cuneiform for CRM Product Launch, and ending with Dreamforce ’23.  Customers, practitioners, architects, administrators, executives, and partners. There is a consensus within the community that data quality is essential. This was especially true at Dreamforce, with Salesforce’s Data and AI announcements. Interestingly, however, there doesn’t seem to be a consensus on how to practice this belief.
We’ve all heard the sayings describing the impact of low-quality data on systems. Garbage in; Garbage out (or Rubbish in; Rubbish out in the UK). Today, we call incorrect GPT responses “hallucinations”. I like Mark Benioff’s take on what to call incorrect AI responses: lies. As AI embeds itself into the mainstream, we need to be more skeptical about the quality of our data and the impact this data has on decision-making.
If we continue to rely on data that is probably right, we will eventually use that data to make decisions that, in hindsight, we knew would be definitely wrong.
Companies Know the State of their Data Quality
Across each of this year’s Salesforce conferences, I spoke with administrators, architects, line of business leaders, practice leads, and executives about data quality and what they’re doing to address it. They all told a version of the same story.
They recognized the state of their data quality and its impact on their business (their data quality wasn’t great; there was a definite cost to the business, but the business continues to grow). They also recognized the value of investing in improving their data quality — and the importance of tying these investments to business outcomes. Most also acknowledged lacking the resources or expertise to take their data quality challenges head-on. Many shared challenges trying to justify an initial investment in assessing data quality. The ones that were able to secure initial investments reported mixed results.
What was interesting about these stories was that they shared similar considerations. Individuals who obtained investments for data quality assessments and those who didn’t both shared the exact critical details of their experience:
-
- The entire process proved time-consuming.
- Scoping data quality challenges was complex.
- A lack of resources and expertise hindered effectiveness.
- The business impact of assessment findings was challenging to identify.
- Results were often outdated by the time they were presented to stakeholders.
- Findings repeatedly formally stated what stakeholders directionally already knew.
- Delivering results in a manner that resonated with stakeholders was elusive.
- Demonstrating value and justifying the investment poses challenges.
In other words — the assessment process was challenging to complete. Individuals didn’t have the expertise to conduct the assessment, compile the results, explain data quality’s impact on their business — and perform their day job. The exercise was too individually expensive to complete promptly. Consequently, the assessment lost its value, and the diluted results failed to resonate with stakeholders.
Traditional one-time data quality assessment approaches are incompatible with today’s business needs. Businesses need a fast, low-cost, and repeatable assessment process aligned with their data ingestion and processing tempo.
The Challenge with Salesforce Data Quality Assessments
Most first-time data quality initiatives follow a fairly standard process. First, target data that the business perceives as underperforming and of low quality. Assess and measure the data quality of these records. Compile the results and prioritize what records (and underlying processes) should be remediated first. Drive priorities using the correlated business value obtained by remediation. Then, correct these items, reassess the data, and measure the business impact of the corrections to verify progress.
This is a straightforward approach to data quality that most Salesforce customers can adopt. The challenge with this approach is the cost (it is expensive to perform) and the ROI of the data quality assessment (it isn’t easy to measure).
The first goal of a data quality assessment should be to quickly produce tangible results that inspire the next set of actions. Unfortunately, data quality assessments rarely do that. They often take too long and require diverse skill sets that fall outside those of teams driving the assessment. This combination can produce immediately outdated results. In the days or weeks it takes to review and process assessment results, the initially assessed data may have changed so much that the assessment results are no longer valid.
These considerations are especially true for customers on the Salesforce Platform. CRM data, after all, waits for no one.
An expensive, outdated data quality assessment that excludes business impact from its results will not be sticky with stakeholders. This lack of stickiness is one of the reasons that data quality assessments (and data quality initiatives) don’t get investment or backing from stakeholders. If stakeholders are not presented with relatable and actionable assessment results, they will continue to perceive their data quality to be sufficient — given their constraints.
Operating constraints influence companies to settle for perceived or sufficient data quality unless presented with alternatives. Data quality assessments should counter this by offering valuable insights more affordably — challenging stakeholder limitations.
Rethinking the Data Quality Assessment Approach
If we want data quality assessment results to be sticky and actionable by stakeholders — we have to think differently about how to perform the assessment, generate the results, and demonstrate business impact. Data quality assessments must be fast, leverage trustworthy data, show results in a consumable and relatable format, and tie results back to business impact.
Stakeholder and business needs are driving these data quality assessment requirements. As Salesforce announced at Dreamforce ’23, it wants to change how businesses work with customers — again. Its Data Cloud and Einstein 1 products are designed to ingest and unify customer data from multiple sources to drive digital interactions — powered, of course, by AI.
Data quality assessments must match the velocity at which companies capture, nurture, and leverage data for business outcomes. Companies will never have less data than they have today — and AI automation needs will only increase.
Use Cuneiform for CRM to Assess Salesforce Data Quality
If we want data quality assessment results to be sticky and actionable by stakeholders — we have to think differently about how to perform the assessment, generate the results, and demonstrate business impact. Data quality assessments should be 100% declarative and customizable. They should create results in minutes. The results should be relatable and consumable by technical, business, and executive teams. And they should support automation.
Cuneiform® for CRM is a data quality monitoring solution available for free to Salesforce customers via the Salesforce appExchange. As a 100% native Salesforce solution, Cuneiform® for CRM accelerates and automates data quality assessments in CRM orgs. It offers modern profiling capabilities, unprecedented scale, and a familiar interface with all the data management insights necessary to assess and monitor CRM data and metadata quality and reliability.
Capable of profiling millions of records in minutes, Cuneiform® for CRM provides the most comprehensive collection of data management insights in the AppExchange. It is also declaratively extensible, allowing users to measure the data quality of profiled records via customizable KPIs — and correlate those KPIs to business impact metrics on Salesforce objects.
If you don’t know the shape of your CRM data, have data quality issues with no way of measuring them or prioritizing them, or don’t have the resources to conduct a data assessment on your own — use Cuneiform® for CRM today to automate your data assessments.