The CRM Landscape

Go-to-market (GTM) teams prominently leverage CRM platforms as their source of truth for account information to drive B2C and B2B strategy. Account data is inherently complex, and keeping a CRM platform up to date with high-quality, enriched account information is a major challenge. 

Ultimately, an optimized CRM platform provides accessible information and insights, automated processes and updates, and embedded analytics to a GTM team’s workflow. The manual nature of data collection and the limited computational power in CRM platforms makes this aspirational state difficult or impossible to achieve.

At Snowflake, we experienced these same challenges in our Salesforce CRM implementation. Lack of consistent data between the CRM system and reporting tools used by the sales team, a lack of a reliable and stable system supporting GTM data led to eroded trust in our data and the inability of our GTM team to make strategic decisions about accounts at scale. 

Early in our CRM journey, our Head of Sales made a seemingly simple request:

“Please make our account scoring better.”

The Sales team had been working with a legacy account scoring mechanism but had no context around the data they were provided and no insight into how to interpret the scoring model. Account scoring had become untrusted, unused, and irrelevant. And so we realized we actually had two problems to solve:

  • Improve the accuracy and interpretability of the account scoring mechanism.
  • Regain the trust of the Sales team and the team’s engagement with account data and scoring quality.

To simplify and encompass the breadth and depth of the data challenge at hand, we leveraged the following three steps:

1. Define CRM implementation requirements to ensure high data quality. Data quality is the foundation of data-driven strategic insights for all business applications—and it’s nuanced and complex in CRM systems.

2. Improve account enrichment with multiple internal and external data sources

3. Redefine the scoring methodology to include data science and human intuition.

These overarching objectives now drive our Salesforce CRM strategy and continue to yield phenomenal outcomes for our business across growth, cost, and risk.

How We Improved Data Enrichment at Snowflake

We set out to change data enrichment by using Snowflake’s solutions, because our CRM data was one of the key inputs for business decisions that needed to be improved. 

  • Define. After reviewing what was leading to low data quality in the CRM system, we identified these leading drivers: 
    • – No data standards were used. 
    • – No minimum data quality was defined or enforced. 
    • – There were no data enrichment processes. 
    • – There was a disconnect between data availability from vendors and the CRM system.
  • Improve. We did the following for this step: 
    • – Set up minimum data quality standards for each entity and ensured uniform classification 
    • – Snowflake uses secure data sharing with data vendors to enable easier access and faster data refresh frequency
    • – Wrote algorithms to find a consensus on which data vendor source to pick
    • – Automated the process to run daily with an ever-increasing number of data points being monitored and refreshed
  • Redefine.
    • As we shared the data and data analytics tools with the members of the Sales team, we soon learned the shortcomings of certain data partners and how far-reaching the implications of stale data could be, because data is widely used within the company now in multiple, critical reporting and planning processes. As a result, we wanted to move towards a “hybrid ownership” data approach where data providers and the Sales team together are responsible for the data quality of data from the companies they partner with via Snowflake. Hybrid ownership is enabled by providing the Sales team with a system that encourages providing feedback on data quality through a number of possible feedback loops, which in term generates a data ecosystem where data partners’ data is connected through enrichment to the Sales team. The Sales team then verifies the data and provides any feedback on the data. Feedback gets reviewed by planning teams for approval. 

How We Improved Account Scoring at Snowflake

Before, human intuition was used in the sales decision-making process, which led to a higher degree of unconscious bias about sales territories. In addition, many potentially good accounts were not invested in. As the Sales organization grew, the complexity of the account scoring process increased, and it became increasingly difficult for the Sales team to construct, modify, and expand territories.

  • Define.
    We identified a number of pain points for this step:
    • – Intuition does not scale. 
    • – Intuition is not data-driven, which leaves room for a lot of error.  
    • – The process was outsourced and not aligned to the business users. As a result, outcomes did not align to what the field needed. 
    • – It was incorrect to rely on intuition and to not take into account what the field needed. 
    • – There was a disconnect between the creators of the CRM processes and the end users of what information would be helpful to drive decisions and sales cycle 

We decided to tackle this problem by using machine learning (ML) models that aimed to answer the following questions:

  • What is the potential of each account with Snowflake (technical account management)?
  • How likely are we to acquire an account if we try to acquire it (account propensity)?

Snowflake’s Data teams tackled the problems and developed the first versions of ML models leveraging Snowflake’s data sharing features and infrastructure. 

  • Improve. For this step, we did the following:
    • – Identified obsolete and redundant accounts that were creating noise in the ML models and then removed them
    • – Rescored accounts using an internal process and an external vendor such as Everstring that provided data services 
    • – Conducted interviews with the teams to help them determine the outcome variables 
    • – Created a human computation layer to account for human intuition 
  • Redefine.
    • – By operationalizing the ML models, we learned a number of lessons about how to maximize the Sales team’s role in the sales processes. The most important lesson was that the team needs to have a very clear role in the process and a clear idea of what they aim to improve. By doubling down on their use case for the planning and account exchange/swap process, we were able to redefine ML model roles and specialize them for a particular task. 
    • – Propensity modeling focused on the speed of acquisition to encourage Sales Reps to focus their activity on “hot” accounts, which in term shortened the sales cycle and enabled Sales Operations teams to assign better accounts and increase the number of active leads.
    • – The total addressable market model focused on finding bigger and better accounts, which in term enabled optimized segmentation that was discovered by using a combination of 100+ account features heavily aimed at understanding each company’s IT spend and cloud index. 
    • – In terms of results, both initiatives brought a number of improvements to our teams across the company and improved how data is perceived and used. Specifically for the Sales organization, those improvements translated to having better Sales Reps on better accounts and achieving higher productivity. For Snowflake, that resulted in a better and more adaptable GTM strategy. 

Ensuring Success

Operations teams must determine an optimal architecture that is flexible and agile enough to adapt to future needs of the business. When managed incorrectly, a CRM platform can become clogged with inaccurate, non-critical and outdated data, leading to incorrect decisions. 

Having a successful GTM strategy requires integrating and reconciling many sources of data, running complex advanced analytics, and ensuring a cadence that keeps the platform data clean.