HomeBlog Keys to Building Great B-to-B Measurement

Keys to Building Great B-to-B Measurement

June 16, 2015 | By Ross Graber

  • Best-in-class b-to-b measurement is the product of a deliberate, process-driven approach
  • Using the SiriusDecisions 2015 Summit Measurement Program of the year winner, Dell, we highlight keys to success
  • Dell’s case study proves the necessity of custom-fit strategies and best practices to develop measurement programs

My favorite part of being an analyst is helping clients and watching them succeed. At SiriusDecisions Summit, I had the privilege of highlighting client successes in developing measurement, as we announced the winners of our Measurement Programs of the Year awards.

Thinking about Measurement and ResultsGreat b-to-b measurement doesn’t just happen as a result of buying the right technologies or through a clever selection of metrics. Great measurement is the result of a deliberate, process-driven approach built around the business’s goals and visibility needs. It’s much easier said than done.

To illustrate what this looks like, I’ll highlight elements from the measurement journey of 2015 Program of the Year winner Dell. Roberto Conterno, director of global marketing operations, led many phases of the measurement effort and shared the team’s story with me. As you develop your organization’s measurement program, consider these keys to Dell’s success:

Focused on the business problem. Dell leadership needed greater visibility into marketing performance across regions. While operating at a significant scale, the organization had developed marketing reporting organically in many operating areas, and the results were difficult to compare. Leadership needed a more standardized and structured way to compare performance to streamline decisionmaking. At the same time, the team realized that any effort would need not only to provide visibility for executives but to support field marketers improve their performance. These needs focused the measurement development effort.

Understood the stakeholders and their needs. While the project team had a strong measurement framework and metrics vision they used to kickstart the process, they understood the critical role of stakeholder participation in creating a comprehensive measurement program. They made an effort to identify the different reporting users and worked with them to understand the business questions they wanted to answer. This wasn’t a matter of asking stakeholders, “What metrics would you like to see?” but “What decisions are you trying to make?” and “What would you need to know to feel more comfortable making those decisions?” It was also a way to push back on non-critical metrics by asking, “How would that metric enable better decisions?” This process ensured that the metrics selected would collectively answer critical user questions.

Those same questions helped build a logical structure within and across dashboards. “What is and isn’t going well, and where?” questions guided most executive dashboard designs. The executive dashboards were designed to serve as heat maps to direct further inspection into atypical performance areas. Understanding that campaign managers and field marketers would be accountable for responding the executive inquiries, the team used the “Why are things going well or not?” question to guide development of the next layer of dashboards.

Applied hierarchy. As with many organizations, marketing was engaged in a various activities executed for a various purposes. The team recognized that its approach to measurement required organizing principles to impose order on what would otherwise be a broad, disconnected and difficult-to-decipher set of metrics. The team created a hierarchy that differentiated executive-level visibility needs from program-level needs focused on measurement areas such as demand creation, reputation and sales enablement. And they used our workstreams to build the different components.

Normalized the vocabulary. One of the leading challenges global organizations experience when comparing performance is understanding that the underlying data means the same thing across all regions. Investments need to be captured and coded in the same ways; for example, if regions are going to produce cost-per-lead information, there must be a common understanding of demand creation spend. Dell standardized the vocabulary used at the point of data capture by developing standards, socializing those standards across the business and deploying technology to support the process.

Evangelized the vision. It wouldn’t have been sufficient to develop measurement the organization didn’t understand or know how to use to improve performance. Dell articulated the end-to-end metric framework at a high level and actively used it to help the organization understand what would be measured, how it related to business performance and how it could be used to make improvements. The team spend time sharing this with stakeholders and educating them on how it works – because measurement systems need to be used in order to be useful.

Developing b-to-b measurement is never a one-size-fits-all activity; measurement must always be custom-fit to the needs and strategies of the organization. Dell’s approach is chock-full of best practices that b-to-b organizations would be well-advised to follow as they develop measurement programs of their own.

Congratulations again to Roberto Conterno and the team at Dell for being recognized as a Measurement Program of the Year.

Ross Graber

Ross Graber is a Senior Research Director of Marketing Operations Strategies at SiriusDecisions. He brings over 15 years of b-to-b marketing experience with focus spanning marketing measurement, demonstrating ROI, data management, process development, marketing technology, customer marketing and sales enablement. Follow Ross on Twitter @rossgraber.

Featured SiriusEvents®

Join Us at #SDSummit