Unlocking the Mystery of Meaningful Metrics
One of the biggest challenges in the association space can be that, as staff, our business goals and daily tasks mirror those of the corporate world, only without the corporate-sized budgets and resource pools to help us get them accomplished.
Marketing is an area where this challenge is particularly observable. Recently, the Chicago chapter of the American Marketing Association (CAMA) hosted an event for its Nonprofit Shared Interest Group, Nonprofit Round Tables: Your Marketing Challenges and Triumphs, where four roundtable discussion groups focused on different aspects of this unique challenge. I had the privilege of facilitating the “Metrics That Matter” group, where we talked about the difficulties we each faced in making meaningful and constructive sense of marketing data—website analytics, in particular.
The Rabbit Hole of Marketing
A shared frustration among many was inferring meaning from data and translating the data into actionable tasks when it comes to analytics. The Big Data Revolution has given marketers the ability to track and measure their efforts more than ever, but tying numbers to decisions in a meaningful way is a skill that continues to elude even the most experienced. Website data is among the hardest to analyze, because there is rarely one clear way to interpret the numbers and what’s “good” depends largely on what your own organizational goals are and it doesn’t always just mean “up and to the right.”
Each participant shared examples that connected website visitor behavior to some other marketing effort, whether a traditional direct marketing piece or a social media campaign. These stories confirmed what we all knew and reinforced for one another: Websites should not be operating on their own. Each effort and its results could and should inform others, moving us toward an approach that integrates traditional and digital marketing, an especially important shift for associations, which still do a lot of direct mail, for example.
Integrating what we learn from our website with lessons we’ve learned from all other tactics enables us to create a cohesive and more effective experience for the end user. After all, we’re not looking at the numbers to learn about the website, we’re learning about our customers—what works for them and what doesn’t. Marketing data should perpetuate motion, tell us what to change or try next, instead of simply evaluating how we did.
Story By Numbers
Another primary concern for participants in this group was figuring out how to effectively report on website performance to executives and “non-numbers” people. This comes as no surprise, considering the numbers are confusing to even those who are “numbers people.” (I live somewhere in the middle.) What we came up with is that we should be using the numbers to tell a story about what we’re doing and what our customers are telling us.
Effectively communicating the numbers behind our tactics requires exposition that includes context, with a heavy reliance on experiential and individualized knowledge on what your audience values and worries about. The setting includes organizational and individual histories. The characters are our volunteers and customers. Each tactic is like its own plot line. Placing the metrics within a storyline makes the reporting much more accessible.
To Each His Own
A layer of complexity in the world of analytics is a universal one, not unique to nonprofits. I mentioned earlier that there is rarely one way to interpret data. Participants discussed how this is true not only for a website’s metrics as a whole, but even for different points during the marketing cycle and for different tactics. Each of our organizations operates on a different annual meeting and product launch cycle, and each of our customers’ industries had their own cycles as well. Because we’re talking about numbers, you’d think there was some way to land on a hard and fast rule, but the truth is data is subjective—for marketers, at least.
This is why organizational goals and benchmarks are especially useful, to provide a reasonable expectation for performance and background on the personality of your audience. It’s always beneficial to be aware of average metrics for comparable sites, if anything, to set a goal. However, I’ve found that comparing data for comparable periods within your organization’s marketing cycles (for example, the 3 months prior to your annual meeting in previous years) often tells me more about how we’re doing and what’s working. When it comes to metrics, it’s especially important to ensure we’re comparing apples to apples.
This conversation about the subjective nature of data led to an affirmation to one another to keep trying and testing new things. Today’s world of marketing is one that requires a posture of continuous learning, and we agreed that a culture of testing where experimentation and adaptability are valued over tactics and one-size-fits-all approaches was key to maintaining that posture. We talked about the never-ending process of analyzing customer journeys, considering and optimizing the path for a prospective customer or donor, removing any friction points while also giving them something in return for performing the act we had set for them to do. Several people shared experiences that helped them to learn what worked and resonated with their target audiences and sticking to those things. What worked for one organization didn’t necessarily work with others, but consistency and strategy could be seen in each effort.
Community Is the Key
Each time I participate in an event hosted by the AMA, I’m reminded that these are my people and that the real key to success isn’t found in best practice or some magical tool or tactic. The key to success is connection to the community of other marketing professionals, sharing lessons learned and even those yet-to-be-learned.
Each of the other 3 groups focused on its own marketing challenge. The “Benchmark Reports: How Using Information from Benchmark Reports Improves Marketing Efficiency” group, facilitated by Dan Kaplan, marketing manager at the American Library Association (ALA), discussed how the annual Association Benchmark Survey by Informz, their digital marketing service, provided guidance as well as a standard against which to measure their own efforts. Through this process, he noted that staff had to quickly learn about e-mail marketing, and benchmarks were one way to acquire knowledge that they otherwise wouldn’t have, such as typical open and click rates, best days and times to send e-mails, effect of frequency on response, best length of subject lines, and the optimal number of links in an e-mail. Industry benchmarks show collective results, which can inform your own practices, especially when partnered with contextual data.
“How Automation Helps Do More with Le$$,” facilitated by Jake Cashman of the Commission on Rehab Counselor Certifications, discussed how automation could be especially useful for organizations smaller staff size and budgets. Association staff are used to living at the intersection of limited resources and small budgets, so we are always on the lookout for ways to increase efficiency and effectiveness, and this is why marketing automation has gained buzzword status. Participants in this group discussed how they determined categories of prospects and rules for triggering e-mails (simple versus complex), all part of the initial set-up of the campaign. This step was where people felt the greatest need for input and guidance, but once the flows are established, the standardization of campaigns would result in reduced costs, saved time, and optimized ROI.
Karen Schrimmer of the National Association for Healthcare Quality (NAHQ) facilitated the last group, which discussed “Presenting an Edgy Idea to Your Board,” where participants discussed how data can be especially useful when proposing a new idea. She noted that this group had a lively, animated conversation about how to effectively present a concept that might be outside your board’s comfort zone and increase your chance of acceptance.
Mo’ Numbers, Mo’ Problems?
While the Big Data Revolution has made us all more accountable and, therefore, better at our jobs, the sheer abundance of data also creates this faux sense that the gap between what we’re doing and what we should be doing is massive. I say “faux” because I don’t think effective use of data is directly proportionate to the amount of data you’re using, where the more data you consume means you’re doing a better a job. It’s not about how much data, it’s about the right data. What are you trying to learn? What are you trying to achieve? Is your goal realistic and measurable? By tying very specific goals and questions to specific pieces of data, you are better able to determine your success and, therefore, your next steps.
Andy Crestodina, cofounder of Orbit Media Studios and top-rated speaker and expert on all things web, makes a clear distinction between analytics reporting and analysis. “Website Analytics is a decision support tool, not just a reporting tool. Ask questions and find answers. Form a hypothesis, test it, and analyze the data.” If we’re hoping to use data to finalize or confirm anything, we’re doing it all wrong. Though you will definitely find some answers hidden in that sea of numbers, we should really be aiming to ask more questions, test more theories, and, therefore, ultimately have even more numbers to look at.
June Pinyo is the content strategist in AMC’s Creative Media Services department and co-lead for the AMC Content Managers User Group, Marketing Special Interest Group, and Informz Process Owners Special Interest Group. For more tips and conversation on content marketing, follow June on Twitter.
Be the first to know about the latest news and events from AMC. Sign up for our bimonthly emails!