- Evidently
- Posts
- How I measure marketing impact and why it's evolving in 2025
How I measure marketing impact and why it's evolving in 2025
plus: your chance to win a $500 Amazon gift card
We’re partnering with Captivate Collective to put together the Customer Marketing Tech Landscape Report coming in Q1 2025, and we’d love your input.
Take three minutes to fill out this survey. You’ll help us create something super useful for you and be entered to win a $500 Amazon gift card.
There are three big things I’ve realized about marketing measurement:
1. I should’ve paid more attention in my undergrad Excel classes at IU (sorry, Dad)
2. Perfect attribution doesn’t exist and it’s made up in the first place
3. There’s no one-size-fits-all measurement framework and set of metrics to use
That last one comes with a disclaimer.
When people talk about marketing measurement, they naturally want something they can copy, paste, and run with. It’s easier that way.
But here’s the reality: it depends.
Like so much in marketing, the best approach is the one that’s specific to your team and company—including how you measure success.
At UserEvidence, we’ve built our measurement strategy step by step, starting with the basics and improving as we go. This is what worked for us this year, where we’ve found gaps, and how we’re leveling up in 2025.
Fixing the foundation
When I joined UserEvidence, our sales and marketing data was… rough. I made RevOps a big priority early on, dedicating about two and a half months to cleaning up HubSpot workflows and lots of bulk historical CSV updates.
At the time, it felt like a slog—like I wasn’t doing “real” marketing. At least the kind of marketing that gets noticed externally.
Looking back, that investment set us up for success. It was probably one of my first big wins as a VP, right up there with overhauling everything about our website and implementing RevenueHero to fix our very broken lead routing process.
Cleaner data means faster, smarter decisions. And because of this critical foundation work early on, we have confidence in tracking core metrics like board-level KPIs on our GTM team and conversion rates throughout the stages in our funnel.
I also wrote all about building out our pipeline model if you want to go deeper on that.
My takeaway: Get the basics right first. There’s no point in obsessing over attribution, especially if your data is a mess.
The attribution dilemma
If you’ve worked in B2B marketing, you know attribution can only get you so far.
Personally, I don’t spend much time worrying about it.
We use a first-touch model with some agreed-upon logic and a 30-day look-back window. This has worked well for us so far.
But it also works well because we acknowledge that it has gaps and we know what those gaps are.
There’s a lot that never shows up in marketing dashboards: LinkedIn comments, Slack messages, DMs from your ICP.
That’s why I make a point to include screenshots of this stuff in quarterly board meetings and weekly marketing updates with Evan and Ray.
This year, I also prioritized self-reported attribution. It’s another part of the marketing measurement story.
And yes, I know self-reported attribution is flawed too.
Before, we included an optional “where did you hear about us?” field on the confirmation page after someone booked a demo.
We didn’t have much success with this set up. Most prospects actually skipped it.
At Wynter’s Spryng event in Austin, Trinity Nguyen from UserGems suggested we make it a required field on the actual booking form and we did it.
It’s not perfect, but that small tweak gave us insights we couldn’t get before.
For example, while our attribution setup showed organic social drove 5.5% of inbound opportunities this year, self-reported data revealed it was actually 32.6%.
Here’s a little ChatGPT cheat code: I exported all the self-reported attribution responses (intentionally designed as an open text field) and mapped them to the existing values we use for the deal source and deal channel fields in HubSpot.
Comparing our first-touch attribution model to self-reported attribution tells a better data story.
My takeaway: Attribution can be misleading if you’re only looking at it one way. Combining system data with self-reported insights gives you a clearer picture.
Efficiency is the name of the game
Lately, we’ve been digging deeper into the efficiency of our marketing investments. Not just the direct results, but also how marketing impacts the broader GTM team.
Sure, some things can be measured as dollar in, X dollars out. But marketing does more than drive direct results.
It has a ripple effect, influencing what our SDRs and AEs can accomplish. Investing in marketing supports their ability to source pipeline, qualified pipeline, and bookings.
This approach helps me answer the bigger questions like this:
“If we had another $100K to invest, where would it have the greatest impact?”
Breaking it down in a simple Google Sheet (screenshot below) has given us better visibility into where we’re seeing the highest returns and where we can invest smarter.
How we’re looking at the efficiency of our spend, minus the actual numbers. 🤫
Big shoutout to Kyle Lacy for helping me wrap my head around this.
He’s been a sounding board and lifesaver in more ways than I can count (honestly, he deserves his own Evidently edition, but I’ll save that for another day).
Taking measurement to the next level
We’re investing more in marketing in 2025 and this means we need to be more confident about where we’re putting those dollars.
No matter how good our workflows are, the tools we have today limit our ability to get the full visibility we need.
I’ve set aside budget for a revenue attribution tool next year. Not because we’re obsessing over attribution or trying to start a credit war (let’s be real, no one wins those), but because we need to get smarter.
We need to understand what’s driving the right behaviors, so we can double down on what works and stretch every marketing dollar further.
I’m already looking at the usual suspects. Cue my email inbox and LinkedIn DMs blowing up. But if you’ve used tools in this category before, I’d love to hear about your experience. Happy to trade a marketing favor too.
Evidently is going on a little holiday break so I’ll see you next year (I’m officially that guy).
Stuff I’m digging this week
The Long Game with Dave Gerhardt—I asked Dave if I could interview him before Drive with a few catches: we’d do it over golf, ask questions I hadn’t heard him talk about before, and film it all. Check out the trailer and subscribe to our YouTube channel to get the first episode when it drops tomorrow.
MKT1 guide to annual planning—Emily Kramer’s three-part guide on annual planning has come in huge for me as I finish our marketing plan for next year. Definitely writing my own mini-series on annual planning in Evidently real soon.
More non-work content in my LinkedIn feed—Devin and Heike Young shot some hilarious content after recording Reed Between The Lines. The LinkedIn echo chamber is very real so I love seeing stuff like this in my feed. It makes me laugh and always catches me in the right moment.
Opinions are cheap. Proof is gold.
In the latest episode of The Proof Point, we got real meta talking about The Proof Point. I sat down with our senior content marketing manager Jillian Hoefer to highlight the wins, challenges, and opportunities for us to shake things up in season two.
My biggest call-outs:
We’re doubling down on proof-driven, actionable content next year that cuts through the noise and delivers real value.
Virtual interviews have gotten the job done, but event-based, face-to-face recordings are what we’re focusing on next year.
The podcast isn’t just for downloads. It’s a goldmine for social content, internal enablement, and even customer education.
UserEvidence, who?
UserEvidence creates customer evidence content for go-to-market teams, generating verified competitive intelligence, product stats, and ROI data.
Happy customers help you credibly prove the value of your product.