The TL;DR
Too many B2B companies create boring, repetitive, and uninspired content that fails to engage buyers or move the needle. It’s time to change that. To truly stand out, companies need to create original, value-packed content that inspires and educates.
We think The Evidence Gap is a great example of this. We created it using this playbook:
- Identify a clear gap—or disconnect—in your business or market you want to validate. For us, it was about better understanding the gap between Customer Marketing teams, sellers, and B2B buyers.
- Reverse engineer your survey by starting with the story you want to tell—your major claims and key headlines for the report are a great place to start. Then, work backward to pinpoint the questions you need to ask.
- Use a small, refined audience to “test” your survey and questions before distributing them to the entire panel.
Most B2B marketing content is pretty, polished, and perfectly useless for sales.
We’re obsessed with case studies that are so overdone they might as well be fiction. We plaster G2 badges everywhere like they actually mean something. (Not to mention big-name company logos that stopped working with us three years ago.)
Talk to your Sales team and you’ll get a dose of reality (real fast):
“We have four products and sell into 16 industries. How will this help my accounts?”
“We’re losing to [insert competitor name] every day. What do I have to show we’re better?”
“This doesn’t help me with mid-market deals or anything my prospects care about. Next.”
Our research for The Evidence Gap confirmed what our CEO Evan Huck and I have seen firsthand: B2B marketers think we’re doing better than we are.
We’re stuck on generic case studies and testimonials while buyers and sellers want more.
If you’ve ever wondered why original research has been getting more buzz lately, that’s exactly why.
Original research based on unique data that your competition can’t copy or get access to.
Having original insights is one of the few ways to have a differentiated point-of-view and truly stand out in a world flooded with post-filled feeds. True, it may not be the easiest way to stand out (yet) but it’s definitely one of the best.
And it’s not just about having content that’s different; it’s about owning the conversation in a way that no one else can.
With survey data, we can create narratives and reports that are grounded in reality, backed by evidence, and—most importantly—aligned with what our audience truly cares about.
That’s exactly what we aimed to deliver with The Evidence Gap—and with over 700 downloads and counting, it’s clear we struck a chord. (Not to mention, a HubSpot feature on the report).
Evan and I did a deep dive on The Proof Point podcast about our process, learnings, and the strategy that made it happen, but I’ll share some of the best tactical takeaways with you here.
Original research starts with a “why”
The key to getting meaningful insights from an audience is to ask clear questions that get at the heart of the matter. (Thanks for that one, George Gallup.)
But to ask the right questions, you need a clear hypothesis—your why. That’s where we started with The Evidence Gap.
Evan perfectly captured the essence of our hypothesis:
“I’d chat with Product Marketing, Sales, and Implementation teams, and they’d all tell me they have big gaps in their customer evidence libraries. But when I’d talk to Customer Marketers, they’d say, ‘We’re crushing it. We created four case studies this year and got great quotes.’”
Sales’ response? “Sure, but those case studies don’t help us win deals because they’re not what buyers want.”
Anecdotally, the disconnect between Customer Marketers, sellers, and buyers was clear.
But did it really exist, and if so, how can GTM teams bridge the gap?
Those were the questions that shaped The Evidence Gap, and while the focus of your original content will be different, your path to “why” will be similar: pinpoint a question or gap in your GTM strategy or industry.
After we fine-tuned our overarching hypothesis, we moved on to survey design. Here’s our best advice:
Survey design: Land on the right questions by starting at the finish line
We started by outlining the story we wanted the report to tell. What was the dream claim we wanted to make? Once we had our vision, along with a hypothesis and a few potential headlines for the report, we worked backward to identify the data we’d need to tell that story and the questions we’d have to ask.
For instance, we wanted to explore whether sellers were genuinely fed up with the quality and diversity of content from Customer Marketing teams. To test this, we started with a simple question: “How often do you feel you have the content you need to close a deal?”
With that question, we could gather the data necessary to uncover a compelling stat and start crafting the narrative we wanted to share. (And, yes, sellers are frustrated, with 90% wishing they had better customer evidence.)
Building on that, we designed a survey with 15-20 questions tailored to three key audiences—marketers, sales, and buyers—with 5 to 6 core questions consistent across all audiences, like this one:
The goal of asking a range of questions is to make sure you’re getting value from every angle. You can’t always predict which questions will lead to insightful answers, so casting a wide net increases your likelihood of striking gold.
Evan put it this way: It’s like throwing a bunch of lines into the water with different bait. You don’t know which one will get a bite, so you try a variety and wait for a nibble.
I’ll admit, this can get messy fast, especially if you’re surveying multiple audiences like we were. So, to keep things manageable (and your sanity intact), consider organizing your questions by persona. A simple Google Sheet worked wonders for us, helping us stay organized as the survey and report came to life.
Pro Tip: Throw in an unexpected question midway through your survey to make sure your audience is still engaged. For example, in a 20-question survey, your tenth question could be something like: “If you were stranded on a lifeboat in the middle of the ocean, would you rather have speedy wifi or a taco truck floating nearby?”
Aligning your hypothesis with your survey questions
I learned a ton by sending my first survey back in November 2023, but one lesson stood out: Sending a survey isn’t as simple as jotting down questions, gathering a bunch of email addresses, hitting send, and collecting the data.
It’s a process that demands thought, planning, and intention. (And yes, I’ll admit that we made it exponentially more challenging by basically sending one survey to three different audiences.)
That’s why, despite the marketer in me wanting to get everything to the finish line ASAP, we decided to slow down. We simplified where we could, moved with intention, and leaned into an iterative process that included a “test run” to make sure we were on the right track.
One of our core hypotheses, for example, is that marketers often see themselves as strong content creators, but their counterparts in Sales don’t always agree. (Evan shared a bit more about this hypothesis on the podcast, if you want to dive in deeper.)
To test this, we asked our panel a straightforward question: “What would you rate the effectiveness of the content from your Customer Marketing team?”
Simple enough, right? Not quite.
Because sellers, being naturally polite, were hesitant to openly critique their colleagues, even when their responses were anonymous.
So, we tweaked the phrasing to encourage more candid feedback without putting anyone on blast and landed on: “Do you wish you had better content to share with prospects and customers?”
One of our biggest learnings from this report: Run a quick test to check if you’re asking the right questions or wording them correctly is a smart way to ensure you’re on the right path before committing to a full panel. How we approached it? We tested our questions with a group of about 30 people before committing to the entire panel.
It’s an agile, low-risk approach to ensure you’re not making any costly mistakes and diving into research that’d drain resources without bringing you any closer to the claims you’re trying to prove.
Hindsight being 20/20…It was ambitious. Most surveys stick to one audience, but we launched one to three distinct audiences: buyers, sellers, and marketers. It was like running three separate surveys at the same time, with the added challenge of weaving all the data into a cohesive and compelling story.
While it wasn’t easy, this three-pronged approach was the only way to fully understand the evidence gap.
So, we polled 619 respondents across mostly tech-related industries (IT and services (40%) and computer software (26%)), with each group receiving a unique survey. The survey ran for 2 weeks and once the responses were in, we spent 1 month analyzing the data, shaping the story, designing the report itself, and creating a promotion plan that would ensure the report got the brightest spotlight possible.
How we promoted it (AKA the real reason we got 700+ downloads so quickly)
In true marketer’s fashion, there was a mad dash at the finish to get this thing out the door. But I had a nagging gut feeling that I needed to do something bigger for launch day to make sure we hit the right audience. So…I did something kind of reckless.
In the 11th hour, I reached out to a handful of influencers within my network who had right-fit audiences for our ICP. I sent them the drafted report, and I asked them to react to a specific stat or finding. Then, we added those quotes to the report for some additional social proof.
This wasn’t only a content play. Truth be told, it was mostly a distribution play. I knew that if I got buy-in from these influencers early, and featured them in the report, they’d be more likely to share it on launch day.
And, luckily, they did.
I was actually interviewing Jillian Hoefer, our current Senior Content Marketing Manager, for her role at the time, and she asked me in our final interview conversation: What did you have to do in order to take over LinkedIn like that this week? You guys and the report were EVERYWHERE.
What we learned
With a well-designed survey and strategic promotion that included an influencer LinkedIn campaign (check out some of the launch day buzz below) and a feature in my email newsletter, our report generated 700+ downloads in 30 days.

The big takeaway? Buyers are hungry for statistical evidence to guide their purchase decisions. Despite that, most GTM teams are falling short (even as their Customer Marketing teams celebrate their wins).
We also found:
- Relevance matters more than ever: Buyers want to know your solution will work for them. “Relevance and specificity are huge components of quality,” Evan explains. “Industries like financial services and healthcare have unique requirements and regulations, and buyers need to know you’ve been successful with others just like them.”
- Rigor isn’t optional: The stakes for B2B purchases have never been higher. A bad decision doesn’t just impact budgets; it can erode credibility and trust and put careers on the line. As Evan puts it, “If you make a big infrastructure investment and it doesn’t deliver, you’re not just losing $100,000, you’re losing your ability to advocate for future investments.” That’s why buyers are demanding data-backed evidence that justifies their spending and mitigates risk.
- The burdens on vendors: Logos and G2 badges might look nice on your website, but they’re table stakes at this point. “When I look at a G2 page,” Evan shares, “I’m not focused on the review scores. I’m diving into the reviews, especially the bad ones.” Buyers need more than surface-level proof—they need substantive, statistically significant, and relevant evidence proving this isn’t your first rodeo.
- The report won’t promote itself: Just like any other piece of content: if you build it, (unfortunately) they will not come. Distribution and promotion on launch day is just as important as the hypothesis, the survey, and the data. Getting creative paid off for us–even if it was an 11th hour idea.
Net-net: The Evidence Gap confirmed the existence of a gap between Customer Marketers, sellers, and buyers, and spotlighted the dire need for GTM teams to fill it. We now have responsibility for filling it, and we hope this report acts as a catalyst to get the ball rolling in the right direction.
Want to explore how UserEvidence can help you turn research into original content that builds authority? Demo UserEvidence Research Content today.