• The Growth Code
  • Posts
  • 🔶 "We replaced 80% of support with AI, without tanking CSAT"

🔶 "We replaced 80% of support with AI, without tanking CSAT"

From layoff to leverage: how Databox restructured support and grew revenue using an AI Bot + smarter knowledge systems.

Read time: 13 minutes

Hey! Sam here 👋 I’ve spent the last 10+ years leading marketing and growth for early-stage tech startups, helping them scale from 0 to 8 figures. Every month, I share actionable insights to help founders and marketing leaders like you drive sustainable growth.

If you're scaling with limited headcount, you’ve probably wondered:

“How do we scale customer support without hiring 5 more people?”

Most founders never find out.

Databox did.

You don't forget a line like this:

That was the opening to a post by Pete. Not a headline. Not a joke. Not a vanity case study. Just a founder stating, matter-of-factly, that they cut most of their support team—and things got better.

The results:

  • 20-60% revenue increase from support-generated opportunities

  • AI CSAT scores: 29% to 75%

  • Support team now focuses on expansion deals, not basic questions

Naturally, I had questions.

Because I've worked with startups who dream of this moment. Founders who want to automate, who want to scale without headcount, but stall the moment they touch Intercom or a GPT.

So I tracked down the guy behind the scenes.

His name is Emil Korpar, Director of Customer Support at Databox. He's the one who implemented the AI bot, built the knowledge system, trained the team, and scaled support as a revenue function.

Here's what I learned from our conversation 👇

If you were forwarded this email, join our free community by clicking below: 

Why Most AI Bots Suck

Most founders think AI support is plug-and-play. Connect your help center, flip the switch, watch tickets disappear.

Wrong.

The problem? Your help articles are written for people who already understand your product. They answer "how to do X" but customers are asking "why isn't X working" and "what the hell does this error mean?"

Your help center says: "Click the Integration tab to connect your data source."

Your customer thinks: "I clicked it. Nothing happened. Is this broken? Did I break it? Should I panic?"

That gap between what you document and what people actually need? That's where AI either becomes your secret weapon or your customer service nightmare.

Most bots fail because they're trained on documentation written by people who know too much, for people who know too little.

🟡 TRY THIS: Audit your current content through the lens of actual customer questions, not internal product knowledge. Pull your top 10 most common support questions. Ask: "Would our help article actually help someone new to our product?" If 5+ are "no," you have a content problem, not an AI problem.

The 3-Platform Knowledge System That Outperforms Any Help Center

How to stop confusing customers and start helping them

Instead of cramming everything into one help center, Emil helped build 3 distinct platforms:

1️⃣ Platform 1: Help Center

  • How-to guides and feature explanations (ie. "Here's how to build a dashboard")

  • Standard documentation most companies already have

2️⃣ Platform 2: Metrics Library

  • Data limitations customers don't expect

  • Edge cases that cause confusion

  • What actually counts toward each metric

This is where Databox documented all the stuff that makes customers go "wait, what?"

Example: Customer wants to track Twitter engagement. Your help center says "Connect your Twitter account." But what it doesn't say:

  • Twitter only syncs 90 days of historical data

  • Retweets count differently than you'd expect

  • If you disconnect and reconnect, you lose your historical data

Databox built a searchable database where every metric has its own page showing:

  • How much historical data it actually pulls

  • What counts toward that metric (and what doesn't)

  • Common gotchas that break the data

  • Which other metrics pair well with it

Why this matters: When customers click on "Twitter Engagement" in Databox, they see this technical info right in the product. The AI can access it too, but users get answers before they even need to ask.

Your version: What are the 10 things about your product that make customers go "that's weird" or "why doesn't this work like I expected"? Document those edge cases.

3️⃣ Platform 3: Databox Community
The "Use Cases" Library

This isn't a typical community forum. It's documentation disguised as real customer scenarios.

Example: instead of a help article titled "How to Build a Sales Dashboard," Emil created a community post titled "How can I make sure I have visibility into my sales team's performance on a monthly basis?"

The post walks through:

  • Why a sales manager would want this (context your help docs skip)

  • Which specific metrics to track and why

  • How to set up alerts when things go off track

  • What this looks like in practice after 3 months

It's written in customer language, not product language. When someone searches "sales team performance," they find this instead of a generic "dashboard building" tutorial.

Your version: Take your 5 most common support requests and rewrite them as business scenarios. Instead of "How to export data," write "How to create monthly reports your CEO will actually read."

Don't ask "what should our AI know?" Ask "what confuses the hell out of our customers, and why?"

Example: Customer asks about historical data for a metric. Surface answer: explain data limitations. True insight: they want month-over-month performance analysis. Document both the technical answer AND the business use case.

🟡 TRY THIS: For the next week, have your support team tag every ticket with one word: "HOW" (how do I do X), "WHY" (why isn't this working), or "WHAT" (what does this mean). Whichever category gets the most tags is your AI starting point. Don't build all three systems at once—nail one category first.

The Revenue Twist: Support as a Profit Center

Here's where it gets interesting. Databox didn't just automate support, they turned it into a revenue engine.

Their support reps have dual KPIs: customer satisfaction AND revenue generation.

The results?

"20-30% more revenue compared to the previous quarter, which later moved closer to 40-50-60% of revenue within a quarter" 

Emil Korpar

How? When AI handles basic questions, support reps focus on discovery and opportunity identification. They're not just solving problems — they're uncovering expansion opportunities.

Emil's framework:

  • Use product usage data to identify high-potential accounts

  • Train reps on discovery techniques during support conversations

  • Create clear handoff processes to sales when opportunities emerge

  • Segment ruthlessly: high-potential accounts get the white-glove treatment. Low-intent users get efficient, AI-first support.

But here's the critical detail: They don't overwhelm support reps with competing priorities. Emil tried having reps do both reactive support and proactive revenue hunting — it didn't work.

Solution: Specialize your team. Some focus on reactive support with light opportunity identification. Others focus on proactive outreach and deep discovery.

🟡 TRY THIS:  This week, have your support team end every conversation with: "Just curious, what originally brought you to [product feature] today?" Track the answers. You'll discover expansion opportunities you never knew existed. Start here before building any fancy systems.

Timeline Reality Check: It Takes a Year

Most founders expect 30-day wins. Here's what actually happens.

If you're looking for quick wins, this might not be for you. Emil's honest about the timeline:

"It took us three months to get everything set up and have it work, so that we could see, hey, this is actually moving in the right direction. It took us all together about a year to be satisfied with the results."

➡️ Months 1-3: Basic setup and initial content creation

  • AI CSAT score improved from 29% to ~50%

  • Focus: Stop the AI from embarrassing you

➡️ Months 4-6: Iteration based on performance gaps

  • Resolution rates steadily climbing

  • Focus: Reading between the lines on what customers really need

➡️ Months 7-12: Optimization and team restructuring

  • AI CSAT score hit 75%+

  • Focus: Revenue generation and advanced use cases

The metrics that convinced them it was working:

  • AI satisfaction scores climbing consistently

  • Resolution rates improving

  • Deflection rates increasing

  • Most importantly: First response time improved (they measure % of chats answered within 5 minutes)

🟡 TRY THIS: Plan for a 12-month implementation cycle. Measure both AI performance and business impact metrics.
Set some benchmarks before you start:
Month 1 target = AI handles 20% of inquiries without escalation.
Month 3 target = AI CSAT above 40%.
Month 6 target = AI CSAT above 60%.
If you're not hitting these milestones, pause implementation and fix your content gaps before moving forward.

Step-by-Step Implementation Framework

The exact roadmap Emil wishes he'd had from day one

Here's the thing: most founders jump straight to AI without doing the groundwork. That's like trying to scale a broken process, you just get more chaos, faster.

Phase 1: Foundation (Months 1-3)

  1. Use your platform's "unanswered questions" feature to identify knowledge gaps

  2. Manually review your most frequently asked questions

  3. Start with existing help center optimization

  4. Set proper team expectations: Frame AI as giving them superpowers, not replacing them

Phase 2: Content Expansion (Months 4-6)

  1. Build your technical documentation equivalent (Databox’s Metrics Library)

  2. Create use case repository based on actual customer scenarios

  3. Implement revenue tracking for support conversations

  4. Begin training support reps on discovery techniques

Phase 3: Optimization (Months 7-12)

  1. Monitor AI satisfaction scores and resolution rates obsessively

  2. Use content suggestion features to identify new gaps

  3. Refine support-to-revenue handoff processes

  4. Consider team specialization (reactive vs. proactive roles)

Databox's content creation hack: They used a custom GPT trained on all their resources to help create community posts. Fed it the raw customer question and context, got structured use case content back.

🟡 TRY THIS: In your next all-hands, say this exactly: "We're adding AI to make you better at your jobs, not to replace you. Here's what this means: less time answering 'Where is the export button?' and more time on expansion conversations that directly impact your quarterly bonus." Then show them the revenue opportunity numbers from companies similar to yours.

If this was helpful, forward it to a founder friend who's drowning in support tickets. They'll thank you for it.

Whenever you’re ready, there are two ways I can help you:

  • Follow me LinkedIn for content like this during the week.

  • Work with me 1:1 to help you grow your business to $20MM ARR. Let’s chat.

    Ideal if you’re a founder doing +$2M ARR

Until next time 👋

Sam

PS. Want the 6 biggest AI implementation mistakes from my conversation with Emil?Reply with "AI" and I'll send it over.

PPS. I’m here to help YOU grow. What’s the one thing you’re struggling with in your business right now?