


Recovery.com Director of B2B Marketing.




Recovery.com Director of B2B Marketing.
Behavioral health marketers hear the same mandate on repeat: be data-driven. But with dozens of dashboards and just as many opinions, it’s easy to confuse numbers with insight. In this conversation with Matthew Fung-A-Fat, a digital marketing leader at Acadia Healthcare, the fog clears. The focus shifts from “more metrics” to measuring what matters, designing simple experiments, and treating every click like a person asking for help.
Here’s a deep, actionable listicle that distills the episode into playbooks any treatment center—local IOP or national network—can put to work today.
Operators fixate on admissions. Marketers earn those outcomes by measuring what drives them. Track the whole path: website users → on-site engagement → conversion (call/form/chat) → qualified lead → admission. When the final number dips, knowing which upstream stage stalled lets teams fix the right thing, fast.
Core metrics to instrument:
Not all calls are equal. Insurance questions, wrong level of care, and distance deflections all register as “a call” but predict different outcomes. Tag calls with basic qualifiers (viable/not viable + reason) and read transcripts. Even a simple binary gives far clearer optimization signals than raw volume.
If marketing sees source/medium but admissions doesn’t, no one can connect campaign to call quality. Push source/medium and landing page data into the admissions view and bring admissions feedback back to marketers. Weekly walk-throughs turn disconnected analytics into shared decisions.
A universal “good dwell time” misleads. Inpatient/acute seekers want clarity and speed; shorter sessions with high call rates are healthy. Residential prospects research; more pages and longer time on site is normal. MAT visitors often want logistics and availability; design for quick answers and direct actions.
Practical rule of thumb: define an “engaged user” for each service line (e.g., ≥90 seconds + a meaningful interaction). Judge content by that benchmark, not a generic site average.
Clever buttons (“Begin your journey”) create hesitation. Clear labels (“Submit a web form”) reduce uncertainty and lift conversions. In a multi-location test, the envelope icon + “Submit a web form” outperformed an icon alone or vague wording. People click when they understand exactly what happens next.
Experimentation is a means to reduce friction for people seeking care. Prioritize tests that shorten time to help: button copy and size, on-page phone visibility, insurance clarity, location proximity cues, and “what to expect” content. The question to ask before launching any test: Will this change make it easier to get care?
A network with 170+ locations can split tests by location cohort; a single site can split by time (A for odd days, B for even days) or traffic source. The point is to control exposure and gather enough signal. Keep the design simple: one primary hypothesis, one clear KPI, enough runway to reach confidence.
Chasing lower cost-per-“time on site” looked great—until calls and admissions didn’t budge. When a proxy metric becomes the north star, budgets drift toward cheap impressions and away from real impact. If the business metric (qualified calls, admissions) isn’t improving, reset the optimization target—even if it means rethinking the attribution model.
If attribution is shaky, the answer isn’t always to optimize to “safer” metrics. Consider multi-touch or U-shaped models that respect both discovery and decision touchpoints. Then re-align bidding to the best available lead quality or call outcome signal, not just on-site behavior.
Listening to calls is non-negotiable. Patterns in questions reveal content gaps, CTA ambiguity, and trust blockers. Pull phrases directly from callers into headlines and FAQs. When families ask, “How do I pay?” or “What treatment is right for my daughter?”, build resources with those exact words. That’s real SEO and real service.
Pro move: Start internal meetings by reading an anonymized web form or chat message aloud. It recenters the room on the human stakes.
Ad teams who build landing pages should regularly read patient testimonials. Tone and language become warmer, more welcoming, and nonjudgmental. It prevents performance copy from drifting into transactional shorthand and keeps empathy front and center.
Admissions hears conversion friction; clinical leaders ensure accuracy; BD hears provider objections and referral needs. A monthly, cross-functional feedback loop produces more accurate service descriptions, stronger referral resources, and messaging that reflects actual care.
When choosing between layouts, headlines, or proof elements, getting target-demographic feedback (“Do you trust this? What’s confusing?”) clarifies a path forward. It’s more expensive than an A/B test but can prevent weeks of guessing and produce a design that converts on the first try.
A strict opt-in consent model will slash visible analytics (think 85–90% less tracking). Plan for it. When event data is sparse, pause automated bid strategies, and lean on trend analysis across channels, time periods, and creative cohorts. Where regulations allow, explore state-specific consent banners that respect local law while maximizing clarity and trust.
Cookie loss doesn’t end optimization—it changes it. Aggregate by weekly cohorts (e.g., channel × campaign × landing page) and monitor directional changes in qualified leads and admissions. Compare matched weeks, control for seasonality, and use budget holdouts to estimate incrementality. It’s less flashy than real-time conversion feeds and surprisingly effective.
Before data warehouses, there is Excel (or Google Sheets). Export three streams:
Join on date and campaign, build a simple funnel view, and review weekly. When ready, pipe Sheets into Looker Studio for visual dashboards. Keep it boring; keep it reliable.
The biggest failure point is basic plumbing: call tracking not swapping numbers, forms not posting to the CRM, chats not tagged by page. Fix the pipes:
A perfect ad is useless if the conversion point isn’t tracked.
Call trees (IVRs) are part of the experience. If callers stall at “press 2 for everything else,” test clearer wording, fewer branches, or direct-to-admissions routes during peak hours. Conversion isn’t only a web problem; it’s an operations flow problem too.
Detox pages should answer availability, insurance, location proximity, and what happens in the next 15 minutes. Residential pages should deliver program specifics, daily schedule, family involvement, length of stay, aftercare, and environment. Match content depth to the decision horizon.
Empathy isn’t a “soft” thing—it’s a KPI you can design for. Define and track:
If it can be observed, it can be improved.
AI will accelerate creative iteration, headline testing, and audience matching—and do it with more context than any team can process manually. The centers that win will pair AI’s speed with guardrails: compliant data use, bias checks, and human review for tone and accuracy. The north star doesn’t change: faster, clearer paths to life-changing care.
We believe everyone deserves access to accurate, unbiased information about mental health and recovery. That’s why we have a comprehensive set of treatment providers and don't charge for inclusion. Any center that meets our criteria can list for free. We do not and have never accepted fees for referring someone to a particular center. Providers who advertise with us must be verified by our Research Team and we clearly mark their status as advertisers.
Our goal is to help you choose the best path for your recovery. That begins with information you can trust.