Journal/sales assessment

Chat vs. Voice vs. Video: Which Sales Assessment Mode to Use (and When)

Should you assess SDRs with a phone simulation or a chat test? Should AEs do video roleplay? A decision framework based on role type and skills tested.

sales assessmentassessment modessales hiring

Should you assess your SDR candidates with a chat simulation or a phone call? Should your AE candidates do a video roleplay, or is voice enough? Is there a reason to use all three?

The answer depends on one question: what does the job actually look like?

A great cold caller might be a mediocre email writer. A polished video presenter might fall apart on an unscripted phone call. Assessment modes aren't interchangeable — they test different skills, reveal different strengths, and match different roles.

Here's how to choose.

The Modality-Matching Principle

The strongest predictor of job performance is a work sample — an assessment that closely mirrors the actual work. This principle has a specific implication for sales hiring:

Assess the candidate in the channel they'll actually sell in.

If the job is cold calling, the assessment should be a phone call. If the job is running Zoom demos, the assessment should be a video conversation. If the job is email prospecting, the assessment should test written communication.

This seems obvious. But most sales assessments default to whatever format is easiest to administer — usually text-based — regardless of whether the role involves any writing at all.

An SDR who makes 60 calls a day should not be evaluated primarily on their writing ability. An enterprise AE who runs video demos should not be assessed on how they handle a chat-based objection. The skills are different. The assessment should match.

Chat Assessment: The Written Sales Simulation

What it tests

Written communication. Structured thinking. The ability to craft persuasive emails, handle objections in text, build rapport without vocal tone or body language, and advance a conversation toward a next step — all through the written word.

Chat assessments also reveal how candidates organize their thoughts. In writing, there's no filler. No "um" to buy time. The response is the response. You see their thinking process clearly.

Best for

  • SDR/BDR roles focused on email outreach. If the rep's primary channel is email prospecting — cold outreach sequences, LinkedIn messages, follow-up emails — chat assessment tests the exact skill they'll use daily.
  • Inside sales roles with a writing component. Reps who split time between calls and email need competence in both, and chat tests the written half.
  • High-volume screening. Chat assessments are the fastest to complete (typically 15–20 minutes), require no audio/video setup, and work on any device. When you need to screen 50+ candidates for an SDR class, chat is the most scalable mode.

What it misses

Verbal skills. Tone. Confidence. The ability to think on their feet without time to compose a response. A candidate might write beautifully but stumble through a live conversation. Chat tells you half the story for roles that sell through speaking.

Ideal plan

Available on all plans, including Starter ($399/mo).

Voice Assessment: The Phone Call Simulation

What it tests

Real-time verbal selling. The candidate has a live phone conversation with an AI buyer — no scripts, no prep time, no rewrite button. This tests:

  • Talk-to-listen ratio. Do they listen more than they talk? The best sales reps typically maintain a 40/60 ratio — 40% talking, 60% listening. Candidates who dominate at 65%+ are pitching, not selling.
  • Response latency. How quickly do they respond when the buyer raises an objection or goes silent? On a phone call, a 4-second pause feels like an eternity. Response latency reveals how fast they think under verbal pressure.
  • Objection handling in real time. There's no backspace key on a phone call. When the buyer says "I'm not interested," the candidate's immediate response — before they have time to think — shows whether objection handling is a trained reflex or a knowledge gap.
  • Call control. Do they guide the conversation or follow it? Can they transition from rapport-building to discovery to next-step commitment without losing the buyer?
  • Tone and energy. Confident without being aggressive? Warm without being soft? The voice reveals things text can't.

Best for

  • Phone-heavy SDR/BDR roles. Cold calling, warm follow-ups, appointment setting. If the rep's primary tool is the phone, voice is the right assessment.
  • Inside sales. Discovery calls, demo scheduling, qualifying over the phone. Voice assessment captures the full skill set.
  • Second-stage screening. After a chat assessment filters the top 30%, voice assessment adds a higher-fidelity layer for phone-first roles.

What it misses

Body language. Eye contact. How the candidate presents visually. For roles that involve video demos or face-to-face selling, voice captures the audio but not the visual dimension.

Ideal plan

Included on Growth ($699/mo) and Scale ($1,299/mo). Unlimited voice assessments.

Video Assessment: The Face-to-Face Simulation

What it tests

Everything voice tests, plus the visual dimension. The candidate joins a video call with an AI avatar that looks, sounds, and behaves like a real buyer. This is the highest-fidelity assessment available — the closest thing to watching a candidate run a live deal.

Video-specific signals:

  • Presence. How do they carry themselves on camera? Do they project confidence or shrink into the frame? In video selling, presence is half the message.
  • Body language. Leaning in during discovery. Nodding during active listening. Maintaining eye contact (looking at the camera, not the screen). Fidgeting during tough objections. These are involuntary signals that reveal comfort level and engagement.
  • Presentation skills. Can they hold attention through a 10-minute demo? Do they use visuals effectively? Do they check in with the buyer or monologue?
  • Reaction and recovery. When the AI buyer drops a curveball — "Actually, we're now looking at your competitor" — how does the candidate's face react before their mouth catches up? Video captures the moment of genuine response.

Best for

  • Account Executives. If the AE runs Zoom-based discovery calls, demos, and negotiations, video is the definitive assessment. It tests the exact context they'll sell in.
  • Enterprise sales roles. When the deal size justifies the investment, video assessment provides the highest-fidelity signal. For roles with $500K+ quota and complex multi-stakeholder deals, the $25 per assessment is trivial.
  • Final-stage evaluation. After chat or voice has filtered the top candidates, video assessment gives you the definitive look at who can perform under realistic conditions.
  • Sales manager candidates. Coaching conversations, pipeline reviews, and team scenarios all happen on video. Assess managers in the format they'll manage in.

What it misses

Honestly, very little. Video is the highest-fidelity mode. The trade-offs are practical, not informational: it costs more ($25/assessment on Growth, included on Scale), candidates need a camera and quiet space, and it's overkill for roles that never involve video selling.

Ideal plan

$25/completed assessment on Growth ($699/mo). 50 included per month on Scale ($1,299/mo).

Combining Modes: The Multi-Modal Assessment

For roles where multiple selling channels matter, combining modes creates a progressively higher-fidelity evaluation:

Stage 1: Chat (screen 30 → 10) Send a chat assessment to all candidates. It's the fastest, most scalable mode. Review scores and transcripts. Advance the top 30%.

Stage 2: Voice (filter 10 → 5) The top 10 from chat take a voice assessment. Now you see how they perform under real-time pressure. Different candidates surface — the brilliant writer who stumbles on the phone, and the average writer who comes alive in conversation. Advance the top 5.

Stage 3: Video (evaluate 5 → 2) For your finalists, video assessment gives the complete picture. Body language, presence, visual selling skills. You now have three data points on each finalist: how they write, how they talk, and how they present.

Stage 4: Interview (decide 2 → 1) The final interview is now targeted. You've already seen these candidates sell in three channels. The interview focuses on culture, team fit, and probing the specific gaps the assessments identified.

Each mode is a progressively higher-fidelity filter. By the time you interview the final 2, you've already seen them sell in every channel your team uses.

The Decision Framework

Role TypePrimary ModeSecondary ModeRecommended Plan
SDR/BDR (phone-first)VoiceChat (pre-screen)Growth
SDR/BDR (email-first)ChatStarter
Inside SalesVoiceChat (pre-screen)Growth
Account ExecutiveVideoVoice (pre-screen)Growth or Scale
Enterprise AEVideoVoice (pre-screen)Scale
Sales ManagerVideoScale
Customer Success (sales component)VoiceChatGrowth
Solutions Engineer / SEChatVideo (demo scenarios)Growth

Three Rules of Thumb

Rule 1: Match the channel to the job. If they'll sell on the phone, test them on the phone. If they'll sell on Zoom, test them on Zoom. The obvious answer is usually the right one.

Rule 2: Use the fastest mode for volume, the richest mode for finalists. Chat for screening 50. Voice for evaluating 10. Video for deciding between 3. Efficiency at the top, fidelity at the bottom.

Rule 3: Don't over-assess. An SDR doesn't need a video assessment. An email-first BDR doesn't need a voice assessment. More modes ≠ better decisions. Better-matched modes = better decisions.


Build your first assessment scenario in 10 minutes. Choose chat, voice, video — or all three.

Start Free

Next step

See how Miki turns editorial thinking into a live hiring workflow.

Browse more essays or jump straight into a product demo if you want to see the assessment layer behind the ideas.

Book a demoSee Miki in action