Skip to content
GuidesEvaluator Secrets
Insider7 min read

What Council Evaluators Actually Look For

You can read every published evaluation criterion and still lose. Here's what actually happens when evaluators score your bid.

Where This Comes From

This guide is based on conversations with former procurement officers from NHS trusts, county councils, and housing associations. They've evaluated thousands of bids - and seen the same mistakes cost SMEs contracts repeatedly.

How Scoring Actually Works

Most public sector tenders use a 0-5 scoring scale. The published criteria tell you what they're scoring. They don't tell you how evaluators actually read your response and decide between a 3 and a 4.

0-1

Unacceptable / Poor

Missing, non-compliant, or doesn't answer the question

2

Acceptable

Basic response, some relevant content, lacks detail

3

Good

Competent response, addresses requirements, some evidence

4

Very Good

Strong response with clear evidence, exceeds minimum requirements

5

Excellent

Outstanding response, demonstrates innovation, comprehensive evidence

The Difference Between 3 and 4 (Where Contracts Are Won)

"Most SME bids score 3. They answer the question, they're competent, they're fine. But 'fine' doesn't win. The bids that score 4 give me specific evidence I can reference in my evaluation notes. Numbers, dates, named clients, measurable outcomes."

— Former NHS Trust Procurement Manager

Score 3 Response

"We have extensive experience in cleaning healthcare facilities. Our trained staff follow all relevant guidelines and we maintain high standards of cleanliness. We will ensure your premises are cleaned to the required standard."

Generic, no evidence, could apply to any contract

Score 4-5 Response

"We currently deliver cleaning services to Royal Devon NHS Trust (ref: Sarah Johnson, 01234 567890), maintaining 98.7% compliance with PLACE standards across 47 clinical areas. Our 12 NHS-trained operatives complete weekly COSHH refreshers via our in-house learning portal. We will replicate this approach at your site..."

Specific evidence, named reference, measurable KPIs

What Evaluators Actually Do (Behind The Scenes)

1. They Skim First

Evaluators have dozens of bids to score. They'll skim your response in 30 seconds before reading properly. If they can't quickly see that you've addressed the key criteria, you've already lost points. Use headings that mirror the question. Bold key information. Make it scannable.

2. They Look For Reasons To Differentiate

When 5 bids all say "we have experience in this sector," evaluators need something to separate them. Specifics are your differentiator. Names, numbers, dates, KPIs - anything concrete that proves you're not just claiming experience.

3. They Need To Justify Their Scores

Evaluators have to write notes explaining why they scored you a 3 vs a 4. If you don't give them quotable evidence, they can't justify giving you high marks - even if they think you'd do a good job. Make their job easy: give them sentences they can copy into their evaluation notes.

4. They Cross-Reference

Smart evaluators will check if your methodology matches your staffing, if your pricing matches your proposed resource levels, if your case studies are relevant to the contract scope. Inconsistencies between sections raise red flags and can drop your score across multiple questions.

The Criteria They Don't Publish

Beyond the official scoring matrix, evaluators are (consciously or not) asking themselves:

  • "Will this supplier be a nightmare to manage?" - Vague methodology, no clear contact structures, or unrealistic promises signal problems.
  • "Do they actually understand what we need?" - Generic responses that could apply to any contract vs tailored responses showing you've read the spec carefully.
  • "Can I defend choosing them to my boss?" - If something goes wrong, evaluators need to show they made a defensible decision. Evidence-rich bids give them that protection.
  • "Is this price realistic?" - Prices significantly below competitors raise concerns about quality, hidden costs, or ability to deliver.

Quick Wins: What Scores Points

Do This

  • • Name specific client references with contact details
  • • Include measurable KPIs from current contracts
  • • Use headings that mirror the question structure
  • • Show you've read the specification (reference page numbers)
  • • Explain HOW you'll deliver, not just WHAT you'll deliver

Avoid This

  • • "We have extensive experience..." (no evidence)
  • • Generic responses copied from previous bids
  • • Walls of text with no formatting
  • • Over-promising on timelines or quality
  • • Ignoring word limits (often disqualifies)

Score 4-5 On Your Next Bid

Our AI generates responses that hit every evaluation criterion with specific, evidenced content. You supply your company details - we write responses that score.