Evaluating a Candidate's English Level: A Practical Guide for Hiring Managers
Updated: 28 March 2026
Hiring the right person often comes down to whether they can actually do the job in the language it requires. A CV might say “fluent English,” but what does that mean in practice? Can they lead a client call? Write a proposal? Handle a tense negotiation?
Here is a practical framework for evaluating English levels during recruitment, updated for how teams actually hire in 2026.
Why self-assessment is unreliable
Candidates regularly overestimate their language ability. Research consistently shows that self-reported levels can be off by one or even two CEFR bands. A candidate who writes “advanced English” on their CV might struggle to hold a technical conversation beyond rehearsed phrases.
This is not about honesty. Most people genuinely believe they are stronger than they are because they have never been tested in a structured way.
The AI writing problem
This has become more complicated with AI writing tools. A candidate whose emails are polished and fluent may be relying heavily on ChatGPT or Grammarly. That is fine for written communication, but it tells you nothing about their ability to speak in a meeting, respond to a question on the spot, or handle a phone call.
When evaluating English in 2026, you need to assess both written and spoken ability — and you need to know whether the written ability is genuinely theirs.
Understanding the CEFR framework
The Common European Framework of Reference (CEFR) remains the standard for describing language levels across Europe. Here is a quick guide to what each level looks like in a professional context:
- A1-A2: Can handle basic greetings and simple written instructions. Not suitable for roles requiring English communication.
- B1: Can participate in routine meetings with preparation. Reads standard emails but may miss nuance.
- B2: Functions independently in English. Can present, negotiate, and write reports with reasonable accuracy. This is the minimum for most international roles.
- C1: Handles complex professional situations with ease. Can catch subtlety, adapt tone, and manage sensitive conversations.
- C2: Near-native proficiency. Rarely needed and often overstated as a requirement.
Practical assessment methods
1. Structured interview tasks
Instead of asking “How is your English?”, build English into your interview process naturally:
- Conduct one interview round entirely in English
- Ask the candidate to summarise a document or article on the spot
- Give them a short role-play scenario relevant to the position (e.g. “A client has just called to say they are unhappy with the delivery timeline. Walk me through how you would handle that call.”)
2. Written assessment
Ask candidates to complete a task that mirrors actual job requirements:
- Draft a short email responding to a client complaint
- Write a summary of a meeting based on brief notes you provide
- Review and edit a paragraph with intentional errors
Important: ask them to complete written tasks in a supervised environment or during a live interview, not at home. This helps you assess their actual writing ability rather than their ability to use AI tools.
3. Remote evaluation techniques
For remote hiring, which is now standard for many European companies:
- Use video calls with the camera on to observe body language and confidence
- Share a screen and ask them to talk through a document or spreadsheet
- Send an asynchronous task with a reasonable deadline to see how they perform without pressure
- Include a spontaneous follow-up question during a live call about the async task they submitted — this tests whether they truly understood what they wrote
4. AI-assisted screening
Language assessment platforms have evolved significantly. Tools like Duolingo English Test, EF SET, and Linguaskill provide quick, standardised scores that correlate well with CEFR levels. These work well as a first filter, but they should never replace a conversation.
Newer platforms also offer AI-powered speaking assessments where candidates respond to prompts and receive automated scores. These can be useful for high-volume hiring, but for roles where communication quality matters, a human evaluator remains essential.
What level do you actually need?
Before evaluating candidates, be honest about what the role requires. Many job descriptions ask for C1 English when B2 would be perfectly adequate. Inflating language requirements shrinks your talent pool unnecessarily and can introduce bias against otherwise excellent candidates.
Ask yourself:
- Will this person be writing client-facing documents in English?
- Will they attend meetings where English is the primary language?
- Do they need to understand and produce technical content?
- Or do they mostly need to read emails and chat informally with colleagues?
Match the requirement to the reality, not to an idealised version of the role.
Consider the trajectory, not just the snapshot
A candidate at B2 who is motivated, articulate, and improving rapidly may outperform a C1 speaker who is complacent. Language level is not fixed — it is a moving target. Factor in potential, not just current ability.
Red flags during evaluation
Watch for these signs that a candidate’s English may not match their stated level:
- They respond to complex questions with rehearsed phrases
- They avoid follow-up questions or redirect to safer topics
- Written samples are noticeably stronger than spoken English (suggesting heavy use of correction tools)
- They struggle with numbers, dates, or spontaneous explanations
- They cannot paraphrase — if they can only say something one way, their range is limited
Building language development into the role
Sometimes the best candidate has B1+ English and strong potential. Rather than rejecting them, consider investing in targeted language training as part of their onboarding. A well-designed programme can move someone from B1 to B2 in six to twelve months, and you get a loyal employee who knows you invested in them.
This is particularly relevant in competitive hiring markets. If you are struggling to find candidates who meet both the technical and language requirements, adjusting the language bar and providing training can open up a much larger talent pool.
Evaluating English across cultures
Be aware that communication styles vary across cultures, and this can affect how you perceive a candidate’s English ability:
- Direct cultures (Netherlands, Scandinavia, Germany) may sound more fluent because their communication style is concise and assertive
- Indirect cultures (Japan, Korea, parts of Southern Europe) may sound less confident even when their English is strong, because they use more hedging and politeness strategies
- Accent is not level — a strong accent does not mean weak English. Focus on clarity, accuracy, and communication effectiveness, not on whether someone sounds like a native speaker
Key takeaways
- Never rely on self-assessment alone
- Be aware that AI tools can mask writing weaknesses — test spoken English separately
- Match the CEFR level to real job requirements, not aspirational ones
- Use practical tasks that mirror the actual work
- Consider language training as part of your talent strategy, not a barrier
- For remote roles, evaluate across both synchronous and asynchronous tasks
- Assess communication effectiveness, not accent
Getting language evaluation right means better hires, fewer surprises, and teams that actually communicate effectively from day one.