The DVSA have published their Annual Report and Accounts – You can read it for yourself here.
In Summary: The DVSA’s views on 2024‑25, portray a year of consolidation and delivery. They highlight continued emphasis on road safety and customer service, particularly through efforts to reduce car test waiting times and improve test pass rates by better preparing learners. The agency also emphasises improving digital services, investing in its people (notably apprenticeships and wellbeing), and recovering areas stretched by previous test backlog efforts.
With tight governance, efficiency targets, and a fees strategy aiming to fully recover costs, the tone is one of confidence – building on recent gains and returning to a “business as usual” operating model after a period of intense demand.
However, at the DITC we feel that many outside the DVSA would not agree? We try to act as spokespeople inside the industry. Supported by our members, we shine a light to challenge and highlight injustice while acting as a voice for the ADIs & PDIs on the front line. We therefore decided to hold the DVSA to their own standards, those of the SC1 used to mark instructor competence. We did so using AI to ensure a fair even handed view – plus there were a lot of pages considering such little progress – Let’s see how they did… (The following is generated by ChatGPT)
➤ Lesson Planning (Total: 3/12)
- Identify goals/needs – 1
DVSA clearly states objectives like reducing wait times, sustainability, and EV integration, showing awareness of their “pupils” (stakeholders) - Lesson structure – 0
There’s no structured plan detailing milestones or staged initiatives—just broad themes with little operational breakdown. - Practice areas – 1
They note geographic disparities (“London and the South East” struggle most) and promise region-specific action - Adaptive planning – 1
Some adaptability is evident—e.g., deploying all qualified examiners temporarily. But the report lacks evidence of evolving plans throughout the fiscal year.
➤ Risk Management (Total: 6/15)
- Shared responsibility – 0
No clear allocation of accountability. Blame shifts between “high demand” and “staff shortages.” - Clear instructions – 1
Targets are stated (e.g., 7-week wait), but lack transparency on implementation steps and timelines. - Awareness of context – 2
The report acknowledges regional disparities and staff challenges, showing some situational awareness. - Timely intervention – 2
Efforts such as recruiting examiners, deploying staff, and sending reminders highlight active interventions. But these often trail behind the issue. - Safety feedback – 1
There’s financial and sustainability data, but no real feedback loop to instructors, learners, or examiners.
➤ Teaching & Learning Strategies (Total: 6/24)
- 10.Style tailored to pupil’s needs – 1
They reference inclusive digital services and D&I strategies, but don’t define how regional or learning needs will shape delivery. - Encouraging learner responsibility – 1
Campaigns like “Ready to Pass?” encourage preparedness but lack measurable results. - 12.Use of examples to clarify outcomes – 0
No case studies, regional snapshots, or data stories are included. - 13.Technical accuracy – 2
The report embeds concrete metrics (test counts, wait times, emissions targets), indicating thoughtfulness. - 14.Timely feedback during delivery – 1
Quarterly performance monitoring is referenced, but not shared publicly or with stakeholders. - 15.Follow‑up on queries – 1
They mention ongoing consultations (e.g., on EVs and climate adaptation), but no clear stakeholder engagement summary. - 16.Non‑discriminatory manner – 2
D&I, inclusivity, and wellbeing appear throughout GOV.UK. This is one of the stronger areas. - 17.Encourage reflection – 0
There’s no mention of reflective practices, internal reviews, or support for capability building among staff or contractors.
➤Summary Scores
- Lesson Planning: 3/12
- Risk Management: 6/15
- Teaching & Learning Strategies: 6/24
- Total: 15 / 51 → Fail, especially in essential planning and feedback competencies.
- Risk Management: 6/15 = Immediate Fail
❗ Examiner-style Feedback
You’ve shared high-level ambitions—including reducing test waiting times, integrating EVs, and hitting environmental targets—but this annual report reads more like corporate posturing than an actionable learning intervention. You know where you want to get, but not how you’ll get there—or how you’ll know when you’ve succeeded.
Key issues:
- Lesson planning is weak—no schedule of milestones or regional rollout plans.
- Risk sharing is vague; you signal issues but don’t specify who’s accountable.
- Feedback loops are underdeveloped—there’s no system for instructors, learners, or examiners to shape policy.
- Reflective practice and adaptive learning are missing. There’s no evidence you’re learning from experience or adjusting mid-term.
There are bright-spots: acknowledgment of regional variance, explicit delivery metrics, and an inclusive internal culture. Still, these are overshadowed by a lack of execution clarity and stakeholder engagement.
That is ChatGPT’s take on the report, what do you think? We’d love to hear from you
Posted by Chris Bensted
July 21, 2025