Skip navigation

Improving and innovating in education: our collaborative approach

30 Mar 2021

Brendon Edmonds

Head of Education

Our new corporate strategy includes a commitment for us to ‘Continuously improve and innovate’ the way we regulate. This means understanding the experiences of education providers as they engage with us. In addition, we also need to  ensure the processes and outcomes we deliver with providers continue to be proportionate and efficient. This helps us to work collaboratively, to provide public confidence in the training routes of future registrants and existing registrants who are seeking to extend their practice scope.  This blog highlights our work in recent months to fulfil this commitment.

Piloting our new model of quality assurance

We are well under way piloting our new quality assurance model which we hope to fully implement at the start of the 2021-22 academic year. This work will transform our relationships with education providers and see us working more closely with sector partners and professional bodies to share data and intelligence. 

QA-Education.jpg

Within this new model, by the end of the pilot, we aim to achieve proportionate and consistent risk-based outcomes, operate efficient and flexible quality assurance processes and use a range of data and intelligence sources to inform decision making.

Findings from our first pilot cycle have demonstrated the model has more flexibility for providers and our visitor assessors, and adds value for all involved in assessing the quality of programmes.  Education providers and visitors are very supportive of our institution wide approach to meeting standards and to delivering our approval process in focused stages. Being led by issues and not rigid in how we assess has also been well received. This is all positive and indicates the model itself is sound, but it is important to note these early results are based on a small sample. The next two pilot cycles will give use better insight into the scalability of the model across a wider profile of providers. 

We have also identified a wealth of incremental improvements from the first pilot cycle which we will address through our next cycle between March and May. These include:

  • Improvements to guidance for providers and visitors
  • Clearer definition of key milestones through issues led assessment
  • Ensuring providers understand why we may ask for more information through monitoring, and how good performance is incentivised throughout.
  • Continuing to develop our online services to support provider and visitor needs.

Read more about our new quality assurance model> 

Education provider reflections on our work

We recently surveyed education providers and visitor assessors, asking them to reflect on a wide range of areas regarding our performance over the past two years.

Around 41% of approved providers responded to the survey and we saw an overall increase in responses compared to previous years. Both education providers and visitors were in most instances (75-80 % of responses) satisfied with our performance in relation to our timeliness, clarity of our communications and our ability to work collaboratively.

Type of respondents

Graph 1.png

 

However, we have identified some key areas for development, most of which will be valuable insight to inform our new QA model, particularly in relation to:

  • promoting greater consistency in the application of standards
  • using technology to make it easier to provide clear and relevant submissions of evidence
  • Improving the collaborative nature of assessments undertaken with providers

Read more about our survey and findings>

Reflecting on outcomes from our processes

Our annual review of outcomes continues to show our standards and processes encourage programme growth, with around a 6% increase in overall numbers compared during the last academic year, and a 20 per cent increase in the last four years. The external drivers for programme growth are many and differ across the UK, with the move for paramedics to degree level generating the most programme growth in this period. 

The report also highlights that complexity in provision means some elements of processes have taken longer to complete. For example, we took 26 days on average to produce approval visitors' reports. We have also seen a decrease in major change notifications, which follows similar reductions in the past two years. Covid-19 has likely had some impact on this with our focus being on supporting providers around temporary / one-off changes to support student progression. 

We have also seen an improvement in our performance outcomes in a number of areas:

  • The introduction of our new provider / profession pathway within the approval process resulted in a 15% increase in providers achieving approval with us. This meant we avoided the need to apply conditions on programmes through identifying and addressing issues earlier in the process. This also contributed to us doubling the proportion of providers achieving approval with us within three months of our visit, up to 40% during this period.
  • The number of cancelled visits was reduced by 8%, and we saw no cancellations occur after we had produced the visitors report. This means the time and effort for all stakeholders to engage in visits and to meet conditions was not wasted.
  • Maintaining consistent outcomes in monitoring across both face to face and documentary assessment of provider submissions. This means we have successfully standardised the way visitors reach decisions through this process in the last two years and removed disparities seen prior to this. 
  • Assessing new apprenticeships via a documentary assessment continues to be proportionate and beneficial to providers. We took around 2.7 months on average to complete these assessments, meaning new apprenticeship programmes could be up and running sooner.

Read our annual review of our data and outcomes and our review of degree apprenticeships for more information.

Page updated on: 31/03/2021
Top