AI school inspections face resistance

Getty Images

Plans for the use of algorithms to identify failing schools have been criticised by the National Association of school principals.

A scientific data on the drive, in part owned by the government of the united kingdom, has been training algorithms to rate schools, the use of machine learning – a form of AI.

It proposes to work with the England education of monitoring Ofsted to help prioritize inspections.

The NAHT said effective inspection of schools should not be based on the data.

“We need to move from a data-driven approach to school inspection,” the union said in a statement.

“It is important that the whole process is transparent and that schools can understand and learn from any evaluation.

“The leaders and teachers have absolute confidence that the inspection system is going to treat teachers and leaders of the fair”.

Social object of the company’s Behavioural Insights Team, part-ownership of the innovation charity Nesta, has established how the artificial intellgence of a system that works in a report.

Lead author, Michael Sanders told the BBC: “If it was put in the field, would be used to prioritize which schools should be inspected, and we are hoping to work with Ofsted in the next 12 months to improve the algorithm and adapt it to that purpose.”

The data used for training the algorithm includes the last Ofsted inspections, other data from schools and census information, all of which is available to the public.

Also the analysis of the responses about the individual schools provided by the parents via Ofsted Parent View.

The data produced by the algorithm will not be shared with the schools and Mr. Sanders said that it would not be useful to do so.

“If we pursued the results of the algorithms and offered five things that make the school better, that would be naive,” he said.

“Ofsted inspectors who do holistic of the inspections in a much better place to provide advice.”

Currently, the algorithms are designed solely as a tool to help Ofsted, but Mr. Sanders acknowledged that could be the future of applications.

“Prediction of grades Gcse was based on teachers’ judgements, but there is research that suggests that it is not all that accurate,” he said.

“The use of data to give a better image could be a best way to help young people in their education.”

But, he added: “Any of the other applications require ethical and supervisory practices.”