Skip to main content

e-Safeguarding in schools: Changing legal obligations

Head teachers, governors, and academics have all been scratching their heads these last few weeks, puzzled by the consultation document from the Department of Education entitled ‘Keeping children safe in education’, which aimed to provide statutory guidance for schools and colleges, but led to widespread confusion.

The problem is that as the law evolves to reflect changing circumstances and risks, it has become a legal minefield for schools and colleges. The truth is that the development of the law overall is often affected not through some direct evolving ‘thread’ of case law or amended legislation; but rather by adding layer upon layer of additional law to a pre-existing matrix which, in itself, is not particularly accessible. Indeed, disparate areas of the law which, historically originate from different places and for different purposes, begin to intersect with other areas of the law.

A perfect example of this was last year’s positive statutory obligations on the subject of Anti-Radicalisation, where law-makers added a new tier of e-Safety obligations for educators. With it, schools and colleges needed to think of their IT infrastructure as an avenue through which their e-obligations may be subverted by the dark-side of the Internet.

The upshot of all of this is that there is not one seamless integrated matrix of e-Safeguarding Law within the current tsunami of technological, societal, and child behavioural change that shows any sign of ebbing.

Mischief abound

One thing that remains consistent, though, is what the law repeatedly says to teachers, governors, managers and ICT designers alike:

' has been repeatedly said in cases about children that their ingenuity in finding unexpected ways of doing mischief to themselves and others should never be underestimated.'

Let us paraphrase that statement into the 21st Century world in which a school, and indeed the pupils themselves, are increasingly relying on technology for their day-to-day teaching and learning. Now, it is no stretch of the imagination that a judge's broad description of the risk would be along the lines of how children would 'meddle with the School’s ICT at the risk of some significant exposure to extreme material and further meddle with it to bully and harass each other'. I doubt there would be an ICT professional across the land who would disagree with that statement!

As Dave Tonnison, director of communications and information systems strategy at Wymondham College Academy Trust put it recently: 'The implementation of ICT in any school or college produces tensions (such as the Need To Educate vs. e-Safety and School Legal Exposure) that combine to impact these institutions in ways that I believe are unique. For example, no other type of organisation, of which I am aware, is required to allow 80 per cent of its network users to be people with characteristics which include (some or all of) inquisitiveness, hunger for new experience, extreme competitiveness, boundless imagination, rebelliousness and self-obsession. A user group which is not only extraordinarily technology-savvy, but expert in self-justification and regularly hormone-driven.'

A morphing duty of care

The duty of care that lies upon teachers, governors, ICT risk managers, and Local Authorities cannot be delegated away or ignored. To operate in circumstances of ignorance of the law, or wilful blindness to it, affords no defence.

The toxicity of the darker corners of the web to developing minds, the interest of children in inappropriate risk taking and the susceptibility of school ICT to be used as a vector of hostility and bullying are now unarguable facts.

Taking all of the above into account, the school and all of the personnel who support it must look to their duty of care that arises when pupils access the school’s network. The duty of care should be institutionally acknowledged, calculated, and understood. Since only then can appropriate measures (whether procedural, structural, or technological) be introduced so that this inescapable duty is fulfilled.

Emerging laws

e-Safety and e-Safeguarding in schools is a broad, contiguous spectrum that covers everything from the exposure of minors to unsavoury pornography, to more extreme forms of cyberbullying. Unlike the Law of Negligence, Health and Safety Law has the capacity to impose criminal sanctions which are tried in the Crown Court before a judge and jury.

Last summer, there was an additional intersection added between the Prevent Duty and e-Safeguarding. The document 'The Prevent duty - Departmental advice for schools and childcare providers' (June 2015) states that:

'In order for schools and childcare providers to fulfil the Prevent duty, it is essential that staff are able to identify children who may be vulnerable to radicalisation, and know what to do when they are identified.'

Further, under the heading 'IT Policies' the departmental advice says:

'The statutory guidance makes clear the need for schools to ensure that children are safe from terrorist and extremist material when accessing the internet in schools. Schools should ensure that suitable filtering is in place.'

A new breed of filtering technology

As e-Safeguarding Law continues to develop, educational establishments will need to match their implementation of risk assessment, procedures and technological measures to ensure they meet the legal obligations on them. Through this route of satisfying the Law’s requirements, they will provide an appropriately safe level of internet access for all their students.

Many schools use specialist Internet filtering tools as a means of restricting access to harmful content, however many use outdated solutions that don’t address today’s needs to be able to properly monitor cloud application use. They therefore need to ensure that their core systems are able to track and block all the modern day vectors that could target vulnerable minors.

Because of this, schools need go a step further and ensure they have a solution that not only knows what websites are being visited, but also has the ability to track further down to a more granular level as to what content is being accessed or posted. Ideally, a solution that can automatically monitor for inappropriate phrases related to issues such as terrorism or radicalisation within the comments posted underneath a seemingly innocuous YouTube video upload or shared link on Facebook. Only then can educators truly keep pupils and staff alike safe in education establishments and ensure they are in compliance with the myriad of new and emerging laws.

Ed Macnair, CEO of CensorNet

This article is based on Dr Brian Bandey’s legal research into e-Safety Law in Education, sponsored by CensorNet