The information technology sector in the UK is growing rapidly. Technology is driving social change and, increasingly, it underpins the smooth running of our work and personal lives.
We have become avid users: Today’s average Britain now spends more time engaging with technology than sleeping. But we need to go beyond being mere digital consumers; we need a new wave of digital makers.
Coding is the language that we use to create technological innovations and therefore lies at the heart of technological progress. Today, coding represents a fundamental skillset which, once acquired, has the potential to empower young people to take advantage of the numerous – and rewarding – career paths within this burgeoning industry.
High demand, low supply
There are currently more than one million professionals working in the IT industry, which generates up to £100 billion for the UK economy each year. And, as IT becomes an increasingly integral part of business, the industry is growing at a phenomenal pace. In order to keep up with the sector’s rapid growth an estimated additional 300,000 skilled digital professionals will be required by 2020.
But there is a skills shortage.
Despite competitive salaries awaiting would-be employees within some of the world’s most innovative companies, demand is not currently being met by the supply of information technology professionals. Indeed, the digital sector stood out among all others as the one with the greatest percentage of unfilled vacancies in 2013. And this is impeding growth: 68 per cent of employers believe the skills shortage impacted their firms’ ability to develop new products and services.
Education pays the best interest
There is a stark misalignment between the UK’s education system and the industry’s requirements. The number of students choosing to take an IT-related A-level and study computer science at university has evidenced this. Just 375 students studied Computing A-level in 2013 and only 7 per cent of those were girls with even less coming from low-income backgrounds. Despite the Tech sector’s growth and society’s increased interest in technology in the past decade, the number of students opting to take ICT or computing A-levels nearly halved between 2003 and 2013.
But this has not gone unnoticed. In an effort to better equip UK students with computing skills, a mandatory change in the curriculum of all state primary and secondary schools was introduced in September of this year
Primary and secondary schools are well placed to equip pre-GCSE students with the problem-solving techniques that help programmers to script the codes that underlie the numerous computer applications, known as Computational Thinking (CT). But schools cannot lead the way.
Typically, new curriculums can take three years or more to be developed for schools whilst industry cycles tend to be half this. To keep pace with these industry cycles requires a high level of agility, expertise and collaboration, making it very difficult for schools to teach coding.
According to the Next Gen Skills Report in 2013, around 40 per cent of ICT teaching roles in schools were unfilled, revealing the education system is simply not attracting the volume or calibre of computing teachers required. Even with the syllabus overhaul, some 116,000 primary and secondary school teachers and leaders will need to familiarise themselves with – what will be for many of them – an entirely new discipline.
A new approach
The world is becoming increasingly coded and computing skills form the bedrock of its success. We cannot continue to ignore the shortage of talent in this area. Nor can we rely on a few elite institutions to produce the workforce of tomorrow as the demand will never be met.
Presently, there is no specialist educational institution to provide those students with a passion for computing from the age of 16 upwards. We need such an institution with the stated aim to develop a pedagogy for the practical learning of higher-level digital skills and provide a pathway for students from the age of 16 into highly skilled employment.
Technology rides roughshod over the traditional aspiration of students progressing to university. A degree is no substitute for the quality of previous work that you can evidence to would-be employers. Coupled with the rising cost of Higher Education for students, it is time for a viable, high quality alternative.
By Mark Smith, director of the National College for Digital Skills