We increasingly hear of the importance of teaching 21st Century skills in the classroom to ensure that our students are prepared for the workplace of tomorrow. But what does this really mean, and are these so-called future skills actually beneficial for our youth?
The drive towards 21st Century skills has been inspired by two factors, the first being the influence of technology. Globally, people now have access to a wide range of devices and internet technologies that weren’t there before, and these technologies have become a part of our everyday lives, with more and more people questioning why they are largely absent in education.
The second factor is the idea that knowledge can only take a person so far. Previous models of education have been based on the accumulation of knowledge and learning facts for exams, but for the world we live in today — and particularly the world of tomorrow — this simply isn’t enough anymore.
What do we mean when we talk about 21st Century skills?
There is a growing realisation that to be a part of the global tech-driven future, people need certain skills to support knowledge, not only to compete against one another, but against artificial intelligence and computerised systems. It is essential to develop skills that can adapt one to these changes to allow for career advancement in the future.
When people talk about 21st century skills, they are often referring to the four Cs: Critical Thinking, Creativity, Communication and Collaboration. These abilities are different from knowledge. They are competencies that support learning across subjects, topics and disciplines.
It is also essential to include computational thinking in this discussion: the understanding of how computers and their programming work. It’s like saying: you used to have to know how to read, write and do mathematics, but now you need to tie in basic programming skills. Even though it is unlikely that everyone will become a programmer, in the same way not everyone will become a writer or a mathematician, you still need these skills to thrive. The idea is that if you understand how computers and programming work, then you can operate more efficiently in a world where everything is connected to the internet and technology.
Are we losing sight of what it means to learn?
These skills are essential and should be promoted, but there are risks associated with accepting these factors and ideas as gospel. One of the primary issues that has arisen around this movement is the idea that people don’t need to know anything now: they can just Google it. This is compounded by popular quotes from prominent technology entrepreneurs, such as: “You don’t need a degree to get a job.”
This simply isn’t the case. As an employer, I don’t want to hire someone who has to Google “everything” and, realistically, not attending university isn’t a guarantee that you’ll become a successful technology giant — just ask the thousands of college dropouts who don’t have jobs.
While the practice of teaching surface knowledge and facts for exams is no longer enough to thrive in this world (was it ever?), we cannot diminish the importance of establishing functional, foundational knowledge as a prerequisite to being a creative and critical thinker. A basic understanding of concepts that can be applied to a range of situations and problems is needed. Once you have a hold on this knowledge, you can use Google to learn more and empower yourself, but Google itself cannot be the sole driver of learning.
The influence of technology and this drive for 21st Century skills has also introduced a school of thought that children cannot learn unless they are sufficiently entertained. However, if you look at studies into how the brain works, you can see that while yes, there is a need to grab the attention of learners, the act of learning isn’t necessarily always fun and does require effort. Being able to employ mental effort, discipline and rigour is essential to ensure learning takes place.
How can we develop both functional foundational knowledge and future skills?
One of the leading challenges faced by teachers in developing this foundational knowledge and these transferrable skills, in a way that the one feeds off the other, is that they are neither subject nor curriculum specific. So how can they be taught?
The key is to pull together areas of knowledge and 21st Century skills outside of a typical classroom environment, and one of the best ways of approaching this is to devise group problem-solving activities that are multidisciplinary. For example, you could look at a business-based project that cuts across a variety of subjects such as accounting, programming, hospitality studies and geography. Forcing students to work together on such a single project fosters not only collaboration and communication, but also critical thinking and creativity, because they have to find ways of applying book-based knowledge to solve a practical problem. This scenario allows students to work within a range of skills and to develop their talents alongside their peers.
Addressing difficult questions
One of the things I feel is often left out of discussions around 21st Century skills is the idea of ethics and how technology impacts on the world around us. No technology is value-neutral. It comes with the assumptions and biases of the programmer, along with the environmental and socio-political impacts of its creation. This involves addressing some very difficult questions — questions few people want to think about.
These include: the potential for manufacturing certain technologies that contribute to global warming; the idea that a factory in India or China might use child labour; or whether new technology may put some people out of their jobs.
Michael Fullan’s work on deep learning and pedagogy makes the point that, alongside the four Cs, there is the need to develop character and citizenship. These qualities allow you to take a step back from a situation and evaluate it from an ethical perspective, and not just on a local level, but on global and community levels as well.
A technology-driven society
The use of technology is fundamentally changing the way we work and live, and will increasingly do so for education as well. Understanding its mechanisms is important. We need schools and teachers who understand the meaning of what it is to prepare learners for the 21st Century, not only for themselves to thrive as responsible citizens, but just as importantly, for us to thrive as a caring, technology-driven society.
There will always be a confluence of factors that contribute to the evolution of required skills, but a broad base of understanding will never be obsolete. With these skills and abilities, and sensitivity to ethical concerns, the future workforce will be prepared for any challenges the world presents.
Dr Lieb Liebenberg is an academic and chief executive of ITSI.