The possibilities that technology offers us to improve learning are tremendously exciting. In the past few years, the question has shifted from whether technology should have a place in the classroom to understanding how technology can be integrated into lessons to achieve specific learning goals.

But as technology advances at lightning pace, it can be difficult for schools to decide which new technologies to commit time and resources to. Here, we look at the evidence to ‘bust’ six common myths around digital technology.

Myth 1. ‘New technologies are being developed all the time. Past research is irrelevant to what we have now, or will be available tomorrow.’

The best research tells us that it’s not technology itself that leads to learning gains, it’s how it’s used in the classroom. This means that past research can help us to develop a clear rationale for why we think the introduction of new technology will be more effective than the last one.

Myth 2. ‘Pupils today are digital natives – they learn differently from older people.’

The first problem with this myth is that there’s no evidence the human brain has evolved in the last 50 years. It may be that young people have learned to focus their attention differently, but our learning capacity is the same as it was before digital technologies became so ingrained in our daily lives. Secondly, just because young people have grown up with technology, it doesn’t mean they are experts in using it for their own learning.

Myth 3. ‘Learning has changed how we have access to knowledge through the internet; today’s children don’t need to know stuff, they just need to know where to find it.’

There’s no denying that the internet has changed how we access information. But information only becomes knowledge when it is used for a purpose. Googling is great for answers to a pub quiz, but would you trust your doctor if she was only using Wikipedia? To be an expert in a field, you also need experience of using the information and knowledge, so that you understand where to focus your attention and where new information will help you in making decisions and judgements.

Myth 4. ‘Students are motivated by technology so they must learn better when they use it.’

Most young people do enjoy using technology in schools to support their learning. But assuming that any increased motivation and engagement will automatically lead to better learning is dangerous. It’s possible that increased engagement or motivation may help increase the time pupils spend on learning activities, or their commitment and determination to complete a task. However, it’s only when this engagement can be harnessed for learning that there will be any academic benefit.

Myth 5. ‘We must use technology because it is there!’

We should use some of the wide range of digital technologies that are available to us to support learning and teaching in schools, but this should be where they improve aspects of teaching and learning and help to prepare children and young people for their lives after school. The curriculum and the way in which pupils work and are assessed should reflect the society and culture they are preparing pupils to be a part of when they leave formal education.

Myth 6. ‘If a little technology is a good thing, then a lot will be much better.’

The evidence tells us that using technology a lot is not linked with better learning. This suggests that there is a prime level of technology use that can support learning; too little and you don’t see the benefit, too much and the gains decline.

The EEF’s ‘Big Picture’ theme on digital technology brings together all of our work in this area, including projects, evidence reviews and Toolkit strands.

COMING SOON: NEW GUIDANCE ON DIGITAL TECHNOLOGY. Available to download from the EEF website in spring 2019.

Interested in more articles like this?

Join as a member of the Chartered College of Teaching.

Join now

If you're already a member, log in for full access to all articles.