How can we tell if digital is good for education?

Last month I was at The Schools Network‘s 2011 Annual Conference, and was fortunate to enjoy the keynotes of three particularly interesting educational thinkers – Alan November, Lord Puttnam and Tony Wagner. Their ideas and my discussions with colleagues have helped drive and shape my thinking, and have encouraged me to set down some of my thoughts on the current possibilities for education.

The main topic at the conference was the place of technology in education, and its potentially transformative role. Some delegates were grappling with this question in practice in their schools, whilst others were still uncertain as to whether to embrace technology in the classroom. I’d like to take this opportunity to propose a conceptual toolkit for evaluating digital technology in education, before applying it in the field.

The acid test for any educational procedure or process is whether it improves and promotes learning. So before we look at digital technology I’d like to start by looking at the learning process, the culture of learning, and the purposes of learning. If we arrive at a shared understanding of these, we’ll be in a stronger position to evaluate the potential – or otherwise – of digital technologies for education.

Our understanding of learning has broadened from simply seeing it as a deferential act – the transmission of knowledge by an expert and its absorption by those under instruction. We now give more focus to the interactive and peer-led elements of the learning process. Group discussion in the classroom, for example, has increased in prominence at the expense of ‘chalk and talk’.

Unfortunately, however, this more passive provision of learning persists – a situation not helped by our examination system. And so there is a problem of a misunderstanding or misapplication of the process of learning that pre-dates digital technology. Whether we embrace digital technologies or not, this problem should be addressed. Perhaps there’s something in digital technology that could be used to do so.

Learning has always been an active process. The process of revising for exams exemplifies this – it’s focused on active engagement rather than vainly hoping for the passive absorption of knowledge. Thinking back to some of the strongest elements of my Cambridge education – writing weekly essays and then developing arguments in one-on-one supervisions, and in fantastic informal interdisciplinary discussion with peers at mealtimes – were all intensely active learning experiences. Lectures, on the other hand, were less helpful.

The conference’s keynote speakers were of a similar opinion as to the nature of the learning process. Alan November argued that the core components of the learning process are purpose, autonomy and mastery, and that they are not currently being adequately facilitated. He posed a question spanning both the culture and process of learning, asking “Who owns the learning?” He advocated that learning should be owned and driven by the individual learner, rather than by teachers. Similarly, Tony Wagner argued that the current system socialises young people into being passive consumers (by focusing on instilling specific competencies and units of knowledge for exam success) rather than motivated active learners.

But perhaps this exploration of the processes and culture of learning is missing the point. Any attempt to assess an education system must surely ask about its purpose. So what’s the point of education? Whilst I would argue that the pursuit of knowledge and intellectual and personal growth is in and of itself a positive for society and the individual, it remains the case that the investment of economic resources is most easily justifiable if it produces economic returns . So what does the economy need out of education?

Tony Wagner’s excellent keynote approached the digital question from this perspective, asking “What does it mean to be an educated adult today?” Tony argued that we should focus less on teaching units of content and more on developing the skills by which knowledge is mastered. Knowledge is constantly changing, so the skills rather than the content must be the focus. Having surveyed the demands of business leaders of their workforces, he concluded that “Employers don’t care about what you know. They care about what you can do with what you know.”

Tony mapped out a set of ‘survival skills’ for navigating the 21st century economy:
1) Critical thinking and problem-solving
2) Collaboration across networks and leading by influence
3) Agility and adaptability
4) Initiative and entrepreneurialism
5) Effective oral and written communication
6) Accessing and analysing information (For example, if you memorised the periodic table of elements, or the planets, even only a few years ago, your knowledge is now out of date.)
7) Curiosity and imagination

I wonder what an education system predicated on instilling these skills, rather than one based around discrete subjects and knowledge bases, would look like. Presumably it would still have space for students to follow personal interests and intellectual abstractions to at least the same extent as the current system allows. If that is the case, then perhaps even the pursuit of knowledge and intellectual  advancement for its own sake has nothing to fear from such a reconfiguration.

I would agree with Tony Wagner, and see the purpose of learning as developing conceptual flexibility and an ability to interrogate and master skills and information. And if we place this conceptual and analytical ability at the fore, rather than prioritising any specific unit of knowledge, then the distinction between more practical and less practical subjects dissolves. The place of arts subjects is secured – indeed, perhaps the more abstract and purely conceptual the better. It’s no surprise that humanities degrees are highly valued for the skills they inculcate, even if my knowledge of the twelfth-century renaissance will probably never see daily use.

How should we conceive of – and define – conceptual and analytical ability? Is conceptual and analytical ability a meta-skill, or can this too be subdivided? As a history graduate who has utlised and developed a wide variety of both practical and theoretical toolkits spanning linguistic analysis, visual rhetoric, political processes, cultural analysis work with bodies of statistical evidence, perhaps my analytical toolkit could be sharper and wider. Perhaps a richer understanding of mathematics, IT systems and philosophy would be of benefit.

So I agree with Lord Puttnam of Queensgate that computer science is the essential knowledge of the twenty-first century, in the sense that we should be actively engaged with the transformative impact of digital technologies, rather than mere consumers; but I think the issue runs deeper than that.

Having looked at the processes of learning, the culture of learning, and the purpose of learning, we are in a position to draw together the toolkit that will allow us to properly think about digital technology and education.

So in evaluating the potential of any piece of digital technology for education, we must ask:

1) Does it foster active – and perhaps peer-led – learning?
2) Does it help us meet the purpose of education?

Testing out this toolkit in the wilds of digital technology, it flags up some serious questions when we look at the ‘traditional’ e-learning model.

It’s easy enough to harness computers to deliver some level of teaching and assessment electronically. In the traditional e-learning model this is based around the user rote-learning specific units of information, or simple competencies, and being tested on this knowledge by the software. Whilst incredibly useful and powerful to a point, by allowing students to master units of competency, this model risks being a step back to the days of passive learners, and deferential, non-interactive education. In almost all cases, learners should be able to question, disrupt, link ideas, and challenge (I would add a ‘challenge’ or ‘panic’ button to any static e-learning system). So we need to make sure that digital technology is harnessed to allow the learning experience to improve.

Thankfully the social media revolution presents us with the opportunity to truly harness the power of computers and the internet to improve learning as a social process. Social media is predicated upon communities, interaction and content creation and sharing. Social media and learning are therefore natural partners. Let’s quickly explore some examples of this.

Online publishing is easy, and collaborative tools (such as Google Docs and wikis) mean that group learning is readily achievable. By collating, interrogating and sharing information as groups, learners can grapple with the provisional nature of knowledge and improve the skills needed to continue to navigate in a world of knowledge in flux.

Forums are a great vehicle for discussion-based learning. I’ve also recently been recommended the edmodo platform by @wjputt, which is basically a Facebook for communities of learners and teachers. I’m currently setting up an online philosophy study group to see how continuing collaborative higher education might work in this space.

The online space can be used to create new communities of learning – outside and beyond the classroom. These can extend beyond the curriculum and have a life that extends beyond a single year group’s experience in a single academic year. As Alan November asked: “Are your students leaving a legacy that will benefit other students?”
So what are the key messages from this discussion?

When evaluating the potential of a digital technology, ask yourself:

1) Does this new technology foster active – and perhaps peer-led – learning?
2) Does this new technology help us meet the purpose of education?

Static e-learning can be useful, but education usually requires much more than this.
To obtain the best from digital technology:

1) Think social. Online collaborative learning, through wikis and forums, works hand-in-hand with content creation and publishing, both of which are now very easy.

2) Think transformative. It seems that digital technology offers some solutions that map nicely onto existing problems. It also offers the possibility of an educational transformation, and must not be seen simply as an extra ingredient in the pedagogical mix. As Lord Puttnam of Queensgate asserted: “if all we do with technology is support existing practices, why would we expect anything better than existing results?” So we should not just digitize old practices and methodologies; rather, we must fully harness the transformed possibilities of learning through the digital medium.

So whilst it’s important to rigorously analyse the usefulness of digital technology for learning, it is even more important to remember that we’re only just starting to learn about how digital technology could help improve learning.

About these ads
This entry was posted in Uncategorized. Bookmark the permalink.

One Response to How can we tell if digital is good for education?

  1. I’ll be interested to see the evidence base for all this. Tony Wagner’s Global Performance Gap seems like a good place to start.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s