Imagine you’re standing in a seminar room of 20 students. Or perhaps it’s a lecture theatre of 300 students. Looking around, you can see students watching you. Others stare at their screens, at their phones or off into the distance. How many would you say were ‘engaged’? And how would you tell? A recent summary of research on student ‘engagement’ suggests it’s an elusive concept to define and measure. It looks different for different students. So what does ‘engagement’ mean? And what does ‘student engagement’ look like? Well, for a start, student engagement isn’t about bums on seats.
Engagement as code for (lecture) attendance
We’ve all had the experience of thinking to ourselves, or maybe talking to another law teacher, and hearing, ‘They’re just not engaged.’ It often comes from a real place of concern for law students. Although it can also sound a little like ‘Where is my audience?’ But the rough measure of engagement we tend to use is a blunt one – how many are turning up to class?
This makes law school a little bit different from other types of schools. Unless you have a compulsory attendance requirement, law students aren’t forced to come to class. Recorded lectures and well-organised study groups mean that students can get the same content in different ways. Educators can make some content available 24 hours, 7 days a week on Youtube or other platforms (and usually in a more accessible way).
The passive lecture has been on and off life support for a while. It’s dead one minute and recovering the next. Some disciplines limit the number of lectures delivered in favour of active learning. Universities flip classrooms backwards and forward, depending on the data.
Despite the near-death experience of the lecture, ‘bums on seats’ still gets used as a triage measure for … something. But is it really engagement? If you get an 80% attendance rate at lectures, can you really tell the dean that they are 100% ‘engaged’? And what does that even mean?
What is student ‘engagement’?
There are almost as many definitions of ‘engagement’ as studies on its role and effect. And there is no clear concept of how student engagement is measured. In 2010, Dr Vicki Trowler reviewed more than 1000 analyses of student engagement to identify definitions and themes. But that wasn’t a complete collection. To try to limit the scope of the review, which Trowler acknowledged was enormous, she reviewed research that authors themselves identified as dealing with ‘student engagement’. She also excluded a mountain of ‘grey literature’ on blogs and other sources.
Through her review, Trowler identified that ‘engagement’ was not a single, binary dimension (‘engaged’ or ‘not engaged’) but as a mix of behavioural, emotional and cognitive responses. Research into those responses might, in turn, be focused on individual student’s learning or whether students’ identified with their institution or students’ involvement in managing their institution.
In short, there isn’t an agreed focus for research or action on student engagement. But, what Trowler did find was:
The majority of literature on student engagement is concerned directly or indirectly with improving student learning.
Student ‘engagement’ – and law student engagement – isn’t about bums on seats. And we should stop using it that way. It’s actually shorthand for asking, ‘Are students learning?’
When do students say they’re learning?
This might sound like an odd question. But learning is personal. It isn’t something that happens on command. This isn’t a new idea. Four hundred years ago, John Locke was arguing something similar.
In a recent issue of the Journal of Further and Higher Education, Lisa Payne at Coventry University attempted to synthesise some of the work on student engagement as a way of beginning to talk about it meaningfully. Payne also asked her own students about what affected their own learning. Her research also helps answer the question, ‘Why is statement engagement important?’
Drawing on the same research as Trowler, Payne argued that there are identifiable negative and positive influences on students. A lot of them are unsurprising. Things like timetabling and hassles with administration act against students learning. On the other hand, having some autonomy in learning, applying it practically and seeing its intrinsic value support student learning.
In a lot of ways, this isn’t new. Locke wrote about it. So did John Dewey. More than 10 years ago, researchers at Brigham Young University noted:
The idea that students must be actively engaged in the learning process in order for it to be effective is not new. The roots for active learning reach back in the literature to John Dewey… A diverse body of educational research has shown that academic achievement is positively influenced by the amount of active participation in the
‘Engagement’ isn’t static
But what’s really interesting about Payne’s article is that student ‘engagement’ isn’t something that applies to a group as a whole and suggests a simple way of representing it:
But the value of Payne’s developing model isn’t just in explaining individual students’ engagement in a course. It might also apply to a student’s engagement within a single learning episode.
Go back to the example we started with – a room full of students, each adopting a different listening ‘posture’. But, if we add the element of time, do they stay that way throughout the lecture or seminar? If they do, you could apply a consistent ‘score’ to their ‘engagement’ and a rough measure of the quality of learning. If it’s used individually, you might even be able to assess the effectiveness of your teaching.
‘I’ve got 3 at 1, 10 at 2 and 5 at 3. Is that enough to continue?’
I’m not suggesting for a minute that this is anything other than a rough measure. But, standing in the middle of a lecture, that’s all you can effectively do. But the immediacy of the assessment might encourage you to change the delivery, change the activity or stop and try something completely different.
Law student engagement isn’t just about bums on seats
‘Student engagement’ isn’t just about whether they turn up. It’s about whether they are learning. Assessing that can be hard. But if something like Payne’s levels can begin to provide a quick ‘check-in’, and a way of explaining if students are ‘engaged’, it’s something worthy of further research.