Acting on evidence

“We owe almost all our knowledge not to those who have agreed but to those who have differed.”

Charles Caleb Colton

Over the past few days I’ve seen a lot of conversation generated as a result of the recently released paper from Dr Michael Johnson of the New Zealand Initiative titled No Evidence, No Evaluation, No Exit , subtitled “Lessons from the “Modern Learning Environments” Experiment.”

Of course, headlines like this are always bound to attract attention – which is why they are so constructed. They’re intended to be provocative, challenging and position-taking. This, it may be argued, is what is required to ensure there is sound debate about the ideas and influences shaping our education system.

The focus of the NZI paper is the strategic direction of the Ministry of Education since 2011 to build, upgrade, and maintain school buildings as modern learning environments (MLEs) – also referred to as flexible learning environments (FLEs), innovative learning environments (ILEs) or 21st Century Learning Environments (21stC LEs).

Dr Johnson’ claims’s key claim is that the Ministry of Education didn’t research the effect these environments would have on students’ learning before forcing schools to adopt them, as discussed In a recent radio interview with Kate Hawkesby.

I find this a challenging proposition – particularly as I have been quite intimately involved with a lot of what has been happening in the evolution of MLEs in New Zealand over the past twenty years, and have had the privilege of working alongside practitioners and researchers in a number of countries whose work has contributed to the development of this thinking and design.

It would be easy to be provoked to take a defensive response based purely upon reading the headline. But having taken the time to download and read the report, and reflect on the now rather expansive library of research and reports I’ve had access to, my view is actually to welcome this report – not because I agree with it’s analysis or conclusions particularly, but because the provocation it provides has challenged me to think more deeply (and broadly) about what I’ve seen happening.

A key point here is that I do believe that, as a profession, there’s a danger in our uncritical acceptance of every ‘new wave’ of thinking. This is a problem identified in academic literature around the world – thus my use of the quote from John Dewey in the image at the top of this post – and repeated below:

It is… advisable that the teacher should understand, and even be able to criticise, the general principles upon which the whole education system is formed and administered. He [sic] is not like a private soldier in an army, expected merely to obey, or like a cog in a wheel, expected merely to respond to and transmit external energy; he [sic] must be an intelligent medium of action.

John Dewey, 1895[1]

That was my first admonishment – to not become ‘positional’ in my response to this report and assume that my understandings and experience are somehow of more value. We must never allow our educational debates to become binarised in the way we engage – regardless of the modelling we are exposed to in the political realm.

I have to confess that, based purely on my initial reading of the headline I made a very un-valid assumption that Dr Johnson may not have considered a lot of the readings and research that I am aware of. On further investigation I have to confess this isn’t the case – in fact, several of the key papers and reports I’d have referred to are included in his reference list, including the work of the Learning Environments Applied Research Network (LEaRN) based out of Melbourne University. I’ve had the privilege of visiting this group on a number of occasions and to learn from the work they do. It’s a delightful mix research into pedagogical practice alongside concerns around inclusion, equity, community engagement and, of course, learning space design.

I guess my surprise is that the conclusions Dr Johnson draws from these papers isn’t necessarily the same as I’d have. I guess a lot of this has to do with what those in the research world call ‘research bias‘. The beginning of any research endeavour can often be to find evidence that supports a particular, pre-determined view we may have. I have to confess that I find myself guilty of that at times. So is that what’s happening here – is it that my particular bias is being challenged here? Of could it be that there’s bias in the approach that Dr Johnson has taken?

In his introduction to the paper, Dr Johnson poses some really valid questions that the paper purports to address, including:

  • Do flexible classrooms improve teaching and learning?
  • Is self-directed learning really effective? If so, for which students, and for what kinds of knowledge?
  • Can integrating computer technology into classroom pedagogy enhance teaching? If so, in what circumstances and for what kinds of learning?
  • What is the impact of different classroom designs on students with cognitive, affective and behavioural challenges?

These are exactly the questions that the researchers at LEaRN are asking in the myriad of investigations they support – as are many other researchers from around the world. Researchers from the USA, Walker, Brooks and Baepler, in their 2011 paper titled Pedagogy and Space: Empirical Research on New Learning Environments (also cited in Dr Johnson’s bibliography) reported on their research into the learning that occurred in what they refer to as Active Learning Spaces (ALCs). They found — “after controlling for potentially confounding factors such as instructor, instructional methods, assessments, and student demographics — that teaching in an ALC contributed significantly to student learning outcomes.

Now it’s completely unfair to base an argument on a single case, but it does appear to me that the headline of the NZI paper may be misleading given the fact that such quantitative analysis does exist – dating, in this case, as far back as 2011 when the changes in NZ were being introduced.

Unfortunately, the case is not helped by the fact that, as noted in the report, the Ministry of Education was only able to provide two papers as ‘evidence’ when they were approached about this. It would appear that the research that was done at the time by the MoE team must have found its way into an inaccessible archive somewhere 🙁

As Dr Johnson rightly addresses in his report, the questions about modern learning environments and their impact on student learning can’t be answered purely by looking at the architecture and specifications around heat, light and ventilation etc. He also questions the effectiveness of ‘self-directed’ pedagogies, concluding that “while there is some evidence that self-directed learning can be motivational for young adults and mature students, but very little that it enhances learning, even for older students“. And Dr Johnson questions the integration of digital technologies, citing “little evidence that technology enhances the acquisition of key skills like literacy and numeracy, or disciplinary learning at the secondary school level.” and concluding that “[digital technology] is no substitute for direct instruction by expert teachers.

The OECD (curiously not cited in Dr Johnson’s bibliography) have been investigating these very same concerns for some years now. Their Innovative Learning Environments (ILE) project ran for more than a decade since the mid-2000s. In the forward to their publication The OECD Handbook for Innovative Learning Environments the authors state: “Over the last decade, the OECD region has seen a 20 percent rise in spending per school student but yet little significant improvement in learning outcomes.” The OECD approach considers the complexity of the education ecosystem, and provides; “a concise, non-technical summary of a substantial body of international reflection on learning and innovation, underpinned in each case by a full publication.” Those publications include, for example, Evidence in Education: Linking Research and Policy and The Nature of Learning: Using Research to Inspire Practice.

Now I’m not suggesting here that any of this literature negates what the NZ Initiative are saying – in fact, there are some points of agreement that can be found. What is important to me is that we:

(a) actively engage in these debates and inquiries and don’t abdicate the ground to others to sort out for us, and

(b) we do so from a critical perspective, declaring any biases and acknowledging the complexity of the environment we’re trying to navigate. Thus, any solutions will be differentiated and fit for purpose in the local context.

Herein lies an issue for MLEs that is identified by Dr Johnson. While the introduction of MLEs was a government led initiative, the legislative environment provided local communities with the ability to determine how this might be implemented in their context. What ensued, however, in the absence of critical conversations and informed debate, was a mix of (perceived?) pressure from central Ministry, advice from architects and other ‘experts’, and lobbying from groups at the local level, with arguments for both ‘sides’ driven by personal perspectives or feelings.

Circling back to the OECD research, then, the key issue for me is that we devote time in our conversations to take account of the broader, connected issues at stake here, and determinably seek to understand the full ecosystem we are operating as part of. One of the biggest challenges I have with the NZI perspective (and with many other viewpoints expressed) is that is easy to focus on and critique any innovation such as MLEs, self-directed learning or digital technologies in education, but to do so with any credibility we have to be equally open to critiquing the basis of our current system and approaches which inevitably are used as the de-facto baseline for our studies.

For example, it’s easy enough to compare the academic achievement of learners who have been taught in a traditional classroom with one teacher against learners who have been taught in an open/flexible/modern environment by using conventional test scores – but are we open to considering whether these scores are measuring the right things? The academic world has been alight with literature around this topic for some time – recognising that the traditional regime of testing may privilege a certain group of students while disadvantaging others. Further, that the substance of what is being assessed (i.e. content knowledge and processing) may not, in our modern world, be as important for our young learners as the competencies that have been identified and promoted by groups such as the OECD – and embraced as part of the NZ Curriculum for the past 20 years.

Now I fully appreciate that argument is nuanced also. Basic forms of literacy (reading, writing, numeracy skills etc.) will continue to be important (perhaps even more so) to be able to function well and thrive in the modern world, and so must be given the correct level of focus and attention they deserve.

My point really is that, in order to more completely understand the complexity of the debate, in addition to the sorts of questions Dr Johnson poses (cited above) we need also to be asking the following…

  • where is the evidence that the traditional structures of one room, one teacher, one subject, one hour etc. are the most effective for achieving the learning outcomes we desire?
  • where is the evidence that many of the new initiatives being introduced as a ‘one size fits all’ solution in our schools and classrooms will benefit every learner?
  • where is the evidence to support the notion that schools alone should shoulder the responsibility for addressing all of the things that are perceived as needing ‘fixing’ in our young people? (i.e. what is the purpose of school?)
  • where is the evidence that our current approach to preparing and supporting teachers for these increasingly complex roles are working? Why do we persist with traditional processes for meeting supply and demand?
  • where is the evidence that local schools and local communities can continue to be effective in making decisions for their schools and learners? And if they are, where’s the evidence for the kinds of support required to make them effective?
  • Where is the evidence to show why we have such large numbers of young people not attending school? Is it because of how our schools are structured and organised? Or the curriculum? Or something else?
  • Where is the evidence to help us understand why so many teachers are leaving the profession? And why so few are choosing to enter it? And what is the impact of this on our ability to build a coherent, locally owned and developed approach to teaching and learning that reflects local community needs and values?

None of these questions can be answered easily. Many of them are already being asked, but the impact is limited and a case of ‘let the 1000 flowers bloom’.

But they are all questions that those of us working as professionals in what we call the profession of teaching and education generally should be engaged in talking about.

To that end I am grateful to Dr Johnson and his colleagues at the NZ Initiative for providing such a provocation in the form of this report. The response of the profession should and must be to interrogate it, identify personal biases, examine the evidence at source and engage in co-constructing the ways forward that will genuinely meet the needs of our future generations.

[1] McLellan, J. A., & Dewey, J. (1895). What psychology can do for the teacher. In J. A. McLellan & J. Dewey, The psychology of number and its applications to methods of teaching arithmetic (pp. 1–22). D Appleton & Company.

DISCLAIMER: This post does not purport to be an intensively researched or evidence-based piece of writing. Rather, it represents the point of view of someone who has been involved in the evolution of thought and practice around the links between pedagogical practice and learning space design for some time now, and who endeavours to maintain a focus on these developments from both a theoretical and practice perspective.

2 thoughts on “Acting on evidence

  1. Excellent response Derek – very fair, open, considered, measured, objective, and systematic…..and therefore very convincing…. well done.

  2. Kia ora Derek
    Thanks for this thoughtful and reasoned ‘analysis’…in my opinion your bullet point list of question at the end is a great summary.. it seems to me that NZI and others would do well to consider these questions themselves.

Leave a Reply