For a professional armed with expert knowledge about effective education practice, there will always be a strong temptation to focus on how to mobilise that knowledge through direct instruction of educators. This is not surprising, given that the experience of seeing individuals learn is a) a fundamental part of education, and b) exhilarating for the person in the teaching position – for many of us with an education role, it is the most rewarding part of the job. But there is compelling evidence that a critical part of supporting professional development and learning is the use of tools and resources which help reinforce and examine that learning on a day-to-day basis when facilitators can’t be there.

Much of the research about knowledge mobilisation in education has, sensibly and understandably, focussed on teachers. This post draws on the results of a Best Evidence Synthesis (BES) of the evidence around effective leadership of learning conducted by a team based in New Zealand, at the University of Auckland[1]. This research was part of a series of publicly-funded reports collecting the best quality evidence available on various elements of high quality education practice (such as Mathematics and Teacher Professional Development). In the Leadership BES, the focus was on using good evidence of impact for learners to build a picture of how professional learning can be effectively supported by colleagues with leadership responsibilities (i.e. headteachers and principals but also subject and year-group leaders). As part of the broad sweep of the work, considerable attention is given to the importance of the use of carefully-designed, tested and refined materials to support colleagues’ learning, which the authors of the report refer to as “smart tools”[2].

So What Are Smart Tools?

Drawing on the work of other researchers, the report’s authors explore the concept of tools as “externalised representations of ideas that people use in their practice”[3] – in other words, tools are a way of getting complex ideas down on paper. The BES notes that this definition of tools, while valuable, is also so broad that it can make the value and purpose of using tools as support for professional learning hard to notice. To combat this, it focuses particularly on “smart tools”, i.e. those which have “direct or indirect evidence showing how they can assist in improving teaching and learning”[4].

The report narrows down even further to focus on tools which help their users to achieve intended learning goals, and emphasises that they can be either developed by the leader of learning, or imported ready-made from elsewhere. Finally, the authors point out that there is a wide range of possible smart tools, but that the evidence shows that all will share qualities. Tools with evidence of success:

  1. Clearly explain the logic for the change they are trying to bring about – “why this will make a difference”
  2. Directly acknowledge the starting points of the professionals who are implementing them – “where the user is now on this topic”
  3. Include guidance about misconceptions which can arise from a misunderstanding of the content – “what this evidence is NOT telling us”
  4. Clearly connect abstract principles to relevant, solid detail, with specific examples of how it looks in practice – “what you as a teacher can/should do with this information”
  5. Are logically structured around a clear and unambiguous purpose – “how this point is connected with the main idea”
  6. Are constantly focused on the main idea, and keep complexity to a minimum
  7. Support understanding through the use of graphics and diagrams which explain their ideas clearly, and which are aligned with, and make explicit connections to, the body text and main theories[5] - “what this diagram tells you about the main idea”

What Do Tools Which Incorporate These Qualities into Their Design Look Like?

The Centre for the Use of Research and Evidence in Education (CUREE) uses these design principles in a wide variety of tools it produces for teachers and other educators. As two examples of these, CUREE has been working on a project in North Wales in which middle and specialist subject leaders are supported to increase their knowledge and experience of research evidence-informed practice. This was achieved through the use of a tailor-made package of smart tools, known as a Research Route Map (see image), giving the various materials a core purpose around which learning activities were structured. The materials provided via this Route Map are explicitly designed to support the transfer of knowledge, and are integrated with the professional learning content so that they encounter concepts on a repeat basis, with a view to effecting specific changes to practice taking place in their school. The interim report on the progress of this project can be found here.

Research Route Map 

CUREE has also created a set of materials to support teachers in adopting a version of the Response to Intervention (RtI) approach first introduced in the United States in 2004. The intervention itself is provided via facilitator-led professional development sessions, but is wrapped around a toolkit which a) supports teachers to identify which learners need the most support and how to provide that support, and b) provides teachers with easily-applied guidance on the various specific literacy strategies which come under the RtI “umbrella”. A copy of the evaluation report for a trial of CUREE’s version of this approach can be found here.

Response to Intervention 

These examples are provided not in the expectation that they be taken as gospel on how knowledge mobilisation can be supported or advanced with tools, but rather to show the variety of approaches which are possible when it comes to using precisely-designed materials to support practitioner learning. While deploying pre-made tools is an entirely valid way to use the evidence from the Best Evidence Synthesis, the process of creating and testing new ones is a powerful learning experience in its own right, and the discussion of how to do this most effectively can only benefit from the input of a diverse group of experienced professionals – like you!


[1] Robinson, V.; Hohepa, M.; and Lloyd, C. (2009), School Leadership and Student Outcomes: Identifying What Works and Why (Best Evidence Synthesis), University of Auckland.

[2] Ibid, p 132

[3] Spillane, J. P., Reiser, B. J., & Reimer, T (2002). Policy implementation and cognition: Reframing and refocusing implementation research. Review of Educational Research, 72, pp 387-431.

[4] Robinson et al, School Leadership BES, p 133

[5] Robinson et al, School Leadership BES, pp 139-140



Bart Crisp is Senior Research Manager and Manager for Research In Practice at the Centre for Use of Research and Evidence in Education (CUREE) in England. He works on many of CUREE’s professional development and learning programmes, including their work on Peer Review for school leaders, Enquiry-based Teaching, and use of research evidence in practice. He has also worked on a number of CUREE’s research projects, including the Developing Great Teaching systematic review, Developing Great Subject Teaching rapid evidence review, and CUREE’s research on teachers’ professional identity for Education International. He has experience in Teaching English as a Foreign Language, working as an English language assistant at Zhen Ze Middle School in the People’s Republic of China.