This post explores how the real-time decisions of educators, playworkers and other staff who oversee children fit into the overall risk management process, and how they are held to account for those decisions. I have written it at the suggestion of the UK Play Safety Forum. The PSF would welcome comments on the position set out here – as would I.
I will start with describing a real-life scenario from a Forest School session run by Bayonne Nursery a few years ago. (Those who have heard me talk on risk will recognise it from a video clip that I often show.) A group of four-year-old children are exploring an area of woodland. After clearing away fallen branches from around a large tree trunk that crosses over a dry ditch, three girls start to shimmy across. Two succeed, while the third becomes alarmed and gives up. Forest school-trained educators, present throughout, do not intervene at any point – not even to give encouragement or warnings. This is despite the fact that at points, things look like they might be getting challenging, uncomfortable or even slightly dangerous.
In staffed situations such as schools, early years settings, out of school/free time facilities, outdoor learning programmes and playwork settings, the real-time judgements of front-line staff (paid and voluntary) about whether, when and how to intervene are fundamental to shaping children’s experiences. Interventions and decisions are informed by staff’s values and understandings about the goals and objectives of their setting and practice: and crucially by their thinking about risk. They are highly sensitive to circumstances, and may happen in a matter of seconds.
The overall process, labelled dynamic risk benefit assessment (RBA) in Managing Risk in Play Provision [pdf link], is complex, fluid, largely intuitive, and difficult to document. In staffed settings, dynamic RBA may well be the most significant part of risk management. Yet as Managing Risk in Play Provision says, it may be undervalued by risk assessment perspectives that focus on the need for written evidence that procedures have been followed.
So how should staff and organisations show they are being reasonable in their approach to dynamic RBA? Some organisations have developed analytical tools such as flowcharts and decision trees in an attempt to set out how decision-making processes might work. Learning through Landscapes has one on its website, and a group of playworkers from Wrexham and Conwy Councils and Glyndwr University has also developed something along similar lines.
Such tools may be helpful in opening up professional debate about relevant factors and options in different circumstances. But how useful are they in capturing sound decision-making in dynamic RBA situations, which may happen so quickly and be dealt with so intuitively that there is no time for reflection, let alone record-keeping? There is a real risk that such tools could be seen not simply as a prompt for discussion, but as a measure of compliance: a requirement that staff are expected to follow – and expected to show that they have followed through documentation or other records.
It is hard to see how this demand for an ‘audit trail’ can be met without adding to the burden of staff, and without distorting the very decision-making that such processes are supposed to be supporting.
Rather than trying to document decision-making through claiming that it is supported by a particular process, a more practical and promising approach may be to emphasise the role of professional competence. This could be shown through relevant experience, skills, qualifications, supervision procedures, professional development and evidence of sound judgements in the past. Good practice in dynamic RBA is also likely to be supported through giving staff opportunities to reflect on their experiences and practice, for instance through ensuring they have space and time to discuss minor adverse experiences and ‘near misses’.
Can sound decision-making in dynamic RBA be ‘audited’ or proven through any kind of documentation? Are flowcharts or decision trees valuable tools, or traps? Should we reject the demand for case-by-case evidence that procedures were followed, and instead focus on the importance of relevant experience, knowledge and skills, supported by time for reflection?
My own view is that when it comes to questions about the soundness of dynamic RBA judgements, the right place to focus is the competences of the individual or staff team, rather than compliance with any procedure. Whether or not you agree, I would welcome your thoughts and comments, and will feed these into upcoming debates at Play Safety Forum meetings.
Tim, Excellent piece as ever. My feeling very much supports yours.I would add one or two things. First it is vital that the organisation have a policy that recognises the importance of risk and the implications inherent in such a policy. For me it should clearly state that written risk assessments are neither expected or appropriate in these circumstances Then it must be up to the competence of staff, as you say. Possibly the only other expectation would be reference to incidents, particularly of near misses, in a daily log of sessions. I am ignorant about this, but, perhaps naively, I imagine that most organisations where this would be relevant would be keeping such logs anyway? I particularly would not want to add to their burdens!
One last comment is that i couldn’t read the diagrams on your blog as they are too small on my computer. This maybe because I couldn’t see how to make the presentation bigger or get rid of the column on the left!
HI Robin and thanks for the constructive feedback. Your two comments sound sensible to me. Thanks too for the technical comment – I’ve changed the post so that clicking on the diagrams takes you to an enlarged, more readable version.
Reblogged this on grumpysutcliffe and commented:
Another excellent blog from Tim!
Pingback: Show or tell: How should educators and playwork...
I shall be interested to read further comments on this. In my experience of ‘training’ TAs et. al. the flow chart is a useful discussion tool, but more useful is modelling this in action during a session outside (typically for me when introducing loose parts play) and discussing incidents as they arise. I agree with you Tim that it is the confidence and confidence of these staff that is key…and part of that confidence stems from the support (or lack of it) that they get from the senior leadership team.
Excellent Tim. Totally agree that dynamic RBA should be competence-based and should not be measured by compliance with or recording actions as prescribed by flow-charts and the like. Humans just don’t think and make decisions like that – certainly not in real time. Daniel Kahneman’s ‘Thinking, Fast and Slow’ is brilliant on this, and I think should be read by everyone interested in risk-benefit assessment. Very crudely summarised, he proposes that we have two types of decision-making processes: fast, intuitive thinking, and slow, rational thinking. Also that we are biased towards being loss-averse: we are more likely to act to avert a loss than to achieve a gain.
His research has all been with adults – I wonder if it might be different with children – that they are more likely to try to achieve a gain in and through their play than risk a loss? The more I think about this, the more I think it might be true – as Dylan sang “When you have nothing, you have nothing to lose.”
Robin – I’d be a bit concerned about the idea of proposing logging near misses if only because my near miss might your miss by a mile and vice versa – define “near”!! And surely adventurous, exploratory, what-if play is about having near misses? But it is a very good question to put and gives rise to all sorts of interesting RBA avenues to pursue.
Pingback: Show or tell: How should educators and playworkers back up their real-time decisions about risk? - Play Scotland
Thanks for the comments and shares so far. Mick – David Ball has shared with the PSF a brief note on decision-making which goes along the lines you suggest. This certainly influenced my thinking. Felicity – you are right that a shared understanding and approach up and down the ‘chain of command’ is essential (which is why we have put so much effort into getting support for RBA from the likes of HSE).
A useful and thought provoking piece Tim, and one I shall continue to ponder and be challenged by you.
I think it worth sharing some of our (Learning Through Landscapes/Grounds for Learnings) thoughts and other processes, as this may help demonstrate why we felt the need for that diagram.
One of our discussions with staff when I was implementing our new Risk Management system was about demonstrating staff competence and our organisations culture regarding risk. How can we do this simply, without it becoming a burdensome paper trail? How could we do this in a way that supports staff, identifies issues that need addressing and rewards an open and honest culture? How do I have enough evidence of all this, should we need to demonstrate to an insurer, HSE or court that we have struck a balance and taken reasonable measures?
The answer was to be able to demonstrate training, observations from colleagues or managers, good judgements or near misses and accidents and our process. The diagram is only one part of a much bigger system, and should be read with the policies and risk assessments.
As we do not have the daily meetings or staff diary, we use a simple email and monthly staff meeting records system. The emails sent to me for H&S I save and surmise annually – with good practice such as postponing through high winds or similar having as much weight as near misses or accidents. Every LTL/GfL meeting stars with opportunity for discussion or reporting of any risk issues. This includes decisions staff have made, and want to relay to colleagues for discussion and thought.
That diagram highlights thought process – not each individual decision made. Every decision is not recorded, it is up to staff to share in the most appropriate manner. Any issues that are brought up are treated positively – even when a poor judgement has been made or incident happen. The diagram has helped demonstrate our culture to staff, within the confines of our risk assessments.
An example would be asking a child to not run in a crowded corridor, while on the same day expecting that child to run around outdoors at break. These simple, natural judgements happen all the time – yet so many risk assessments and systems seek to lock them down or formalise every last detail, reducing competence and confidence to make those judgements. So far that diagram has helped illustrate the decision making that goes on, to our customers, on a number of courses.
Our approach to risk in play, youth and education is changing, and we need to demonstrate clearly to others how it is different.. It is now out of step with much of industry, and the training that many H&S professionals have. Both the content of and tone of our risk assessments is different – and we do not adhere to other industries ‘ERICPD’ approach. That diagram highlights that we go straight in with the behaviour (Discipline) and Reduce before we do Exclude and Isolate. See your example of the wobbly bridge.
The diagram is not purely the safety of our customers – it is to do with liability. We can demonstrate our staff competence and culture around risk management clearly. We can demonstrate they are allowed to make simple judgements. We have simple records of this. I am really keen to find out how else we can do this. An example from my old outdoor centre employer was the daily staff meeting notes and risk management sheets were recorded in the staff diary. Simple, fast and open to anyone to view.
The bigger question for me is why we still have so many Education H&S systems and cultures that are not fit for purpose. They are too long winded, too focused on micro-hazards or low likelihood risks. They then use numerical scores to simplify the complex, judgement and context laden decisions to a number. They fail because they set out to cover every last detail – and then do not include every last detail. They do not have a feedback loop – other than when things go badly wrong. They do not engage staff, require thought or encourage openness.
It requires culture change, and I think as LTL/GfL we set out to be one of the leaders of that change. It has been great to reflect on what seems like a simple enough decision and diagram, and try to unpick what message and culture that it sets. We are really keen to learn from others, and see how simple these systems can become. I think we are about right so far – but please do keep the conversation going.
I would like to highlight we are inviting conversation this autumn – at a conversation day about ‘Striking a balance when managing risks’. Do come and join us – http://www.ltl.org.uk/resources/results.php?id=886
Thanks Matt for such a thoughtful and detailed response. I found it very helpful in getting a better picture of what you are trying to do with the flowchart and how it fits into the bigger picture. I agree that the imposition of inappropriate systems from factory/workplace H & S into education and play is the bigger problem, and of course RBA is squarely aimed at tackling this. I do not underestimate the challenges facing those like LtL/GfL in promoting RBA, and I admire the progress you have made.
What the wobbly bridge aims to show is that the whole premise of orthodox risk management – namely the imperative to reduce risk – is wrongheaded in play contexts – and many learning ones too. (Slightly tangentially, one question I have about your flowchart is whether it too implies that risks should always be reduced or controlled. The possibility that an offer/activity/space is too unchallenging is not raised.)
In terms of how any flowchart such as yours is used, I would still want to hold to the distinction in the post, between supporting and building good decision-making on the one hand (where I can see potential, but also difficulties due to the intuitive, complex nature of dynamic RBA) and being a tool for compliance and accountability on the other (where I can only see problems).
I’m sure some of these issues will be explored at your 28 Sept event in Stirling (which sadly I cannot make due to a diary clash). I also invite others to chip in – it is always valuable to hear how change is unfolding at the chalk-face (and in the playgrounds and woods).
Excellent Tim – not sure you got my last comment. Just one thing to add – i think leader competence is essential and yes the whole organisation needs to take this on board – up and down! Two things i would like to add are leader confidence and experience – this needs to be explicit – a leader can demonstrate competence in ‘knowing’ the value of RBA’s and the process – but to make those split second decisions also takes confidence and experience.
Hi Jon – thanks for taking the time to comment. I agree that leadership in settings is important. I haven’t seen any other comment from you here by the way – just checked WordPress’s spam folder too, and nothing there. An internet hiccup, perhaps?
Hi Tim…yeh looks like an internet glitch or could have been my operator error!! It included an example of my own with 16 10 year old boys in the woods two saturdays ago!! Where my own judgement on what was wasn’t safe was very much based on experience and own confidence (I may have made different decision had I been younger and more gung ho!). Thanks for the continuing support and advocacy for this approach to risk, play and RBA process.
Jon – you make a good point about competence, confidence and experience. As we developed our system I used what I knew from outdoor centres, where multiple staff are leading dynamically hazardous and challenging activities on a daily basis.
Our evidence base for their skills was through some qualifications, through simple log books of sessions led (and not every single one – a ‘slice’ of or summary) and a few observations through the year. These varied from any peer (including visiting staff) feedback, a ‘coffee cup walk by’ (me, wandering around site with a cuppa and watching what was happening) or a full observation from senior, myself or a technical expert. We recorded ‘positive’ incidents as well – the day we chose not to canoe, or altered activities due to weather forecast etc.
The simple observations were a couple of lines in a diary, signed.
This system was simple and varied in evidence base. They were pretty routinely (daily) applied, but this reflects the increased level of hazard and risk overall for some activities. The Adventure Activities Licensing Authority inspector commended us on such as simple, positive and evidence based system.
In a play or education setting, they could be applied on a monthly or annual basis.
Sadly, my employer had a teen guest fatality and following inquiry. This highlighted some shortcomings – and one was the fact that there was not enough opportunity for staff to question, feedback or record any issues or fears they had; another was keeping an up-to-date paper trail of competence and decisions made. Our system outlined above was partly informed by this experience.
I’m going to echo the sentiments of others here, to some extent. Risk assessment, as you say, is very much an aspect of the individual.
Having a written statement, flow chart, or mind map to outline the expected procedures is a great idea. Like any organization, this provides a more or less objective standard to measure and contrast the actions of individual staff members with.
Looking at the population generally, I’m sure you would find a roughly bell-curved distribution of methodologies for watching playing children. This might range from falling asleep with a book over the eyes to vainly chasing and actively policing at close range.
The hiring interview should roughly approximate the “slice” of the bell curve the organization is aiming for, and this can be fine tuned by reference to the documentation on best practices. As long as this reference text can be used with a guided hand and not treated as an absolute in and of itself, this seems eminently reasonable.
Pingback: Risk / Health + Safety | Pearltrees
Wow, as a Nature educator from Canada, I just learned so much! Thanks Tim for the article and others for comments.