Safety netting is something that is drummed into us at a very early stage in our medical careers. It’s a recommended part of a standard consultation, and one that we all got taught to do as GP trainees, although as a recent BMJ review article discusses (BMJ 2022;378:e069094), there is little underlying evidence to under-pin how effective it is, or indeed what form safety netting should take. But be that as it may, safety netting is generally considered to be an important part of good primary care consultations, and that it can help weed out the (relatively rare) person developing more serious illness from those with benign or self limiting disease and may reduce unnecessary re-consultations.
But over time it feels as though the importance of safety netting has become synonymous with the situation where a patient either has serious early disease that is indistinguishable from benign disease, or they have benign disease that may turn into something more serious. The well trodden examples of this might be the child that has signs of a benign viral infection only, but is in fact in the very early stages developing meningococcal septicaemia, or the patient with back pain and developing sciatica that then goes on to develop cauda equina syndrome. But these scenarios are all based on the premise that we have accurately and correctly assessed the information in front of us - that no other clinician (or AI for that matter) could have spotted that THAT child was en route to getting meningococcal septicaemia, or predicted that THAT person with back pain and sciatica would go on to develop cauda equina syndrome.
But what if we are just outright wrong? The evidence of a more serious condition is there, but we’ve either asked the wrong question or just missed that crucial bit of information, leading us down an incorrect diagnosis or pathway. Indeed, if you look back at Roger Neighbours original introduction to the concept of safety netting back in 1987, it proposes we ask three questions, number 2 of which is ‘How will I know if I’m wrong?’. I’m sure we all like to think we aren’t ‘wrong’ too often, but we may be deluding ourselves somewhat if we think that is the case. The fact that we are human means it is inevitable that we are prone to error, and how cognitive bias fits into this was discussed recently in an excellent article in the BJGP (BJGP 2022; 72 (722): 433.)
Unlike safety netting, the concept of cognitive bias was not something that came across my radar during GP training or in the early stages of my career, and I wish it had done. I’m sure just a simple understanding that often diagnostic errors are not made through lack of knowledge but through cognitive bias, would have been really helpful. As discussed in the BJGP article, we have two complementary ways of thinking; a slow conscious processing of information in a step by step manner, which is hampered by the amount of data it can process, and a fast less conscious thinking, which can process masses of data and draw on other information such as environmental cues to give a rapid answer. But this latter type of thinking which we have to use all the time, especially in time pressured GP consultations, is prone to cognitive bias - the data and knowledge may all be in place but the cognitive short cuts our brains use to interpret that information may be flawed and give the wrong answer.
And as we all know this is where it gets very difficult for us in a medical profession. Where as most good entrepreneurs and business people talk openly about the importance of making mistakes which you learn from and improve performance, we work in a profession where mistakes are poorly tolerated. Despite the talk of a ‘no blame culture’ we know that when a complaint comes through when we’ve made a mistake, that ‘support’ seems to evaporate away, and it can be a very lonely place. But we are human, so we will make mistakes - how can we square that particular circle?
Well, the BJGP gives three very candid pieces of advice. First, we need to be honest that human nature means we will make mistakes - ‘…being open about our propensity to error with ourselves, our patients, and our colleagues makes it easier to learn from experience’. And second it comes back to safety netting. We will have times where the diagnosis is staring us in the face, if not jumping up and down waving its arms, but our cognitive blind spot means we still miss it. The BMJ paper talked about a number of facets needed for good safety netting, one of which is giving patients ‘permission’ to come back if things don’t improve, which is crucial to picking up any incorrect provisional diagnoses. And finally we need to embrace the principles of a growth mindset ‘We will never be free from bias or error. Our security lies not in perfection, but in recognising and learning from our imperfections’.
You can quickly add CPD to your account by writing a reflective note about the Cognitive bias and the importance of safety netting post you've read.
Log in to your NB Dashboard and use the 'Add Reflective Note' button at the bottom of a blog entry to add your note.