Skip to content

From the Side-lines to the Spotlight: Confronting Bias in Veterinary Medicine

This is article thirty-four in an RCVS Knowledge series of features on patient safety, clinical human factors, and the principles and associated themes of Quality Improvement (QI).

Bias. While we commonly associate the word with newspaper headlines and political campaigns, it also plays an important role in healthcare. But how? Bias describes a cognitive process. It helps us to get stuff done. In the context of care quality it also has drawbacks. Let’s look at a scenario to explain this a bit further.

Jan popped her head through Sarah’s consulting room door.
‘Sarah, just to let you know Mrs. Taylor phoned. She says it’s nothing personal, but she’s taking Tilly to River Vets for a second opinion.’ 
Sarah’s heart sank, ‘Oh, really … did she say why?’ 
'Well, she feels that there might be more to it than the high temperature.’ 
Sarah thought back to the consultation … a busy evening surgery.

Tilly, ah yes, she’d been lethargic for a few days and was pyrexic. I didn’t find much else on the physical exam … Mrs. Taylor, ah yes, she said she was a pensioner, I wanted to avoid a blood sample because they’re not cheap. I was quite sure a course of non-steroidal medication would fix it, normally these things are just a virus. Oh, that’s right and the next client had 5 puppies for first vaccine, I didn’t want to be running behind.

‘Ok, thanks for letting me know.’
Jan closed the door and Sarah tried to concentrate on the export certificate in front of her. She read a line, but the words weren’t making any sense … her mind kept wondering back to Tilly. Anxiety was crowding in.

On behalf of Sarah, let’s pause to ask two important questions: how do we humans make decisions, and why are they sometimes wrong?

Decision making

Decision making is complex but Dual Process Theory (DPT) can help us out here – DPT says that people essentially make decisions using one of two systems: ‘System 1’ or ‘System 2’ 1. System 1 thinking is rapid and feels intuitive to us – there is little conscious thought involved prior to the answer ‘appearing’ in our mind 2. System 2 is slower and more effortful. We employ System 2 when trying to solve crosswords or a Sudoku puzzle, for example. It uses logic, rationale and we search for clues from the available evidence 2. Crucially we are more likely to commit to this more difficult approach if we have time, and don’t feel flustered (this might be why I like doing Sudoku on Saturday afternoons and not Monday mornings!)

If we are in a hurry, then System 1 decision making will jump in, but the disadvantage of the process is that it relies on potentially unreliable assumptions. Our brain ‘matches’ the limited information in front of us, for example a poorly cat, with similar experiences from our past3. These ‘unconscious biases’ are not always correct. In fact, it is thought that between 50-96% of all medical mistakes are due to errors of cognition 4. To use an analogy, we can think of System 1 (bias) as the brash football manager who is shouting at his players from the side-line, and System 2 the quiet assistant who thinks before he speaks – analysing the game from every angle to understand the challenges his team is facing. It’s worth remembering that System 1 is not ‘bad,’ but neither is it a rigorous way of analysing complex issues 2.

Bias

In the medical professions diagnostic biases have been categorised to help us understand the problem more fully. The situation might be an intensive care unit, a house visit, an imaging suite or an operating theatre:

Anchoring bias Focusing on initial information or symptoms, results in a diagnosis before a full assessment has been made
Availability bias Recent encounters with disease leads to the same diagnosis in a patient with a similar presentation, despite counter-factual information
Confirmation bias Seeking and only accepting information that confirms a prior diagnosis, and ignoring symptoms and findings that do not support it
Gambler’s fallacy Believing that the same diagnosis is less likely if recent diagnoses have been the same. i.e. a belief that when flipping a coin multiple times it is unlikely to land heads every time
Sutton’s slip Making the most likely diagnosis while discounting possible, but less likely causes
Hindsight bias A judgement made in hindsight, often about a person or people, with an incomplete understanding of the circumstances and the experiences of those on the ground at the time. Hindsight bias is why we feel tempted to blame people – our System 1 ‘football manager’ jumps in before System 2, our calm ‘football assistant,’ has had time to look at the problem.

Table 1. Some of the cognitive biases that can influence clinicians in the process of making a diagnosis 2.

In our story, Sarah used an anchoring bias to make a quick diagnosis on Tilly the cat

Of course, and importantly, bias is not just limited to clinical decision making. In human medicine, bias - ‘the evaluation of something or someone that can be positive or negative’ 2 can lead to patients being given inferior care based on ethnicity, religion, sexual identity, socio-economic class, education, disability, neurodiversity, and geographic origin 5. For example, NHS data shows that the risk of maternal and perinatal mortality is five times higher in black women than white women 2. Meanwhile, in the veterinary industry, research suggests that sexist attitudes still exist in the small and large animal sectors 6 and in another survey 29% of vets reported experiencing some form of discrimination at work10. In the context of quality improvement, it’s worth remembering that providing equitable care that doesn’t change because of personal characteristics is one of the 6 pillars of care quality (Institute of Medicine (IOM)). However, evidence shows that everyone has cognitive biases – it is part of being human7 and the result of cultural influences. But this doesn't mean we should accept it. We have an individual and shared responsibility to address it, and though these statistics are not indictments of our healthcare systems per se, they are effective reminders of its potential effect.

Elsewhere, bias can influence the interpretation of research studies themselves. Healthcare professionals and researchers can place more significance on studies written by academics in rich countries than those in poorer locations2.

Our cognitive biases can also influence the performance of inter-disciplinary teams. An example might be an older vet starting an operation without consulting with a younger, registered nurse because of a belief (a bias) that they have all the knowledge and expertise. However human factors experts - practitioners in effective teamwork - emphasize the importance of open communication and equitable relationships as part of a shallow hierarchy8.

So what can we do?

So how do we tackle bias at work in all its forms?

  1. Diagnostic biases can be addressed by questioning our decisions and assumptions.
  2. A useful phrase to bear in mind is ‘that you’re wrong more often than you think’2.
  3. Artificial Intelligence has been proposed as a potential solution, and commentators suggest the far greater use of checklists, protocols and decision trees - giving vets time to order their thoughts before reaching for the syringe drawer4.
  4. When a mistake has been made it is also important to contextualise the cognitive error and examine the factors that contributed towards it9.
  5. We should consider addressing prejudices and discrimination that can linger in our unconscious? - Some authors describe health equity training5.

Being mindful of our thinking is difficult but has significant benefits in terms of care quality – engaging our System 2 decision making processes helps boost our performance at work. Put another way, it can be worth remembering that the football manager of our mind will always benefit from consulting their trusty assistant.

Checklist - What you can do next

  • Addressing unconscious bias is an important step in creating fairer and more inclusive working environments in the veterinary profession. Access the Unconscious Bias CPD course by RCVS Academy, the College's free digital learning platform, to explore how to raise self-awareness, and adopt strategies to reduce bias and promote equity, diversity and inclusion in the workplace. You can access the course for free using your 'My Account' login details.
  • Read our QI Feature ‘Maximising welfare benefits by contextualising case management’ focusing on an evidence-based decision-making approach to cases within the context of the patient and owner circumstances, combined with clinical expertise. 
  • Download the associated 'Conversation guide for discussing contextualised care' to help mitigate unconscious bias by asking questions that may otherwise be assumed and identify any areas of support that may be required when diagnosing and managing complex cases.
  • Access free CPD with our QI Boxset to learn how you can use Quality Improvement techniques to structure systems and processes and learn from everything that happens in practice to support team members to deliver high standards of care. Our free QI CPD is available in bite-sized episodes to help you establish a quality improvement structure in practice.
  • Access free CPD with the EBVM for Practitioners course to discover how incorporating an evidence-based veterinary medicine (EBVM) approach into your daily practice has the potential to increase confidence in decision-making, and contribute to an atmosphere of cohesion and support among veterinary teams. 
  • Find out what it takes to be a Good Veterinary Workplace with the British Veterinary Association's (BVA) view on the core values all good workplaces should have, allowing veterinary professionals to continue to safeguard animal welfare and team wellbeing through the prism of workplace culture, diversity, equality and fair treatment. 
  • Have a look at Civility Saves Lives, a website run by health care professionals aiming to raise awareness of the power of civility in medicine, to understand how incivility in the workplace can have a big impact on team performance and patient outcomes, and what we can do to combat incivility and improve our ability to do our jobs.

References

  1. Tay, S.W., Ryan, P. and Ryan, C.A. (2016) Systems 1 and 2 thinking processes and cognitive reflection testing in medical students. Canadian Medical Education Journal, 7 (2), e97-e103. https://doi.org/10.36834/cmej.36777 
  2. Gopal, D.P. et al. (2021) Implicit bias in healthcare: clinical practice, research and decision making. Future Healthcare Journal, 8 (1), pp. 40-48. https://doi.org/10.7861/fhj.2020-0233
  3. Hammond, M.E.H. et al. (2021) Bias in medicine: Lessons learned and mitigation strategies. JACC: Basic to Translational Science, 6 (1), pp. 78-85. https://doi.org/10.1016/j.jacbts.2020.07.012
  4. McKenzie, B. A. (2014) Veterinary clinical decision-making: cognitive biases, external constraints, and strategies for improvement. Journal of the American Veterinary Medical Association, 244 (3), pp. 271-276. https://doi.org/10.2460/javma.244.3.271
  5. Health equity: Definition, examples, and action (2020) [Medical News Today] [online]. Available from: https://www.medicalnewstoday.com/articles/health-equity [Accessed 25 March 2024]
  6. Knights, D. and Clarke, C. (2019) Gendered practices in veterinary organisations. Veterinary Record, 185 (13), pp. 407-407. https://doi.org/10.1136/vr.104994
  7. Emberton, M. (2021) Unconscious bias is a human condition. Permanente Journal, 25 (2). https://doi.org/10.7812/TPP/20.199
  8. Sukhera, J. et al. (2022) Exploring implicit influences on interprofessional collaboration: a scoping review. Journal of Interprofessional Care, 36 (5), pp. 716-724. https://doi.org/10.1080/13561820.2021.1979946
  9. Turner, M. (2021) Building a safety culture in practice - a whole team approach [RCVS Knowledge] [online]. Available from: https://knowledge.rcvs.org.uk/news-and-events/features/building-a-safety-culture-in-practice-a-whole-team-approach/ [Accessed 25 March 2024]
  10. Summers, O.S. et al. (2023) A cross-sectional study examining perceptions of discriminatory behaviors experienced and witnessed by veterinary students undertaking clinical extra-mural studies. Frontiers in Veterinary Science, 10. https://doi.org/10.3389/fvets.2023.940836  
     

About the author

Mark TurnerMark Turner BVSc MRes MRCVS

Mark graduated from the University of Liverpool in 1996 and in 2017 completed a Masters degree at the RVC investigating patient safety culture in the UK veterinary professions.

He has interests in QI, patient safety, organisational culture and systems engineering including lean. He has written for Vet Times, Companion magazine, In Practice and the Veterinary Business Journal on subjects including QI, burnout and staff engagement.

He is a practicing veterinary surgeon, and member of the BSAVA’s Education Committee.

Back to top

April 2024