Some things Professor Rob Briner thinks he got wrong in trying to promote evidence-based practice
This article is part of an ongoing series of articles on evidence-based knowledge management.
Professor Rob Briner is Co-founder & Scientific Director of the Center for Evidence-Based Management (CEBMa). As he is considered a leading global authority on evidence-based practice, his perspectives and publications are important components of RealKM Magazine‘s guidance and resources for evidence-based knowledge management (KM).
In a LinkedIn post, Professor Briner shares a list of some things he thinks he got wrong in trying, over the past 25 years, to promote the idea of evidence-based practice (EBP) in industrial-organizational (IO) psychology, human resources (HR), and management.
These things are:
- Focusing way too much on ‘the science’ (systematic reviews, pushing research) – important but it’s just one source of evidence and awareness doesn’t seem to change behaviour.
- Not acknowledging that universities and academics are, in general, part of the problem of low adoption not part of the solution.
- Framing it as part of the (tedious and misplaced) practitioner/academic divide/gap debate.
- Implying that scientists are somehow better or purer practitioners compared to managers when they are not (e.g., questionable research practices).
- Making EBP sound like a technocratic solution which can only be undertaken by experts, nerds, wonks, brainiacs, elites, etc.
- Being too myth-bustery – tends to alienate rather than engage – crucial to challenge dodgy stuff but needs to be much more sophisticated.
- Not sufficiently appreciating how the work context pulls practitioners away from EBP. In other words, not appreciating managers’ constraints and incentives.
- Implying practitioners are making mistakes or are silly or odd or not thinking straight.
- Sounding (and being) smug and preachy – “you really should do this it’s good for you.”
- Not being sufficiently clear about when an EBP approach makes more or less sense (e.g., based on problem importance).
- Not engaging effectively with reasonable objections to EBP (e.g., that it’s time-consuming).
- Insufficient focus on the ‘diagnosis’ part of EBP and too much focus on using EBP to find a solution (the “what works?” approach only works with clearly understood/diagnosed problems.
- Positioning EBP as something individuals or teams can do when it requires structural and systemic thinking and action.
- Not emphasizing that the quality of decision-making is about the process not the outcome (bit like saying an experiment was good because it got the result the researchers wanted).
- Failing to clarify EBP is not a one-off thing but needs to be part of a longer-term process of individual, group and organizational learning and development.
- Not starting from where practitioners are now and what they can realistically do now.
- Positioning EBP as about making really very well-informed decisions rather than better-informed decisions.
- Focusing on the teeny-tiny incy-wincy proportion of practitioners who want and are able to try EBP – sure it’s not for everyone but it’s also not for almost no-one.
Header image source: Adapted from Evidence Based by Nick Youngson on Alpha Stock Images which is licenced by CC BY-SA 3.0.
Also published on Medium.