Reflections from an Implementation Short Course

Last Thursday and Friday I was fortunate to attend a two day Implementation course put on by the Department of General Practice at the University of Melbourne.

The course was aimed at researchers, professionals and clinicians with an interest in developing research projects in the primary care setting.

The first day consisted of a series of presentations, examining the theory, particularly the Normalization Process Theory, which provides a toolkit for thinking through design/implementation issues.

There were a number of key things that I took away from it.

  • I realised that, just as in the ‘real’ world of full-time employment, getting out of the office – through conferences or meetings – is critical in getting fresh perspectives and space from your own research.  I’m so thankful that the university sees this too, reserving money from my scholarship for spending on things such as these. It was good to get out of my hovel to talk to others.
  • I was surprised (and rather relieved) to find that despite our disparate topics and stages of research, amongst this very accomplished and wise group of clinicians and researchers, we were all struggling with the similar fundamental issues – e.g. what exactly are the theoretical framework and logic models of our research projects? (thankful this somehow didn’t simply depress me…!).
  • I really do need to get the fundamentals right and keep going back to them (this comes to no surprise really, but perhaps it’s good to be reminded of this regularly, particularly as a new researcher).
  • While they are both work in my department, I haven’t had much interaction with either Professor Jane Gunn or Professor Jon Emery, but was super impressed not just by their knowledge, but also by their very clear enthusiasm for research (both their own and participants’). It was also refreshing to hear both academics speak so honestly about significant RCTs and research that did not deliver
  • NPT seems to be very ‘in’, at least in the department who have strong links with the creators. I’m still not convinced and many participants seemed rather confused by how to use it (i.e. it is a subjective tool to assist in planning and/or assessing research design – but without any evidence to suggest that it is useful) – perhaps because I’m more familiar with other frameworks.
  • Systems thinking is very relevant – it’s not enough to find efficacy of an intervention in a tightly controlled experiment, it also needs to be world-ready, easy  to integrate in current practices.

 

Advertisements