Tag Archives: microbiology

“The Sick Microbiologist”


It has been a tough couple of weeks, as I have been suffering from a nasty viral infection. I have been teetering for several days along the borderline between being able to work or staying in bed. Rightly or wrongly, I have been struggling into the lab, and for that I thank (or blame) my parents for instilling a strong work ethic in me.

Nevertheless it has very much been a case of energy conservation over the past couple of weeks. To facilitate this I have been doing the following:

  • Taking long coffee and lunch breaks. Getting out of the micro lab early.
  • Keeping the number and content of emails to an absolute minimum.
  • Ruthlessly declining all options and requests for ‘voluntary’ work/projects.
  • Keeping a low profile (for infection control reasons!)

And funnily enough, despite my compromised health, I have been keeping up, and quite easily as well.

It is amazing that we manage to ‘squeeze’ into an 8 hour working day what we should really be doing in 3 or 4, maybe even less…

Maybe I need to be sick more often…



“Do the Maths”


10 years ago, the average medical microbiology lab scientist or technician might expect to process 50 samples in a day.

10 years from now, with the help of automation and interpretative software, the same staff member will likely be processing upwards of 300 samples per day. For those who do molecular or infectious serology as opposed to bacteriology, it could well be many more.

Do the maths…

If you are thinking about going into microbiology training, you need to be very aware of this. Ask your peers about the numbers being trained on the course, and the likely number of job posts at the end of training. Look for transparency and honesty from the colleges and your career advisors. If this is your passion in life, by all means consider it, but be aware of how the discipline is going to change over the next decade.

If you are already in training or holding a post as a microbiology scientist/technician, then it is so, so important to supplement your skills: Show leadership qualities, do some research, get yourself IT savvy, learn how to troubleshoot automated systems. It all matters.

And if you work in a laboratory which is already highly automated, there is no room for complacency. The automation will inevitably increase further, and the interpretative software will become more sophisticated…

This new technology creates employment as well, but it is different employment. Designers, engineers, IT specialists and salespeople are all required for automated platforms.

And clinical microbiologists, like myself, are not immune from this evolution. (see this related post)

All we can do is be acutely aware of the changes that are taking place around us, and prepare ourselves as best we can for the future. We should be a little concerned by the above, but embrace the challenge nevertheless.

For a good article on this topic, click here. Things are changing so quickly, that already parts of this 2013 article are out of date….



“Cutting off the fat whilst keeping the flesh”


Reducing inappropriate or unnecessary testing is generally a good idea. Not only does it free up finances which can then be used on other more useful tests, it also improves the positive predictive value of the test in question by increasing the prevalence in the tested population.

But how exactly do you reduce inappropriate testing? Well you can look at guidelines from other centres or research papers published in journals. But in my opinion the best evidence to support reducing inappropriate or unnecessary testing is to collect your own localised data. This is particularly the case if you want to reduce unnecessary testing by introducing specific testing criteria based on certain patient or laboratory parameters.

For example if you want to restrict Hepatitis A testing to those patients with a significant ALT increase, then you need to look at the range of ALT values for all your patients with a genuine positive Hepatitis A result.

If you want to restrict Trichomonas testing to all those patients under a certain age, then you need to examine the age related prevalence rates for Trichomonas in your particular population.

The risk with cutting off the fat however is that you always risk cutting off a little bit of the flesh, i.e. you may miss the occasional positive where the patient has fallen outside the pre-determined testing criteria for that particular infection.

The key is in deciding whether the criteria or the cut-off level for testing that you have set are acceptable, and to do this you need to take into account as a minimum the severity of disease, the consequences of a missed diagnosis, the opportunity/potential to make the diagnosis at a later date. This is why you would never dream of restricting testing for syphilis in community age groups over a certain age, just because the prevalence in this cohort is so low. Because if you miss the diagnosis of syphilis the consequences could be a lot worse than if you missed a trichomonas infection…..

But possibly the most important factor to take into account when trying to adopt selective testing criteria is to consult with and gain approval from requestors, and in particular specialists in that particular area of testing. Thus it is a good idea to have good working relationships with the Infectious Diseases department and for that matter all your other users as well.

The requestors generally understand the situation. they will often be budget holders in their particular area/institution and understand exactly what you are trying to achieve. If you are reasonable, rational and communicative, only rarely will they stand in the way of what you are trying to achieve.

And the flesh and fat analogy works well in my opinion. If you try to cut off too much fat, then you will start removing the flesh as well, and you will only end up hurting the patient….