Product Search
Product Search

Secure Checkout

Published Study Finds Education Effective... or Does it?

Critical reading is... well, critical

by Dennis Ernst • January 29, 2021

Technical, Phlebotomy News


As you know, I live in the literature. If a study or article on blood collection comes out, I devour it. Every now and then, though, the conclusions researchers draw leave me scratching my head.

Take for example a study recently published in Clinical Laboratory where researchers in Romania concluded education is an effective way to implement a standardized protocol for blood sample collection. Not exactly something that should stop the presses, but I'm always interested when my own livelihood is validated. When I read the results in detail, though, I was hard-pressed to draw the same conclusion as the authors.

Over a period of four months, they tracked the frequency samples drawn by the medical staff were hemolyzed, underfilled or clotted in anticoagulant tubes. Then they implemented a peer-education program on proper technique and measured the frequency again. Now, you would think that after educating the staff on how blood samples should be drawn, the frequency of all three indicators of poor quality would come crashing down. How else could they conclude education to be effective?

But instead of decreasing after the educational initiative, the rate of hemolyzed samples more than tripled in all units. ICU and emergency department hemolysis went from 4.5% to nearly 15% and that on medical floors went from 3% to 9%. The frequency of underfilled samples more than doubled in all units except the ICU where a small decrease was observed. So how is it the authors concluded education to be "effective"? I'm really scratching my head here.

To be fair, they did notice a mean reduction in clotted samples of 38%, but by my thinking you can't cherry pick your numbers like that when reaching a conclusion. Unless I'm missing something, the conclusion should be that the educational intervention they implemented was effective on only one of three quality indicators. For the other two it was an abject failure. Look, I know research is hard and exacting. But authors and those who publish their work need to know people rely on their conclusions to be valid and from which they can improve processes in their own laboratories. To draw and publish a conclusion that suppresses data is to lose credibility.

To set the record straight on the value of education, consider a similar study conducted by researchers in Oman. Not only did they conclude that initiating an educational intervention dramatically increased sample quality, their data backed it up. The rate of unacceptable samples decreased from nearly 3% to less than 1%.

I guess my point is that you have to read everything critically, and not let other people draw your conclusions for you. With that said, the same goes for this post. That's why I provided the links for you. If you think I'm the one who drew the wrong conclusion, I'd like to hear from you.


overall rating:
my rating: log in to rate

clotted education educational intervention preanalytical research sample sample quality underfilled


Leave a Comment

Visit