For Blood / Biologics

Sizing Up Staff Competency
Jill Drummond, MT (ASCP)





Training employees to perform tasks in a laboratory or technical environment is critical to the success of any regulated business. Assessing employee skill levels is one way to see if employees are comprehending what they are learning.

You spend a lot of time and money training new staff, but are you sure they really get it?  There are many challenges associated when bringing entry-level (high school) employees to a competent level in a technical or laboratory environment.  This article highlights how to develop Competency Assessments that are effective in measuring knowledge-based and skill-based tasks.
Performance Growth Model Image
Most organizations have the goal of moving employees from a state of “untrained” to “fluent” in a given skill or ability.  To illustrate this, we look to the Performance Growth model, which provides us with four distinct levels of performance, starting with skill acquisition. 


MasterControl Training Toolkit

This article is related to the toolkit:
Training Toolkit from MasterControl.
To get the full details, please download your free toolkit.




Skill Acquisition - This is the initial phase of learning a new skill and the period where training takes place.  The learner is introduced to the content and, depending on the task, one or more practice sessions may be needed to learn how to perform the required steps and the proper sequence of the steps.  During this time, constant assistance and coaching are necessary. 

Skill Competency - This is the intermediate phase in learning a new skill or task.  At this point, the learner can perform the required steps, in the proper sequence without assistance, but they may not progress from one step to another efficiently.  In other words, they can perform the task and can do it correctly, but they may be slower than more experienced employees.  This is the point to assess initial competency to ensure that, during the skill acquisition or training phase, the training was successful. 

Skill Proficiency - This is where a learner is out in the work force and gains more experience.  He or she will be able to efficiently and precisely perform the proper sequence of steps.  Some organizations actually establish productivity standards and expectations for experienced staff based on skill proficiency levels.  Again, proficiency levels are performance standards that you would expect of experienced employees, not someone who has just recently achieved competency. 

Skill Fluency - Finally, our ultimate goal is that the learner will achieve skill fluency.  This is the final stage in learning a new task and it is true mastery.  If performance is not fluent, it is not likely to be maintained.  With fluency, correct performance becomes second nature and mistakes or errors decline. 

When discussing competency and proficiency, it is important to know the difference.  Webster’s defines competency as “Having the essential abilities or qualities that make one legally qualified to do a job”, and proficiency is defined as “Involves qualities of being adept, skilled, or advanced in an occupation”.  The difference is that competency is the minimum standard of performance required to be considered qualified in a task and proficiency is achieving greater efficiency that comes from experience.

After learners complete initial competency assessments, they are deemed qualified to perform the task independently.  However, skills and knowledge continue to improve with experience on the job and learners gain greater efficiency which leads to becoming proficient in the task.

This brings us to competency assessment.  Webster’s defines competency assessment as “Putting a value on the essential abilities or qualities that make one qualified to do a job”.  For our purposes, we will view competency assessment as a process to test that the training was successful and the learner can demonstrate the required skills or apply required knowledge.

There are many advantages to designing competency-based assessments.  They help to standardize the training materials and ensure that the training is consistent, whether it is delivered in one location or several.  Competency assessments can form the basis for trainer demonstrations.  In addition, they can function as a self or peer-assessment tool that can be used at anytime.  The primary advantage to competency-based standards is to ensure that all learners have their skills measured according to the same standard and provide a basis for follow-up evaluation if needed.   

Likewise, there are also some drawbacks to developing competency–based assessments.  They do require more time and effort to develop and they are much more formal than an evaluation of a person’s skills based on job shadowing to determine if they are ready for the job or not.  Competency-based assessments are much more structured and, when developed correctly, outline specific performance standards.  In addition, an adequate number of skilled trainers are required to conduct training and assessments because competency–based training usually involves one-on-one or small group instruction, designed to be deployed in the job environment.

When designing competency assessments it is important to remember there are two main types of tasks to be assessed, skill-based and knowledge-based.  Competency assessments need to be designed to test the specific type of task.

Skill-based assessments are designed to evaluate the use of psychomotor abilities and the application of performance skills.  In other words, the learner must be able to demonstrate that skill.  An example of an assessment instrument designed for skill-based tasks is a performance-based observation checklist.  Through direct observation, the observer can evaluate whether the learner can perform the task, step-by-step and without assistance, based on standardized performance criteria. 

The second type of assessment would be appropriate for knowledge-based tasks.  These assessments are designed to evaluate the use of cognitive or mental abilities and the application of knowledge.  In other words, the learner must be able to apply what he or she has learned.  Examples include quizzes, scenarios, role-plays, etc.  This type of knowledge-based competency assessment might evaluate whether the learner can perform a calculation, based on a real or scenario values, or whether the learner can make a decision or solve a problem based on a case study. 

Now that we have a good idea of the different types of tasks and the different types of assessments that would be appropriate for each, let us look at how to actually develop the competency assessments.  There are four major steps to developing competency assessments: 

  1. Review the process or procedure in order to “chunk” it down into manageable tasks.  A task is the smallest piece of that process that can be performed by one person.
  1. Perform task analysis to define the critical steps for a skilled-based task or critical content for knowledge-based tasks. 
  1. Establish specific standards for the task.  These are the minimum expectations that define how we determine when the task is done correctly or knowledge applied accurately.  This is the most critical part of the process.  If you do not establish specific standards for each step, it is very difficult to determine whether the learner is doing it correctly. 
  1. Develop the appropriate competency assessment instrument for the type of task being taught, skill-based versus knowledge-based. 

Example of a skill-based competency assessment:

Competency Assessment Image

In conclusion, here are some tips for successful implementation of competency assessments.  Develop the assessments to be as objective as possible so the behavior, not the personality is assessed.  During the training period, it is important to provide both positive and constructive feedback.  This is critical for skills acquisition and the first phase of learning.  Maybe the most important tip is to make sure you allow sufficient practice time for that skill acquisition.  Don’t rush competency assessments.  Remember, our overall goal is to develop a work force with a solid performance, and enable each individual to move through the performance growth model and eventually reach fluency.

Jill Drummond is Director, Training and Education for Blood Systems (www.bloodsystems.org).

Read about training using electronic software solutions:

Product Data Sheet:

MasterControl Training?

Tech Paper:
MasterControl Training?

White Paper:
Automating Training Control Processes



Click here to view all available resources.

If you are a current MasterControl customer, click here to download documents directly through our Customer Center.