Monday, March 26, 2012

Does {X} Get You Hired?

In the past year to eighteen months I have been a member of many dialogues about the value of something educational – a degree, a certificate, a course, volunteering – in terms of getting the participant hired or promoted.  This seems to be the current hot topic in performance improvement.

… And it implies a line of demarcation and a rite of passage across that line… that is, you in or are you out?


Is that good or bad?  To be determined.  Let's consider the following…

A friend and mentor of mine, Judy Hale of Hale Associates has worked with accreditations and certifications for years and has written one of the field’s go-to books on the topic, Performance-Based Certification: How to design a valid, defensible, cost-effective program.  In recent meeting discussions about emerging certifications at ISPI, she has mentioned meeting with the Department of Labor, which now is requesting (requiring?) a new criterion for their accepted certifications.  The criterion?  Does having this certification make a difference in hiring?  That is, will people with this certification get hired over those without it? 

In one way, this is great progress; while in another way is it is just another kind of a short cut.   Employability as the validation may be just as wonky as the learning objectives that have been dumbed down to little more than a list of topics (but that's another blog). 

Look at this way.  Last year one of my clients was a community college developing programs that they wanted to promoted as certifications, where mostly they were certifying course attendance and passing both knowledge and hands-on tests.  However, their goal was to provide courses to the unemployed.  The courses were the easy part.  The tough part was determining whether any of their programs made a difference in hiring.  Well, some roles are synonymous with their certification (Certified Nursing Assistants, for example) – one must have the training program and certificate in order to get hired for the role.  Besides that program has been around a long time and well validated. 

However, a direct correlation may be more about completing than doing the work. 

The veterinary equivalent, the veterinary assistants, are just now beginning to be certified; in fact, all veterinary assistants in the United States (not just my area) were uncertified as late as 2010 and only a small fraction of one percent certified in 2011.  A new program and certificate may, in fact, be improving the work of the untrained employees who are already doing the work and using a certificate as a stepping stone.  It may also improve the chance of that individuals who are currently unemployeed will get work.  Both are likely.  However, there is no data about the baseline and no way to track progress after completing a program, so how can we measure its effectiveness?  In ten years, the situation will be like the CNA, where you can't get hired without a 6-week certificate program under your belt. 

Is employability the criterion here?  One certification can show a tight correlation between certification and job, while the other cannot merely because it’s too new to have data and has a history of employing the untrained.  Which program is better at preparing the learner for the role? 

Okay, another example.  Last week I received a call from a local journalist who was working on an article about the value of volunteering and the way that businesses were using it to promote  -- themselves both internally with their employees and externally with their clients.  We might consider this an item on the third line of the triple bottom line -- the corporate social responsibility -- with the other two being the top line (revenue) and the bottom line (net profit after taxes and expenses).   Here the social responsibility factors include work-life balance of employees, giving back to the community, receiving awards for work done, receiving publicity for work done, etc.  That employees who participate might keep their job when others are laid off, those who had volunteer work in their resume made it through the hire process or got a promotion -- those are the open questions for this journals. The conversation started around a volunteer project that I had headed up as volunteer with a professional society.  Did it get any of us a job or promotion?  No.  As a professional society, we did get promote the project that created a lot of interest among our members.  However, when members asked about participating in similar projects, they were unwilling to commit the time required.  In the end, the journalist and I talked about the value of volunteering with a professional society as one way that I provide a ‘gut check’ for my clients about whether my certification by that society is valid and valuable.  That is, having a certification might not get me hired but having volunteer work with the organization providing it might validate the certification.   

Let's try a different look at employability measures.  Not too many months ago, the key indicator of getting hired as an instructional designer was whether you had 3-5 years of Flash and Photoshop experience -- another X variable.  For many organizations the requirement to have that experience was obviously less than astute because of the rest of the position description wandered; it wasn’t clear that they had the ability to put someone with 5 years of high-tech instructional design and development experience to work using that experience. 

It’s still an open question.  Does having {X} – degree, certificate, course, volunteer experience, or specific tool experience – make one more employable? Maybe we should also ask whether those individuals with X produce a work with higher quality, quantity, speed and flexibility… and whether that is what the employer really wants.  Our line of defense for these solution sets wobbles with openly visible gaps in the line.

Actually, the most needed question is whether ‘employability’ is the best test of a {X}.  Employability may be only one part of the story and the measure of whether one is “in” or “out” may need to be more complex than do you have {X}? Line between ‘in’ and ‘out’ gets a bit convoluted when we zoom in to look closely at whether we have the right performance measures at all.  


Let's talk.  What examples do you have of validating a program or accreditation?  How effective are the measures in demonstrating the success of that program/accreditation?  

Photographs by Sharon Gander © Jan 2012