How-to to make AI for recruitment a strength not a weakness

May 17, 2023 | How-to, Selection & assessment, You might have missed


There’s growing concern that recent AI developments represent as much risk as they do reward. Robert Newry, CEO at Arctic Shores, offers practical ideas to make AI for recruitment a strength not a weakness.


Flashback to 2014 and Amazon’s disastrous first foray into AI for recruitment and the concern over the latest iteration is entirely understandable.


However, ten years on, surely we have learnt, iterated and improved? Are new AI tools like ChatGPT set to become our greatest strength in recruitment? Or an even greater weakness in waiting?


The news and hype justifies the question. A lawsuit last year revealed significant issues when using AI to read facial expressions and linking that to suitability for a job.


The impact has been for public institutions to start to regulate on how AI should be used in sensitive areas like recruitment.


Our past experience with AI seems to be have been overlooked with all the excitement over Open AI’s ChatGPT. Once again AI is being hailed as the answer to big recruiting challenges like large-scale screening. As recruitment teams are increasingly being asked to do more with less, automation is not an option.


The reason we keep searching for a better way to automate aspects of recruitment is that humans are just as flawed when doing repetitive tasks like reviewing CVs or interviewing.  Years of unconscious bias training have not made the difference we expected.


With concerns emerging over discrimination in the use of AI (see a review of ChatGPT’s role in CV screening), the main question for many in recruitment remains – should we embrace this or keep it at a distance?


This is one of the themes we will be discussing in the ISE webinar on 25 May 2023 ChatGPT and the Impact to Early Careers Recruitment.



If AI is not necessarily the saviour, what should we do? 


Successful recruiters of tomorrow will work out how to leverage advances in AI by understanding its weaknesses as much as its strengths. Learning how to unlock AI’s potential, will help you unlock your own.


What we do know is that the last thing anyone in volume hiring wants to do is to go back to more manual screening. So the question is how can we get the best of both worlds – be more human AND more automated?


Here are three tips I have learnt since AI first appeared as a means of automating tasks ten years ago. They will help employers to make AI for recruitment a strength not a weakness.


1. Seek independent AND expert knowledge


Spend time with those who know, before taking the plunge. Seems obvious but who and where do you turn to?

There is lots of independent advice spread around LinkedIn but mostly by individuals who are not experts in the fields of occupational best practice.

This is why the Amazon team failed – they had great expertise in data science and machine learning but they didn’t utilise business psychologists who would have insisted on best practice to avoid the basic group adverse impact their model fell foul of.

Since I have entered the world of psychometric assessment, I have discovered that business psychologists are an overlooked and undervalued expert resource.

They are trained to look at any data relating to people in the workplace with more than just a statistical correlation perspective. They expect representative comparison groups, they check for adverse impact, they know how to interrogate validation data, and most importantly they know what best practice looks like.

Like any profession, not all are at the same standard. A good place to start if you want to find someone is the Association of Business Psychologists.

Even before you get expert advice, there are some basic questions to ask AI providers:

  • What is the size and depth of the comparison group used to train the AI algorithm? (Do not be fobbed off with ‘this is proprietary’ because it’s not and at a minimum should be available via a non-disclosure agreement)
  • Can the way the score was calculated be explained to a candidate or a regulatory body?
  • Can the provider provide evidence (which can be independently scrutinised) of the consistency and reliability of their results?

One method to avoid is testing any AI out on yourself or a few members of staff. Any scientist will tell you that if you want to know the validity of something, small sample sizes are not the way to do it.

You wouldn’t expect your doctor to try out a new drug on a couple of his or her friends and family before prescribing to patients. So why would do that with AI for recruitment?


2. Use AI to improve on what humans do poorly and use humans to do what AI does poorly


AI is not a silver bullet. It can process information far faster and more reliably than a human, and now process that information at human levels of reasoning. But don’t use AI to automate a flawed process and then blame AI!

I have come across far too many cases where AI and/or new tech were introduced to automate a previously flawed human process and organisations conclude that AI doesn’t work or is dangerous.

Facial recognition in video interviews was one example – no research shows that our facial muscle movement reliably corresponds to our performance in the workplace. Another includes removing names from CVs, which just pushes bias into other areas of CV screening, such as university attended or hobbies.

Technology has never been the solution on its own. First, we need to be clear about what we want to hire for, then build an appropriate and objective process around that and then work out how best to automate it.

The problem for TA teams is the vast array of HR tech solutions, all promising great improvements with terms like ‘ethical AI’. This means working out which solution to use and how can be mind-dizzying.

The answer then is not less automation and more human interaction. It is to redesign your process and apply your expertise selectively – I like the analogy of ‘book-ending’.

Careful human research, interrogation and configuration at the start, validated AI in the middle, and human oversight at the end to check the results align with the hypothesis.

By putting problems first, designing your process around that, you’ll be able to design a tech stack that is more human in its fairness not less. For ideas on how this could look, please see our CV-Less Hiring Playbook.


3. How to win hearts and minds


For hiring managers, explaining that a CV review does not predict performance is THE biggest challenge. CVs are a comfort blanket– they are built deep into our psyche and challenging that position feels like challenging a core belief.

Few hiring managers realise (or worse acknowledge) that there can be over 150 biases at play when reviewing a CV. The trick is to show why a new approach is better, rather than lambast hiring managers for being biased.

A workshop to go through and bring to life the flaws of relying on a CV is a great way to start. You could also bring in an external inspirational speaker who can bring to life why the CV is a barrier to change rather than an enabler.

You won’t convince everyone at the outset, which is why a pilot showing the benefits of a new approach is so important. Hiring managers need to see how you can bring AI and human oversight together to make better human decisions. We do it all the time in sport, so why not recruitment?



A case study in next generation recruitment


Last year Siemens Electrification and Automation were experiencing all the problems I highlighted above when looking for project engineers. Some roles had gone unfilled for 200 days last year.


The managing director, Jon Turner, decided a radical change was needed if his business was not going to be held back from growth.


Jon was willing to experiment with a radically new approach to broaden his talent pool, which blended a human friendly approach with automation. One of the most important things he told his team was that the pilot may not be successful first time, but if they learnt from their mistakes he was sure they would make real improvements.


The first step was to make the bold move to scrap the CV as the initial screen. This meant the number of applications jumped from just over 80 in previous campaigns to over 500. If he hadn’t had an online assessment this would have ground his TA team to a halt.


What appealed to him was that the assessment enabled all applicants to be considered in the same way and because each one got a feedback report on their strengths it still respected the candidate.


The big success for Siemens was that the team found eight incredible candidates (four men and four women) from talent pools the business had never considered (or been open to). One had worked at KFC, another at Aldi and another was an internal candidate. They took on two of them and filled the role in just 41 days. The hiring manager was thrilled.


Final thoughts


Be honest with yourself. You are not a tech expert let alone an AI one. That’s ok, you are not expected to be. Be comfortable with getting advice on the process as much as the technology.


If you do your due diligence right you will create a foundation to match your weaknesses to AI’s greatest and growing strengths.


You may also be interested in

Register for our FREE ISE webinar on 25 May 2023: ChatGPT and the Impact to Early Careers Recruitment

How is AI being used to recruit graduates and apprentices?


The good, the bad, and the unclear of AI in recruitment

Was this article helpful?


Share This