Algorithms are getting bad press. From the US elections to the UK’s summer 2020 A-level results and more recently, AI in assessment for hiring. Mistakes have been made in applications of AI that have eroded trust. However, a lot of good can be achieved in student assessment when AI is applied thoughtfully and correctly, says Sova’s Dr. Alan Bourne.
Despite the use of Artificial Intelligence (AI) in our daily lives, and in many critical decision-making processes, there has been significant and understandable resistance in accepting AI as a tool in hiring.
This is primarily due to a lack of clarity in various examples as to whether it can be demonstrably shown to be fair. Building trust, and ultimately social legitimacy, is difficult when positive messages about testing, results and benefits are crowded out by stories in the media about AI gone wrong.
To demonstrate to the wider world that AI has a role in student assessment, we need to focus on setting clear requirements of how fairness and relevance are proven, alongside communication and openness about why and how they help make positive change.
Raising the bar without losing client focus
Firstly, I think we need to consider whether the bar for measuring the success of AI in assessment has been set too low.
Any tool that promises to be ‘better than CV screening’ isn’t promising much! But expectations are high and any tool we use must be proven to be better than the alternatives, not just equal to them, if it is to gain social acceptance. For example, in cancer screening it is accepted that AI can augment the work of a highly skilled radiographer.
Secondly, I believe it’s vitally important to use only techniques that we fully understand and are underpinned by evidence. It’s easy to see why the US passed a bill to restrict the use of facial recognition. How did the creators develop the model in the first place? Although an innovative concept, there were no models, and no training data sets that could have fairly trained a facial recognition algorithm at the time of its development. Launching unproven concepts into the big wide world is a risk and one that undermines the trust we are trying to build in AI.
Lastly, there is a need for more transparency around AI in assessment for recruitment. Transparency for buyers of AI tools about how decisions are made by a provider’s models and recorded in an audit log. Transparency for candidates about what goes on in an AI model and how fairness and prediction is built in.
Using AI for good
It’s important to remember that there has always been numbers-based decision making in occupational psychology.
It used to be that those numbers were derived from paper-based assessments rather than by AI. It was widely accepted that they were fair and reliable. Today, our team of psychologists can still do this, but it takes a really long time! AI streamlines this process without making any compromises.
But there has also been a lot of change in AI algorithms over the last five years resulting in a big difference between AI that is tested and AI that is new and experimental. At Sova, we only use proven AI that has a level of maturity in which we’re fully confident.
For example, we can prove that the video interview feature in our platform accurately identifies the right people to go through to assessment centre. We get positive results and feedback because not only is it predictive and fair statistically, but it’s easily explainable. If someone asks how it works?
Our AI tool transcribes an interview analysing what people have said irrespective of how they’ve said it. It’s used with human augmentation to make the final decision and no demographic data is collected. Easily explained and transparent.
We also use AI to continuously monitor and improve the effectiveness and fairness of our assessment solutions. AI helps us to optimise every bit of content so that it’s as useful as possible. The result being a highly efficient and predictive solution for our clients and an enjoyable and relevant experience for candidates.
Six fundamentals of good AI
If you plan to add some AI techniques to your recruitment toolkit, these six fundamental points will provide a solid grounding. Even without the application of AI, these points are central to any best practice assessment journey.
- Is it predictive? Does the tool predict the outcome, and can you demonstrate it? For example, is there correlation with identification of high performers, retention or any other goal you have set?
- Is it fair? If you measure for a certain attribute at one point in time and you do it again at a later date, are the results consistent? This reliability paired with prediction are the cornerstones of fairness.
- Is it accurate? Whether you’re dealing with huge applicant numbers or a small population, accuracy is key to separating out the right candidates.
- Candidate experience. Do candidates find the experience valuable, engaging and developmental? Boring, lengthy or irrelevant tests will impact your results.
- Is it relevant? Research shows that when candidates perceive the assessment process as fair and job related, they are more likely to perform well in the assessment. There is an appetite to buy innovative technology which is something that we also promote and support, but this needs to be balanced with the ability to prove that it works for your organisation.
- Is it transparent? Transparency in assessment determines whether or not it’s seen as socially legitimate. This relates back to our original point about perceptions, reputation and trust.
Part of our job at Sova is to create educated, informed buyers. Our commitment is to apply AI ethically to make fairer hiring decisions and to alleviate the inefficiencies of the assessment process. We know that when done well, AI has the potential to overhaul the fairness and accuracy of how people are selected for a job.
To learn more about AI in assessment, our whitepaper Ethics, Equality and Empowerment explains more about assessment in the age of AI.
For more insight to the tools and techniques used in student assessment read ISE’s Student Recruitment Survey 2020
0 Comments