AI has the potential to reduce bias and promote diversity in the recruitment process. But exactly how it works and the extent to which it can positively impact diversity remains a mystery to most HR and recruitment professionals, a new study has found.
We know that Artificial Intelligence (AI) has incredible potential to positively influence diversity of hire as well as reduce bias in recruitment. But, like any tool, it’s not failproof. How effective is AI at reducing bias? How can human recruiters work with AI to improve diversity outcomes? And how much technical expertise is needed to select and use AI properly?
When we considered these questions, we found little research had been done in this space, and hence we have partnered with Diversity Council Australia (DCA) and Monash University to undertake a series of ground-breaking studies into AI in recruitment and selection. This research looks at the impact of unconscious bias on recruitment and selection decisions using AI, and what interventions can work to minimise or remove the influence of unconscious bias in recruitment.
Fostering diversity, equity and inclusion is a key part of our recruitment strategy at Hudson RPO, both internally and for our clients. Our clients rely on us to make sure that we positively influence diversity outcomes for them, whether it's in process, technology, training or tool selection. As trusted partners, we have an obligation to use new technology responsibly and to be informed about how it may impact hiring decisions.
Inclusive AI @ Work
Over the last three months, DCA and Monash interviewed a cross-section of professionals, including recruiters and HR professionals, job seekers, HR subject matter experts, AI developers, and AI subject matter experts.
When it comes to HR and recruitment professionals, the preliminary research uncovered a number of interesting findings:
AI has incredible potential. Recruiters and HR professionals have embraced AI and have integrated it into their workflows with encouraging results. AI tools have allowed them to improve efficiencies, elevate the candidate experience, and focus on high-value tasks.
The inner workings of AI remain unknown. Although they’ve embraced these tools, the recruitment and HR professionals surveyed indicated that there was a lack of clarity about what AI does and how it works in practice, describing it as a “DIY Black Box”. Some felt they needed more training to better understand the inner workings of AI.
There's a push for greater customisation. Most wished for more customisation, e.g. improving usability and outcomes by refining various inputs, such as through weighting, but did not know who to contact to achieve this. This left them unsure how these tools might impact D&I.
The key takeaway seems to be that AI is neither a friend nor foe for recruitment and HR professionals, and that more training and support is needed to help them use these tools to remove bias and promote diversity.
To explore these preliminary findings further, have a look at this infographic in the report.
Putting AI to work
So, what does this mean for recruitment professionals looking to use AI tools to promote a more diverse hiring environment? Like most technologies, you can’t simply take a plug and play approach. There needs to be careful consideration around a number of factors.
Here are our top takeaways for HR and recruitment professionals who are using AI.
Be clear on where AI can add value (and where it can’t)
One thing this research points to is the need to be very clear about your talent acquisition strategy. Like any technology, you shouldn’t use AI for AI’s sake. Recruitment and HR leaders need to focus on driving outcomes and measurement and then work out where AI fits within this framework.
Review the roles that you're recruiting for and consider where in that lifecycle can AI add the greatest value. For example, are you operating in a candidate-rich or candidate-poor environment?
For roles where you don't have a lot of candidates applying, for instance, very senior or technical roles, AI would be best used to source those people. In a candidate-rich environment such as a customer service role, AI would be best used to help screen and assess them.
By understanding where in the recruitment lifecycle you need to enrich the candidate experience, you’ll be able to better determine the type of technology you need to achieve your outcomes.
Carefully research the available tools
AI has infiltrated almost every stage of the recruitment lifecycle - from sourcing to selection to onboarding. By one count there are over 250 different commercial AI-based HR tools available. This proliferation of tools means technology providers are clamouring to gain market share and we are being bombarded with claims of what this technology can do.
But, not all tools are created equal.
Take chatbots for example. There are some chatbots that use natural language processing and sound very conversational - so much so that you may not even be aware you’re talking to a robot. Whereas other chatbots are very robotic and it's clear that you’re talking to a computer.
And that’s just one small subset of AI in recruitment. There are tools across the entire recruitment lifecycle that all vary in terms of what they claim to do, so it’s worth doing your research before you make your decision.
Be proactive: AI is dynamic and evolving
Although AI is designed to be intuitive and to learn from itself, it’s not a set and forget tool. It’s crucial that you’re not only training people in how to use it, but constantly evaluating and refining it to ensure it’s working optimally.
AI technology itself is constantly evolving, and so should your strategy around it.
Asking some important questions like these will help you critically assess your tool:
Is the algorithm giving us more diverse candidates? Is there bias creeping into the tool?
Has the data been periodically validated? Has any data become obsolete?
Where does it need to be customised to perform better?
The AI tool’s creator should have established systems for documenting and monitoring
its performance. However, like any relationship, there are two sides. If DEI is an organisational priority for you, it's important to work hand in glove with your tech providers and recruitment team to continually validate the data and make sure it’s giving you the outcome you desire.
The question as to who should review your AI tool is an important one.
In our business, we have a manager of the solution working hand in hand with our technology team to make sure that the tool is working optimally. This is who our recruiters go to if they've got questions or issues. That individual then liaises with the technology provider to get those questions answered or those customisations made.
If you don’t have someone internally, it’s a good idea to partner with professionals who can help you manage this relationship. It’s far too easy to neglect the ongoing monitoring and review process, especially when things get busy. And considering last year was one of the busiest recruitment years on record, it’s easy to see why this happens.
Create some space and resources to perform those reviews and ongoing customisations and your AI tools will continue to serve you well.
It’s still early days
Although AI has been embedded into almost every aspect of the recruitment lifecycle, we're still very much at the early stages of our AI journey.
We’re at the very start of what will probably be a multi-generational trend. Just as we all felt a little uncertain when Applicant Tracking Systems started to replace our Excel spreadsheets, AI is largely unknown.
And that’s why this research is so crucial. The DCA and Monash* research program is currently in year two of a three-year program. We’re excited to see what comes out of the next phase of research as we uncover the practical recommendations for using AI to reduce bias and improve diversity of hire.