AI in recruitment is a fast-growing hiring practice. With Australia’s spending on AI systems set to grow to $3.6 billion in 2025, AI is clearly here to stay.
Employers have a responsibility to ensure inclusion and diversity are front of mind when designing and implementing AI-powered tools, and recruitment is no exception.
Now, employers have guidance. The Diversity Council Australia (DCA)’s new Inclusive AI at Work in Recruitment: How organisations can use AI to help rather than harm diversity outlines how to embed inclusion and diversity practices into AI-powered recruitment tools from the get-go.
The Role of AI in Recruitment
Uses of AI in recruitment are wide-reaching, appearing at every stage of the recruitment process from resume screening to candidate matching. Employers use predictive AI to source job candidates on websites like LinkedIn, they use chatbots involving video interviews with virtual recruiters, as well as simulations and games like AI-powered psychometric testing.
Using AI to aid recruitment holds indisputable organisational benefits. It can reduce hiring times by 90 per cent, reduce screening costs, lower turnover rates, and improve revenue. If used well, AI can also diversify hires, maximise accessibility, fairness, objectivity, transparency and help employers broaden their talent pool of applicants but the challenges of hiring inclusively first must be addressed.
Challenges and Concerns
Using AI in recruitment to facilitate a diverse and inclusive workforce can be challenging, and those challenges tend to mirror those seen in traditional hiring processes. Biases like preferring candidates with formal education, thus alienating lower socio-economic backgrounds, have long played out in traditional hiring methods. As AI is built by humans, the same biases show up in how AI recruitment tools are designed, creating algorithms that are not inclusive.
Remember that to really hire inclusively, AI should be just one recruitment tool. One quarter of Australians face systemic barriers to being online and using technology. This means, when used in isolation, AI in recruitment can alienate many people from your recruitment pool including people with limited income, people in regional areas, people with disabilities and many others.
DCA's new Report
DCA’s guidelines show how to create a team to keep your organisation up to speed on how bias plays out in recruitment so that an inclusive AI recruitment plan can be designed and implemented.
Employers must assess the state of their workplaces to carefully plan an inclusive AI recruitment strategy. They can do this by first assessing:
- how inclusive their workplace is and
- how advanced their AI systems are.
Quite often, organisations believe their services are inclusive, but overlook or are unaware of a particular cohort of users who have difficulties with services. Professionals who have the expertise and know what they are looking for can review systems and provide feedback to improve outcomes. The team at the Centre for Inclusive Design are experts in this area and can help.
Inclusive Design Principles
To create inclusive AI recruitment tools, employers and hirers must use inclusive design principles. Inclusive design connects government, industry, and organisations with communities of people who are traditionally excluded or unable to access products, services, and the built environment – often called ‘edge users.’
Incorporating the lived experience of edge users into the design process will increase the accessibility and usability of AI in recruitment tools. It is a practice of designing with, not for. When hirers and employers engage edge users from the beginning, they create systems that benefit everyone.
Addressing Bias and Fairness
Inclusive design can help address bias in AI recruitment by utilising the skills and insights of edge users to make the process fairer. DCA’s guidelines recommend allocating a diversity and inclusion (D&I) team who can assess AI recruitment tools from an inclusion and diversity lens.
Choosing D&I team members from backgrounds with experience being marginalised by labour market recruitment trends will be key to your team being equipped to anticipate and identify bias in your AI tool. Your D&I team will still need to be trained to understand how bias in traditional recruitment has worked and how these biases manifest in AI. You might not have people from all walks of life, so can seek support outside your organisation.
A high level of AI maturity is also needed to successfully deploy AI tools that do not inadvertently disqualify edge users. AI specialists need to train anyone using the AI tool on how to use it to minimise bias and understand how collating data about people through the algorithm can disallow many jobseekers from being recruited.
Transparency and Accountability
Whilst AI technologies used in recruitment are new and we are waiting for legislation to catch up, organisations and their leaders must be held accountable for their AI-driven hiring decisions and remember that a monolithic workforce lacking diversity will also lack innovation.
Transparency in AI-generated recruitment will be critical to AI being used ethically and inclusively. Employers and vendors should openly communicate their generative AI utilization tools in candidate assessment to avoid hiring decisions facilitating the disqualification of suitable candidates due to the whims and prejudices of a human-made AI tool.
Practical Steps for Inclusive AI Recruitment
For organisations to adopt inclusive recruitment processes, leading with the mindset of ‘How can we codesign with community from the beginning to the end of the process? is key.
The creation of a D&I team is a practical way of ensuring the right questions are asked before deploying AI. Also critical is the process of continuously monitoring and allowing for adjustment of the AI tools you use. Feedback loops are necessary as is ensuring the system can be amended and adjusted to facilitate inclusive AI recruitment practices.
The adoption of an inclusive design mindset, designing with not for, is paramount to organisations hiring inclusively and diversely when using AI. So when you are looking to create a new system, think about who it works for, who it doesn’t work for and how you may find ways to recruit inclusively for everyone.
Diversity Council Australia’s Inclusive AI at Work in Recruitment: How organisations can use AI to help rather than harm diversity helps organisations manage the risks associated with AI in recruitment to ensure we include, not exclude. Find out more here.
Manisha Amin is the CEO, chief strategist and visionary of the Centre for Inclusive Design (CfID), a social enterprise leading the conversation on the power of thinking from the edge. Manisha currently sits on the Boards of Diversity Council Australia, Bambuddah Group, and Nautunki Theatre Company, as well as the SBS Community Advisory Panel. She was a former Board member and Deputy Chair of ADHD Australia.