Neil Johnson, ACCA Careers editor



Studying this article and answering the related questions can count towards your verifiable CPD if you are following the unit route to CPD, and the content is relevant to your learning and development needs. One hour of learning equates to one unit of CPD.
Multiple-choice questions

Although many might fear the robots are coming for their jobs, in fact it looks just as likely they’ll seek you out to offer you one.

A survey by the UK’s Institute of Student Employers found that 28% of its members were using AI to recruit graduates and apprentices in 2023, a significant leap from just 9% the previous year. Other research has found that 44% of HR executives in the US have already embraced AI, while 79% of recruiters believe AI will soon be able to make hiring and firing decisions.

If you’re looking for new job opportunities, it’s worth being aware of how recruiters and employers are using the tech so you understand how your application might be viewed and processed.

Automated interactions

Generative AI, which is based on large language modelling (see the AB article The rise of the LLM), is currently at the forefront of a lot of the change, with recruiters and employers particularly liking its ability to create job descriptions and conduct automated interactions with potential candidates.

In a recent blog, Mike Smith, chief executive at recruitment company Randstad Enterprise, refers to a tool they use to automatically create job ads. This frees up recruiters’ time so they can perform more value-adding work – work AI can’t do that needs a human touch – such as establishing meaningful relationships with candidates and other initiatives to make the hiring experience more positive.

Randstad also uses a tool to create a list of interview questions based on job descriptions, generating model answers and ranking candidates based on their responses. It can then reach out to potential candidates who click a link for a brief introduction to the role, and even conduct an initial screening with them if they’re interested.

Recruiters seek specific skills as opposed to the more traditional search for education and experience

Furthermore, LLM tools such as ChatGPT can be used as an automated interviewer or assessor for online interactions with candidates. ‘It can ask standardised questions, evaluate responses and provide an objective analysis of candidates’ performance, which helps reduce bias,’ says Bharath Bala, a Singapore-based recruiter specialising in technology.

‘These tools can also administer skills-based tests to evaluate candidates’ knowledge and abilities,’ continues Bala (more on this below). ‘It can simulate real-world scenarios and assess candidates’ problem-solving skills, technical expertise and domain-specific knowledge.’

Tools can then follow up with applicants. ‘After they’ve evaluated candidates, LLM-based applications can generate personalised feedback, highlighting strengths and offering suggestions for professional development, including relevant resources or training programmes,’ says Bala.

Skills-based focus

Gaining ground in the jobs market currently, thanks to innovations afforded by AI, is skills-based hiring. Here, recruiters seek specific skills as opposed to the more traditional search for education and experience. Before AI, this was time-consuming, requiring recruiters to manually map candidate skills against job descriptions. AI can now parse these two in a flash.

This can uncover untapped and overlooked talent, and potentially even go some way to addressing a key AI worry – diversity. As this approach is not focusing on people with specific degrees from ‘the right schools’ or experience at only certain companies, AI can be used to quickly and deeply mine for skills from far and wide.

Blunt tool

Diversity, bias and discrimination are still major concerns in the recruitment tech space – for candidates and employers alike.

One high-profile example is Amazon’s automated recruitment tool unwittingly discriminating against female applicants after the CV screening algorithm self-modified to prefer male candidates based on data from the preceding 10 years.

Katherine Cooke, senior associate in employment law at Higgs LLP in the UK, points out that businesses using AI to screen CVs would be liable for any discrimination that occurred as a result of using AI.

‘The AI can scan applications and rank them on pre-determined criteria – for example, academic results or keywords used to describe skillsets. While employers might see this as an economic strategy to produce a manageable shortlist, they should be careful,’ she says.

‘ChatGPT may struggle to grasp the contextual nuances of certain questions or responses’

In addition, LLMs lack contextual understanding, trust, transparency and emotional intelligence. ‘In certain situations, candidates may require empathy and emotional support that a machine may not be able to provide adequately,’ says Bala. ‘Relying solely on AI systems could lead to a perceived lack of personal touch or a sense of disconnect.

‘In addition, ChatGPT may struggle to grasp the contextual nuances of certain questions or responses. It could misinterpret or misjudge candidate or employee input, leading to inaccurate assessments or inappropriate recommendations,’ he adds.

As with accountancy and many other sectors, recruitment appears to still rely on humans for the important nuance, even if the bots are filtering for skills.