While employees in many industries fear that AI may be coming for their jobs, recent research by King’s College London into the use of AI in professional services has found that human oversight and judgment have not (so far at least) been replaced by generative AI.
AI, in the form of large language models, is sensitive to the precise questions it is asked and the information it is provided with. Because these models may still hallucinate, practitioners are mainly using AI to assist in the middle of a task or workflow, according to the research findings. Framing the problem (for example, composing and ordering prompts) and undertaking assessments and judgment of AI outputs is still largely being done by humans with domain expertise.
Clarity and context
In our interviews with senior managers and partners at 11 professional service firms globally – from major global firms to boutique practices – many mentioned that generative AI lacks the contextual understanding and clarity of thought necessary for the provision of professional services. They also pointed out that, as professional firms, much higher-quality drafting can be achieved when prompts are longer and more detailed, and when the AI model is instructed to review existing owned documents on a drive while drafting.
Across both large and small firms, many interviewees described using a 20-60-20 rule: the first 20% and final 20% of a task are performed by humans, with the middle 60% completed through AI automation. Many emphasised that human judgment and expertise remain essential in delivering professional services. Generative AI has not yet replaced expert human judgment because of its tendency to hallucinate.
There will be a renewed premium on face-to-face connection
Several interviewees pointed out that if human judgment were indeed not necessary in professional services, then software companies would have already made practices obsolete. AI-generated content is typically generic and unsuitable for high-level professional services.
Clients pay for human insight, context and accountability. High professional fees cannot be justified if the advice presented to clients is generated only or mostly by AI. Participants repeatedly emphasised that for a professional service firm to remain successful, it must answer a fundamental question: what can AI not do that humans can provide?
Human factor
Many firms predicted there will be a growing premium on human connection, alongside a continued mistrust of the explosion of AI-generated content. One global firm stated that over the next three to five years there will be a renewed premium on face-to-face connection, similar to the renewed demand for human interaction seen after the Covid-19 pandemic.
This firm argued that platforms such as LinkedIn are already flooded with AI-generated content, leading people to trust online material less and to seek face-to-face conversations instead. As a result, traditional networking and personal connections may become far more important in professional services.
AI overdependence was identified as a serious risk
Another concern raised by respondents was the risk of misinformation and the creation of deepfakes. The explosion of AI-generated content, including large volumes of false material, reinforces the premium placed on face-to-face relationships over online interactions. Personally knowing stakeholders and hosting networking events and receptions may therefore become increasingly important.
Trust issues
Overdependence on AI by firms was also identified as a serious risk for their long-term business success. Replacing expertise with automation may erode client trust, which could lead to reputational decline. Firms may also struggle to justify charging high retainers if clients believe tasks are largely being performed by AI. Professional services depend on trust and relationships, neither of which can be automated.
In addition, many firms emphasised ‘we are not a technology company’, and that remaining successful hinged on ensuring they do not become one. If firms shift too far toward technology-driven models, they risk undermining what makes them valuable as professional service providers.
The stress is on soft skills when hiring entry-level graduates
Staff impact
The reliance on AI by senior partners to perform tasks usually done by junior executives is leading to a flattening of organisational structures and tighter entry routes into the profession. While respondents say junior and middle-level roles are likely to remain, the reduced opportunities for experiential learning may affect the talent pipeline and long-term career development.
Given that the replacement of junior staff with AI is endangering the medium-term prospects of firms, some respondents said their recruitment approach is to place more emphasis on the need for well-developed soft skills when hiring entry-level graduates.
More information
See ACCA’s talent trends research and other insights on the future of work. See also our technology hub