Its seems you can’t go anywhere without hearing about AI. Around one corner is excitement about its potential, around another is a warning about its risks. If you travel far enough, you’ll get enough variations of the two that will exhilarate and frighten you in equal measure.
This reaction isn’t new. English philosopher Francis Bacon observed centuries ago that people tend either to reject new things as dangerous or to embrace them eagerly without properly understanding them.
Make sure AI knows who the audience is
We can see this playing out today when it comes to deciding whether to use AI in some steps of creating the annual report. Some are adamant that AI should be prohibited, saying it has no place in crafting the narrative or numbers in a legal document of record. Others think it’s the silver bullet they’ve been waiting for, with the potential to make reporting better, faster and cheaper.
The truth, of course, is somewhere in between.
Understand the limitations
AI doesn’t guarantee a correct answer but predicts the most likely response based on its training. This creative, probabilistic approach is why some people are frustrated when they don’t get the ‘right’ answer from AI. It gives a creative solution to a problem, but it’s not the only (and perhaps not even a correct) solution. This is where human judgment comes in. As the saying goes, a machine can outperform a person, but a person working with a machine beats a machine alone.
So, you need to decide what role you want the machine to play. In corporate reporting, how you use AI depends on whether you want it to be a co-creator (giving ideas and helping you think through them) or a co-worker (giving you the product you’ve asked for). Either way, AI is not a shortcut but a tool to make your report better than it would have been without it.
Using AI never absolves you from your responsibilities
Keeping in mind its probabilistic nature and need for creativity, AI can support corporate reporting in a number of ways.
Companies produce large volumes of operational, financial and other performance data, which AI can quickly analyse and synthesise, identifying trends and patterns that can take humans days or weeks to find – if at all.
AI can help with research, scanning peers’ reports and identifying practices you may want to emulate or avoid. It can compare your reporting to the requirements. It can bring you up to speed on economic, geopolitical or scientific developments that may affect your business. Such insights can help you identify actions to take.
Using the data and research, AI can also help in drafting performance commentary, reporting policies and disclosures. Whether it’s used at the start or end of the writing process, AI can help with grammar, readability, length, tone and consistency. Make sure it knows who the audience is, what they care about, key messages and how much detail you want to share.
Key principles
Some say that AI has no place in doing some or any of those things, and others are eager to get started. For those setting out on the journey there are a numbers of key principles to bear in mind.
First, you need to ensure that the processes are backed by strong governance. You need to be able to stand behind every single word and number in your annual report – legally and reputationally. Using AI never absolves you from your responsibilities. Use a secure, internal system, manage access carefully and have a clear AI use policy.
AI is not a shortcut. It can make things more efficient, but it requires careful thought, attention and oversight.
AI’s ‘insights’ are simply its best guess about how to interpret the data
Secondly, remember that AI is only a starting point, never the finished product, wherever you are in the process. Let it give you ideas, but fact-check everything – even when you start to trust it and even when you’re using internal data. Its ‘insights’ are simply its best guess about how to interpret the data. It needs to be combined with your knowledge of your business, the market and your view of the future. You’ll know what’s plausible and what’s nonsense.
Finally, ensure you’re using the right prompts – a skill that takes a lot of practice. They should be specific, detailed and give plenty of context. To reduce the risk of hallucinations (a plausible sounding but inaccurate response), ask AI to be direct and admit when it can’t answer your question.
Beware bias
The risk of bias is a common concern and important to mention. There are three ways that an AI output can be affected by bias. When creating your internal AI platform, you’ll need to determine how you want to deal with each:
- The data: the information AI learns from shapes its responses. You can limit the sources it’s allowed to use, but ultimately the data is what it is.
- The algorithm: the process AI uses to analyse the data can be neutral or trained to prioritise some data over others.
- The user’s prompt: how you phrase your prompt can lead AI towards a particular answer.
AI is already changing how we work, and it’s only a matter of time before it changes how we approach corporate reporting. Its potential and its risks are significant. It can do things you can’t and complement the things you can. And it can broaden your perspective by revealing what you may not have spotted yourself.
Experiment to see if it does improve your report. If it doesn’t, don’t use it – for now. But don’t stop trying. It may be able to help you do it better, faster and cheaper one day soon.