AI in Audit: Balancing Risk, Innovation and Responsibility
Blog

AI in Audit: Balancing Risk, Innovation and Responsibility

Auditors are adopting AI to boost efficiency, but risk management remains critical. Learn how to balance innovation with governance and human oversight.

Artificial intelligence has already arrived in the audit profession. It is changing how firms test transactions, assess risks and provide assurance. For internal auditors, external

practitioners and decision-makers, the critical issue is not whether AI will influence their work, but how to adopt it responsibly while protecting against new risks.

At the recent Caseware Speaker Series, cyber risk leader and author David Gee urged auditors to shift their perspective. “Everybody is rushing into AI. There will be car crashes,” he said. “The challenge is to balance speed of innovation with controls that prevent serious missteps.”

Gee argued that auditors should no longer see themselves as referees who only blow the whistle on errors. Instead, they must become goalkeepers who actively protect their organisations while supporting innovation.

Why AI matters for auditors

AI is already reshaping audit processes.

  • Automation is reducing time spent on routine testing.
  • Predictive analytics is helping auditors detect risks earlier.
  • Continuous monitoring is moving assurance away from annual cycles toward near real-time oversight.

The scale of this shift is clear. Gartner’s 2023 Hype Cycle for Generative AI report predicts that by 2026 more than 80 per cent of enterprises will be using generative AI applications, compared with fewer than 5 per cent in 2023.

Four priorities for balancing risk and innovation

1. Build AI Literacy Across Teams Most auditors still rate themselves as beginners in AI. Gee emphasised that this gap must close quickly. Audit teams need to understand how models are trained, how data quality affects outcomes and how to challenge vendor claims. Training programs from professional bodies such as ISACA provide practical entry points.

2. Strengthen Governance and Oversight Traditional frameworks do not cover AI effectively. Boards and audit committees should establish AI ethics committees or appoint a Chief AI Officer. They must require

documentation of AI decision-making and maintain strong audit trails. Risk appetite statements also need updating to reflect AI adoption.

3. Move from Reactive to Predictive Auditing AI can help auditors anticipate risks before they escalate. This requires investment in continuous monitoring systems that detect anomalies in real time. Gee also highlighted the importance of red teaming, where AI models are stress-tested to identify weaknesses before they undermine audit quality.

4. Focus on Human Skills AI Cannot Replace While AI can accelerate processes, it cannot replicate judgment. Critical thinking, ethics, leadership and communication remain vital. “These are things AI cannot do, and they will only become more important,” Gee said.

Key risks in AI adoption

  • AI adoption carries significant risks.
  • Bias in algorithms can distort audit evidence.
  • Data confidentiality may be compromised when sensitive information is processed externally.
  • AI-driven cybercrime is projected to cost the global economy trillions within three years.
  • Hallucinations and lack of explainability may undermine audit credibility and complicate regulatory defence.

Australia’s Auditing and Assurance Standards Board (AUASB) has already identified AI as an area of concern and is expected to release further guidance.

Why smaller firms may have an advantage

Large firms often lead technology adoption, but Gee suggested that smaller practices may adapt more quickly. With flatter structures and fewer rigid roles, small and mid-sized firms can run pilot projects with less resistance.

Examples include using AI to draft management letters, summarise client data and assist with risk assessments. These smaller initiatives allow teams to build experience and confidence without major disruption.

“The biggest challenge is not the technology. It is convincing people to embrace it,” Gee noted.

Next Steps for the Profession

Responsible adoption of AI requires a deliberate approach. Auditors should begin with small projects, invest in literacy, embed governance frameworks and double down on the human skills that technology cannot replicate. David Gee has written a practical guide on AI in audit with detailed recommendations. A recording of his full webinar is also available on demand.