Try for Free Login

Extra Q&A Session
Practical use cases
for AI in governance

Your questions answered...

Your Questions Answered

We often receive more questions than we can answer during our webinars. To address this, we host a follow-up session with our expert panel to unpack and answer these questions, then share the responses with you, our audience. Watch the recording below, read the transcript, or watch the full webinar here.

Your Expert Panel

Q & A with our experts—Questions answered

AI is evolving fast, and there are so many choices out there. How can Boards and executives find the right AI solutions for their organisation? Are AI coaches or other support options available to help?

ANSWER:  Boards are increasingly recognising the importance of understanding AI, yet many have not formally discussed it at the board level. Studies by Forbes and Deloitte indicate that while 83% of companies see AI as a priority, nearly half have not addressed it in board meetings.

To approach AI strategically, boards should consider frameworks from institutions like the Institute of Directors in Australia, New Zealand, and the US. Instead of focusing on specific tools, boards must take a governance-first approach, ensuring they have the right strategic oversight and capability.

Additionally, cross-functional involvement from finance, risk, and IT teams is crucial. Boards can also leverage established readiness frameworks, such as Gartner’s AI Maturity Model and IBM’s AI Ladder, to guide AI discussions and strategy.

Are there AI policy templates, draft frameworks, or governance models that boards and executives can use as a starting point to manage AI in their organisations?

ANSWER: This webinar has provided an AI policy framework as a resource.  It was developed through detailed and complex prompts rather than a simple AI-generated request. The framework is intended to be reviewed, adapted, and customised to fit an organisation's specific needs, ensuring buy-in at all levels.

Additional AI governance frameworks worth considering include:

  • The Australian Government's AI Ethics Principles
  • Singapore’s AI Verify and AI Governance Policy, which balances governance with innovation
  • OECD’s AI Policy Observatory
  • NIST’s AI Risk Management Framework

A cross-functional approach is essential—beyond compliance, legal, and IT, involving operational roles to ensure AI is effectively embedded across the organisation.

For organisations looking to develop a customised AI policy, Notebook LLM is recommended. It allows companies to input their specific data, alongside existing frameworks, to generate a tailored policy. However, reviewing and verifying the AI-generated output is crucial to ensuring accuracy and alignment with organisational goals.

Data privacy and security are big concerns with AI, especially when dealing with sensitive information like financial reports. What are best practices to ensure data protection, and does AI leave a 'digital paper trail' that could be discoverable in legal proceedings?

ANSWER: Data privacy and security are key concerns in AI governance. Boards should ensure AI governance charters align with regulatory requirements, covering data retention limits, sharing protocols, accountability structures, encryption, and access controls.

For financial data security, organisations must use compliance-ready AI platforms such as Azure or AWS and ensure providers meet standards like ISO 27001. Financial institutions must also comply with regulatory guidelines (e.g., APRA). Additional security solutions, such as Layer X and Zscaler, can enhance AI oversight, providing visibility into AI usage, monitoring data inputs, and sandboxing environments.

AI systems inherently generate digital footprints, including data inputs, decision-making logs, and training data sets. Organisations must establish strong governance frameworks to manage these digital records and ensure compliance. Guidance from the OAIC stresses restricting sensitive data inputs and training teams on AI policies and acceptable use.

Boards should prioritise auditability in AI governance. A PwC AI legal report highlights key considerations for boards, emphasising the need for thorough research and understanding to ensure proper oversight and compliance.

We've heard about Google's Gemini and other emerging AI tools—what insights can you share on these, and what should Boards be looking for when evaluating AI tools?

ANSWER: The board's role is not to evaluate AI tools—that is the responsibility of management. Instead, the board should focus on governance by asking the right strategic questions, such as:

  • What is the return on investment (ROI)?
  • What are the opportunities and risks?
  • How can AI be effectively deployed within the organisation?
  • Is there a cultural change program in place to support implementation?

While the board can approve management’s recommendations, its primary role is oversight, not selecting specific tools.

One concern with AI is that new recruits may miss out on learning experiences traditionally gained from foundational tasks. How are organisations addressing this issue, and what should boards consider in terms of AI’s impact on talent development?

ANSWER:  The nature of work is constantly evolving, and AI is accelerating these changes. While certain tasks traditionally done by junior professionals are being automated, the focus should be on AI-assisted roles rather than full replacement.

A historical perspective shows that just as computers transformed work decades ago, AI is reshaping fields like law and accounting today. For example, in the U.S., the OSCAR legal LLM framework allows junior lawyers to quickly access case law, shifting their focus from manual research to critical thinking and application.

Similarly, in finance and accounting, AI is removing repetitive tasks, enabling junior professionals to develop analytical and strategic skills. The workplace structure will continue evolving, but integrating AI should prioritise learning, mentorship, and skill development rather than eliminating foundational roles.

Boards are often asked about the return on investment (ROI) for AI. How should Boards evaluate the financial impact of AI investments, and what role should they play in encouraging AI adoption—such as sponsoring pilots or requesting AI reports?

ANSWER:  At the board level, it is crucial to document baseline operational metrics such as costs, productivity, error rates, and revenue streams. Boards should guide management in pilot-testing AI initiatives on a small scale to measure their impact and extrapolate return on investment (ROI).

Management should provide regular updates, such as quarterly or biannual reports, detailing AI’s influence on cost savings, revenue growth, productivity, and innovation. Non-financial metrics, including customer satisfaction, improved insights, risk mitigation, and new revenue channels, should also be considered.

However, successful AI adoption requires more than just implementation—it must be supported by cultural change within the organisation. Without the right mindset and adoption strategies, AI projects may fail to deliver their full potential.

What are the major red flags Boards and executives should watch for when implementing AI—such as hallucination, bias, shadow AI, and ethical risks—and how can they manage these risks effectively?

ANSWER:  AI can generate inaccurate information, known as hallucinations, because it prioritises providing answers over correctness. Human oversight is essential to verify AI-generated content. Microsoft's "Copilot" branding reinforces that AI should assist rather than replace human decision-making.

Shadow AI occurs when employees use unauthorized AI tools without organisational approval. While this can boost productivity, it also risks exposing sensitive information and bypassing established risk and ethical guidelines. Boards must address this by ensuring oversight and governance of AI usage.

Regulatory compliance is increasingly important, with regulators like New Zealand’s Privacy Commissioner focusing on AI explainability. Boards must understand legal frameworks such as GDPR (if applicable to your region) and ensure AI decisions are transparent and unbiased. Collaborations, such as working with academic institutions, can help ensure compliance.

A clear, adaptable AI policy is crucial. Organisations should maintain a living document that guides AI usage and aligns with their specific needs. Webinar resources include an AI policy framework as a starting point for organisations to tailor to their requirements.

Back to top
Governance Made Easy Webinar Series

What our Webinar Guests Have to Say

"thank you for a great year of sharing information to help me understand and make better informed decisions on our board"
Jess Hona
Chairman, Mid-Market (101-500 emp.)
"Everyone of the many BoardPro webinars I have attended have been fantastic presentations, full of valuable tips and useful for not only Board members, but for anyone who sits on any committee. Well done to Sean and his professional team for making a significant difference to my position"
Pam Brand
Chairman, Mid-Market (101-500 emp.)
"I like how the webinars are concise, and deliver what is required, to make modifications to current systems. No pressure and no selling at your events is great. We can simply focus on the great content"
Paula M
Secretary, Small-Business (50 or fewer emp.)
"Thank you - I find your webinars extremely well done and address concerns/solutions I may not have time to consider, focus on, or realize I need"
Susan Sacks
Director, Mid-Market (101-500 emp.)
"Thank you for providing insight into better board governance, I have picked up valuable information from these short and sharp webinars."
Mary A
Company Secretary, SME (51-100 emp.)
"This webinar provided an excellent opportunity to think through what is currently provided to my Boards and opportunities where improvement could be achieved."
Virginia P
Board Secretary, Mid-Market (51-1000 emp.)
"These 45 min sessions are fast becoming my staple brief sessions for topical skills update. Keep them going please."
Andrea M
Human Resource & Compliance, Small-Business (50 or fewer emp.)
"Thank you, this was a helpful session and very timely - our board has just this year planned specific strategic discussion topics into our board calendar, to help us be more strategic."
Peter L
Chairman, Mid-Market (51-1000 emp.)
"Thank you for providing a forum for knowledge sharing, being a CEO is often isolating. This opens the conversation up for sharing experience and positive, solutions based outcomes"
Lester M
Chairman, Mid-Market (101-500 emp.)
View other webinars