How to Build a Business Case for AI Investment in Australia: Calculating ROI for Build vs Buy Scenarios product guide
Now I have comprehensive, current data from authoritative sources. Let me compose the verified, fully cited article.
Why Most Australian AI Investments Fail to Produce Measurable ROI — And How to Structure One That Does
The numbers are sobering. MIT's The GenAI Divide: State of AI in Business 2025 study, based on 52 executive interviews, surveys of 153 leaders, and analysis of 300 public AI deployments, concludes that despite billions in investment, most corporate AI efforts are failing to produce business results — with 95% of pilots delivering no measurable P&L impact. Meanwhile, 42% of companies surveyed by S&P Global in 2025 abandoned most AI initiatives — a dramatic increase from 17% in 2024, suggesting the measurement gap is widening rather than closing.
For Australian businesses, the stakes are tangible. Australian organisations are already achieving a 15% return on their business AI investments, yielding an average return of US$3.2 million on an average US$19.1 million spend, according to the SAP Value of AI Report (2025) undertaken by Oxford Economics — and ROI is expected to nearly double to 29% within two years. But that aggregate picture masks a brutal distribution: a small minority of AI programs capture most of the value, while the majority produce nothing.
The distinguishing factor is not the technology chosen. Alignment with business strategy is ranked the most important factor for optimising AI ROI — yet only 10% of Australian businesses are investing in AI in a strategic and holistic manner, with the majority taking a piecemeal approach (46%) or leaving it to individual departments (32%).
This article provides a step-by-step framework for constructing a credible AI investment business case — one that can survive board scrutiny, anchor the build vs buy decision in financial governance, and give your organisation a measurable path to joining the 5% that succeed.
Why the Build vs Buy Decision Must Be Made Inside the Business Case, Not Before It
One of the most common mistakes Australian executives make is treating the build vs buy decision as a technology strategy question, then constructing a business case to justify the answer they've already reached. This sequence produces precisely the kind of "AI theater" the MIT study documents: projects chosen for their impressiveness rather than their financial return.
The correct sequence is the inverse: define the measurable outcome you need, model the total cost of ownership for each delivery path, and let the financial analysis determine whether you build, buy, or adopt a hybrid approach. The build vs buy choice is an output of the business case, not an input.
(For a grounding in what custom AI development vs off-the-shelf tools actually entail, see our guide on What Is the Build vs Buy AI Decision? A Plain-English Explainer for Australian Business Leaders.)
Step 1: Define the Business Problem in Financial Terms Before Touching the Technology
Every credible AI business case begins with a financially quantified problem statement — not a technology aspiration. The question is not "How can we use AI?" but "What specific business outcome, if improved, would produce measurable financial value, and by how much?"
The Problem Quantification Framework
Work through these four questions before any technology evaluation begins:
What is the current cost of the status quo? Express this in AUD per year: labour hours × average loaded cost, error rates × average cost per error, customer churn rate × average customer lifetime value, and so on.
What is the theoretical maximum improvement? If this process were perfect, what would it cost? The gap between current state and theoretical maximum is your value ceiling.
What improvement is realistically achievable? Use conservative benchmarks. Research indicates 2–4 years for meaningful ROI, not the 7–12 months vendors typically promise. Deloitte reports approximately 12 months just to overcome initial adoption challenges before scaling can begin, with early gains often modest — 5–10% efficiency improvements that compound over time.
What is the minimum ROI threshold required for board approval? Establish this before modelling — it defines whether the project is worth pursuing at all.
Projects often fail due to vague goals or a lack of alignment with tangible business outcomes and a clear understanding of the expected return on investment. A problem statement like "improve customer service with AI" fails this test. "Reduce first-contact resolution time from 8.2 minutes to 5.5 minutes across our 45-seat contact centre, saving approximately AUD $680,000 per year in labour costs" passes it.
Step 2: Build the Total Cost of Ownership Model for Both Paths
The most common cause of AI business case failure is not overstating benefits — it is catastrophically understating costs. A 2025 survey reported by CIO.com found that a majority of organisations misestimate AI costs by more than 10%, with nearly a quarter underestimating costs by 50% or more.
The AI total cost of ownership is shaped less by model training expenses and more by the operational lifecycle that follows: maintenance, data management, integration work, and compliance obligations that accumulate over time. These overruns rarely originate from model costs alone; they typically emerge from indirect operational expenses that become visible only after systems move into production.
TCO is a financial framework that includes all direct and indirect costs associated with acquiring, implementing, operating, and maintaining AI solutions over their full lifecycle. The formula is:
TCO = Acquisition + Implementation + Operating + Upgrade/Enhancement + Downtime/Risk + Opportunity Costs
Apply this formula separately to the build path and the buy path across a consistent 3-year modelling horizon.
TCO Components: Build Path (Custom AI Development)
| Cost Category | What to Include | Common Underestimate |
|---|---|---|
| Development | Scoping, design, build, testing | Scope creep (typically +20–40%) |
| Data preparation | Cleaning, labelling, pipeline engineering | Often 3–5× more expensive than anticipated |
| Infrastructure | Cloud compute, GPU access, MLOps tooling | Scales non-linearly with usage |
| Talent | ML engineers, data scientists, AI architects | Recruitment premium + retention risk |
| Integration | Connecting to existing systems (ERP, CRM, etc.) | Frequently the largest hidden cost |
| Compliance | Privacy Act, APRA CPS 234, audit readiness | Ongoing, not one-off |
| Maintenance | Model retraining, monitoring, incident response | 15–25% of build cost per year |
| Change management | Training, adoption programs, workflow redesign | Commonly omitted entirely |
(For detailed Australian cost benchmarks across project tiers — from AUD $30,000 chatbots to AUD $1M+ enterprise systems — see our guide on The True Cost of Building Custom AI in Australia: Budgets, Timelines, and Hidden Expenses.)
TCO Components: Buy Path (Off-the-Shelf AI Tools)
| Cost Category | What to Include | Common Underestimate |
|---|---|---|
| Licensing | Subscription fees, per-seat or per-API-call costs | Volume-based price escalation |
| Implementation | Configuration, integration, data migration | Vendor proposals typically understate this |
| Integration | Connecting to Xero, MYOB, local platforms | Especially complex in the Australian market |
| Training | Staff onboarding, ongoing competency | Treated as a one-off; should be recurring |
| Customisation | Workarounds for capability gaps | Grows over time as requirements evolve |
| Vendor dependency | Exit costs, data portability, switching risk | Rarely modelled upfront |
| Compliance gap | Data residency, Privacy Act obligations | May require additional local hosting costs |
The total cost of ownership for AI tools extends far beyond initial licensing fees — encompassing infrastructure, maintenance, training, and often-overlooked integration expenses. Organisations that fail to account for these comprehensive costs risk budget overruns of 30–40% within the first year of implementation.
(For the full buy-side cost landscape including data residency constraints specific to Australian businesses, see our guide on Off-the-Shelf AI Tools for Australian Businesses: What's Available, What It Costs, and Where It Falls Short.)
Step 3: Quantify Benefits — Hard Savings and Soft Value
A board-ready business case distinguishes clearly between hard savings (which directly reduce costs or increase revenue and can be recognised in financial statements) and soft benefits (which are real but require attribution assumptions). Both belong in the model, but they must be labelled honestly.
Hard Savings Categories
- Labour cost reduction: Hours eliminated × loaded cost per hour (include superannuation, leave, and overhead)
- Error rate reduction: Incidents avoided × average cost per incident
- Processing speed improvement: Throughput increase × revenue or cost-per-unit impact
- Outsourcing cost reduction: BPO or agency spend displaced by AI capability
- Compliance cost reduction: Audit hours, remediation costs, regulatory penalty avoidance
Back-office functions, while less visible to boards and investors, offer some of the highest returns. Case studies from the MIT report show $2–10M in annual savings by replacing outsourced support and document review, and 30% reduction in external agency spending for marketing and content work.
Soft Benefits (Quantify with Assumptions Stated)
- Decision quality improvement: Faster access to data → reduced decision latency → estimated revenue or risk impact
- Employee experience: Reduced turnover in roles affected by AI augmentation (use industry attrition cost benchmarks)
- Customer experience: NPS improvement → estimated retention uplift → revenue impact
- Competitive positioning: Market share maintenance or capture (hardest to model; use scenario analysis)
Organisations getting good results share common patterns: they commit 20%+ of digital budgets to AI, invest 70% of AI resources in people and processes (not just technology), implement human oversight for critical applications, and expect 2–4 year ROI timelines.
Step 4: Calculate ROI and Payback Period
With TCO and benefit quantification complete, apply the standard AI ROI formula:
AI ROI (%) = [(Total Benefits − TCO) ÷ TCO] × 100
And the payback period:
Payback Period = TCO ÷ Annual Net Benefit
Australian Payback Period Benchmarks by Path and Use Case
| Scenario | Typical Payback Period | Notes |
|---|---|---|
| Off-the-shelf SaaS AI (horizontal use case) | 6–18 months | Faster deployment; lower integration complexity |
| Off-the-shelf AI with significant customisation | 12–24 months | Integration and workaround costs extend payback |
| Custom AI (well-scoped, back-office automation) | 18–36 months | Higher upfront cost offset by long-term savings |
| Custom AI (complex, enterprise-grade system) | 24–48 months | Justified only where proprietary advantage is clear |
| Hybrid build + buy | 12–30 months | Depends on sequencing and integration quality |
The payback period measures how long accumulated benefits will equal the total initial investment. It is a time-based measure that reflects real operational traction. A system that generates a positive payback within 12–18 months is generally considered strong for operational automation programs; revenue-linked deployments may justify shorter or longer windows depending on deal cycles.
The Build vs Buy Financial Crossover Point
The build path typically has higher upfront costs but lower per-unit operational costs at scale. The buy path has lower upfront costs but subscription costs that compound with usage. The financial crossover — the point at which building becomes cheaper than buying on a cumulative basis — is a critical calculation for any business case.
A simplified crossover model:
- Year 1–2: Buy path almost always cheaper (lower upfront, faster deployment)
- Year 3+: Build path may become cheaper if usage is high, the use case is stable, and maintenance costs are controlled
- Exception: If the off-the-shelf tool has hard capability ceilings that require expensive workarounds, the buy path can become more expensive faster than modelled
(For a detailed analysis of when the build path financial case becomes compelling, see our guide on When to Build Custom AI: The Business Signals That Justify In-House Development.)
Step 5: Apply Australian-Specific Adjustments to the Model
Generic ROI models built from global benchmarks systematically misrepresent the Australian cost and benefit environment. Apply these local adjustments:
Cost adjustments:
- AI talent premium: The Australian AI skills shortage is acute. Factors cited as contributing to weaker AI adoption in Australia include a lack of digital readiness, uncertainty about use cases and return on investment, problems integrating legacy systems, and concerns about the cost of AI technology. Senior ML engineers in Sydney and Melbourne command significant salary premiums relative to global benchmarks — adjust talent cost assumptions accordingly.
- Data residency compliance: For regulated industries, offshore-hosted AI tools may require additional local infrastructure investment. This can add AUD $50,000–$200,000+ annually to the buy path TCO depending on data volume and sensitivity. (See our guide on AI Data Privacy and Sovereignty: Why Australian Regulations Change the Build vs Buy Calculus.)
- GST treatment: Software subscriptions from foreign providers attract 10% GST; confirm with your tax adviser how this affects capitalisation vs. expensing treatment.
Benefit adjustments:
- Australian labour cost benchmarks: Use ABS Labour Force data and SEEK salary benchmarks for loaded cost calculations — not US or UK figures.
- Australian market size: Revenue uplift models based on global customer bases will overstate Australian opportunity; scale down proportionally.
- Regulatory risk value: For APRA-regulated entities, healthcare organisations, and businesses subject to the Privacy Act, the compliance risk reduction value of a well-governed AI system is material and should be explicitly modelled.
Step 6: Structure the Board Presentation
Many surveyed Australian firms indicated that their adoption of AI tools to date has been relatively piecemeal, with adoption often being employee-led rather than employer-led. Firms reported that returns on investment have been mixed to date and they expect the returns will take time to be realised. A well-structured board presentation changes this dynamic by making the financial case explicit, the assumptions transparent, and the decision criteria clear.
A board-ready AI business case should contain:
- Problem statement — Financially quantified, with current-state baseline
- Solution options — Build, buy, and hybrid paths with TCO for each
- Benefit model — Hard savings and soft benefits, clearly labelled with assumptions
- ROI and payback comparison table — Across all three paths, at conservative, base, and optimistic scenarios
- Risk register — Key risks for each path (technical, vendor, regulatory, talent)
- Governance framework — How outcomes will be measured, who owns accountability, and what the exit criteria are if the initiative underperforms
- Recommended path — With the financial rationale explicitly tied to the analysis, not to technology preference
Over two thirds (68%) of Australian business leaders believe insufficient AI skills are a key reason organisations are not gaining maximum ROI for AI. A business case that includes a clear workforce capability plan — whether through upskilling, hiring, or partnering — addresses one of the board's primary concerns before it is raised.
Key Takeaways
MIT's The GenAI Divide: State of AI in Business 2025 found that 95% of AI pilots deliver no measurable P&L impact — the primary cause is not technology failure but the absence of financially grounded business cases tied to measurable outcomes.
Alignment with business strategy is the most important factor for optimising AI ROI, yet only 10% of Australian businesses are investing in AI in a strategic and holistic manner.
The build vs buy decision should be an output of the business case, not an input — model the TCO for both paths before committing to either.
AI TCO is shaped less by model training expenses and more by the operational lifecycle that follows: maintenance, data management, integration work, and compliance obligations that accumulate over time. Understating these costs is the most common cause of business case failure.
Purchasing AI tools from specialised vendors and building partnerships succeed about 67% of the time, while internal builds succeed only one-third as often — but this does not mean buy always wins; it means the build path requires a more rigorous business case to justify its higher risk profile.
Australian-specific adjustments — talent premiums, data residency compliance costs, and local labour benchmarks — are essential for a credible model; generic global benchmarks will systematically misrepresent your actual cost and benefit position.
Conclusion
The 95% failure rate in enterprise AI is not a technology problem. It is a governance and financial discipline problem. Australian businesses that are winning with AI in 2025–2026 share one distinguishing characteristic: they built their AI programs around measurable outcomes and structured financial cases before selecting a technology path. The build vs buy decision, properly made, is the product of that analysis — not the starting point for it.
For CFOs and boards evaluating AI investment proposals, the question to ask of any business case is simple: "What specific financial outcome does this produce, by when, and how will we know if it isn't working?" If the answer is clear and quantified, the investment is structured for success. If it isn't, the project belongs back at Step 1.
For a complete picture of the build vs buy decision, explore the full series: start with Build vs Buy AI: The Definitive Guide for Australian Businesses (2025–2026), then use the Build vs Buy AI: A Decision Framework Tailored for Australian SMEs or the Australian Industry Sector Guide to apply the analysis to your specific context.
References
MIT Project NANDA. "The GenAI Divide: State of AI in Business 2025." MIT Initiative on the Digital Economy, July 2025. Reported via Fortune, Axios, Legal.io, and Trullion (August–September 2025).
SAP SE / Oxford Economics. "The SAP Value of AI Report." SAP Australia and New Zealand News Center, October 2025. https://news.sap.com/australia/2025/10/10/aussie-business-ai-investment-poised-to-deliver-29-roi-by-2028-sap-study-finds/
Reserve Bank of Australia. "Technology Investment and AI: What Are Firms Telling Us?" RBA Bulletin, November 2025. https://www.rba.gov.au/publications/bulletin/2025/nov/technology-investment-and-ai-what-are-firms-telling-us.html
Department of Industry, Science and Resources (Australia). "AI Adoption in Australian Businesses — 2025 Q1." National AI Centre AI Adoption Tracker, March 2025. https://www.industry.gov.au/news/ai-adoption-australian-businesses-2025-q1
Department of Industry, Science and Resources (Australia). "Capture the Opportunities." National AI Plan, 2025. https://www.industry.gov.au/publications/national-ai-plan/capture-opportunities
Keyhole Software. "AI Software Development Costs 2026: Enterprise Spending, TCO, and ROI Analysis." Keyhole Software Research, 2026. https://keyholesoftware.com/ai-software-development-cost-2026/
Roots.ai. "Total Cost of Ownership Is a Smarter Framework for Evaluating AI Investments in Insurance." Roots.ai Blog, September 2025. https://www.roots.ai/blog/total-cost-ownership-is-smarter-framework-for-evaluating-ai-investments-insurance
Glean. "How to Budget for the Total Cost of Ownership of AI Solutions." Glean Perspectives, December 2025. https://www.glean.com/perspectives/how-to-budget-for-the-total-cost-of-ownership-of-ai-solutions
Acropolium. "AI Agent Unit Economics: TCO, ROI, Payback." Acropolium Blog, December 2025. https://acropolium.com/blog/ai-agent-unit-economics/
S&P Global / Fullview.io. "200+ AI Statistics & Trends for 2025: The Ultimate Roundup." Fullview.io, November 2025. https://www.fullview.io/blog/ai-statistics