A lot has been written and spoken about DeepSeek since the release of their R1 model in January. Soon after, Alibaba, Mistral AI, and Ai2 released their own updated models, and we have seen Manus AI being touted as the next big thing to follow.
DeepSeek’s lower-cost approach to creating its model – using reinforcement learning, the mixture-of-experts architecture, multi-token prediction, group relative policy optimisation, and other innovations – has driven down the cost of LLM development. These methods are likely to be adopted by other models and are already being used today.
While the cost of AI is a challenge, it’s not the biggest for most organisations. In fact, few GenAI initiatives fail solely due to cost.
The reality is that many hurdles still stand in the way of organisations’ GenAI initiatives, which need to be addressed before even considering the business case – and the cost – of the GenAI model.
Real Barriers to GenAI
• Data. The lifeblood of any AI model is the data it’s fed. Clean, well-managed data yields great results, while dirty, incomplete data leads to poor outcomes. Even with RAG, the quality of input data dictates the quality of results. Many organisations I work with are still discovering what data they have – let alone cleaning and classifying it. Only a handful in Australia can confidently say their data is fully managed, governed, and AI-ready. This doesn’t mean GenAI initiatives must wait for perfect data, but it does explain why Agentic AI is set to boom – focusing on single applications and defined datasets.
• Infrastructure. Not every business can or will move data to the public cloud – many still require on-premises infrastructure optimised for AI. Some companies are building their own environments, but this often adds significant complexity. To address this, system manufacturers are offering easy-to-manage, pre-built private cloud AI solutions that reduce the effort of in-house AI infrastructure development. However, adoption will take time, and some solutions will need to be scaled down in cost and capacity to be viable for smaller enterprises in Asia Pacific.
• Process Change. AI algorithms are designed to improve business outcomes – whether by increasing profitability, reducing customer churn, streamlining processes, cutting costs, or enhancing insights. However, once an algorithm is implemented, changes will be required. These can range from minor contact centre adjustments to major warehouse overhauls. Change is challenging – especially when pre-coded ERP or CRM processes need modification, which can take years. Companies like ServiceNow and SS&C Blue Prism are simplifying AI-driven process changes, but these updates still require documentation and training.
• AI Skills. While IT teams are actively upskilling in data, analytics, development, security, and governance, AI opportunities are often identified by business units outside of IT. Organisations must improve their “AI Quotient” – a core understanding of AI’s benefits, opportunities, and best applications. Broad upskilling across leadership and the wider business will accelerate AI adoption and increase the success rate of AI pilots, ensuring the right people guide investments from the start.
• AI Governance. Trust is the key to long-term AI adoption and success. Being able to use AI to do the “right things” for customers, employees, and the organisation will ultimately drive the success of GenAI initiatives. Many AI pilots fail due to user distrust – whether in the quality of the initial data or in AI-driven outcomes they perceive as unethical for certain stakeholders. For example, an AI model that pushes customers toward higher-priced products or services, regardless of their actual needs, may yield short-term financial gains but will ultimately lose to ethical competitors who prioritise customer trust and satisfaction. Some AI providers, like IBM and Microsoft, are prioritising AI ethics by offering tools and platforms that embed ethical principles into AI operations, ensuring long-term success for customers who adopt responsible AI practices.
GenAI and Agentic AI initiatives are far from becoming standard business practice. Given the current economic and political uncertainty, many organisations will limit unbudgeted spending until markets stabilise. However, technology and business leaders should proactively address the key barriers slowing AI adoption within their organisations. As more AI platforms adopt the innovations that helped DeepSeek reduce model development costs, the economic hurdles to GenAI will become easier to overcome.

Barely weeks into 2025, the Consumer Electronics Show (CES) announced a wave of AI-powered innovations – from Nvidia’s latest RTX 50-series graphics chip with AI-powered rendering to Halliday’s futuristic augmented reality smart glasses. AI has firmly emerged from the “fringe” technology to become the foundation of industry transformation. According to MIT, 95% of businesses are already using AI in some capacity, and more than half are aiming for full-scale integration by 2026.
But as AI adoption increases, the real challenge isn’t just about developing smarter models – it’s about whether the underlying infrastructure can keep up.
The AI-Driven Cloud: Strategic Growth
Cloud providers are at the heart of the AI revolution, but in 2025, it is not just about raw computing power anymore. It’s about smarter, more strategic expansion.
Microsoft is expanding its AI infrastructure footprint beyond traditional tech hubs, investing USD 300M in South Africa to build AI-ready data centres in an emerging market. Similarly, AWS is doubling down on another emerging market with an investment of USD 8B to develop next-generation cloud infrastructure in Maharashtra, India.
This focus on AI is not limited to the top hyperscalers; Oracle, for instance, is seeing rapid cloud growth, with 15% revenue growth expected in 2026 and 20% in 2027. This growth is driven by deep AI integration and investments in semiconductor technology. Oracle is also a key player in OpenAI and SoftBank’s Stargate AI initiative, showcasing its commitment to AI innovation.
Emerging players and disruptors are also making their mark. For instance, CoreWeave, a former crypto mining company, has pivoted to AI cloud services. They recently secured a USD 12B contract with OpenAI to provide computing power for training and running AI models over the next five years.
The signs are clear – the demand for AI is reshaping the cloud industry faster than anyone expected.
Strategic Investments In Data Centres Powering Growth
Enterprises are increasingly investing in AI-optimised data centres, driven by the need to reduce reliance on traditional data centres, lower latency, achieve cost savings, and gain better control over data.
Reliance Industries is set to build the world’s largest AI data centre in Jamnagar, India, with a 3-gigawatt capacity. This ambitious project aims to accelerate AI adoption by reducing inferencing costs and enabling large-scale AI workloads through its ‘Jio Brain’ platform. Similarly, in the US, a group of banks has committed USD 2B to fund a 100-acre AI data centre in Utah, underscoring the financial sector’s confidence in AI’s future and the increasing demand for high-performance computing infrastructure.
These large-scale investments are part of a broader trend – AI is becoming a key driver of economic and industrial transformation. As AI adoption accelerates, the need for advanced data centres capable of handling vast computational workloads is growing. The enterprise sector’s support for AI infrastructure highlights AI’s pivotal role in shaping digital economies and driving long-term growth.
AI Hardware Reimagined: Beyond the GPU
While cloud providers are racing to scale up, semiconductor companies are rethinking AI hardware from the ground up – and they are adapting fast.
Nvidia is no longer just focused on cloud GPUs – it is now working directly with enterprises to deploy H200-powered private AI clusters. AMD’s MI300X chips are being integrated into financial services for high-frequency trading and fraud detection, offering a more energy-efficient alternative to traditional AI hardware.
Another major trend is chiplet architectures, where AI models run across multiple smaller chips instead of a single, power-hungry processor. Meta’s latest AI accelerator and Google’s custom TPU designs are early adopters of this modular approach, making AI computing more scalable and cost-effective.
The AI hardware race is no longer just about bigger chips – it’s about smarter, more efficient designs that optimise performance while keeping energy costs in check.
Collaborative AI: Sharing The Infrastructure Burden
As AI infrastructure investments increase, so do costs. Training and deploying LLMs requires billions in high-performance chips, cloud storage, and data centres. To manage these costs, companies are increasingly teaming up to share infrastructure and expertise.
SoftBank and OpenAI formed a joint venture in Japan to accelerate AI adoption across enterprises. Meanwhile, Telstra and Accenture are partnering on a global scale to pool their AI infrastructure resources, ensuring businesses have access to scalable AI solutions.
In financial services, Palantir and TWG Global have joined forces to deploy AI models for risk assessment, fraud detection, and customer automation – leveraging shared infrastructure to reduce costs and increase efficiency.
And with tech giants spending over USD 315 billion on AI infrastructure this year alone – plus OpenAI’s USD 500 billion commitment – the need for collaboration will only grow.
These joint ventures are more than just cost-sharing arrangements; they are strategic plays to accelerate AI adoption while managing the massive infrastructure bill.
The AI Infrastructure Power Shift
The AI infrastructure race in 2025 isn’t just about bigger investments or faster chips – it’s about reshaping the tech landscape. Leaders aren’t just building AI infrastructure; they’re determining who controls AI’s future. Cloud providers are shaping where and how AI is deployed, while semiconductor companies focus on energy efficiency and sustainability. Joint ventures highlight that AI is too big for any single player.
But rapid growth comes with challenges: Will smaller enterprises be locked out? Can regulations keep pace? As investments concentrate among a few, how will competition and innovation evolve?
One thing is clear: Those who control AI infrastructure today will shape tomorrow’s AI-driven economy.

As AI evolves, the supporting infrastructure has become a crucial consideration for organisations and technology companies alike. AI demands massive processing power and efficient data handling, making high-performance computing clusters and advanced data management systems essential. Scalability, efficiency, security, and reliability are key to ensuring AI systems handle increasing demands and sensitive data responsibly.
Data centres must evolve to meet the increasing demands of AI and growing data requirements.
Equinix recently hosted technology analysts at their offices and data centre facilities in Singapore and Sydney to showcase how they are evolving to maintain their leadership in the colocation and interconnection space.
Equinix is expanding in Latin America, Africa, the Middle East, and Asia Pacific. In Asia Pacific, they recently opened data centres in Kuala Lumpur and Johor Bahru, with capacity additions in Mumbai, Sydney, Melbourne, Tokyo, and Seoul. Plans for the next 12 months include expanding in existing cities and entering new ones, such as Chennai and Jakarta.
Ecosystm analysts comment on Equinix’s growth potential and opportunities in Asia Pacific.
Small Details, Big Impact
TIM SHEEDY. The tour of the new Equinix data centre in Sydney revealed the complexity of modern facilities. For instance, the liquid cooling system, essential for new Nvidia chipsets, includes backup cold water tanks for redundancy. Every system and process is designed with built-in redundancy.
As power needs grow, so do operational and capital costs. The diesel generators at the data centre, comparable to a small power plant, are supported by multiple fuel suppliers from several regions in Sydney to ensure reliability during disasters.
Security is critical, with some areas surrounded by concrete walls extending from the ceiling to the floor, even restricting access to Equinix staff.
By focusing on these details, Equinix enables customers to quickly set up and manage their environments through a self-service portal, delivering a cloud-like experience for on-premises solutions.
Equinix’s Commitment to the Environment
ACHIM GRANZEN. Compute-intensive AI applications challenge data centres’ “100% green energy” pledges, prompting providers to seek additional green measures. Equinix addresses this through sustainable design and green energy investments, including liquid cooling and improved traditional cooling. In Singapore, one of Equinix’s top 3 hubs, the company partnered with the government and Sembcorp to procure solar power from panels on public buildings. This improves Equinix’s power mix and supports Singapore’s renewable energy sector.
TIM SHEEDY Building and operating data centres sustainably is challenging. While the basics – real estate, cooling, and communications – remain, adding proximity to clients, affordability, and 100% renewable energy complicates matters. In Australia, reliant on a mixed-energy grid, Equinix has secured 151 MW of renewable energy from Victoria’s Golden Plains Wind Farm, aiming for 100% renewable by 2029.
Equinix leads with AIA-rated data centres that operate in warmer conditions, reducing cooling needs and boosting energy efficiency. Focusing on efficient buildings, sustainable water management, and a circular economy, Equinix aims for climate neutrality by 2030, demonstrating strong environmental responsibility.
Equinix’s Private AI Value Proposition
ACHIM GRANZEN. Most AI efforts, especially GenAI, have occurred in the public cloud, but there’s rising demand for Private AI due to concerns about data availability, privacy, governance, cost, and location. Technology providers in a position to offer alternative AI stacks (usually built on top of a GPU-as-a-service model) to the hyperscalers find themselves in high interest. Equinix, in partnership with providers such as Nvidia, offers Private AI solutions on a global turnkey AI infrastructure. These solutions are ideal for industries with large-scale operations and connectivity challenges, such as Manufacturing, or those slow to adopt public cloud.
SASH MUKHERJEE. Equinix’s Private AI value proposition will appeal to many organisations, especially as discussions on AI cost efficiency and ROI evolve. AI unites IT and business teams, and Equinix understands the need for conversations at multiple levels. Infrastructure leaders focus on data strategy capacity planning; CISOs on networking and security; business lines on application performance, and the C-suite on revenue, risk, and cost considerations. Each has a stake in the AI strategy. For success, Equinix must reshape its go-to-market message to be industry-specific (that’s how AI conversations are shaping) and reskill its salesforce for broader conversations beyond infrastructure.
Equinix’s Growth Potential
ACHIM GRANZEN. In Southeast Asia, Malaysia and Indonesia provide growth opportunities for Equinix. Indonesia holds massive potential as a digital-savvy G20 country. In Malaysia, the company’s data centres can play a vital part in the ongoing Mydigital initiative, having a presence in the country before the hyperscalers. Also, the proximity of the Johor Bahru data centre to Singapore opens additional business opportunities.
TIM SHEEDY. Equinix is evolving beyond being just a data centre real estate provider. By developing their own platforms and services, along with partner-provided solutions, they enable customers to optimise application placement, manage smaller points of presence, enhance cloud interconnectivity, move data closer to hyperscalers for backup and performance, and provide multi-cloud networking. Composable services – such as cloud routers, load balancers, internet access, bare metal, virtual machines, and virtual routing and forwarding – allow seamless integration with partner solutions.
Equinix’s focus over the last 12 months on automating and simplifying the data centre management and interconnection services is certainly paying dividends, and revenue is expected to grow above tech market growth rates.