E U N O I A

Loading

Managing AI Costs: The Role of GenAI and Hybrid Cloud

As artificial intelligence (AI) becomes a key part of business transformation, companies are facing an unexpected challenge: rapidly increasing computing costs. According to a recent report by IBM's Institute for Business Value (IBV), titled “The CEO’s Guide to Generative AI: Cost of Compute,” these costs are projected to rise by 89% between 2023 and 2025. With 70% of executives identifying AI as the main reason for these rising expenses, companies are feeling the pressure. In fact, many businesses have already had to cancel or delay AI projects due to financial concerns, which threatens to slow down innovation.

Even large AI companies like OpenAI are dealing with these financial pressures. OpenAI, for example, reported making USD 300 million per month in revenue by August 2024. Despite this impressive growth, the company still faces significant operational costs. To keep up with rising expenses and meet its ambitious goals, OpenAI raised USD 6.6 billion in funding in October 2024, valuing the company at USD 157 billion. This funding will help OpenAI manage its costs and continue to grow.

The financial aspect of AI is becoming an important factor in how businesses approach AI adoption. While AI technologies are technically advanced, they can be very costly. Jacob Dencik, Research Director at IBV, points out that AI projects may not succeed if the business case cannot support these high costs. In other words, even if AI can deliver great solutions, companies might hesitate to fully invest if it is too expensive. Adnan Masood, Chief AI Architect at UST, also highlights the risks. He warns that companies need to carefully consider whether they can afford to keep pushing AI innovations or risk falling behind in the highly competitive "AI arms race."

One way that companies are managing these rising costs is through the use of hybrid cloud architectures. A hybrid cloud combines the use of private and public clouds, allowing businesses to run their AI operations in a more cost-efficient way. By adopting hybrid cloud platforms, companies can better control where their computing costs are coming from. These platforms offer visibility into how data, workloads, and applications are running, helping businesses choose the most cost-effective environments. Dencik believes that using a hybrid cloud approach can help businesses scale their AI use more affordably.

However, managing AI costs goes beyond simply cutting expenses. Companies can also make their AI systems more efficient, for example, by improving their coding practices. Dencik notes that using more energy-efficient programming languages and coding techniques can reduce the energy consumption of AI applications by up to 50%. Interestingly, Generative AI (GenAI) can also contribute to this process. It can be used to optimize data center layouts and design more efficient servers, helping to cut down energy usage and improve the overall efficiency of AI systems.

Masood suggests additional ways to manage AI costs, such as large language model (LLM) routing. This method directs AI tasks to the most appropriate models based on factors like complexity and performance, ensuring that resources are used efficiently. Another technique is to reduce the size of LLMs, using processes like quantization, which decreases the memory needed to run these models. This not only lowers the cost of running AI systems but also makes them faster and more affordable to deploy.

The growing complexity of AI models is also driving up costs. Dencik advises that companies don’t always need to use large, complex models. Smaller, well-trained models can achieve the same, or even better, results depending on the task. He advocates for a multi-model, multimodal strategy where different AI models are used for different tasks. This flexible approach allows companies to be more cost-effective while still taking advantage of AI’s capabilities.

Another concern related to AI costs is its environmental impact. Running AI systems, especially those hosted on cloud platforms, requires significant energy. As a result, companies are becoming more aware of AI's environmental footprint. Dencik mentions emerging practices like "green ops", which focus on reducing the environmental impact of cloud usage. This means optimizing AI systems not just for economic savings but also for sustainability.

In conclusion, rising computing costs are becoming a significant challenge for businesses adopting AI. Companies that can successfully manage these expenses will be better positioned to stay competitive. Those who cannot may fall behind. The key to success in AI might not only be in using the technology but also in managing the costs associated with it. Strategies like adopting hybrid cloud platforms, improving coding efficiency, and optimizing AI models will be essential for companies aiming to stay ahead in the AI race.