×
Tuesday, April 28, 2026

The Cost of Doing AI Business - PYMNTS.com

While the potential of generative artificial intelligence (AI) may seem limitless, the computing power it requires could present limitations. One estimate places the cost of a ChatGPT query at 1,000 times that of the same question asked of a normal Google search. In the initial development stages, as companies such as OpenAI seek to generate public interest, that may be an acceptable cost, even with 100 million active users added in a single month.

However, that kind of expense could easily become unsustainable for a more general-use product. Even the White House has weighed in on the question, noting the potential environmental impact of the increased energy consumption and data center space required for extended generative AI applications.

Addressing the Underlying Expense

Before dealing with the cost of running large language models (LLMs), most companies interested in developing their own generative AI solutions will come up against the cost of training them. Training generative AI requires either owning or renting time on hardware, significant data storage needs and intensive energy consumption. The cost of simply training OpenAI’s GPT-3 — the version before the one employed in ChatGPT — was more than $5 million.

However, there has been some progress in lowering the bar for entry into generative AI. One solution developed at the Massachusetts Institute of Technology (MIT) claims to reduce the cost of training an LLM by 50%. In addition, the more efficient training...



Read Full Story: https://news.google.com/rss/articles/CBMiV2h0dHBzOi8vd3d3LnB5bW50cy5jb20vbmV3...