The article explores the escalating costs associated with developing advanced artificial intelligence systems. Estimates suggest that creating next-generation AI could cost billions of dollars, with projections reaching $10 billion for future generations. This cost escalation is driven by the increasing computational power required, with hardware and electricity costs doubling every nine months.
A study by researchers from Stanford University and Epoch AI revealed that the cost of computational power for training AI models is a major factor. This cost is primarily due to the specialized semiconductor chips needed and their rapid depreciation over time. The study indicates that these hardware costs alone could reach billions by the end of the decade.
Beyond hardware, the article emphasizes the significant role of employee compensation. The study analyzed four AI models (GPT-3, GPT-4, OPT-175B, and Gemini Ultra 1.0), finding that labor costs accounted for 29% to 49% of total development expenses. While hardware costs are increasing rapidly, labor costs are expected to decrease as a proportion of total costs if computational power use continues to grow.
The article concludes by discussing the implications of these escalating costs. The high barriers to entry mean that only a few well-funded organizations—primarily tech giants like Google, Microsoft, and smaller companies backed by them—will be able to compete. This concentration of power raises concerns about responsible AI development and deployment, prompting calls for both developers and policymakers to address these challenges.