The post argues that the cost of operating Large Language Models (LLMs) has decreased significantly, contradicting a common belief that they are expensive to run.
Initial AI inference was costly, but costs have significantly dropped, becoming cheaper than search queries in many cases.
LLM prices vary, but even lower-end LLMs are more affordable than traditional search APIs, despite offering substantial capabilities.
The article addresses objections regarding LLM cost structures, suggesting businesses are not overly subsidizing these costs but instead operating on viable margins.
The author predicts that as LLM costs decrease, demand for AI applications will rise, countering beliefs of an unsustainable race to the bottom in pricing.
There's a growing concern about the ecological and societal impacts of LLMs, raising questions about the broader definition of 'cost.'
Get notified when new stories are published for "🇺🇸 Hacker News English"