I always hear the ai companies clamoring for gigawatts of “compute” so they can finally “grow” due to the immense “demand”. But somehow people can just start up clawbot and burn through millions of tokens just fine, I never hear about anyone being denied access to LLM usage. The same for businesses, they’re being sold ai crap left and right and there is never a bottleneck or a queue. In fact, there seems to be plenty of “compute” to go around, far more than needed, really.
Has this ever been pointed out to the ai CEOs? Has this been discussed or explained?


It’s a fairly common theory that demand will never meet “expectations”. I think the most famous proponent is The Big Short guy:
(My rant: The “future demand” is part of the grift. It’s a boondoggle. These grifters are going to grab as much cash as possible while saying whatever keeps the gravy train rolling. When it all crashes, they’ll be fine, maybe even bailed out, because this is legal and normal under capitalism.)
Aren’t those effectively the same thing. If the hardware can provide a return, then it’s not a bubble.
That was a surprisingly interesting Wikipedia article