I love how level-headed this article is. The dust will settle and the sooner it does the better.
"Look, it's just arithmetic" is a pretty good way to get my vote. I did not realize that a multi-step AI query must be run through from the beginning each time. I should have? Because clearly the algorithms are stateless. But that whole quadratic expense thing is a real pisser, particularly when what they're all doing is creating chatbots. A chatbot that doesn't remember what it just said is barely better than ELIZA.
There are some optimizations to get a result better than quadratic. For example when you start to hit your context window you can have an in-between step to compress the tokens down before proceeding. And my agentic coding tool of choice doesn’t send the entire codebase every time, instead it picks which files to send along each time I ask something. But those optimizations in practice mean you can ask a few more questions before hitting the Wall of Stupidity where every AI model will get stuck in some kind of loop or thought pattern or solution. It won’t make the law go away.