Filing this under “mental models for understanding how to utilize LLMs”.
For all the wrong reasons.
Hey! Dan Shipper here. Registration is open for my new course, Maximize Your Mind With ChatGPT.
Fuckin' what the world needs now is EST with buzzwords. Landmark with tech trends. 10/10. No notes.
Time isn’t as linear as you think. It has ripples and folds like smooth silk. It doubles back on itself, and if you know where to look, you can catch the future shimmering in the present.
Not according to thermodynamics but do go off.
Last week I wrote about how ChatGPT changed my conception of intelligence and the way I see the world. I’ve started to see ChatGPT as a summarizer of human knowledge, and once I made that connection, I started to see summarizing everywhere: in the code I write (summaries of what’s on StackOverflow), and the emails I send (summaries of meetings I had), and the articles I write (summaries of books I read).
Great. We can agree on that. LLMs take a corpus of knowledge, navigate it ad nauseum, build a black box association LUT and then vomit out datapoints that have been stochastically randomized from regular to extra-spicy. If you want the mean, median and mode of a million color swatches, LLMs will give you a paint chip. Or, run it a dozen times, get a dozen similar paint chips.
Summarizing used to be a skill I needed to have, and a valuable one at that. But before it had been mostly invisible, bundled into an amorphous set of tasks that I’d called “intelligence”—things that only I and other humans could do.
Well there's your first mistake. Summarizing is fucking math. What you're talking about is insight and insight is what wins, not "summarizing."
But now that I can use ChatGPT for summarizing, I’ve carved that task out of my skill set and handed it over to AI.
Yup. If you want the mean, median and mode of a large corpus of data without any insights into the data what-so-fucking-ever, ChatGPT is the tool for you. So yeah - if you are basically eliminating insight as a valuable commodity, ChatGPT is fucking amazeballs.
Now, my intelligence has learned to be the thing that directs or edits summarizing, rather than doing the summarizing myself.
Great! You've now put yourself in the position of guessing whether the machine is correct or not without any way to get the machine to show its work.
As Every’s Evan Armstrong argued several months ago, “AI is an abstraction layer over lower-level thinking.” That lower-level thinking is, largely, summarizing.
No it's fucking PATTERN RECOGNITION.
If I’m using ChatGPT in this way today, there’s a good chance this behavior—handing off summarizing to AI—is going to become widespread in the future. That could have a significant impact on the economy.
Let's be clear - those of us with insight both long for and dread a world where all you chumps have deprecated insight. Long for it because we're going to wipe the floor with you. Dread because the world will be a fucking dumpster fire.
But what happens when that very skill—knowing and utilizing the right knowledge at the right time—becomes something that computers can do faster and sometimes just as well as we can?
I'm sorry but when did we go from "summarizing" to "knowing and utilizing" as if they were the same thing? Because they're not even vaguely the same thing. Here's a whole post about ChatGPT not knowing SHIT.
It means a transition from a knowledge economy to an allocation economy. You won’t be judged on how much you know, but instead on how well you can allocate and manage the resources to get work done.
Does Dan have any employees? 'cuz I judge my employees on whether or not they can do the tasks I've hired them for.
There’s already a class of people who are engaged in this kind of work every day: managers.
that is not what managers do. Managers coordinate people.
They need to know things like how to evaluate talent, manage without micromanaging, and estimate how long a project will take.
I don't think this guy has ever met a manager.
Individual contributors—the people in the rest of the economy, who do the actual work—don't need that skill today.
Or, for that matter, an "individual contributor."
But in this new economy, the allocation economy, they will. Even junior employees will be expected to use AI, which will force them into the role of manager—model manager.
Fucking lol every employee I have is "expected to use" visual basic, HTML5, VoIP, SSL and Java. They don't know that? And that's fine. What they do is "their jobs" and they do them well. Why the fuck would ChatGPT be any goddamn different?
(500 words of big-think bullshit ommitted)