godfuckingdammit You know how sometimes you read something so stupid that so many stupid chinstrokers just sit and stroke their chins over and everyone ignores the fact that we're all stroking our chins over something so fucking stupid that it fucking wakes you up at night? So my knee-jerk reaction was this dumb-as-a-sack-of-hair "imagine how calculators changed math class" canard that Mister Chicken just gently waved his hands over must be a misquote because fuckin' hell had Sam Altman actually said that surely someone would have eaten him alive for it. So what did he actually say? Here it is at 15:14: "What do you tell educators what are misconceptions of what you're working on how can you kind of allay their concerns?" "We adapted to calculators and changed what we test for in math classes, I IMAGINE" what do you tell people concerned about plagiarism, allow me to phrase this in the most softball possible way I tell them to suck it ____________________________________________________________ Let's start from a "changed what we test for" perspective because this mfer right here grew up with a casio melody 80 as the only interesting thing to play with as a small child, barreled right the fuck into trig tables and was probably in the last class in the school district to be instructed in the use of a slide rule. Yer goddamn right. One year we were being taught how to use a slide rule, the next year we were being encouraged to buy TI-81s. This, of course, was an easy fifteen years after every fucking parent you knew had bought an HP-35 because it was a nuclear weapons lab after all. So like... we were being taught trig tables two entire goddamn decades after the ability to go home and mash the sin button solved that shit because you know what? Trig tables are what the problems were written around, and as soon as you could get a TI-81 for, I think, $90? they started dropping the chapter on how to read trig tables. Eventually. Took years. They were still in the back of the book ten years later. Thing is tho your approach to math does not change appreciably whether your answers come from a calculator, a slide rule, a bunch of trig tables or brute-force calculation. What changes is your source of error and your methodology for computation. THE ANSWER IS THE FUCKING ANSWER. Gather 'round children while I share the tale of the floating point bug. You see, long after the bleeding edge was on their fifth, sixth or seventh computer but before hippies started making kitsch out of AOL CDs, the world was shocked - SHOCKED! to discover that one in nine billion computations might, might! fuck up in the fifth decimal place. This of course cost Intel half a billion dollars because computers aren't supposed to fuck up. Those were the halcion days when Team Eternal September were juniors in college, though. When popular conception of computers had gone from Tron to Heartbeeps to Hackers. When having some knowledge about computers was cool rather than a reason to accuse people online of being Russian hackers. Fuckin' chatGPT sucks balls at arithmetic. Look at this mealy-mouthed legalspeak: The answer is "approximately" to the eighth decimal place because OpenAI knows their shit sucks ass at math and if they just wing out to a goddamn calculator every time they might miss a chance to give you the answer in the form of a dragon or some shit. _________________________________________________________________ The first time I ever heard the phrase "live my truth" was when a sociopathic liar on my TV show was caught in a sociopathic lie and when he was asked about his sociopathic lie he responded that he was "living his best truth" thereby implying that it wasn't that he lied it was that there is no truth, there are no lies, what does reality even mean, maaaaan and everyone was too polite to go "you're a fucking liar" because they were fucking simps. Not "here's my official class photo with a chicken" simps but simps nonetheless. This is the reason STEM kids will always and forever be fucking merciless towards any dipshit with a liberal arts degree: sometimes the answer is the answer. These are not the same: And the difference killed 114 people. So the answer is the answer is the answer except in liberal arts where the answer is a subjective performance in response to prompts that is graded and judged on largely subjective standards. Professor Chicken is all about whether a freshman who has never gotten laid can write a better essay about a snowball fight than a robot without even beginning to grapple with the difference between the subjective evaluation of creative writing and the objective evaluation of mathematics. Primarily because Saltman told him it was okay to do so. Kind of. Not really. Saltman actually told him to STFU but he's a fucking simp so he took that to heart and wrote a lesson plan whereby freshman can burn off one of their English GREs feeding tokens to ChatGPT. And this mfer is so far up his own ass that he can simultaneously say "the upsides for school districts and colleges are clear" and quote one of his students as saying "Reflecting on the fact that 3 credits at UVA costs me $5000 and 2100 minutes, I do not believe I grew enough through this course for it to be worth it.” THERE'S NO FUCKING SYNTHESIS HERE It's fuckin' Sam Bankman Fried logic: What are the odds that Shakespeare is any good? the math says he sucks so why should I read him? __________________________________________________________________ Make no mistake - the Eloiification of the human race is going to have winners and losers. wow, ChatGPT wrote a marginally better essay than a Freshman english student, time to tune in the Kick-Me-In-The-Balls channel. The people who can ignore the fishing lure are going to eat everyone else for lunch. "Nathan?" "Drew?" They got everyone else's number. They recognize the sham for what it is and have moved the fuck on. I wonder how much carnage they will leave in their path because this chucklefuck doesn't even realize how fucking stupid he is.Speaking about AI in the classroom, OpenAI CEO Sam Altman has described ChatGPT as “a calculator for words.” This analogy indicates the magnitude of change that ChatGPT is poised to bring about—imagine how radically math class must have changed when calculators became widely affordable—but it also indicates that change itself, even radical change, is not necessarily scary. Most AI skeptics would admit that math class survived the advent of the calculator.
ಠ_ಠ Too bad there wasn't anybody there! With the ability to document it! Or give you an oral history at the drop of a hat! Yep, we're all dead now. or we have Alzheimer's. This is like talking to a virgin about sex. What are these... cal-cue-lay-torrs you speak of? What eldritch magic do they perform? Apparently none of them have been in a fucking math class? I'm sure you can trust those numbers although fuckin' even the studies on self-reported ethics have been retracted well fuck if Kevin Roose says it's okay Are these "most people" in the room with us right now? Do they wonder if calculators are banned in math class too? Holy shit he sources his knowledge from NPR! Who knew! Also it's a study on the effects of AI on amateur creative writing. If you suck, AI might make you suck eight whopping percent less. The classroom erupted in a hubbub of disbelief. I was as shocked as anyone. My ability to spot AI-generated text had until now proven so reliable that it wasn’t even a point of conscious pride, just another flavor of the disappointment I feel when I start reading bad writing. The point is both paragraphs ARE BAD. the goal is to make writing that is NOT BAD. something something virgins something something sex Perhaps you're a bunch of incels it is literally the only fucking thing you clowns talk about are they tho And here i thought you listened to NPR My niece asks ChatGPT to be her therapist every day. It does a better job than her friends. Her friends are thirteen. Who do you think she'd rather hang out with tho how did they feel about calculators Writing students would learn more from other students' critiques than from ChatGPT fight me Reader, they did not. They taught everyone how to cite a URL and moved the fuck on. The assumption, then as now, was that teachers would be better sussing out sources of plagiarism than students and if they weren't, that's on the teacher. Perhaps it just does it worse and for free I see you Nathan Bitch I wrote an essay about why the entire fucking class should be abolished and the chair be fired and the feckless grad student who subjected us to this bullshit broke down crying and pleading for her job. Y'all are pussies. We discovered over the weekend that none of our employees under 40 know how to grill a hamburger. Nor can they be taught. They must all be busy teaching freshman creative writing at UVA. Depends - do you have a drill or a screwdriver you and me, Nathan. You and me my god it's full fo stars The gentle art of subtlety Fukkn.... brethren in the Year of Our Lord 2025 let's see your pince-nez Right, like all parodies do, like when Johnathan Swift said 'I'm not actually telling you to eat the irish" Take it from someone who mixed over two thousand hours of reality television - humanity thinks in stock phrases and brainless cliches Fuckin' lol Zoey snowed you and you're too busy sniffing out AI to notice she clapped your ass in a platitude—imagine how radically math class must have changed when calculators became widely affordable—
At the beginning of the semester, I asked my students to complete a baseline survey registering their agreement with several statements, including “It is unethical to use a calculator in a math class”
In my admittedly small sample, Altman’s analogy didn’t hold up. Calculators were uncontroversial: across my 72 students, one agreed that it was unethical to use a calculator, five chose Neutral, and the rest either disagreed or strongly disagreed.
But plenty of people do things that they believe to be unethical. In my next question, I asked students to indicate, anonymously, whether they had previously used AI in for-credit writing assignments. They confirmed that they had used it for editing first drafts (22%), outlining (28%), interpreting prompts (38%), proofreading (50%), and brainstorming (56%), with smaller pockets using it for finding sources or writing first drafts.
It’s increasingly uncontroversial to use AI to brainstorm, and to affirm that you are doing so: just last week, the hosts of the New York Times’s tech podcast spoke enthusiastically about using AI to brainstorm for the podcast itself, including coming up with interview questions and summarizing and analyzing long documents, though of course you have to double-check AI’s work.
The authors point out that most people, even if they’re not chess fans, have heard of Deep Blue, the chess-playing machine that beat World Chess Champion Gary Kasparov in 1997;
In the following class, I had my students consider a study, covered by an NPR story from 2024, that looked at the effects of AI on creative writing.
You’ve probably guessed where this is going. Max revealed, with a smile that didn’t quite conceal his dismay, that the girl did not exist, because the first paragraph had been written by ChatGPT.
When we talked about it, we reflected on the crucial efficacy of the romance plotline.
More than any single line of prose, it was the girl that had taken us in. She was so beautiful in her vagueness: the snow flecking her hair of unspecified color and texture, the frisson of erotic worldliness that comes from her being older than our narrator, and of course her “kind eyes.” Perhaps we were so deeply programmed by the rom-coms we’d watched that we’d mistaken a rom-com for reality.
In conversations about AI and education, it’s less common to hear about instructors using AI for writing lectures, designing assignments, or grading.
Some students have mixed feelings about the idea of receiving AI instruction or feedback—one student at Northeastern petitioned unsuccessfully for a tuition refund on the basis that her instructor had used AI—but the upsides for school districts and colleges are clear.
Frankly, in the era of DOGE, I’m surprised we haven’t heard more about replacing the left-leaning cadres of public university faculty with cost-efficient, “ideologically diverse” chatbots.
I didn’t realize how irreplaceable I’d believed myself, how like a John Henry of the networked Humanities, until my students shared their findings. Yes, the majority preferred my feedback—it was noted that the AI models demonstrated an unhelpful fixation on “improving transitions,” whatever that means—but even my strongest advocates noted that their AI tutors often gave advice similar to mine, and faster.
In each of these seminars, we had two instructors instead of one, who came from different disciplines: our Medieval colloquium, for example, featured a historian of early modern Rome alongside a softspoken Platonist.
If you accept both of these use cases—if you believe that students and faculty alike can and should use AI—you quickly encounter a scenario that most people would find logically abhorrent: teachers using AI to evaluate and grade AI-generated “student” writing.
Back in 1998, for example, faculty and academic officials panicked about the rise of the internet, expressing concerns that seem both quaint and prescient.
Perhaps ChatGPT has simply democratized this venerable tradition of cheating, thereby reducing the moral trespass we indicate when we use the word “cheating.”
While some students from “different childhoods and levels of education” might need help writing at the college level, Nathan explained that he’d had “an excellent education up to this point,” for which reason he took the “difficult and dangerous” view that “I do not believe that students of The University of Virginia, a top 3 public school in the country, need a first-year writing course such as this one.”
I suppose I feel obliged to correct for the fact that some students might have voted yes simply to spare my feelings; I admire Nathan’s and Sam’s bravery for saying all this to my face, as it were.
And if you have access to an electric drill, why would you insist on using a screwdriver?
In the final essay prompt, I’d invited my students to compare my course to learning “to start a fire with flint and tinder in the age of matches and propane lighters”: was this analogy accurate?
Of the four students who argued that the course wasn’t necessary, another took up this analogy directly. “Reflecting on the fact that 3 credits at UVA costs me $5000 and 2100 minutes,” Drew wrote, “I do not believe I grew enough through this course for it to be worth it.”
Other students disagreed with my analogy. “The analogy is flawed,” Dishi argued, “for unlike fires, all writing is not created equal."
Carina, a ROTC student who often attended class in full camo, wrote that “there is a reason people still learn to build a fire that way, in case of emergency with no resources.”
In my admittedly small sample of 72 students, I noticed that the students whose essays expressed the strongest doubts about the course, whether or not they voted no, were all men. I didn’t have the opportunity to ask them about this, but I can speculate along identitarian lines as to why my brethren felt this way.
When I pointed out that the joke he intended would have required an “aha” moment where he told the reader that the text was AI-generated
As Misha’s essay indicates, writing about “the power of writing” contains its own stock phrases and brainless clichés.
Writing, wrote Zoey, “is a way to express something that you cannot verbally say out loud,” which made it “a subject as rigorous as science. Everyone can speak, but not everyone can write.”
If ChatGPT were to read Cam’s essay, I doubt it would pause at this line. But her words have lingered with me because Cam spent the last month of the semester on crutches, so I don’t think she used the word crutch lightly.