a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
kleinbl00's profile
kleinbl00

x 1287

stats
following: 22
followed tags: 80
followed domains: 8
badges given: 362 of 362
hubskier for: 5200 days


recent comments, posts, and shares:
kleinbl00  ·  17 days ago  ·  link  ·    ·  parent  ·  post: Hubski Ghosts

Markov chains have nothing to do with words. They arose out of the study of numbers. Their breakthrough commercial success was Robert Mercer applying Markov chains to high-frequency trading - within a short-enough timeline the variation of up or down a penny produced actionable heuristics to probabilistically profit. This launched Renaissance Technologies, whose tax burden prompted him to enlist his daughter Rebekah to enlist Steve Bannon to elect Donald Trump.

Here check out the sleight of hand

    In deep learning, the transformer is a neural network architecture based on the multi-head attention mechanism, in which text is converted to numerical representations called tokens, and each token is converted into a vector via lookup from a word embedding table.

"tokenization" is for purposes of Markov chains. "converted into a vector" restricts the math "via lookup." It's all autocomplete all the way down. Is it pure Markov chains? It is not. Is it most closely related to Markov chains? It is.

    At each layer, each token is then contextualized within the scope of the context window with other (unmasked) tokens via a parallel multi-head attention mechanism, allowing the signal for key tokens to be amplified and less important tokens to be diminished.

Is a multivariate pruning mechanism to solve problems by Markov chains. Probability in a billion directions as determined by what's adjacent. If the data contains the answer to one times ten and one times eleven, it will give you a probabilistic (not mathematical) answer for one times ten point five.

What do you think is the difference between a "probability" and a "vector embedding?" Because they mean the same thing. Maybe this is why you think "grasp" is an appropriate word to use for LLMs - they don't "grasp" anything they have an n-dimensional marble track they run down. Or why you say "inference not autoregression" as if the things were using reason instead of math. Look -

Does it need training data? Then it's a model.

Does it need a lot of training data? Then it's a large model.

Does it need a lot of training data carefully curated and patched so that all hands have five fingers and nobody makes faceswap porn of Taylor Swift? Then it's just a model.

If it were "inferring" and "grasping semantic meaning, grammar and relationships" it wouldn't need training data. It would be able to parse and analyze outside of its extant corpus of knowledge. You could start it small and let it get big. You could dump the world's knowledge of it and it could prune the useless shit. You could pick a fight over how to spell "blueberry" and it would have mechanisms for correction, rather than styles for arguing with you. It could parse and incorporate "hands have five fingers" and then raise a flag about The Princess Bride. Instead what's changed over the past ten years is

- how big is the model

- how many copyright-violating corners are fenced off

- how many times and directions it loops through the model to give you an answer

GPT-5 cost between $2.1 and $2.5b. The Internet tells me that's four hundred times more than GPT-3. And not only can the average normie not tell the difference, but the people who actually use OpenAI howled so much that it was partially rolled back.

Which does the evidence bear out - that the fundamental technology has changed massively but we've had effectively zero progress

or

that the fundamental processes have simply grown more elaborate and we've had effectively zero progress?

kleinbl00  ·  18 days ago  ·  link  ·    ·  parent  ·  post: Hubski Ghosts

an addendum

Wavetable synthesis is one of those things nerds love. I am a nerd. The nerdiest company in audio used to be Izotope, which has been doing the be-all end-all noise reduction and mastering plugins for more than 20 years now. And about fifteen years ago, they decided to take a swing at wavetable synthesis. It was fucking awesome.

But you kind of have to know what you're doing? And the presets are mostly useful for making weird shit. So they dumbed it down:

But you still need to know what you're doing.

So it died.

So did Stutter Edit.

So did Trash, although they've dragged that back from the dead who knows

So did Breaktweaker.

'cuz the problem is? The market isn't musicians, per se, it's guys who noodle around after they finish filing their TPS reports.

And here's what Izotope sells them. And it sells it to them a lot.

I can't use Neutron. My mixes don't sound bad enough for it to be useful. There is nothing Neutron can do to my shit because before I've even finished laying shit down I do a better job than Neutron. Because I'm a professional.

Who used Iris like a rented mule.

And Trash.

And Stutter Edit.

kleinbl00  ·  18 days ago  ·  link  ·    ·  parent  ·  post: Hubski Ghosts

Again - my beef is LLMs, not AI. There are maybe five different statements on this page alone in which I explain and expound my decades-long experiences with AI music generation, as well as my general enthusiasm for AI art.

But we're not talking about that. We're talking about Suno. Which uses current technology, not some-future-technology-otherwise-unspecified that allows you to win your argument.

    So let's be clear about what you are saying. Are you saying AI will never produce music as good as human-produced music?

I am absolutely not.

Look - the artificial intelligence trope has always been "you wake it up, it cogito ergo sums, it overpowers you" going back to Gilgamesh and Enkidu. Going back to hebrew golems. The first use of the word "robot" was in a story about a robot uprising. Chips are as great as they are because computers have been used to design them for generations now. "give it a spark and the fire will consume you" is the whole point - technology is supposed to bootstrap and "artificial intelligence" is supposed to bootstrap itself. That's the "intelligence" part. that's "turing completeness" - a language complex enough to reconstruct itself.

Where we're at with LLMs? Is "give it the sum of all human knowledge and get not-very-dank memes and an inability to spell blueberry, please give me seven trillion dollars."

Prior to LLMs, artificial intelligence research was slow and methodical and advanced about the speed of fusion research. It wasn't something that scaled, it was something that required insight, experimentation, novel approaches and an integrative approach to computer science, biology, and math. LLMs? Are Cleverbot. And the more people who play with Cleverbot, the better Cleverbot imitates a real conversation. And the more money you throw at it, the closer you are to the Singularity.

Up to a point.

The thing is? Cleverbot is just a double-blind ELIZA and LLMs are just Cleverbot with the humans replaced with human records. that's the whole schtick. Across every dimension. It's a comparator. Nothing more. It has no reasoning. It has no analysis. It parrots. Feed it enough data and it'll parrot anything in its data. And now that it's one seventh of our economy, the research on artificial intelligence of any other kind has ground to a halt.

So I'll say this:

IF artificial intelligence research stops at LLMs

THEN AI, as exemplified by Suno and other LLM-based platforms, will never provide sufficiently consistent, repeatable, adjustable quality to aid musicians in improving their music, let alone supplant the need for artists in music generation.

But I mean FUCKING HELL man.

Those of us who make music? We got exposed to physical modeling in 1994. And you could make actual, honest, new sounds. If you had $8700.

Yeah - mathematical flute, mathematical drum, mathematical horn, mathematical whatever. Or, you could go the cheap way and fucking change music forever:

LLMs, for what it's worth, owe a lot of their basic shape and size to wavetable synthesis, first created in 1958. You'll recognize it:

Problem is wavetable synthesis is expensive and hit its limits in 85-86, about the time another set of algorithms took over

And then made a brief reappearance but was mostly used for cheesy, forgettable reality TV scores and USA Up All Night movies (and niche '90s ambient - I have owned three Wavestations)

Suno is the Wavestation, with a pledged "three Californias" of bragawatts dedicated to it. If you're a dumb kid at American Music you can put your thumb on C1 and get "Ski Jam" and it's fuckin' awesome but take it from a guy who step-programmed Wavestations - it rules at what's in the box, and what's in the box is gonna sound like that. You are never gonna get Firestarter out of a Wavestation.

Wanna see something cool? I helped build this:

It is, in my not-exactly-amateurish estimation, the first new synthesis algorithm since Physical Modeling because the synthesis industry has been making most of its money emulating the past 50 years in software. It's from a clever guy at Eventide named Dan who decided to apply some of the stuff he'd learned while doing multiband compression for mastering. It allows you to make actual new sounds that nobody has ever heard before because it takes a simple thing - a sound waveform - and tweaks it in ways people haven't thought of yet. I'd have presets in there (I have presets in other Eventide plugins) except I got bumped by Suzanne Ciani.

So that's out here on the frontier.

Where we watch the normies go "if we feed enough money to the Wavestation we'll never need musicians again."

kleinbl00  ·  18 days ago  ·  link  ·    ·  parent  ·  post: Hubski Ghosts

https://medium.com/data-science-collective/gpt-4-is-just-a-giant-markov-chain-and-thats-the-genius-of-it-f7818ef2fc0b

Now - as far as Suno and Markov chains

- Having messed around with Suno

- Having messed around with Max

- Having messed around with Karma

- Having messed around with capytalk

- Having messed around with Reaktor

- Having messed around with WWise

- Having messed around with Unity

- Having messed around with sample libraries

- Having messed around with music libraries

- Having been around song construction kits since they were CD-Rs that you burned and swapped over Usenet

Suno has phrases that it can alter the key of. Suno has a text-to-speech generator that it can run through an autotune algorithm. And Suno has all of Gracenote, the service that fingerprints every bit of music ever published with title album genre mood and 30-or-so other characteristics. This is just what Spotify carries around in their public API:

    Track Name Album Name Artist Name(s) Release Date Duration (ms) Popularity Explicit Added By Added At Genres Record Label Danceability Energy Key Loudness Mode Speechiness Acousticness Instrumentalness Liveness Valence Tempo Time Signature

And it takes your text prompts...

...and feeds them to an LLM.

And it takes your music prompts...

...and feeds them to an LLM.

And it has a very simple algorithm that mates the key signatures, tempos and moods to which it adds the lyrics.

It's a much simpler problem than regular LLMs. There's less training data. There's more rules. There's substantially less possible variations for "song" than there is for "poem" or "story" or "picture. This is why to normies it's more impressive - the job is easier.

AND YET

I asked it for "nine inch nails song about chickens" and it choked because "Nine Inch Nails" is a no-no. But I asked for "90s industrial song about chickens" and not only did it give me Trent Reznor, it gave me recognizable samples from one of my old sample packs.

kleinbl00  ·  19 days ago  ·  link  ·    ·  parent  ·  post: Hubski Ghosts

There's a spectrum - a song that is very specific to one person will be loved by that one person and ignored by everyone else. A song that is universal will be broadly liked by everyone but cherished by few, presuming the songs are about equally good. Broadly speaking, a song specific to experiences will have a smaller, more vociferous following, that was my point with the South Park video. There aren't that many people who want "baby" replaced by "jesus" but the ones who do? They will heavily favor those songs at the price of quality.

Here's my question: right now, you can say "hey Siri - Play a song to put me in a good vibe" and it will absolutely do that. What leads you to believe that an ad-hoc synthetic composition a la mode will be more impactful for you than a song you already have a somatic experience with? That was written by humans? That you experienced with other humans?

Every mother is an expert on "good art as experienced by me." that doesn't make them art critics, it makes them parents.

kleinbl00  ·  19 days ago  ·  link  ·    ·  parent  ·  post: Hubski Ghosts

    Sorry, but your arguments here and in the other thread simply display a lack of vision. You're looking at where AI is now and saying it will never get to human level because "reasons".

I've been very clear about those reasons - LLMs are incapable of synthesis. They use Markov chains and stochastic variation to produce arithmetic means in every dimension. An n-dimensional LLM with values from 1 to 100 in every direction will never reach 101 in any direction. That's not "reasons" that's the fundamental limitation of the technology.

    But if you told the guy who invented the printing press that one day people would print lifelike color pictures with tiny drops of ink using a device that fits in your backpack and costs the same as a peasant gets paid shoveling shit for half a day, he'd say it's impossible. )

mmmyeah I suggest you take up your beef with the printing press with someone else. For his part, Gutenberg came up with movable type after woodcuts were well known in Europe. Here's Madonna del Fuoco from 1425:

As to what that has to do with inkjet printers, I'm not sure. I'll point out that Gutenberg's movable type and dye sublimation or inkjet printing is about 550 years (or 27 generations) apart, and also mention that pretty much up until the xerox machine? Gutenberg would recognize everything about printing, up to and including the mimeographs you and I grew up with.

    Just because Sumo hasn't improved, does that mean that no other programs will improve?

It means that the technology underpinning it does not experience improvement. There's only so much an LLM can do. Until we retrench to technologies that aren't wholly dependent on LLMs, there will be no substantial improvement.

    Yes, I know you're arguing that AI can only get us 80% of the way to what a human could do. I say it will get 100% there.

Right - and you're doing that by disregarding my expertise. Why? Because it's inconvenient. You haven't even paid lip service to my experiences in this field which definitely makes it easier to establish your opinion to be as grounded as mine.

    We could back and forth forever on this, but the thing is neither of us can prove the other wrong, so we're stuck with me saying you lack vision and you saying that because you have a long resumé you are more qualified to predict the future.

Right - you're arguing I "lack vision" when the fact of the matter is, I have the expertise to understand what I'm observing.

You'll excuse me if I merit your guarantee accordingly.

kleinbl00  ·  20 days ago  ·  link  ·    ·  parent  ·  post: Hubski Ghosts

    Ugh, fine, I guess I'll go back to making own music then. I hope you're happy.

LOL

Izotope did this. They'd made super-tweaky heavy-duty learn-your-magic warhorses for noise reduction and we all use them. We use them like rented mules. And they're full of terms we don't use and can make strange artifacts and you beat on them and you get something and you shift it and you get something else and you learn that you stay within these values on these two parameters and you actually make things better and we rely on it. And then they came out with something called "total mix." And they didn't even beta-test it. They just threw it at editors.

And we all hated it.

And the editors all hated it.

And the output was shiiiiiiiiiiiiiiiiiit

Because the idea was it could use its own training data and its own settings to make your reality TV show sound better without having to know what you were doing and what you could do? Is make your reality TV show sound how Izotope thought it should and they legit scrubbed the Internet of any mentions of it because everyone was SO mad that they'd cut out the expertise of 95% of their clientele to better sell products to the other 5% that about 90% of that 95% started shopping for different tools.

Cathy O'Niel made the observation in Weapons of Math Destruction that the first thing an algorithm needs in order to be fair is to be open. The next thing it needs is to be replicable. There are all sorts of ways to design music algorithms to be intelligent but also creative but also adjustable but also replicable but there aren't any ways to do it with LLMs. My beef isn't with AI, it's with LLMs - I think AI is fascinating and I even think LLMs are fascinating when you can manipulate them in directions where you're making new stuff.

A tool is anything that augments expertise - a non-expert can use a tool and it will improve their output but their output will improve more if they learn how to use it. I don't care who you are - you'll get better at shoveling after a few hours digging a ditch. A crutch is anything that covers up a lack of expertise - if you can't walk, a crutch will help you perambulate... but if you're already running, a crutch is just something to carry around.

The entire AI landscape right now is crutches and it's being sold as if it's running shoes.

kleinbl00  ·  21 days ago  ·  link  ·    ·  parent  ·  post: How AGI became the most consequential conspiracy theory of our time

    I have been reporting on artificial intelligence for more than a decade, and I’ve watched the idea of AGI bubble up from the backwaters to become the dominant narrative shaping an entire industry.

Yet he missed the forest for the trees. It's right here:

    Alan Turing asked if machines could think only five years after the first electronic computer, ENIAC, was built in 1945. And here’s Turing a little later, in a 1951 radio broadcast: “It seems probable that once the machine thinking method had started, it would not take long to outstrip our feeble powers. There would be no question of the machines dying, and they would be able to converse with each other to sharpen their wits. At some stage therefore we should have to expect the machines to take control.”

That became Collosus: The Forbin Project

That became Demon Seed

That became Electric Dreams

That became The Terminator

All of which were in heavy rotation on daytime cable TV right about the time a home-schooled nerd named Eliezer could watch them rather than do his homework.

If you've only been reporting for ten years you missed the part where the "Singularity Institute for Artificial Intelligence" became the "Machine Intelligence Research Institute." "Singularity" isn't some weird phrase, it's the phrase coined by Vernor Vinge to set himself apart from every other "the robots are gonna get us" author. It's also the phrase Jaron Lanier used in You Are Not A Gadget to point out that the eschatological tendencies of the Singularity crowd mirror the eschatological tendencies of the Left Behind crowd - they both think Jesus is coming to reward the faithful and punish the wicked, the only difference is one takes god on faith, the other machine superintelligence. And as soon as Lanier gave away the game, "singularity" became a bad word.

I'll say this for the Jesusfreaks - you don't need to worry about the mechanics of an all-powerful god, it's in the name. The Singularists, on the other hand, needed to build out this whole hand-wavey thing about how robots would paint with malicious colors and end the world.

And it wasn't logically consistent, and it isn't possible, and they know it, so they stopped thinking about it, and took it on faith.

That's it. That's the whole enchilada.

IF: machine intelligence

THEN: machine superintelligence

BECAUSE: faith leap.

LLMs are perfect for this faith leap because you physically cannot interrogate them. You cannot know how an input led to an output. You can probe it? You can map its contours? But it is fundamentally unknowable, much like Jesus.

Pretty fuckin' cool how Peter Thiel has decided Greta Thunberg is the antichrist, rather than Grok or some shit, eh? Isn't it great how a rich eschatological dipshit who helped fund our way here is now a biblical scholar?

The part that grinds my gears is every. single. conversation. I have ever had on Hubski about artificial intelligence includes that faith leap. This shit is knowable and discoverable and the fuckin' code's out there, man. Back when this was a serious concept, the idea was a self training algorithm would train itself past our abilities but because people are more interested in the faith leap than the technology we've got Saltman out here selling three Californias worth of power and asking for eight trillion dollars. Before LLMs? The algorithm was gonna feed itself. After LLMs? We must spend our way to the apocalypse.

IF: machine intelligence

THEN: machine superintelligence

BECAUSE: faith leap, send as much as you can or god will call us all home

kleinbl00  ·  21 days ago  ·  link  ·    ·  parent  ·  post: Hubski Ghosts

Guaranteed by whom? By you?

I've had friends throwing Suno at me since it debuted. it has not improved. Look - you know games. Do you know video games? In WWise or Unity the way you build up a reactive, responsive score is by creating individual elements with start and loop points based on events. Those events can be progress-based, they can be damage-based, they can be timing-based, they can be location-based. The player essentially creates her own score by wandering around the game and doing stuff, and if done well it's transformative.

Suno essentially does the same thing except instead the player inputs are text prompts. Those text prompts are tied to musical elements and those musical elements aren't original - they're warmed-over midpoints between examples in a training library. More than that, they aren't repeatable. More than that, they're the fat part of the bell curve - everything above 85% and everything below 15% (to conjure numbers out of my ass) are lopped off for being too weird.

Now - I'm literally in the beta for software that takes your music and comes up with AI accompaniment. It requires you to have slightly more of a clue what you're doing, however. What do you think - will it do better than Suno?

And at what point is it "AI" and at what point is it a tool? This is where people who have no professional experience with something like to pipe in. A buddy of mine freaked balls about this last night:

ZOMFG "we're cooked." Never mind that live video replacement has been commercially available for 25 years. Never mind that people are so used to a synthetic 1st down line that they don't even realize they're looking at one. Never mind that all this device does is allow you to Luminar or Photoshop or Figma your Tik Tok videos without having to loop through a computer. If you've never used video or image-editing software it's obviously the end times for photographers everywhere.

Did photography radically change when everyone started carrying one in their pocket? Or did we just end up with a lot more photos nobody looks at? Well, depends on where you look. The stuff the semi-pros crank out has become RADICALLY better because they have better tools. The stuff the amateurs crank out goes through waves of heavy Instagram filters and duckface. Ultimately, quality comes from expertise.

It seems impossible to you because you haven't been in the belly of the beast on music production since 1994. I have. But since it's music, nobody thinks that counts for anything.

kleinbl00  ·  23 days ago  ·  link  ·    ·  parent  ·  post: Hubski Ghosts

Your argument is equivalent to putting forth Tesla's Full Self Driving as evidence of AI's prowess in protein folding, though. There is no aspect of Suno that has anything to do with warehousing or logistics. You're also arguing here is an AI song that is well received, therefore all AI songs are great when my argument was the stuff you put forth (under a deliberately throw-away schedule, remember) wasn't impressive, particularly in terms of threats to the passions and professions of artists.

Remember - I've professed again and again that I've been messing about with generative music since the early '90s. It's not a tool that works for me (it's fun though) but that doesn't mean it's not a tool that works for anyone. I find that rezzeJ's work without Suno is better than his work with but that isn't necessarily true for everyone. Again, I've given money to AI artists.

Maybe Xania Monet is the first artificial persona to chart on Billboard. Hatsune Miku is like 22 now, here she is at Wembley:

My beef, to be clear, is that you consistently take the position that expertise is worthless particularly in domains you have no experience in and when presented with evidence to the contrary, you switch to other domains you have no experience in. Check it out - here's me defending AI artists three entire years ago. My argument hasn't changed - tools are useful in the hands of the trained and if you make something with a tool, you make something.

You keep doubling down on expertise is worthless.

No.

kleinbl00  ·  23 days ago  ·  link  ·    ·  parent  ·  post: Hubski Ghosts

Let's talk through this horse first, though. This discussion started in chat, which I'm not going to screengrab because it's tacky and flies in the face of chat:

- bfx brings up Amazon's impending 30k layoffs

- mk expresses concern over the layoffs, presumes they're AI-related, asserts that AI-related layoffs are the future of employment for white- and blue-collar workers, puts forth an ai-slop song a friend sent him as proof of the ascendancy of AI

- I shit all over the song, calling it out for being the same mediocre dreck Suno always produces, arguing that Amazon is a logistics company much more affected by tariffs than AI

- bfx politely points out that Amazon has a lot of warehouses and that their automation is a long ways away from AI

- mk throws this very song in my face to prove how rapid the pace of change is, with emphasis on how little effort it took

- steve argues that the warehouse tech is there, he's seen it

- bfx points out he works in logistics and is a decision maker in this field and he has not

- I shit all over the song, the idea that AI had anything to do with layoffs at Amazon, the historical impact of innovation on employment, the historical impact of innovation on logistics in employment and the idea that implementing AI has been a net positive for anyone, with concrete examples of the companies that have had their businesses hurt by implementing AI (Klarna, McDonald's) before rounding into my personal experiences attempting and failing to integrate AI into medical transcription

- mk posts this song

...so really, the discussion here isn't "this song is great" it's "this song is proof that AI will doom us all." The layoffs that started this whole blow-up? "it's culture." The bait and switch on this page is "this song is great because it talks about us" serves as a proxy for "this song is proof that AI will doom the warehouse industry," a classic whataboutism. My broad point is that it's a low-effort song that sounds low-effort, much as everything Suno has been doing since launch, but because the straw man has The Face of Hubski I'm a terrible person for picking on the baby.

Now let's talk about your argument because it's much more interesting. I would also like some credit for the fact that a big chunk of stuff posted with the tag "ai" was posted by me, going back more than ten years, and that the entire field was fascinating before OpenAI poisoned it:

Really, that inflection point is where pop culture switched from "AI glitches are interesting" to "AI glitches are bad for business:"

There's a whole domain where "makes shit up" is the entire point. I can't really hear many instances where the described features actually appear - I really wanted to hear it try to do "Rhodes runs through waveshaping and saturation" because it's my kinda thing but really, the closest it got was "Drums shift between breakbeats and complex IDM programming in 7/4" is "jungle" so it gave you jungle drums.

Searching for "Rhodes runs through waveshaping and saturation" shows that Suno doesn't know what a Rhodes is and doesn't know what waveshaping or saturation are, either - it's got stuff that it does, and it mixes them together. If I click the tag "IDM Jazz Fusion" it initially gives me several songs that sound indistinguishable from yours... but if I try to repeat the process it gives me more stuff like what I clicked so the tags aren't even real. It definitely has hooks it uses a lot, it definitely has beats it uses a lot. It shapes what it gives you based on what it has, not on what you ask for, and really, that's fine.

The Roland XOX-boxes were created to accompany one-man bands and if you try to use them for their intended purpose they sound cheesy AF. It wasn't until they started showing up cheap at pawn shops that music changed forever.

- and here's where it breaks down -

I gave Suno your exact prompts. It came up with four songs that sound nothing like yours. I'm sure I could refine it a little but I really don't want to get it on me; the fact that it starts out with very little to tie the prompt into the song is the whole point. By way of contrast, here's NI Absynth sixteen years ago:

The TB-303 was introduced in 1981 and failed immediately because it was a sophisticated device with a lot of potential designed for a simple job. It wasn't until hackers and pikers got ahold of it to bend it out of its intended purpose that things got interesting. By way of contrast, Electronic tanpuras were introduced in 1979 and haven't done a thing in the years since - they're simple devices with zero potential designed for a simple job.

And I mean, MAX was introduced in 1985. I've written algorithmic music prompts in MAX, Capytalk and Reaktor. Here's Reaktor 11 years ago:

Here's Reaktor 11 months ago:

Now - this isn't really to my taste. But you can hear the human behind it. It gets my chin nodding. It's evidence of a generative tool used in competent hands to create something unique. YOU CAN'T DO THAT IN SUNO. The whole point of Suno is 'make me a song that sounds like other songs' and it uses its training data of other songs to give you a song. Suno is the Tanpura - it lets you play with yourself alone. I've made this argument before - the great thing about XOX-box era Roland is it's open-ended. Here's what a Jupiter 8 arpeggiator wants to do:

Here's what a Jupiter 8 arpeggiator wants to do in the hands of Duran Duran:

What you're not hearing is that you've created an endpoint. It's music, sure. But it'll never be more than what it is, and any attempt to improve it will make it different, not better. That Tim Exile shit? He sweated over that, it's what he wanted. Your Suno example is what you got and I hope I don't offend you by pointing out that the stuff you make is leagues better, I've heard it.

LLMs excel at getting anything to 80%. They are the embodyment of the Pareto Principle. They will always give you the fat part of the bell curve and with the commercially available LLMs, you can't even push them out into the tails - you are forever at a b minus and any of the random brilliance that might get you to an A has been abolished to hell with the stuff that gives you an F. The universe where everyone is coding and feeding their own small models? Yeah that shit gets weird. But that shit doesn't require you to buy tokens. Worse, it is economically impossible for commercial LLMS to do anything else. The only way to get the model to be interesting is to find the holes in it, and every time someone finds a hole in a major model, the company patches it. Their market isn't you, their market isn't Tim Exile, their market is mk, who is quite happy with mediocrity thanks and isn't at all interested in making music for anyone else. He doesn't want to sell potato chips, he wants to eat them.

Suno can generate music to the level of bland, forgettable stock music. It isn't even adequate, man. Take away the Ikea Effect and it's got nothing. I didn't build your Poang chair so to me? It's just a cheap lounger. I didn't assemble your Tamiya tank model so to me it's just a tchotchke. I didn't rub my hands all over your Hatchimal so to me it's just clutter. Go search Suno for "elevator music" and then hit Youtube for some Percy Faith. Suno doesn't even understand the brief for the same reason it'll never understand what a "Screamer" is - it has to be told, it can't infer.

There's a "tell" to AI-anything. Whatever the tool is, ask yourself - is it for people who know what they're doing? Or is it for people who think they don't have to know what they're doing? Because that last 20% is where skill comes in. Where training comes in. Where experience comes in. And I have yet to see an LLM-based tool that lends itself to people with skill, training or experience... and I've been using AI tools since Sound Forge 3.5 in 1998.

kleinbl00  ·  26 days ago  ·  link  ·    ·  parent  ·  post: Hubski Ghosts

I'm being a prick by saying a song that took thirty seconds to prompt isn't impressive?

That seems to be the argument here - we're all doomed because AI is good. But what evidence of good AI is there? This song?

Theoretically I've been beta-testing software for a large company that will build up bass, guitar, drums, whatever around any audio file you feed it. I haven't felt the need, however, because in a closed, NDA beta populated by audio professionals there's been a lot of "mmmmyep" in response to everything it does. The better version of this, run by people who do this shit for a living, is thunderously underwhelming and it kicks the living tar out of this.

The Korg Karma would auto-accompany; it was released in 2001. And yeah - if you were a lone lounge singer at a Holiday Inn out on the turnpike, it was cheaper than hiring bandmates. You can even look at its list of notable users and see that it got a real try-out - but that algorithm was available for about 18 months and then the Karma got remaindered. Korg never bothered trying to do that again.

My argument is and has been that algorithmically-generated music (and poetry, and art, and and and) is un-fucking-impressive and I say that as someone who started messing around with algorithmic music on Kyma in 2002. See - I'm not disputing you on wastewater even though I've got more background in it than any other normie you know. I'm not disputing mk in genetics. I'm not disputing TNG in sales.

But this? This is something I actually know more about than anybody else and since y'all have picked up a guitar once or twice y'all feel more than comfortable going "no no it's great and you're a prick for raining on the parade."

One of the things you have to learn as a screenwriter is distinguishing between "things that you like" and "things that are good." I always had hella more luck churning out dreck on command than I did carefully refining my own ideas - my own ideas appeal largely to me and leave the general audience confused. You can solve by inspection which are more fun for me to work on. If you ever wanted to know what happened to Bill Drummond and Jimmy Caudy, it's the fact that they spent ten years going "you're all listening to shit, stop it" and the audience went "moooooooooorPOOOOOOOOOOOOOP" so they took a million pounds and burned it in a pile.

Everyone here is going "thing that I like" and equating it to "thing that is good" - "while this is a genre of music I would never listen to, it’s pretty darn good compared to other music in that genre." That's like saying "while I would never listen to rap music, this is good rap music." The truly baffling thing is I'm sitting here getting ad-hominem attacks for arguing that low-effort AI slop is low-effort AI slop! If my kid spent 30 seconds scribbling something on a page and showing it to me, I would be remiss as a parent if I didn't say "that looks like you spent 30 seconds on it, kid." What's the counter-argument? "what does the fox say" peaked at number six. Fuckin' "We Built This City" made it to number fourteen FOR THE YEAR and it's widely acknowledged as one of the worst songs of all time.

But you know what? It's in your goddamn head right now. It just bumped out "what does the fox say" which was also in your head. Fuckin' Gangnam Style was a goddamn cultural movement and the whole bloody country was doing the Macarena.

Pop quiz - can you recall anything at all about "Hubski Ghosts?"

Back when I was editing and scoring Youtube videos I had access to three different stock music libraries. They were full of forgettable nonsense garbage that nobody ever listened to again, but they were cheap. Those, of course, were the high-water mark of unproduced music and they sounded hella better than Suno ever has. Nowadays of course you just type "royalty free" into Youtube and get whatever you want. Most people use the same music, though, because almost all of it is ass.

NOBODY uses AI-generated music.

And maybe that's the difference - I've spent a career listening to mediocre music across dozens of genres, and i've spent a career getting little gems like mixing Reba McIntyre in secret, forgotten little impromptu three-part harmonies. My crap-to-gold meter is practiced and polished in this very domain.

If that makes me a prick? I suggest y'all get used to people being pricks about your AI-generated music.

kleinbl00  ·  28 days ago  ·  link  ·    ·  parent  ·  post: Hubski Ghosts

You take it as an article of faith that AI will end up in the Top 40.

- What's your music experience?

- What's your broadcast experience?

- What's your radio experience?

- What's your songwriting experience?

- What's your A&R experience?

Because I paid for college mixing bands in clubs.

And I worked for Paramount and Fox for fifteen years.

And I put together a radio show every week that reports to Billboard - my shit actually gets tabulated by the guys who build the Top 40.

And I've scored two or three web series and a few short films.

And I get about 15 emails a day from various A&R reps trying to get me to play their music.

I'm not even going to be polite about this anymore - you know fuckall about music but since you can type prompts into an AI you presume my expertise is moot. "Their competition is just gonna keep getting better and better" - It's not though. Juno this month is Juno 18 months ago and nobody fucking cares. Look - someone hacked Spotify's algorithm and managed to get into the charts. Here watch.

Go search "velvet sundown" and you'll hear far more discussion about the fact that they somehow got 370,000 spins than the fact that this is mediocre shit that nobody would listen to on purpose. Basically, Spotify's algorithm is so easily gamed that you can stuff it with AI slop as easily as you can Youtube or TikTok. Spotify responded by going "oops."

Here i got another one for you

You know who doesn't hate AI music? The alt-right. Because they can force Suno to make them alt-folk covers of music that they aren't allowed to like without it being run through their hatefilter.

That's objectively TERRIBLE. Here's the really dumb thing: an alt-folk cover of Pantera's walk has existed for 12 years:

But clearly that chick has opinions about Gaza so we aren't allowed to listen to it, let's get AI to burn a few million tokens doing a worse version that's just for us.

I listened to the goddamn song, dude. the difference is, I listen to about nine hours of new music a week.

I am an expert.

You are not.

And your arguments reflect that fact.

kleinbl00  ·  28 days ago  ·  link  ·    ·  parent  ·  post: Hubski Ghosts

And it sounds like it!

- What about this is impressive to you?

- What about this is any better than Suno was eighteen months ago when all the musicians spent two minutes freaking out about Suno before dismissing it out of hand?

- What about this sounds anything more than GPT4 poetry, Taxi.com stock music and any rando's text-to-speech AI engine?

- Are you ever going to listen to this again?

- So why would you expect anyone else to?

I had a bad day in a private forum on Reddit a dozen years ago and within an hour someone had posted this:

Now - I don't listen to that either. It's every bit a low-effort throw-away song. Nobody will ever bother with it and the fact that it's been racking up 20 views a year is actually kind of impressive.

Everything else to say was said here, applies 100% and it grinds my fucking gears how everyone tech-adjacent stares into the Mirror of Erised and goes "clearly artists are doomed."

kleinbl00  ·  30 days ago  ·  link  ·    ·  parent  ·  post: The dawn of the post-literate society: And the end of civilisation

The lack of meaning, you mean?

My daughter went bra shopping with her mom yesterday. She also got a makeover at the counter - first time she's had eye shadow and mascara on. She also gave me the GenZ Stare for the first time I noticed - now normally I'd link to that but I want you to go through the exercise of seeing the moral panic writ large in every result you get.

Now see - up until like a year ago we just called that a "withering stare." I don't know how far back that goes; I could guess "Shakespeare" and most people would nod and stroke their chins. But then some twitchfuck millennial decided "ZOMFG they're staring at me deadpan AND I DON'T KNOW WHAT IT MEANNNNNNNSSSSSSSS" because they were raised on a steady diet of Instagram and Friends reruns and they have crippling aphasia if no one is making duckface. So rather than go "I am now an Old" they went principalskinner.jpg and presumed it was the children who were a problem.

I can see some people having a problem the first time their kid gave them an "AYFKM" look. For me? It was rather a proud moment. My wife was discussing a self-defense course for young women that was being taught by a friend of a friend -

"sort of like what they taught us to do in camp."

"What did they teach you to do in camp?"

"you sharpen a stick that just barely fits in your hand and leave one end blunt so you can conceal it and stab your attacker if you need to."

"So that camp taught you to make prison shanks?"

"I...guess?"

"Well I guess it's a good thing Bobby wasn't in that camp," my wife opined.

"Oh Bobby took that camp," my daughter said.

"Well at least he's not violent," I said.

- GEN Z STARE -

"LOL Bobby is violent?"

- GEN Z STARE INTENSIFIES -

"What violence does Bobby practice?"

"He likes to throw rocks at girls' boobs"

"Awright I earned that look"

kleinbl00  ·  33 days ago  ·  link  ·    ·  parent  ·  post: The Goon Squad

So... gooning is not my cup of tea? But I'm really uncomfortable with Harper's declaring that nobody should drink it.

Aside from the self-righteous moral scolding ("I could’ve spent my years of peak brain development romping around a toxic-waste site, slurping sludge and indiscriminately licking circuit boards"), the article describes an archetypal sex-positive group fetish. The Circle Jerks formed in 1979 and everyone knew what they were referring to. Bukakke has been a thing since the mid '80s. The "goonstate" is nothing more than tantric sex, an accepted and celebrated aspect of Hindu spirituality with a 2500-year history.

I mean yeah - older dudes are getting laid more often, who knew. The guys who are really into it are concerned about the health of the guys who are diving too deep. Young men will absolutely celebrate things young women really wish they wouldn't and while I've never felt particularly compelled to build a porn cave, let alone share pictures of one, I'm heartened to see that the controls and cautions around it are about maintaining a non-exploitive community. Pretty much every "nope I'm out" example used by the author is about stuff that breaks the law or crosses a self-harm threshold so what, exactly, are we supposed to be upset about?

    As a card-carrying woke millennial, I felt antique as a hippie washed ashore in Reagan’s eighties. There I was, archaically respecting pronouns while everyone else bought Bitcoin and rediscovered the joys of calling things retarded.

Oh.

Fu-huuuuuuuuuucking LOL

    Using electronic health record data from the multistate, primary care–based American Academy of Pediatrics Comparative Effectiveness Research through Collaborative Electronic Reporting (CER2) network, we defined preguidelines, postguidelines, and postaddendum guidelines cohorts (cohort entry during September 1, 2012, to August 31, 2014; September 1, 2015, to August 31, 2017; and February 1, 2017, to January 31, 2019, respectively). We determined the cumulative incidence of IgE-FA and/or atopic dermatitis (AD) in children aged 0–3 years, observed for either at least 1 or 2 years. Diagnosis rates during pre- vs postguidelines periods were compared using logistic regression, Cox proportional hazards modeling, and interrupted time series analysis.

Let's put that in English, shall we?

- Using the EHR database outside the one all the allergy studies use (my daughter is not in it)

- That only contains data voluntarily added ("do you have a food allergy? click this box")

- That is explicitly not covered to test for food allergy by Blue Cross, Kaiser or any other major insurer

- They successfully demonstrated that there's a hole where they're expecting 60,000 kids to be

Contrary to this bullshit, the conventional wisdom GOING BACK DECADES was "get your kids exposed to allergens so they aren't as bad for you" - the phrase "hygiene hypothesis" was first coined in 1989 but the idea goes back to Rachel Fucking Carson. It's why my kid was given peanut butter at - you guessed it - four months! And promptly went into anaphylaxis at her babysitter's.

What they won't tell you, of course, is if you want a legit allergy diagnosis? You have to wait until your kid is 24 months old. But don't wait too long because insurance won't cover it after 30 months! And since the tests are done at an allergy clinic the results aren't given to your PHP. You can do that, of course. You can ask for the results to be transferred. But EPIC doesn't have a slot for all your allergy shit because your insurance company doesn't have any way to code for it other than "life-threatening allergy."

What you're seeing here? Is the archetypal attempt to blame parents - same as it ever was - while also attempting to cast a nothing bullshit study as a "landmark."