a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment
kleinbl00  ·  1429 days ago  ·  link  ·    ·  parent  ·  post: Pubski: May 20, 2020

In the beginning, there was the Motorola 68x and the IBM x86. the 68x was a RISC chip; it was designed for synchronous processing by multiple different areas on the chip which meant it loaned itself well to stacking. The x86 was designed for memory segmentation to allow better use of limited resources. One is expensive. The other is cheap. One is refined for flexibility. The other is refined for speed.

Apple used a 68x processor. This allowed it to handle digital audio natively and successfully. At a time when memory was $40/mb, Apple's x86 chips could successfully stream stereo CD-quality audio at 10.4MB/minute. PC clones used the x86 and no attempts were made at authoring shit. At some point in the late '80s, however, the PC universe decided that maybe games shouldn't sound like 8-bit shit so things like the Gravis Ultrasound and the Creative Soundblaster viable. These specialized sound cards siphoned the bistream off at an interrupt; there were sixteen available on the chip and a soundcard could use one. It allowed for synchronous audio control on an asynchronous chip.

This is about the time Pro Tools was released as Sound Tools and about the time serious consumer video editing software was available. Pro Tools and Video Toaster were synchronous protocols that ran on 68x; video rendering was done on SGI and Sun systems that ran other RISC architecture. Eventually the PC universe recognized that there was a possibility to capture some of that money and assorted platforms rose up that sort of kind of allowed real-time recording and manipulation of audio and video but they only worked if they were slow and janky enough for the buffer to do the job.

Eventually x86 architecture became fast enough that you could deal with the asynchronous nature of the x86 instruction set but also, eventually Intel realized that they could eat Sun and DEC's lunch if they came out with synchronous x86 chips for the server market. Now you could do audio and video on a PC. Meanwhile Apple limped RISC architecture along until it was an obvious dead end at which point they switched to Intel but the hot shit macs? They've always used server chips on the Intel side.

So here we are now. If your asynchronous chips are fast enough, they will deal with synchronous data and the slop won't matter. This allows you to stream several channels of audio without things borking but I'm running 100 channels of I/O on a regular basis. Account for plugins and such and I've probably got a few thousand synchronous streams of data. And when things don't all arrive at the same time, software breaks.

AMD never served the pro market and never will. Gamers are a much more valuable demographic anyway simply because there's hella more of them.