I wrote an application in HyperCard many years ago, that took text files of word parts - prefixes, suffixes, roots, etc. - and combined them to make "words." Then it ran the results through a couple of filters to eliminate words with common oddities that English doesn't like - three consecutive consonants, Q without a following U, X without a vowel on both sides, etc. - and output the final list in a semi-randomized order.
It created both fascinating and terrible words.
The combinatorics were interesting to watch, and the edge cases were where things got really weird and unexpected results appeared. That became my favorite part of the program!
That was in, like 1990, or somewhere around there.
It was a blunt instrument that would occasionally spit out excellent results. (If a client liked one of the words, they could buy it for $60k, or something. Not from me, sadly... from the company that paid me $500 to write the software for them.)
We like to think our computers are so smart, and our tools are so amazing. But honestly, all we have are seriously blunt hammers. We just have millions and millions of them pounding away, like the proverbial monkeys on their typewriters.
All these CGI studios need is for ONE of their videos to "hit", and it pays for the army of monkeys. Generate 10k videos and upload them programmatically, and the numbers pencil out.
But, as KB says about Google's algorithm changes, the smallest tweak can completely disable an entire industry.
It'll happen. And the content farms will iterate the next generation of their content engines.
Over and over.
This is not content for humans. It is content for computers.