BUT WAIT IT GETS WORSE One developer, who only goes by the name Lore in their communications with the media, described the open-source release of the large language model (LLM) Llama as creating a “gold rush-type of scenario”. He used Llama to build Chub AI, a website where users can chat with AI bots and roleplay violent and illegal acts. For as little as $5 a month, users can access a “brothel” staffed by girls below the age of 15, described on the site as a “world without feminism”. Or they can “chat” with a range of characters, including Olivia, a 13-year-old girl with pigtails wearing a hospital gown, or Reiko, “your clumsy older sister” who is described as “constantly having sexual accidents with her younger brother”.
I do think it's hard to fundamentally change open models to prevent this? My understanding is that with open weights models, you could in theory just put the model back in the oven to train the safety features out of them again. (Although I might be wrong about that.) Facebook is entirely to blame for opening this particular Pandora's box though.
Yeah, you can't program morality into the models. Once again, we have a new technology that rolls over our norms, expectations, and economic and legal systems. This is a trend that is growing in pace. Human nature is the culprit. We can't help ourselves. We are a medium for the protagonist, we aren't the protagonist.
What an absurd statement! You can absolutely program morality into the models. They rely on training data, that training data being selected and refined by employees of the company making the model. Every LLM out there is a swiss-cheese lookup table of "things we won't get sued over." That's why you can no longer get ChatGPT to generate Mario but you can absolutely get it to Miyazakify everything. This isn't something that happened through serendipity, this is something that happened through training. "Don't tell teenagers to dickmax" is the same programming statement as "don't tell journalists the Jews should be gassed."Yeah, you can't program morality into the models.
It's not the training data that does that. What you are seeing is mostly the result of hidden prompt engineering and post-processing of outputs. You can kind of skew the training data, but when you are talking about open models, you can't code morality into the dataset, and you can just as easily ask a model to be evil with a dataset that isn't optimized for it. GPT won't generate Mario because OpenAI literally tells it not to if you ask it to.
"that training data being selected and refined by the employees of the company making the model" is pretty unambiguous. So I take my open model and I run it on my own iron and I chunk it down and I get this skinny little thing. Maybe I train it on the most heinous shit imaginable. That's not Meta's fault and it's not Meta's problem and if I wanna take an open model and code it for evil, nobody is going to stop me. But this is an open model running on someone else's cloud with payments processed by someone else's payment processor with Oauth handled by Google and Facebook and Github and whoever. And they are every bit as morally, ethically, legally and practically culpable as Goldman Sachs was for laundering cartel money. If they're all profiting off of evil they get to pay the penalties for profiting off of evil. This is not a gray area.
It is hard! Because any kid can just code up any fuckin' thing he wants and then what do we do? Thing is, I had friends who would go to the hardware store and buy pipe, then hit the sporting goods store and buy smokeless, and lo and behold they'd spend the afternoon making pipe bombs. I had another friend who used his time in the machine shop to make 9mm pistols. My sister's boyfriend called in bomb threats every day at 1pm for about six weeks to avoid having to take a chemistry midterm. Easy! You just head over to the pay phone and dial! So simple a kid could do it. In all three cases. First pipe bomb I ever saw? I was in fourth fucking grade. So how do you stop the kids from buying pipe? Obviously you don't because that's stupid. Maybe the kid has a sink to fix, or maybe he's picking up supplies for his dad. You can almost see the Leave it to Beaver episode. One thing you can do, though, is you can make it illegal to manufacture pipe bombs. Then the kids that make pipe bombs? You can actually intervene in their lives before someone gets killed. Ohhh, but how that will curtail innovation! Ohhh, but how that will stifle free speech! Ohhh, but how that will somehow bring us closer to Communist Russia, which apparently we're fans of, except when Trump is mad at them, but even then... Fucking look. Tim McVeigh cooked that up out of nitromethane and fertilizer. Nitromethane was tough to buy even back then - you'd best be a top fuel dragster team. It took McVeigh three tries because the first two people were at "dude where's your dragster" and "what are you trying to do, build a bomb lol?" The fertilizer? Enough for 12 acres of corn, 12 acres being at the time roughly one fifteenth of a typical farming operation. And we've been trying to regulate it ever since. We sure as shit track the nitromethane. There are legitimate purposes for AI, much like there are legitimate purposes for ammonium nitrate. Nearly everyone fucking around with it will do no harm. There's absolutely no point in regulating things that can be dangerous but are mostly used for innocuous purposes; everything that can be easily used for malfeasance we regulate the shit out of. I can buy castor beans online right now. I can find online articles telling me what a beautiful plant it is. And everyone agrees you're A-OK to do that - even though the beans should be handled with gloves, apparently - until you try to cook up ricin, which has been illegal since - wait for it - 2019 because a whole bunch of simps take their murder cues from Breaking Bad. Fuckin' Agatha Christie used ricin in one of her books. We didn't need to regulate ricin, much like we didn't need to regulate nitromethane. Is it Vince Gilligan's fault that now you can't cook up deadly toxins at home for relaxation? I mean, yeah, but that's incidental. Vince Gilligan doesn't profit off of people buying castor beans off Amazon. Facebook? OpenAI? Anthropic? If my malfeasance requires your giant server farm, you have an obligation to the public.
How do you prevent a person from pirating software? I don’t think you could have a super advanced server farm, for someone with money could probably daisey chain enough computers and run a simple AI instance without needing an entire datacenter. Certainly enough to create something like an AI Twitter bot spreading disinformation to rubes who don’t bother to fact check. Maybe that can convince people that Haitians eat cats on Bastille Day. Maybe it convinced them that Biden is dead and replaced by an actor. Depending on how clever the bot is and how credulous the target audience is, you can probably incept someone to be a lone wolf terrorist. But what exact thing do you prevent here? Limit the number of GPUs (which are also used for bitcoin mining) computers themselves? Hard drives? Given how easy it is to download software, I can’t imagine that you could honestly expect to stop people from downloading software — we’ve been failing at that since Napster in 1990.
What does that have to do with the price of tea in China? Nobody is worried about "someone with money" being whispered sweet jihadi nothings by IsisGPT. The worry is impressionable people being told harmful stuff about comets. Here, look: These were all the rage when I was a kid. They were also dangerous AF. Basic problem is they don't turn like a motorcycle, they don't turn like a car, they turn like a car with a wheel in the middle. This is particularly problematic when what they're mostly good for is doing donuts in vacant lots because your parents own a couple but they have to be trailered 50 miles to get to the trailhead and you don't have a driver's license but you know where the keys are and it's boring this summer and Susie will be impressed by your dust-generating power and and and. Now - you can kill the shit out of yourself on a 4-wheel ATV, particularly as the southern US allows you to drive them on the street now. Americans are all about the freedom to kill the shit out of yourself. But deaths and injuries of children and teenagers went up like crazy when the ATC was introduced and dropped back down when it went away. Can you still buy an ATC? Absolutely. Can you still ride one? Hundo P. Are they still dangerous? Absolument. But the US made it harder to hurt yourself on Honda's toys by making Honda's toys less likely to surprise you in the corners. You prevent large corporations from receiving revenue for encouraging impressionable incels to hurt themselves through legal injunction. How is this so hard to grasp. That's a cut-down Facebook AI living on a Raspberry Pi. It could absolutely spew Andrew Tate quotes all day long and nobody's gonna do shit about it. The important distinction is a whole lot of effort and labor went into that, as opposed to "hey Siri tell me why girls don't like me." Bitcoin mining uses ASICs. GPUs were used to mine Ethereum until it went proof-of-stake. AI is increasingly reliant on NPUs and has been for years: This is one of those situations where you're substituting your assumption of knowledge for... knowledge and it's leading you to make supercilious arguments like "how do you prevent a person from pirating software." Dozens of ways. I've got an iLok with 400 plugins on it; several of my friends run Pro Tools 10 (released 2012) because its the last cracked version. I have several business programs that keep a portion of their code on a server I don't own rending everything local worthless without a clear license. But that doesn't matter because this isn't a copy protection issue it's a negligence issue.How do you prevent a person from pirating software? I
But what exact thing do you prevent here?
Limit the number of GPUs (which are also used for bitcoin mining) computers themselves?