

I suddenly feel a lot less bad about him having his game copied and re-sold because he released it under public domain. Maybe the ‘left’ in copyleft scared him
Hateful and stupid 🤝
I suddenly feel a lot less bad about him having his game copied and re-sold because he released it under public domain. Maybe the ‘left’ in copyleft scared him
Hateful and stupid 🤝
I hate that I saw that same post earlier today
Here’s a quote from the book:
AI already transcends human perception — in a sense, through chronological compression or “time travel”: enabled by algorithms and computing power, it analyzes and learns through processes that would take human minds decades or even centuries to complete.
Glad to know the calculators I had in school were capable of time travel
People are so, so, so bad at telling what’s a bot and what’s real. I know social media is swarming with bots, but if you’re interacting with somebody who’s saying anything more complicated than “P o o s i e I n B i o” it’s probably not a bot. A similar thing happens in online games, too, and it’s usually the excuse people use before harassing someone else
But damn the lengths people will go to to avoid admitting they were wrong. This comment chain just keeps going on with somebody who’s convinced {origin="RU"}{faith="bad"}{election_manipulation="very yes"}
must be real because something something microservices: https://www.reddit.com/r/interestingasfuck/comments/1dlg8ni/russian_bot_falls_prey_to_a_prompt_iniection/l9pbmrw/ It reads like something straight off /r/programming or the orange site
Then it comes full circle with people making joke responses on Twitter imitating the first post, and then other people taking those joke responses as proof that the first one must be real: https://old.reddit.com/r/ChatGPT/comments/1dimlyl/twitter_is_already_a_gpt_hellscape/l9691c8/
This account kind of kicked up some drama too, basically for the same reason (answering an LLM prompt), but it’s about mushroom ID instead: https://www.reddit.com/user/SeriousPerson9 I’ve seen people like this who use voice-to-text and run their train of thought through ChatGPT or something, like one person notorious on /r/gamedev. But people always assume it’s some advanced autonomous bot with stochastic post delays that mimic a human’s active hours when like, it’s usually just somebody copy/pasting prompts and responses.
Sorry if you contract any diseases from those links or comment chains
I’m in the same boat. Markov chains are a lot of fun, but LLMs are way too formulaic. It’s one of those things where AI bros will go, “Look, it’s so good at poetry!!” but they have no taste and can’t even tell that it sucks; LLMs just generate ABAB poems and getting anything else is like pulling teeth. It’s a little more garbled and broken, but the output from a MCG is a lot more interesting in my experience. Interesting content that’s a little rough around the edges always wins over smooth, featureless AI slop in my book.
slight tangent: I was interested in seeing how they’d work for open-ended text adventures a few years ago (back around GPT2 and when AI Dungeon was launched), but the mystique did not last very long. Their output is awfully formulaic, and that has not changed at all in the years since. (of course, the tech optimist-goodthink way of thinking about this is “small LLMs are really good at creative writing for their size!”)
I don’t think most people can even tell the difference between a lot of these models. There was a snake oil LLM (more snake oil than usual) called Reflection 70b, and people could not tell it was a placebo. They thought it was higher quality and invented reasons why that had to be true.
Orange site example:
Reddit:
For story telling or creative writing, I would rather have the more interesting broken english output of a Markov chain generator, or maybe a tarot deck or D100 table. Markov chains are also genuinely great for random name generators. I’ve actually laughed at Markov chains before with friends when we throw a group chat into one and see what comes out. I can’t imagine ever getting something like that from an LLM.