The irony of using an AI generated image for this post…
AI imagery makes any article look cheaper in my view, I am more inclined to “judge the book by its cover”.
Why would you slap something so lazy on top of a piece of writing you (assuming it isn’t also written by AI) put time and effort into?
I thought it was intentional AI slop
Yeah, I’m sure they left the spelling mistake in the image on purpose to get increased engagement from pedants like me. I’m sorry, it works on me.
https://defragzone.substack.com/p/run-massive-models-on-crappy-machines
the author doesn’t oppose AI, just programmers being replaced for it.
this post is about programmers being replaced by ai. the writer seems ok with artists being replaced.
Or the picture is a statement for why artists shouldn’t be replaced either. Who can tell.
considering one of the other posts is about “democratizing AI” I lean towards my take.
Oh, it is for sure more likely.
I know that it’s a meme to hate on generated images people need to understand just how much that ship has sailed.
Getting upset at generative AI is about as absurd as getting upset at CGI special effects or digital images. Both of these things were the subject of derision when they started being widely used. CGI was seen as a second rate knockoff of “real” special effects and digital images were seen as the tool of amateur photographers with their Photoshop tools acting as a crutch in place of real photography talent.
No amount of arguments film purist or nostalgia for the old days of puppets and models in movies was going to stop computer graphics and digital images capture and manipulation. Today those arguments seem so quaint and ignorant that most people are not even aware that there was even a controversy.
Digital images and computer graphics have nearly completely displaced film photography and physical model-based special effects.
Much like those technologies, generative AI isn’t going away and it’s only going to improve and become more ubiquitous.
This isn’t the hill to die on no matter how many upvotes you get.
people don’t like generated so bc it’s trainer on copyrighted data but if you don’t believe in copyright then it’s a tool like any other
There are thousands of different diffusion models, not all of them are trained on copyright protected work.
In addition, substantially transformative works are allowed to use content that is otherwise copy protected under the fair use doctrine.
It’s hard to argue that a model, a file containing the trained weight matrices, is in any way substantially similar to any existing copyrighted work. TL;DR: There are no pictures of Mickey Mouse in a GGUF file.
Fair use has already been upheld in the courts concerning machine learning models trained using books.
For instance, under the precedent established in Authors Guild v. HathiTrust and upheld in Authors Guild v. Google, the US Court of Appeals for the Second Circuit held that mass digitization of a large volume of in-copyright books in order to distill and reveal new information about the books was a fair use.
And, perhaps more pragmatically, the genie is already out of the bottle. The software and weights are already available and you can train and fine-tune your own models on consumer graphics cards. No court ruling or regulation will restrain every country on the globe and every country is rapidly researching and producing generative models.
The battle is already over, the ship has sailed.
Exactly!!
Thank God, you get it.This video (which was trending a while ago) explained it pretty well:
https://www.youtube.com/watch?v=pt7GtDMTd3kAnd to add to what you said, people have some huge misunderstandings about how Gen AI work. They think it somehow just copy pastes portions of the art it was trained on, and that’s it. That’s not the case AT ALL, it’s nit even close to that.
AI models should be allowed to be trained on copy righted data. If thry shouldn’t be allowed to do that, then humans shouldn’t be allowed to do it either. Why do we give such advice to upcoming writers and musicians and artists, to consume the kind of content that they want to create in the future? To read the kind of books that they want to write like? To listen to the kind of music that they want to create? To look at pieces of art that they want to create? Should humans ALSO be limited to only publuc domain content?? I really don’t think so.
Again, Gen AI models don’t just copy paste stuff from their training set of data. They understand what makes up that piece of data. Just like a human does.
Thankfully, reasoning models like Deepseek-R1 have started to show the average person how an AI actually reasons and thinks about things and that they don’t just spew stuff out of nowhere in the hopes that it makes some kind of sense, slapping pieces of their training data set together to write something that’s barely comprehensible. The “Think” tags in such models really helped clarify some huge misunderstandings that some people had. Although, many many people are still left who have a really messed up view of how AIs work, and they somehow speak with such confidence about these topics. It drives me nuts.
It’s hard for people who haven’t experienced the loss of experts to understand. Not a programmer but I worked in aerospace engineering for 35 years. The drive to transfer value to execs and other stakeholders by reducing the cost of those who literally make that value always ends costing more.
Well, yeah, but those costs are for tomorrow’s executive to figure out, we need those profits NOW
those executives act like parasites. They bring no value and just leech the life from the companies.
executives act like parasites
WE MAED TEH PROFITZ!!!1!!1
which is ironical since without them the profits would likely soar. Doing bad shit 101 is to pin the consequences of your actions on others and falsely claim any benefits others have managed to do as your own achievements.
It’s utterly bizarre. The customers lose out by receiving an inferior product at the same cost. The workers lose out by having their employment terminated. And even the company loses out by having its reputation squandered. The only people who gain are the executives and the ownership.
On a more generic scale (whatever that means), we went from coding serious stuff in Ada with contracts and designs and architectures, to throwing everything in the trash while forgetting any kind of pride and responsibility in less than 50 years. AI is the next step in that global engineering enshittification (I hate that word but it’s appropriate).
Imagine a company that fires its software engineers, replaces them with AI-generated code, and then sits back, expecting everything to just work. This is like firing your entire fire department because you installed more smoke detectors. It’s fine until the first real fire happens.
I don’t know. I look at it like firing all your construction contractors after built out all your stores in a city. You might need some construction trades to maintain your stores and your might need to relocate a store every once in a while, but you don’t need the same construction staff on had as you did with the initial build out.
While true, that is a weak analogy. Software rots and needs constant attention of competent people or shit stacks.
I’m not saying you can fire everyone, but the maintenance team doesn’t need to be the size of the development team if the goal is to only maintain features.
It works for a while. Keep a few seniors and everything will be fine. Then you want new features and that’s when shit hits the fan. Want me to add a few buttons? 1 month because I have to study all the random shit that was generated last week.
Twitter and Tumblr are operating on skeleton crews but are able to make changes.
Craigslist is still around even though it hasn’t changed much since the '90’s.
There is an entire industry of companies that buy old MMO’S and maintain them at a low cost for a few remaining players.
Southwest Airlines still runs ticketing on a Windows 95 server.
I think you’ll see more companies accept managed decline as a business strategy.
It’s funny you use southwest as an example in this. I flew with them for the first time this year and it was easily the worst technical experience from an IT perspective that I have ever had. Sure I got from point A to point B, but everything involved with buying the ticket, getting through security, tracking my flight, boarding time, etc was worse than every other flight I’ve been on. The app was awful and basic features like delay notifications or pulling up the digital ticket made an already expensive as hell experience way more stressful. Windows 95 isn’t keeping up
Literally anybody who thought about the idea for more than ten seconds already realized this a long time ago; apparently this blog post needed to be written for the people who didn’t do even that…
You underestimate the dumbassery of Pencil-Pushers in tech companies (& also how genuinely sub-human they can be)
As a software engineer, I’m perfectly happy waiting around until they have to re-hire all of us at consulting rates because their tech stacks are falling the fuck apart <3
I work for a fortune 500 company.
just recently lost a principal engineer that built an entire platform over the last four years.
just before they left I noticed they were using AI an awful lot. like…a lot a lot. like, “I don’t know the answer on a screen share so I’ll ask ChatGPT how to solve the problem and copy/paste it directly into the environment until it works” a lot.
they got fired for doing non-related shit.
it’s taken us three months, hundreds of hours from at least 5 other principal engineers to try to unravel this bullshit and we’re still not close.
the contributions and architecture scream AI all over it.
Point is. I’ll happily let idiots destroy the world of software because I’ll make fat bank later as a consultant fixing their bullshit.
There’s also the tribal knowledge of people who’ve worked somewhere for a few years. There’s always a few people who just know where or how a particular thing works and why it works that way. AI simply cannot replace that.
I don’t disagree with that, but there’s so many “wtf is this shit” moments that defy all logic and known practices.
like for example, six different branches of the same repo that deploy to two different environments in a phased rollout. branches 1-3 are prod, 4-6 are dev. phases go 3,1,2 for prod and 6,4,5 for dev. they are numbered as well.
also, the pipelines create a new bucket every build. so there’s over 700 S3 buckets with varying versions of the frontend…that then gets moved into…another S3 bucket with public access.
my personal favorite is the publicly accessible and non-access controlled lambdas with hard-coded lambda evocation URLs in them. lambda A has a public access evocation URL configured instead of using API Gateway. Lambda B has that evocation URL hard coded into the source that’s deployed.
there’s so much negligent work here I swear they did it on purpose.
Well, also if the guy was just dumping AI generated code arbitrarily into your product, that pretty significantly risks the copyright over the entire product into which the generated stuff was integrated (meaning, anyone can do whatever the fuck they want with it).
you’re not wrong. unfortunately that’s not how legal sees it.
not sure what they’re snorting, but it must be good shit.
I’m an IP attorney whose been pretty specialized in ML-enabled technologies for a decade now, and have worked in-house for fortune 500 companies so I’m pretty familiar with how these queries are often handled, especially at multinats. There honestly probably isn’t someone in your legal with all three of seniority, understanding and keeping up with the legal nuances, and understanding of the underlying technologies. The overlap in my experience is few and far between.
A reason I didn’t see listed: they are just asking for competition. Yes by all means get rid of your most talented people who know how your business is run.
I wonder if there will eventually be a real Butlerian Jihad
Maybe after Herbert’s idiot son dies and someone else gets the rights
This is prophetic and yet as clear as day to anyone who has actually had to rely on their own code for anything.
I have lately focused all of my tech learning efforts and home lab experiments on cloud-less approaches. Sure the cloud is a good idea for scalable high traffic websites, but it sure also seems to enable police state surveillance and extreme vendor lock-in.
It’s really just a focus on fundamentals. But all those cool virtualization technologies that enable ‘cloud’ are super handy in a local system too. Rolling back container snapshots on specific services while leaving the general system unimpacted is useful anywhere.
But it is all on hardware I control. Apropos of the article, the pendulum will swing back toward more focus on local infrastructure. Cloud won’t go away, but more people are realizing that it also means someone else owns your data/your business.
I think they were suckered in also by the supposed lower cost of running services, which, as it happens, isn’t lower at all and in fact is more expensive. But you laid off the Datacenter staff so. Pay up, suckers.
Neat toolsets though.
The cloud provides incredible flexibility, scale, and reliability. It is expensive to have 3+ data centers with a datacenter staff. If the data center was such a great deal for the many 9s of reliability provided by the cloud, company’s would be shifting back in mass at this point
Oh no way. It was a year(s)-long process to get to the cloud, then the devs got hooked on all the toys AWS was giving them and got strapped in even further. They couldn’t get out now if they wanted to. Not without huge expense and re-writing a bunch of stuff. No CTO is going die on that hill.
They jumped in the cloud for the same reason they jumped into AI - massive hype. Only the cloud worked. And now % of the profits are all Amazon’s. No app store needed. MuwAHhahahAhahahaa
I was a mf’ing hard core rider of the tech boom, was a sought-after consultant, and I and my colleagues rode the razor’s edge of what was possible in online gaming for 2 decades… and I can tell you now, AI presents to creative individuals who have a clue, the greatest opportunity ever handed to them. Look at how AI destroys things and “invent” solutions and you’ll pay yourself well.
Now more than ever a “programmer” is a guy that can plug other people’s modules together and pray it works. Notice that now and git gud at what you do.
You spend all your work day in meetings bragging about yourself while never actually doing any work, aren’t you?
Look at how AI destroys things and “invent” solutions and you’ll pay yourself well.
Yeah, I’m seeing the absolute deluge of AI shovelware games. I know it generates money due to sheer volume, but to me that’s just like all those online courses of “how to dropship”. You’re being one of the worst literal definitions of “waste of resources”.
None of you can hear. You’re all so afraid. There is OPPORTUNITY EVERYWHERE but you’re so locked into your script there’s no talking to any of you. It’s so sad to see you limit yourselves. But in a way it’s revelatory of the truth I’m speaking… the “i’m a porgammer” because ya downloaded other people’s work is over, and the path is open to those ready to work and innovate. Good luck, but you don’t need that because you’ve already decided you’ve lost.
What most people forget is that as a programmer/designer/etc, your job is to take what your client/customer tells you they want, listen to them, then try to give them what they ACTUALLY NEED, which is something that I think needs to be highlighted. Most people making requests to programmers, don’t really even know what they want, or why they want it. They had some meeting and people decided that, ‘Yes we need the program to do X!’ without realizing that what they are asking for won’t actually get them the result they want.
AI will be great at giving people exactly what they ask for…but that doesn’t mean its what they actually needed…
Great points. Also:
… AI will be great at giving people exactly what they ask for …
Honestly, I’m not even sure about this. With hallucinations and increasingly complex prompts that it fails to handle, it’s just as likely to regurgitate crap. I don’t even know if AI will get to a better state before all of this dev-firing starts to backfire and sour most company’s want to even touch AI for most development.
Humans talk with humans and do their best to come up with solutions. AI takes prompts and looks at historical human datasets to try and determine what a human would do. It’s bound to run into something novel eventually, especially if there aren’t more datasets to pull in because human-generated development solutions become scarce.