I’ve always said as a software developer that our longterm job is to program ourselves out of a job. In fact, in the long term EVERYBODY is “cooked” as automation becomes more and more capable. The eventual outcome will be that nobody will have to work. AI in its present state isn’t ready at all to replace programmers, but it can be a very helpful assistant.
can, but usually when stuff gets slightly more complex, being a fast typewriter is usually more efficient and results in better code.
I guess it really depends on the aspiration for code-quality, complexity (yes it’s good at generating boilerplate). If I don’t care about a one-time use script that is quickly written in a prompt I’ll use it.
Working on a big codebase, I don’t even get the idea to ask an AI, you just can’t feed enough context to the AI that it’s really able to generate meaningful code…
Working on a big codebase, I don’t even get the idea to ask an AI, you just can’t feed enough context to the AI that it’s really able to generate meaningful code…
That’s not a hard limit, for example google’s models can handle 2-million-token context window.
Ughh I tried the gemini model and I’m not too happy with the code it came up with, there’s a lot of intrinsities and concepts that the model doesn’t grasp enough IMO. That said I’ll reevaluate this continuously converting large chunks of code often works ok…
I actually don’t write code professionally anymore, I’m going on what my friend says - according to him he uses chatGPT every day and it’s a big help. Once he told it to refactor some code and it used a really novel approach he wouldn’t have thought of. He showed it to another dev who said the same thing. It was like, huh, that’s a weird way to do it, but it worked. But in general you really can’t just tell an AI “Create an accounting system” or whatever and expect coherent working code without thoroughly vetting it.
Management can’t blame AI when shit hits the fan, though. We’ll be fine. Either that or everything just collapses back into dust, which doesn’t sound so bad in the current times.
That’s the beauty of AI tho - AI shit rolls uphill, until it hits the manager who imposed the decision to use it (or their manager, or even their manager).
I’ve always said as a software developer that our longterm job is to program ourselves out of a job. In fact, in the long term EVERYBODY is “cooked” as automation becomes more and more capable. The eventual outcome will be that nobody will have to work. AI in its present state isn’t ready at all to replace programmers, but it can be a very helpful assistant.
can, but usually when stuff gets slightly more complex, being a fast typewriter is usually more efficient and results in better code.
I guess it really depends on the aspiration for code-quality, complexity (yes it’s good at generating boilerplate). If I don’t care about a one-time use script that is quickly written in a prompt I’ll use it.
Working on a big codebase, I don’t even get the idea to ask an AI, you just can’t feed enough context to the AI that it’s really able to generate meaningful code…
That’s not a hard limit, for example google’s models can handle 2-million-token context window.
https://ai.google.dev/gemini-api/docs/long-context
Ughh I tried the gemini model and I’m not too happy with the code it came up with, there’s a lot of intrinsities and concepts that the model doesn’t grasp enough IMO. That said I’ll reevaluate this continuously converting large chunks of code often works ok…
I actually don’t write code professionally anymore, I’m going on what my friend says - according to him he uses chatGPT every day and it’s a big help. Once he told it to refactor some code and it used a really novel approach he wouldn’t have thought of. He showed it to another dev who said the same thing. It was like, huh, that’s a weird way to do it, but it worked. But in general you really can’t just tell an AI “Create an accounting system” or whatever and expect coherent working code without thoroughly vetting it.
Management can’t blame AI when shit hits the fan, though. We’ll be fine. Either that or everything just collapses back into dust, which doesn’t sound so bad in the current times.
That’s the beauty of AI tho - AI shit rolls uphill, until it hits the manager who imposed the decision to use it (or their manager, or even their manager).