I’m not a developer, I do not work in a technology field, but I used to. I know Linux sysadmin, security, and a very small amount of python.
ChatGPT has allowed me to “write” code that I use everyday in my business. So, I’m not a developer, but it lets me do things I otherwise would not be able to.
My business is still too small to even consider hiring a developer, so it’s allowing me to grow my business.
I’m just writing this to point out that “devs” are not the only people using chatgpt to write code.
Chatgpt and other LLMs are fantastic technical task assistants but, and this is a big but, you need to treat their work the same way you’d treat work from a new intern. Verify the output before you trust it.
Jumping on the new shiny thing and relying on it over all the other tried and tested tools is the core re-occuring mistake in development.
What? This fantastical scenario has never happened. Name one other new development tool that has lead to the sort of issues you seem to think will happen. Debuggers? Auto-complete? Syntax highlighting? Better build tooling?
I never said a single tool causes issues. I said abandoning existing tools to only use the new thing is a problem.
And I said - when the hell has that ever happened? Ever?
See people who want to only use the newest frameworks, to the point of re-building projects when they come out.
See people who fixate on a single design pattern and insist on using it in every application.
I’m talking about development tools not platforms and libraries. An LLM is not replacing a framework. It’s not replacing… Anything really.
He should’ve asked ChatGPT instead. That’s how “modern developers” seem to get by
I’m not a developer, I do not work in a technology field, but I used to. I know Linux sysadmin, security, and a very small amount of python.
ChatGPT has allowed me to “write” code that I use everyday in my business. So, I’m not a developer, but it lets me do things I otherwise would not be able to.
My business is still too small to even consider hiring a developer, so it’s allowing me to grow my business.
I’m just writing this to point out that “devs” are not the only people using chatgpt to write code.
Chatgpt and other LLMs are fantastic technical task assistants but, and this is a big but, you need to treat their work the same way you’d treat work from a new intern. Verify the output before you trust it.
It’s just making some front end stuff for other people to use to access PDFs on my server that need some level of protection and access control.
So, it’s been pretty easy to verify.
I’m too paranoid about trusting it or even myself to write code that could have irreversible effects.
Thanks for the advice🙏
Could just go full Luddite and pull his tooth by tying string to it like developers who refuse to use AI.
Not using the nonsense machine doesn’t mean we have to abandon the whole rest of the toolbox.
Jumping on the new shiny thing and relying on it over all the other tried and tested tools is the core re-occuring mistake in development.
What? This fantastical scenario has never happened. Name one other new development tool that has lead to the sort of issues you seem to think will happen. Debuggers? Auto-complete? Syntax highlighting? Better build tooling?
Hey man, maybe re-read my comment. It’s not long.
I never said a single tool causes issues. I said abandoning existing tools to only use the new thing is a problem.
See people who want to only use the newest frameworks, to the point of re-building projects when they come out.
See people who fixate on a single design pattern and insist on using it in every application.
And I said - when the hell has that ever happened? Ever?
I’m talking about development tools not platforms and libraries. An LLM is not replacing a framework. It’s not replacing… Anything really.
So google how to do a root canal is cool with you?
Go sit in his chair. Moron
No.
Not even close.