> I keep hearing this assertion, that GPT can be wrong, therefore it’s an unworkable technology.
This is a straw man, I did not say any such thing. I am just pointing out the limitations that people like the author of this article seem to be blissfully unaware of.
Also I would argue that your premise of AI vs a Jr eng is pretty bad. Junior engineers are not writing things to 85% correctness. If they are, they should be let go basically immediately. That's a 15% error rate. I would posit that even the worst human programmers have error rates well below 1% for code that actually ships.
>...even the worst human programmers have error rates well below 1% FOR CODE THAT ACTUALLY SHIPS.
I added that emphasis, and you should realize that shipping code is the result of teams, and engagement like ChatGPT is demonstrating will replace most of your fellow human teammembers, and the only human input may be "putting it all together..." this is a top-level jobtask, with limited employment opportunity; I worry that the code that I am already able to generate and have real results with (as a non-programmer technician) is quite scary. It is sufficient.
Simply put: the UX/UI here is too addictive and too capable to not be earth-shattering. But this is just an amateur opinion, and certainly "creativity" is already (and already was) an "INhuman" attribute, limited but to the rarest minds...
1. project with large code bases that I maintain, the configs alone are of the scale of 20k lines of yaml files
2. Prototyping client with smaller code bases and varying requests for projects needing to be adapted for their clients
3. Implementing research results of a research institute
In all what I have/saw, I still don't see a single use case for GPT assisted programming. Even for snippets, I have a large repository of lua snippets for neovim and coockicutter templates that I built throughout the years.
You are an extremely sophisticated and well-trained crafts-person, obviously.
Now: IMAGINE ACCOMPLISHING THAT without ever having had been able to hold an "entry level position" at whichever creative-houses / apprenticeships helped in your current sophistry.
YES: If you are "top tier" educated and know how to communicate with these GPT systems... you can replace (e.g. as a lawyer) your entire law clerk staff, save one fast typist that is also a good question asker.
I have found in my previous six weeks of Prompt Crafting that it has helped me in so many areas of my life BEYOND TECHNICAL. For whatever-odd reason, of the smartest people I know, it's the most-technical that seem to have a problem with the rapid changing world of not needing 50% of your labor force for routine entry-level (and probably mid-level, too) for any job that currently just requires you to sit in front of a computer screen and not have any executive authority (literally, every single one of those jobs isn't "GONE," it's just that all the stuff that you currently pay younger people to do/learn-on-the-job... you don't need these people (correction: nearly as many of them) to chew through datasets. I am around 40, and have two lawyer brothers, and what they were able to accomplish in lawschool pales in comparison with what many ChatGPT systems I've seen and used and benefited from (again: outside of technical areas of expertise; OF COURSE IT GETS WRONG, so do humans; it is learning to ask better questions literally by the thousands... every second of every day.
The elitist mentality of "OH BUT MY JOB IS SECURE" is so silly, because what do you do when nobody is able to support themselves / help you. I often ask my clients: if Reagan's policies were `so good`, then how come it's `so hard to find good help`? This is a good question for which I have yet to ask AI, but it is a good one and probably not answerable in even just a few paragraphs.
Have a great day; be grateful for what you have; know that your social security (if in US) will be non-existant (for real, this time IS different) because it is already so out-of-whack that the massive unemployment issues WILL affect your life.
The previous few tens-of-thousands that a few FAANGs disposed of, pre-Christmas... is just the beginning folks!
This is a straw man, I did not say any such thing. I am just pointing out the limitations that people like the author of this article seem to be blissfully unaware of.
Also I would argue that your premise of AI vs a Jr eng is pretty bad. Junior engineers are not writing things to 85% correctness. If they are, they should be let go basically immediately. That's a 15% error rate. I would posit that even the worst human programmers have error rates well below 1% for code that actually ships.