I’m a dev. I’ve been for a while. My boss does a lot technology watch. He brings in a lot of cool ideas and information. He’s down to earth. Cool guy. I like him, but he’s now convinced that AI LLMs are about to swallow the world and the pressure to inject this stuff everywhere in our org is driving me nuts.

I enjoy every part of making software, from discussing with the clients and the future users to coding to deployment. I am NOT excited at the prospect of transitioning from designing an architecture and coding it to ChatGPT prompting. This sort of black box magic irks me to no end. Nobody understands it! I don’t want to read yet another article about how an AI enthusiast is baffled at how good an LLM is at coding. Why are they baffled? They have “AI” twelves times in their bio! If they don’t understand it who does?!

I’ve based twenty years of career on being attentive, inquisitive, creative and thorough. By now, in-depth understanding of my tools and more importantly of my work is basically an urge.

Maybe I’m just feeling threatened, or turning into “old man yells at cloud”. If you ask me I’m mostly worried about my field becoming uninteresting. Anyways, that was the rant. TGIF, tomorrow I touch grass.

  • Manticore@beehaw.org
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    AI can code assist; it’s quite helpful for that. Predictive text, learning a less familiar language, converting pseudo, etc.

    But it couldn’t possibly replace senior developers long-term. It just looks new and exciting, especially to people who don’t truly understand how it works. We still need to have human developers capable of writing their own new code.

    1. AI is entirely derivative, it’s just copying the human devs of yester-year. If AI does the majority of coding then it becomes incapable of learning, thus necessitating human coders anyway. It also is only going to generate solutions to broad-strokes problems that it already has in its dataset, or convert pseudocode into functional code (which still requires a dev know enough to write pseudo).

    2. It also currently has no way of validating what it writes. It’s trying to replicate what our writing looks like contextually, it doesn’t comprehend it. If it ever starts training on itself as it ages, it will stagnate and require human review, which means needing humans that understand code. And that’s not including the poor practices it will already have because so many devs are inconsistent about things like writing comments, documentation, or unit testing. AI doesn’t have its own bias but it inevitably learns to imitate ours.

    3. And what about bug-testing? When the AI writes something that breaks, who do you ask for help? The AI doesn’t comprehend the context of the code its reading if you paste it back, it doesn’t remember writing it. You need people who understand how the code works to be able to recognise why it might be breaking.

    AI devs are the fast food of coding. It will never be as good quality as something from an experienced professional. But if you’re an awful cook, it still makes it fast and easy to get a sad, flat cheeseburger.

    I’ve worked with devs who are the equivalent of line cooks and are also producing sad, flat cheeseburgers: code of poor quality that still sees production because the client doesn’t know any better. IMO, those are the only devs that need to be concerned, because those are the ones that are easy to replace.

    If AI coding causes any problems within the job market for devs, it will be that it replaces graduate/junior developers so well that fewer devs get the mentoring or experience to become seniors, and the demand for seniors will rack up significantly. It seems more likely that developers will split into two separate specialisations, not that our single track will be replaced.