https://archive.ph/hMZPi

Remember when tech workers dreamed of working for a big company for a few years, before striking out on their own to start their own company that would knock that tech giant over?

Then that dream shrank to: work for a giant for a few years, quit, do a fake startup, get acqui-hired by your old employer, as a complicated way of getting a bonus and a promotion.

Then the dream shrank further: work for a tech giant for your whole life, get free kombucha and massages on Wednesdays.

And now, the dream is over. All that’s left is: work for a tech giant until they fire your ass, like those 12,000 Googlers who got fired six months after a stock buyback that would have paid their salaries for the next 27 years.

We deserve better than this. We can get it.

  • mishimaenjoyer@kbin.social
    link
    fedilink
    arrow-up
    68
    arrow-down
    12
    ·
    1 year ago

    imagine getting first replaced by some kid out of a garage, then by indian code farms and now by ai developed by the grown up kids from said garage and trained by indian code farms.

    • expr@programming.dev
      link
      fedilink
      English
      arrow-up
      59
      arrow-down
      13
      ·
      1 year ago

      So tired of this rhetoric. AI isn’t replacing any software engineering jobs, nor could it. It’s a joke, quite frankly.

      • Lexi Sneptaur@pawb.social
        link
        fedilink
        English
        arrow-up
        37
        arrow-down
        1
        ·
        1 year ago

        They set up a ChatGPT based bot at my work just to help our support agents find information faster. It provides straight up factually false information 80% of the time. A solid 30% of the time, it says the opposite of the truth. It’s completely worthless at all times.

          • Lexi Sneptaur@pawb.social
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            I am seeing adoption being rather slow. Some very creative uses have been found though. It’s very good at summarizing content, for example. Especially in a context where it doesn’t matter too much how accurate it is, like summarizing a speech-to-text transcript of a phone call.

      • TeenieBopper@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        1
        ·
        1 year ago

        I was listening to a podcast about AI. I think it was one of Ezra Kleins. And he was telling a story that he heard, bout those weird virtual reality games from the 90s or early Aughts. And people shat on those games because they were awful and clunky and not very good so that shitting was well deserved. But one guy was like “yeah, that’s all true. But this is the worst it’s going to be. The next iteration isn’t going to be worse than this.”

        And that’s where AI is now. Like, it’s powerful and already a threat to certain jobs. GPT 3/4 may be useless to software engineering jobs now (I’d argue that it’s not - I work in a related field and I use it about daily) but what about GPT 5? 6? 10?

        Im not as doom and gloom on AI as I was six months ago, but I think it’s a bit silly to think that AI isn’t going to cause massive upheaval across all industries in the medium to long term.

        But also, for the record, I’m less worried about AI than I am about AI in the hands of Capitalism.

        • Shadywack@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          But also, for the record, I’m less worried about AI than I am about AI in the hands of Capitalism.

          Let’s just say it, AI in the hands of the 1% who use it to become the 0.001%.

        • Lemmington Bunnie@aussie.zone
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          I love how people always argue this point, “oh it sucks and can’t replace x”. Computer animation sucked once as well, but look at what can be done with it now.

          AI sucks in its current state. It will evolve and improve, and put poor little uneducated admin people like myself completely out of work. I’m learning what I can, but I’m not very bright, and neither is my future!

      • FaeDrifter@midwest.social
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        4
        ·
        1 year ago

        It was impossible for a computer to be smart enough to beat grandmasters at chess, until it wasn’t. It was impossible to beat Go Masters at Go, until it wasn’t.

        No software engineering jobs are getting replaced this year or next year. But considering the rapid pace of AI development, and considering how much code development is just straight up redundant… looking at 20 years from now, it’s not so bright.

        It would be way better to start putting AI legislation in place this year. That or it’s time to start transitioning to UBI.

        • expr@programming.dev
          link
          fedilink
          English
          arrow-up
          16
          arrow-down
          4
          ·
          edit-2
          1 year ago

          I am an actual (senior) software engineer, with a background in ML to boot.

          I would start to worry if we were anywhere close to even dreaming of how AGI might actually work, but we’re not. It’s purely in the realm of science fiction. Until you meet the bar of AGI, there’s absolutely no risk of software engineering jobs being replaced.

          Go or Chess are games with a fixed and simple ruleset and are very suited to what computers are really good at. Software engineering is the art of making the ambiguous and ill-defined into something entirely unambiguous and precisely defined, and that is something we are so far from achieving in computers it’s not even funny. ML is ultimately just applied statistics. It’s not magic, and it’s far from anything we would consider “intelligence”.

          I do think we need legislation targeting ML, but not because of “omg our jobs”. Rather we need legislation to combat huge tech companies vacuuming any and all data on the general public and using that data to manipulate and control the public.

          Also, LOL at “how much code development is straight up redundant”. If you think development amounts to just writing a bunch of boilerplate as though we were some kind of assembly line putting together the same thing over and over again, you’re sorely mistaken.

          • Not_mikey@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            1 year ago

            I think you overestimate what the average software developer is doing.

            Do I think in 10 years ai will be patching the Linux kernel or optimizing aws scaling functions, no. Do I think it will be creating functional crud apps with Django or Ruby on rails, yes, and I think that’s what a large amount of software developers are doing. Even if it’s not a majority a lot of the more precarious developers without a cs degree will probably lose their job. Not every developer is a senior engineer working on ML.

          • FaeDrifter@midwest.social
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            4
            ·
            1 year ago

            It’s purely in the realm of science fiction.

            This isn’t proof of anything, I would just like to point out that a lot of science fiction has become reality in the last few decades.

            Go or Chess are games with a fixed and simple ruleset

            At the end of the day, what is a computer except a machine with a fixed and simple ruleset: logic gates.

            ambiguous and ill-defined into something entirely unambiguous and precisely defined, and that is something we are so far from achieving in computers it’s not even funny

            You don’t need AI to write you perfect C or JavaScript or HTML. You just need it to create an interface for an end user to make the computer do what they want. I predict the AI itself won’t write the languages, it will tend to replace the languages. Many orders of magnitude more computationally expensive, but the hardware is quickly becoming cheaper to buy than paying software engineers.

            If you think development amounts to just writing a bunch of boilerplate as though we were some kind of assembly line putting together the same thing over and over again, you’re sorely mistaken.

            Obviously not, that’s why libraries and OOP and frameworks exist, I’m aware, not pretending like I have anything to teach you about it either.

            And I’ll take the L if you have the insider knowledge that there’s a requirement for massive creativity behind the scenes in widespread fundamental overhauls of the way software works. But afaik, the fundamentals of code haven’t changed in decades. The way users interact has not changed much since smartphones became standard. I don’t see a capitalistic incentive to pay for lots of new creativity, instead of just making usable products.

      • primbin@lemmy.one
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Now that I use github copilot, I can work more quickly and learn new frameworks more with less effort. Even its current form, LLMs allow programmers to work more efficiently, and thus can replace jobs. Sure, you still need developers, but fewer of them.

        • expr@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Learning frameworks has never been hard, and frankly does not make up the majority of a developer’s job. Maybe you do it while onboarding. Big whoop. Any good developer can do that fairly easily, and LLMs are entirely superfluous. Worse yet, since they are so commonly confidently incorrect, you have to constantly check if it’s even correct. I’d prefer to just read the documentation, thanks.

          A mature engineering organization is not pumping out greenfield projects in new languages/frameworks all the time. Greenfield is usually pretty rare, and when you do get a greenfield project, it’s supposed to be done using established tools that everyone already knows.A tiny fraction of a developer’s job is actually writing code. Most of it is the soft skills necessary to navigate ambiguous requirements and drive a project to completion. And when we do actually program, it’s much more reading code than it is writing code, generally to gain enough understanding of the system in order to make a minor change.

          LLMs are highly overrated. And even if it does manage to produce something useful, there’s much more to a codebase itself. There’s the socialization of knowledge around it and the thought process that went into it, none of which you gain when using an LLM. It’s adequate for producing boilerplate no one reads anyway, but that’s such a small fraction of what we even do (and hopefully, you can abstract away that boilerplate so you’re not writing it over and over again anyway).

      • Shadywack@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        Not yet, but would you agree that businesses desire the ability to automate software engineering and reduce developer headcount by demanding an AI supplemented development work flow?

        • expr@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Sure, just like businesses have always wanted “no-code” solutions to their problems to cut out the need for software engineers. We all know how that turned out. There was no threat then, and there’s no threat now.

          • AstridWipenaugh@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            1 year ago

            AI coding is just another tool developers have at their disposal now. It will just raise the bar for expected output. I expect within a few years it will be popular to describe a process, have an AI tool spit out some intern-grade hot mess that maybe compiles, then have a junior developer fix it, and a senior developer write the custom/complex parts. If the AI is good enough, it’ll be a significant time saver for it to get you more than half way to done.

            It could even be tamed with a test-driven development approach. Write a bunch of good tests and have the AI generate code that passes the tests. What could possibly go wrong… lol

            • expr@programming.dev
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              I find it highly overrated in terms of productivity in general, particularly when writing anything remotely non-trivial/company-specific.

              There’s also the absolutely massive issue of licensing/IP/etc. Any company that’s not full of dumbasses should recognize the massive risk and liability involved and stay the fuck away.