• pdxfed@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 months ago

    Admired AMD since the first Athlon, but never made the jump for various reasons–mostly availability. Just bought my first laptop(or any computer) with an AMD chip in it last year, a ryzen7 680m. There is no discrete graphics card and the onboard GPU has comparable performance to a discrete Nvidia 1050gpu. In a 13" laptop. The AMD chip far surpassed Intel’s onboard GPU performance, and Intel laptop was ~30% more from any company. Fuck right off.

    Why doesn’t this matter to Intel? Part of why they always held mind space and a near monopoly is their OEM computer maker deals. HP, DELL, etc. it was almost impossible to find an AMD premade desktop, laptops were out of the question.

    • trolololol@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      I believe my first amd was a desktop athlon around 2000. I needed a fast machine to crunch my undergraduate thesis and that was the most cost effective.

      In recent years I can’t buy amd for a strong desktop, went with xps and there’s no options. Linux is a requirement for me, so it narrowed down my choices a lot. As you’d expect, it’s a horrible battery life compounded by being forced to pay and not choose an NVIDIA card that also has poor drivers and power management.

      x86 and it’s successor amd86 instruction set is a Pandora box and a polished turd, hiding things such as micro instructions, a full blown small OS running in parallel and independent of BIOS, and other nefarious bad practices of over engineering that is at the roots of spectre and meltdown.

      What I mean is I prefer AMD over Intel, but I prefer riscv over both.

  • gravitas_deficiency@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 months ago

    Pretty incisive article, and I agree.

    In retrospect, I think the marketing/sales/finance corporate leadership idiocy that’s intensified over the last couple decades is the single biggest contributor to my deep sense of frustration and ennui I’ve developed working as a software engineer. It just seems like pretty much fucking nobody in the engineering management sphere these days actually values robust, carefully and thoughtfully designed stuff anymore - or more accurately, if they do, the higher-ups will fire them for not churning out half-finished bullshit.

    • Sneezycat@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      That’s why I like my steam deck so much: the design is so thoughtful and adapted to its own needs, and unfortunately that’s a rare sight lately (not just in technology).

      • hark@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        Would’ve probably turned out different if Valve was beholden to shareholders and the never-ending hunger for a higher stock price. The push to drive “shareholder value” is one of the most destructive forces if not the most destructive force we’re dealing with these days.

  • bruhduh@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 months ago

    While I used AMD since fx bulldozer and currently using laptop with Ryzen 7 5700u and really enjoying it, downfall of intel saddens me because they keeping the GPUs prices down, i mean, would AMD and Nvidia offer 16gb GPUs in 300$ price range if intel wouldn’t bring a770 16gb for 300$ on the table first, p.s AMD always deserved first place and still deserves it now, while intel is good as catching up player which keeping the prices down

  • Possibly linux@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 months ago

    Intel GPUs are still ahead in some ways. They need to work on getting Intel GPUs in datacenters

    I also like that they are working on creating a more open AI hardware platform

  • Magister@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I tried to always use AMD, 386SX33, 486DX4/100, Duron 1000, Athlon XP 2200, then went a laptop life with Intel, but since COVID/WFH I went back to AMD, I have a 5600H in a miniPC

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      2 months ago

      That has pretty much nothing to do with Intel’s decline though. Losing the enthusiast market to AMD was a small blow, the bigger blow was losing a lot of server market to AMD. And now AMD is starting to dominate in pretty much every CPU market there is, outside of the very low power devices where ARM is dominant and expanding.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      Why? AMD doesn’t make phone chips, yet they’re dominating Intel. Likewise for NVIDIA, who is at the top of the chip maker list.

      The problem isn’t what market segments they’re in, the problem is that they’re not dominant in any of them. AMD is better at high end gaming (X3D chips especially), workstations (Threadripper), and high performance servers (Epyc), and they’re even better in some cases with power efficiency. Intel is better at the low end generally, by that’s not a big market, and it’s shrinking (e.g. people moving to ARM). AMD has been chipping away at those, one market segment at a time.

      Intel entering phones will end up the same way as them entering GPUs, they’ll have to target the low end of the market to get traction, and they’re going to have a lot of trouble challenging the big players. Also, x86 isn’t a good fit there, so they’ll also need to break into the ARM market a well.

      No, what they need is to execute well in the spaces they’re already in.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          Yes, and 5 years ago, they had very little of it. I’m talking about the trajectory, and AMD seems to be getting the lion’s share of new sales.

          • sfantu@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            I hope for the best for AMD … they make great products… I own several AMD machines.

            But ARM and the trust built around it … continues to eat their market … I see Intel as having more of a fighting chance.

            IF THEY RESTRUCTURE … or whatever the fuck their strategy is.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              I don’t really see ARM as having an inherent advantage. The main reason Apple’s ARM chips are eating x86’s lunch is because Apple has purchased a lot of capacity on the next generation nodes (e.g. 3nm), while x86 chips tend to ship on older nodes (e.g. 5nm). Even so, AMD’s cores aren’t really that far behind Apple’s, so I think the node advantage is the main indicator here.

              That said, the main advantage ARM has is that it’s relatively easy to license it to make your own chips and not involve one of the bigger CPU manufacturers. Apple has their own, Amazon has theirs, and the various phone manufacturers have their own as well. I don’t think Intel would have a decisive advantage there, since companies tend to go with ARM to save on costs, and I don’t think Intel wants to be in another price war.

              That’s why I think Intel should leverage what they’re good at. Make better x86 chips, using external fabs if necessary. Intel should have an inherent advantage in power and performance since they use monolithic designs, but those designs cost more than AMD’s chiplet design. Intel should be the premium brand here, with AMD trailing behind, but their fab limitations are causing them to trail behind and jack up clock speeds (and thus kill their power efficiency) to stay competitive.

              In short, I really don’t think ARM is the right move right now, unless it’s selling capacity at their fabs. What they need is a really compelling product, and they haven’t really delivered one recently…

              • mihies@kbin.social
                link
                fedilink
                arrow-up
                0
                ·
                2 months ago

                It’s not just node, it’s also the design. If I remember properly, ARM has constant instruction length which helps a lot with caching. Anyway, Apple’s M CPUs are still way better when it comes to perf/power ratio.

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  2 months ago

                  Yes, it’s certainly more complicated than that, but the lithography is a huge part since they can cram more transistors into a smaller area, which is critical for power savings.

                  I highly doubt instruction decoding is a significant factor, but I’d love to be proven wrong. If you know of a good writeup about it, I’d love to read it.