• AA5B@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    1 year ago

    It’s hard to picture “128bit computing” in a general sense as ever being a thing. It’s just so far beyond anything we can realistically use now, plus would be inefficient/wasteful on most ordinary tasks.

    Put this together with the physical limits to Moore’s law and current approaches to at least mobile computing ……

    I picture more use of multi-core, specialty core, system on a chip. Some loads, like video, benefit from wide lanes, huge bandwidth, addresses many things at once, and we have video cores with architectures more suited for that. Most loads can be done with a standard compute core, and it is unnecessary, maybe counterproductive to move up to 128bit. If we want efficiency cores, like some mobile already have, 128bit is wrong/bad/inefficient. We’ll certainly have more AI cores, but I have no idea what they need

    If you can forgive the Apple boosterism and take this as a general trend, see the focus on fast interconnections to many specialty cores. Each core has a different architecture and different needs

    https://www.apple.com/newsroom/2023/06/apple-introduces-m2-ultra/

    • esty@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      not even an apple thing isn’t this just how SOCs work in general? definitely something intel and amd should be doing though (if they aren’t already i dont honestly know) especially with hardware decoders and ML cores and whatnot

      • AA5B@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Yes, this is how SoC can work. I think it is a great description about one specific company emphasizing a balance of different cores to do different jobs, rather than trying to make many general cores attempting to do everything. However, don’t get distracted by all the marketing language or that this is a company that people love to hate