Panther Lake and Nova Lake laptops will return to traditional RAM sticks

  • Riskable@programming.dev
    link
    fedilink
    English
    arrow-up
    98
    arrow-down
    2
    ·
    19 days ago

    Gelsinger said the market will have less demand for dedicated graphics cards in the future.

    No wonder Intel is in such rough shape! Gelsinger is an idiot.

    Does he think that the demand for AI-accelerating hardware is just going to go away? That the requirement of fast, dedicated memory attached to a parallel processing/matrix multiplying unit (aka a discreet GPU) is just going to disappear in the next five years‽

    The board needs to fire his ass ASAP and replace him with someone who has a grip on reality. Or at least someone who has a some imagination of how the future could be.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      71
      ·
      edit-2
      19 days ago

      Gelsinger said the market will have less demand for dedicated graphics cards in the future.

      Reminds me of decades ago when intel didn’t bother getting into graphics because they said pretty soon CPUs would be powerful enough for high-performance graphics rendering lmao

      The short-sightedness of Intel absolutely staggers me.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        48
        ·
        edit-2
        19 days ago

        CPUs would be powerful enough for high-performance graphics rendering lmao

        And then they continued making 4 core desktop CPU’s, even after phones were at deca-core. 🤣🤣🤣

        • Trainguyrom@reddthat.com
          link
          fedilink
          English
          arrow-up
          11
          ·
          edit-2
          18 days ago

          To be fair, the arm SOCs on phones use BigLittle cores, where it will enable/disable cores on the fly and move software around so it’s either running on the Big high performance cores or the Little low power cores based on power budget needs at that second. So effectively not all of those 6+ cores would be available and in use at the same time on phones

          • Buffalox@lemmy.world
            link
            fedilink
            English
            arrow-up
            13
            ·
            edit-2
            18 days ago

            True, but I use the phone reference to show how ridiculous it is that Intel remained on 4 cores for almost 8 years.
            Even Phenom was available with 6 good cores in 2010, yet Intel remained on 4 for almost 8 years until Coffee Lake came out late 2017, but only with 6 cores against the Ryzen 8.
            Intel was pumping money from their near monopoly for 7 years, letting the PC die a slow death of irrelevancy. Just because AMD FX was so horrible their 8 Buldozer cores were worse than 4 Core2 from Intel. They were even worse than AMDs own previous gen Phenom.
            It was pretty obvious when Ryzen came out that the market wanted more powerful processors for desktop computers.

      • ChicoSuave@lemmy.world
        link
        fedilink
        English
        arrow-up
        27
        ·
        19 days ago

        It’s been the same “vision” since the late 90s - the CPU is the computer and everything else is peripherals.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      19 days ago

      Does he think that the demand for AI-accelerating hardware is just going to go away? That the requirement of fast, dedicated memory attached to a parallel processing/matrix multiplying unit (aka a discreet GPU) is just going to disappear in the next five years‽

      Maybe the idea is to put it on the CPU/NPU instead? Hence them going so hard on AI processors in the CPU, even though basically nothing uses it.

      • bruhduh@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        19 days ago

        But if he wants npu then why not buff igpu too? I mean, igpu exclusive on CPU memory is good boost, look up intel i7 8709g they put AMD Radeon vega igpu and exclusive to igpu 4gb of hbm memory, it did wonders, now when AMD is winning in apu sector, they could utilise same ideas they did in the past

        • Trainguyrom@reddthat.com
          link
          fedilink
          English
          arrow-up
          7
          ·
          18 days ago

          Seriously putting a couple gigs of on-package graphics memory would completely change the game, especially if it does some intelligent caching and uses RAM for additional memory as needed.

          I want to see what happens if Intel or AMD seriously let a generation rip with on package graphics memory for the iGPU. The only real drawback I could see is if the power/thermal budget just isn’t sufficient and it ends up with wonky performance (which I have seen on an overly thin and light laptop I have in my personal fleet. It’s got a Ryzen 2600 by memory that’s horribly thermally limited and because of that it leaves so much performance on the table)

    • ColeSloth@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      5
      ·
      18 days ago

      Probably because APU’s are getting better and more pc gamers are doing handhelds and apu laptops instead of dedicated desktops. PC gaming has gotten really expensive.

      This is a non comparison for at least the next 5 years. A dedicated gpu is still a better choice hands down for gaming. Even going on a lower end build an older gpu will still beat the current best apu by a good amount, but in 10 years time it may not be so necessary to need a gpu over an apu. GPUs are getting too power hungry and expensive. Gamers gonna game, but they won’t all want to spend an ever increasing amount of money to get better graphics, and arc would need at least another 5 years to be competition enough to claim a worthwhile market share from amd or nvidia and that’s wishful thinking. Long time to bleed money on a maybe.

      • ms.lane@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        18 days ago

        You think Intel is going to have 500-850mm^2 dies?

        That’s what they need to compete in the GPU space.