Panther Lake and Nova Lake laptops will return to traditional RAM sticks

  • umami_wasabi@lemmy.ml
    link
    fedilink
    English
    arrow-up
    121
    arrow-down
    1
    ·
    15 days ago

    Reverting to RAM sticks is good, but not shutting down GPU line. GPU market needs more competiter, not less.

    • ChicoSuave@lemmy.world
      link
      fedilink
      English
      arrow-up
      35
      arrow-down
      5
      ·
      15 days ago

      Intel can’t afford to keep making GPUs because it doesn’t have the reliable CPU side to soak up the losses. The GPU market has established players and Intel, besides being a big name, didn’t bring much to the table to build a place for itself in the market. Outside of good Linux support (I’ve heard, but not personally used) the Intel GPUs don’t stand out for price or performance.

      Intel is struggling with its very existence and doesn’t have the money or time to explore new markets when their primary product is cratering their own revenue. Intel has a very deep problem with how it is run and will most likely be unable to survive as-is for much longer.

      • Jay@lemmy.world
        link
        fedilink
        English
        arrow-up
        26
        ·
        edit-2
        15 days ago

        As a Linux user of an Intel Arc card. I can safely say that the support is outstanding. In terms of price to performance, I think it’s pretty good too. I mainly enjoy having 16GB of VRAM and not spending $450-$500+ to get that amount like Nvidia. I know AMD also has cards around the same price that have that amount of VRAM too though

        • XTL@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          3
          ·
          14 days ago

          That’s interesting, thanks. Can I ask what that vram is getting used for? Gaming, llms, other computing?

          • Jay@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            14 days ago

            The main things that use up a lot of VRAM for me is definitely doing Blender rendering and shader compilation for things like Unreal Engine. My games probably would use a little more if I had any screen higher than 1080p. The most usage I’ve seen from a game was around 14Gb used

            I haven’t messed around with llms on the card just yet but I know that Intel does have an extension for PyTorch to do GPU compute. Having the extra VRAM would definitely be of help there

            • Jay@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              ·
              edit-2
              14 days ago

              That’s pretty much the lowest that I’ve found too.

              From what I could find, this is the lowest price per GPU manufacturer (For 16GB of VRAM)

              • Intel Arc A770: $260
              • Radeon RX 7600XT: $320
              • NVIDIA RTX 4060 Ti: $450
      • deegeese@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        23
        ·
        15 days ago

        It boggles the mind that AMD realized the importance of GPUs 20 years ago when they bought ATI and in all that time Intel still doesn’t have a competitive GPU.

        • ms.lane@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          14 days ago

          Intel realized it back then too, but things didn’t pan out the way they wanted.

          nVidia and AMD were going to merge while ATi was circling the drain. Then Jensen and Hector Ruiz got into their shitfight about who was going to be CEO of the marged AMD/nVidia (it should have been Jensen, Hector Ruiz is an idiot) which eventually terminated the merger.

          AMD, desperately needing a GPU side for their ‘future is fusion’ plans, bought the ailing ATi at a massive premium.

          Intel was waiting for ATi to circle the drain a little more before swooping in and buying them cheap, AMD beat them to it.

          • deegeese@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            5
            ·
            14 days ago

            That’s a slightly revisionist history. ATI was by no means “circling the drain”, they had a promising new GPU architecture soon to be released, and remember this because I bought ATI stock about 6 months before the merger.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            ·
            14 days ago

            ntel was waiting for ATi to circle the drain a little more before swooping in and buying them cheap, AMD beat them to it.

            They had strong iGPU performance, a stronger process node, and tons of cash. There’s no reason they couldn’t have built something from the ground up, they were absolutely dominating the CPU market. AMD didn’t catch up until 2017 or so when they launched the new Zen lineup.

            Intel sat on their hands raking in cash for 10+ years before actually getting serious about things, and during that time, Nvidia was wiping the floor w/ AMD. There’s absolutely no reason Intel couldn’t have taken over the low-end GPU market with a super strong iGPU, and used the same architecture for a mid-range GPU. I bought Intel laptops w/o a dGPU because the iGPU was good enough for light gaming. I stopped once AMD’s APUs caught up (bought the 3500U), and I don’t see a reason why I’ll consider Intel for a laptop.

            Intel lost because they sat on their hands. They were late to making an offer on ATI, they were late in building their own GPUs, and they’re still late on anything touching AI. They were dominant for well over a decade, but instead of doing R&D on areas near their core competencies (CPUs), they messed around with SSD and other random stuff.

            • ms.lane@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              13 days ago

              They needed the IP.

              You can’t just build a 3D accelerator. It’s billions of dollars in licensing basic building blocks.

              Easiest way to get in is to buy your way in.

              • sugar_in_your_tea@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                12 days ago

                Yet they were capable of building one over the last decade or so with their Arc GPUs. I’m saying they should have started a decade earlier.

      • atempuser23@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        15 days ago

        Basically there is only money at the top of the gpu range. Everything else is a budget card with razor thin margins.

        AI specific chips will take off over time but even then the ship is starting to sail . These are mostly data center projects.

      • Wooki@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        15 days ago

        besides being a big name, didn’t bring much.

        Absolutely wrong. A lot of old and dated information in your post.

        They have something no one else has: manufacturing, and very low price and great performance after recent driver updates. They just lack the driver stability which has been making leaps and bounds.

        I do not think anyone else can enter the market, let alone with an edge.

      • LavenderDay3544@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        14 days ago

        Intel is too big to fail. And the defense sector needs an advanced domestic foundry. Uncle Sam will bail it out with our tax money.

        • ChicoSuave@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          13 days ago

          The United States has a few chip fabs that are capable of making military grade hardware. It’s helpful that the defense industry uses chips which aren’t the most advanced possible - they want the reliability mature tech provides. Micron, Texas Instruments, ON semiconductor - there are a few domestic chip companies with stateside fabs.

          Intel is also a valuable collection of patents and a huge number of companies would love to get them. Someone will want to step in before the government takes over.

          • LavenderDay3544@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            13 days ago

            Intel is the only US based and owned foundry that is on the leading edge of fab process technology. That’s what the government wants domestically. Defense isn’t just military and certain intelligence and similar functions need high performance hardware. I somehow don’t think the NSA is using CPUs made on Northrop Grumman’s 180 nm planar CMOS process. Army radios might use that shit but the highest tech defense and intelligence agencies are using modern hardware. Intel is the best option for manufacturing it.

            TSMC could be an option now with its US based GIGAFABs but it would be a much more complex deal with the US government where chips made for it would have to be made entirely in the US and possibly by a US domiciled subsidiary instead of TSMC’s main Taiwan based parent company. The same goes for Samsung.