I’m looking for a machine to run OpenGPT, Stable Diffusion, and Blender. I’m on the precipice of buying an Alienware w/ Ryzen 9 with a Radeon RX6850m. I’ve never needed anything near this level on Linux and I’m scared TBH. I’d much rather get a System76, but the equivalent hw has Nvidia and costs more than twice as much. While skimming for issues with current hardware, I saw something about a Legion laptop that could only use Intel RAID for the file system, and that this was a nightmare with generic distro kernels. What other stuff like this is happening with current laptop hardware?

I can barely manage a Gentoo install by following the handbook, understanding a third of it, and taking a few weeks to get sorted.

I spent all of yesterday afternoon sorting though all of the Linux hardware data in this stable diffusion telemetry: https://vladmandic.github.io/sd-extension-system-info/pages/benchmark.html

That total dataset has just over 5k total entries/699 valid Linux entries not including LSFW. It contains no entries for a Radeon RX6850m. I’m super nervous to buy a laptop that costs as much as my first car. I never want to run Windows again. What resources can I check to boost my confidence that this is going to work on Fedora WS?

If anyone is interested, the SD github dataset has the following numb entries/AMD card model:

- _3 RX 5700 XT /8GB
- _2 RX _580 __ /4GB
- _3 RX _580 __ /8GB
- 15 RX 6600 XT /8GB
- _1 RX 6650 XT /8GB
- 31 RX 6700 XT /12GB
- 10 RX 6750 XT /12GB
- 10 RX 6800 __ /16GB
- 19 RX 6800 XT /16GB
- 15 RX 6900 XT /16GB
- _9 RX 6950 XT /16GB
- _7 RX 7900 XT /20GB
- 39 RX 7900XTX /24GB
- _6 RX VEGA __ /8GB

Other common cards used in Linux and in this dataset are:

NVIDIA
- 39 A100-SXM4 /79GB
- 20 GTX-1070 /8GB
- 11 GTX-1080Ti /11GB
- 13 H100-PCIe /79GB
- 12 RTX-2070 /8GB
- 12 RTX-2080 Ti /22GB
- 31 RTX-3060 /12GB
- 16 RTX-3070 /8GB
- 10 RTX-3080 /10GB
- 39 RTX-3090 /24GB
- 11 RTX-3090 Ti /24GB
- 10 RTX-4070 Ti /12GB
- 87 RTX-4090 /24GB
- 27 RTX-A4000 /16GB
- 15 RTX-A5000 /24GB
TESLA
- 26 T4 /15GB
- 11 V100S-PCIE /32GB
  • poVoq@slrpnk.net
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    1 year ago

    This is for very low resolution only and AI up-scaling then takes another long time. Yes SD can work with 8gb vRAM and 12 is nicer, but the upcoming SDXL will probably require 16gb to work good enough.

    I agree that Nvidia is crap and would love to recommend AMD, but their software for AI stuff is just bad right now and their business decisions to only support the newest data-center GPUs with it is even worse.

    I have an all AMD Linux system, and it works great for gaming and VR, but I have given up on trying to get SD to work on it despite spending a lot of time on that already. Maybe with a newer card it would be better, but I think the risk is just too high to spend a lot of money on an officially unsupported card that AMD can break any minute and has done so in the past.

    • j4k3@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      This is the talking-sense that got to me. Thanks. It is why I made the post before pulling the trigger.

      I really hate shopping and now I’m back to zero. I probably need to focus on an external graphics card solution, but that looks like a messy space to navigate too. There seems to be a good bit of negative feedback from the ASUS ROC external GPU laptop setup. I have no idea what is or is not possible. I think I saw a headline in passing about USB4 just getting merged into the kernel, so that doesn’t bode well for support of existing hardware. I’m not sure what kind of bandwidth is really needed for SD to the CPU.

      Thanks again for the minor disappointment to avoid a major one later.