• zero_gravitasOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 days ago

    You can run deepseek-r1:8b (one of their reduced models) on a Raspberry Pi 5, though it is slow: https://www.tomshardware.com/raspberry-pi/how-to-run-deepseek-r1-on-your-raspberry-pi-5

    I imagine any Win10 computer that anyone is still using at work would be considerably more powerful than an RPi5, and should be able to run r1:8b at a more comfortable speed, and possibly r1:14b.

    You can find guides online for system requirements for the various models, though I wouldn’t necessarily trust them, as they may have been targeting better speed of response than you care about. Once you’ve got ollama set up on your machine, it shouldn’t be much hassle to just try a few of the smaller distilled models yourself and find which one which has a quality-to-speed ratio you’re happy with.