• Larry@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 months ago

    Local AI sounds nice. One reason I’m cynical about the current state of AI is because of how many send all your data to another company

    • MacN'Cheezus@lemmy.today
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      It is. Unfortunately it does tend to use up a lot of RAM and requires either a fairly fast CPU or better yet, a decent graphics card. This means it’s at least somewhat problematic for use on lower spec or ultraportable laptops, especially while on battery power.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      edit-2
      7 months ago

      Eh, I don’t particularly care too much either way. It seems to be solving problems with the 80/20 approach: 80% of the benefit for 20% of the effort. However, getting that last 20% is probably way more difficult than just building purpose-built solutions from the start.

      So I’m guessing we’ll see a lot more “decent but not quite there” products, and they’ll never “get there.”

      So it might be fun to play with, but it’s not something I’m interested in using day-to-day. Then again, maybe I’m completely wrong and it’s the best thing since sliced bread, but as someone who has worked on very basid NLP projects in the past (distantly related to modern LLMs), I just find it hard to look past the limitations.