New generative A.I breakthroughs are happening almost every week, it seems. The big question is; Are we ready for it? Noted Science Zaddy Kyle Hill explains ...
If your concern is that we’re “not getting anything” in exchange for the training data AI trainers have gleaned from your postings, then those open-source AIs are what you should be taking a look at. IMO they’re well worth the trade.
I’ve been playing with a locally installed instance of big agi really like the UI but it’s missing the RAG part. I’m also cobbling my own together for fun and not profit to try to stay relevant in these hard times. Langchain is some wild stuff.
That’s cool. I haven’t looked at any local/foss llms or other generators, largely because I don’t have a use case for them.
If your concern is that we’re “not getting anything” in exchange for the training data AI trainers have gleaned from your postings, then those open-source AIs are what you should be taking a look at. IMO they’re well worth the trade.
Agree. When I feel like playing and/or have a use case for myself I’ll be looking at open source ai.
It’s with playing around with. This is a good one which packages all the basics including RAG
https://github.com/imartinez/privateGPT
I’ve been playing with a locally installed instance of big agi really like the UI but it’s missing the RAG part. I’m also cobbling my own together for fun and not profit to try to stay relevant in these hard times. Langchain is some wild stuff.
Thank you. Gonna save the link for when I have a use case and/or want to play around