It’s December 23, and if you haven’t got Christmas organized by now, you’re screwed. But think how well you could have done with our chatbot friends to lend a hand! AI is a gift to advertising agen…
This stuff is getting pushed all the time in Obsidian plugins (note taking/personal knowledge management software). That kind of drives me crazy because the whole appeal of the app is your notes are just plain text you could easily read in notepad, but some people are chunking up their notes into tiny, confusing bite-sized pieces so it’s better formatted for a RAG (wow, that sounds familiar)
Even without a RAG, using LLMs for searching is sketchy. I was digging through a lot of obscure Stack Overflow posts yesterday and was thinking, how could an LLM possibly help with this? It takes less than a second to type in the search terms and you just have to look at the titles and snippets of the results to tell if you’re on the right track. You have the exact same bottleneck of typing and reading, except with ChatGPT or Copilot you also have to pad your query with a bunch of filler and read all the filler slop in the answer as it streams in a couple thousand times slower than dial-up. Maybe they’re more equal with simpler questions you don’t have to interrogate, but then why even bother? I’ve seen some people who say ChatGPT is faster, easier, and more accurate than Stack Overflow and even two crazy ones who said it’s completely obsolete and trying to understand that perspective just causes me psychic damage.
This stuff is getting pushed all the time in Obsidian plugins (note taking/personal knowledge management software). That kind of drives me crazy because the whole appeal of the app is your notes are just plain text you could easily read in notepad, but some people are chunking up their notes into tiny, confusing bite-sized pieces so it’s better formatted for a RAG (wow, that sounds familiar)
Even without a RAG, using LLMs for searching is sketchy. I was digging through a lot of obscure Stack Overflow posts yesterday and was thinking, how could an LLM possibly help with this? It takes less than a second to type in the search terms and you just have to look at the titles and snippets of the results to tell if you’re on the right track. You have the exact same bottleneck of typing and reading, except with ChatGPT or Copilot you also have to pad your query with a bunch of filler and read all the filler slop in the answer as it streams in a couple thousand times slower than dial-up. Maybe they’re more equal with simpler questions you don’t have to interrogate, but then why even bother? I’ve seen some people who say ChatGPT is faster, easier, and more accurate than Stack Overflow and even two crazy ones who said it’s completely obsolete and trying to understand that perspective just causes me psychic damage.