Everyone who’s actually worked a real job knows it’s better for someone to not do a job at all than to do it 75% right.
Because now that you know the LLM is getting basic information wrong, you can’t trust that anytime it produced is correct. You need to spend extra time fact-checking it.
LLMs like Bard and ChatGPT/GPT3/3.5/4 are great at parsing questions and making results that sound good, but they are awful at giving correct answers.
Sure lets fact check every google response. Wouldnt hurt the economy, thats like a few million new jobs.
Everyone who’s actually worked a real job knows it’s better for someone to not do a job at all than to do it 75% right.
Because now that you know the LLM is getting basic information wrong, you can’t trust that anytime it produced is correct. You need to spend extra time fact-checking it.
LLMs like Bard and ChatGPT/GPT3/3.5/4 are great at parsing questions and making results that sound good, but they are awful at giving correct answers.