- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Yes! This is a brilliant explanation of why language use is not the same as intelligence, and why LLMs like chatGPT are not intelligence. At all.
Yes! This is a brilliant explanation of why language use is not the same as intelligence, and why LLMs like chatGPT are not intelligence. At all.
Yes, now that it has become a marketable product, the term “AI” feels like a buzzword due to overuse. But in actual fact it is still being used (by most vendors) consistent with how it has been used for like 40 years. ML, video game opponents, chess engines: all of these have been referred to as “AI” for at least that long. Anyone who thinks that calling GPT or Stable Diffusion “AI” started “five minutes ago” (or even that it is in any way novel) has to be someone whose only exposure to the concept of “AI” has been through Sci-Fi movies, and not the actual, real field of AI that has been developing for decades. It is, therefore, a very clear signal that the person knows fuck all about the subject and so cannot possibly form a valid opinion. It’s just a generic angry response.
And frankly, I think ChatGPT would do a better job of that than the author of the article. At least we wouldn’t be wasting actual, valuable, human brain resources making this drivel.