All of Silicon Valley — of Big Tech — is focused on taking large language models and other forms of artificial intelligence and moving them from the laptops of researchers into the phones and computers of average people.
But if I type “show me a picture of Alex Cranz” into the prompt window, Meta AI inevitably returns images of very pretty dark-haired men with beards.
Earlier this year, ChatGPT had a spell and started spouting absolute nonsense, but it also regularly makes up case law, leading to multiple lawyers getting into hot water with the courts.
In a commercial for Google’s new AI-ified search engine, someone asked how to fix a jammed film camera, and it suggested they “open the back door and gently remove the film.” That is the easiest way to destroy any photos you’ve already taken.
An AI’s difficult relationship with the truth is called “hallucinating.” In extremely simple terms: these machines are great at discovering patterns of information, but in their attempt to extrapolate and create, they occasionally get it wrong.
This idea that there’s a kind of unquantifiable magic sauce in AI that will allow us to forgive its tenuous relationship with reality is brought up a lot by the people eager to hand-wave away accuracy concerns.
The original article contains 1,211 words, the summary contains 212 words. Saved 82%. I’m a bot and I’m open source!
This is the best summary I could come up with:
All of Silicon Valley — of Big Tech — is focused on taking large language models and other forms of artificial intelligence and moving them from the laptops of researchers into the phones and computers of average people.
But if I type “show me a picture of Alex Cranz” into the prompt window, Meta AI inevitably returns images of very pretty dark-haired men with beards.
Earlier this year, ChatGPT had a spell and started spouting absolute nonsense, but it also regularly makes up case law, leading to multiple lawyers getting into hot water with the courts.
In a commercial for Google’s new AI-ified search engine, someone asked how to fix a jammed film camera, and it suggested they “open the back door and gently remove the film.” That is the easiest way to destroy any photos you’ve already taken.
An AI’s difficult relationship with the truth is called “hallucinating.” In extremely simple terms: these machines are great at discovering patterns of information, but in their attempt to extrapolate and create, they occasionally get it wrong.
This idea that there’s a kind of unquantifiable magic sauce in AI that will allow us to forgive its tenuous relationship with reality is brought up a lot by the people eager to hand-wave away accuracy concerns.
The original article contains 1,211 words, the summary contains 212 words. Saved 82%. I’m a bot and I’m open source!