It looks like AI has followed Crypto chip wise in going CPU > GPU > ASIC

GPUs, while dominant in training large models, are often too power-hungry and costly for efficient inference at scale. This is opening new opportunities for specialized inference hardware, a market where startups like Untether AI were early pioneers.

In April, then-CEO Chris Walker had highlighted rising demand for Untether’s chips as enterprises sought alternatives to high-power GPUs. “There’s a strong appetite for processors that don’t consume as much energy as Nvidia’s energy-hungry GPUs that are pushing racks to 120 kilowatts,” Walker told CRN. Walker left Untether AI in May.

Hopefully the training part of AI goes to ASIC’s to reduce costs and energy use but GPU’s continue to improve inference and increase VRAM sizes to the point that AI requires nothing special to run it locally

  • iktOP
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    6 days ago

    Consider the perspective of the average person: who is gaining value from AI in its current form? It sure as hell isn’t us.

    Ah yeah, the millions of people using it to generate images, generate summaries, translate documents, clean up photos etcetcetc are getting no value from it at all 🙄

    Perplexity received 780 million queries in May, CEO Aravind Srinivas shared onstage at Bloomberg’s Tech Summit on Thursday. Srinivas said that the AI search engine is seeing more than 20% growth month-over-month.

    edit: why am I having this discussion, if you don’t like AI feel free to go literally anywhere else on Lemmy