• AwkwardLookMonkeyPuppet@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    They mimic the inputs. Microsoft made a chatbot a few years ago named Tay who turned into a hateful Nazi in less than 24 hours because Microsoft didn’t install any safeguards around the type of inputs it received. The program was scrapped almost immediately.