• LifeInMultipleChoice@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      4 days ago

      I think we need to make laws pertaining to the use and usage of the term by businesses. There is nothing intelligent about language models. Most of what AI is being used for in businesses is more “Automated Instructions” than anything intelligent.

      Laws need to dictate that companies MUST have reasonable ability to get to a human representative and that they are legally responsible for their responses.

      It’s fine to set up automated systems to assist people within companies, as the majority of issues people have can be solved through automated processes.

      User: “I need access to this network share”

      LLM: Okay submit this form: Link to network share access request form.

      LLM: Can I further assist?

      User submits form specifying what the network path location, radio buttons for read/ read, write permissions, and reason for needing access.

      Form sends approve/deny button to owner of that specific network share in an email.

      Approver clicks approve, and the user is added to the active directory group required, and receives an email back stating they have been added and they should log out and log back in so their active directory groups update group policies.

      Time taken by users: 5 minutes Many companies have so many requests coming in that stuff like this often doesn’t get to the approving parties and completed for weeks.

      But if you set up an LLM inside your company non external facing that locates forms and processes but cannot access user data or permissions it can take the workload of managing 60,000 users down by a significant amount.

      (I’m sure there are a million other uses that could be legitimate, but that’s just a quick one off the top of my head)