I know it’s not even close there yet. It can tell you to kill yourself or to kill a president. But what about when I finish school in like 7 years? Who would pay for a therapist or a psychologist when you can ask for help a floating head on your computer?

You might think this is a stupid and irrational question. “There is no way AI will do psychology well, ever.” But I think in today’s day and age it’s pretty fair to ask when you are deciding about your future.

  • MagneticFusion@lemm.ee
    link
    fedilink
    arrow-up
    35
    ·
    1 year ago

    A computer will never have emotions the same way a human has emotions. It is not a living creature. True and genuine human connection is something that will only become more invaluable with the rise of AI

    • dog@suppo.fi
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Eh, I give it 5 years.

      Never say never, because everything is possible given enough time. The only question being how much time.

      • MagneticFusion@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        you are being wildly optimistic. AI confidently lies to you about purely objective things such as asking it to write a program and it confidently writes it wrong and tells you that it is correct over and over. Something like psychology and mental health is far from objective and is constantly evolving, and also differs from person to person based on a gazillion different variables, the most important of them being emotion, something a robot will most likely never have. Even some living animals do not have a wide range of emotions such as snakes that only feel fear and anger, they don’t feel sadness or happiness or anything. What would make you think that artificially created robots would have enough emotional intelligence to replace human psychologists within 5 years?

  • TimewornTraveler@lemm.ee
    link
    fedilink
    arrow-up
    31
    arrow-down
    1
    ·
    1 year ago

    homie lemme let you in on a secret that shouldn’t be secret

    in therapy, 40% of positive client outcomes come from external factors changing

    10% come from my efforts

    10% come from their efforts

    and the last 40% comes from the therapeutic alliance itself

    people heal through the relationship they have with their counselor

    not a fucking machine

    this field ain’t going anywhere, not any time soon. not until we have fully sentient general ai with human rights and shit

    • cheese_greater@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      1 year ago

      I don’t think there’s harm in allowing people who would never be able to afford life-saving medicine to have life-saving medicine cat-puzzle-feeder style

      Edit: this was me and access hasn’t changed the fact that I do no generally derive value from it.

    • dog@suppo.fi
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Interestingly, and somewhat related, it was tested years ago whether a Robot could bring comfort/social support to lonely pets/elderly.

      The results were outstandingly in support, and this is going into actual commercial usage/development as we speak.

  • Havald@lemmy.world
    link
    fedilink
    arrow-up
    26
    ·
    1 year ago

    I won’t trust a tech company with my most intimate secrets. Human therapists won’t get fully replaced by ai

  • magnetosphere@kbin.social
    link
    fedilink
    arrow-up
    22
    ·
    1 year ago

    You are putting WAY too much faith in the ability of programmers. Real AI that can do the job of a therapist is decades away, at least - and then there’s the approval process, which will take years all by itself. Don’t underestimate that. AI therapy is uncharted territory, and the approval process will be lengthy, detailed, and incredibly strict.

    Lastly, there’s public acceptance. Even if AI turns out to have measurably better outcomes, if people aren’t comfortable with it, statistics won’t matter. People aren’t rational. I don’t care how “good” Alexa is, or how much evidence you show me - I will never accept that a piece of software can understand what it’s like to grow up as a person. I want to talk about my issues with a flawed, fallible human, not a box plugged into the wall.

    You ask a valid question, just much earlier than necessary. I’d be surprised if AI was a viable alternative by the time you retire.

    • Encode1307@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      There are already digital therapeutic platforms approved for mental health. Orexo deprexis is one such program. The fact is that the vast majority of people who need therapy aren’t getting it now. These ai therapy models will provide services to those people. I’m willing to bet that in a decade, the majority of therapy will be done by AI, with human therapists focused on the most severe behavioral health conditions.

  • Macaroni_ninja@lemmy.world
    link
    fedilink
    arrow-up
    21
    ·
    1 year ago

    I don’t think the AI everyone is so buzzed about today is really a true AI. As someone summed it up: it’s more like a great autocomplete feature but it’s not great at understanding things.

    It will be great to replace Siri and the Google assistant but not at giving people professional advice by a long shot.

    • Zeth0s@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      13
      ·
      edit-2
      1 year ago

      Not saying an LLM should substitute a professional psychological consultant, but that someone is clearly wrong and doesn’t understand current AI. Just FYI

      • Macaroni_ninja@lemmy.world
        link
        fedilink
        arrow-up
        12
        ·
        1 year ago

        Care to elaborate?

        It’s an oversimplified statement from someone (sorry I don’t have the source) and I’m not exactly an AI expert but my understanding is the current commercial AI products are nowhere near the “think and judge like a human” definition. They can scrape the internet for information and use it to react to prompts and can do a fantastic job to imitate humans, but the technology is simply not there.

        • Zeth0s@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          3
          ·
          edit-2
          1 year ago

          The technology for human intelligence? Any technology would be always very different from human intelligence. What you probably are referring to is AGI, that is defined as artificial general intelligence, which is an “intelligent” agent that doesn’t excel in anything, but is able to handle a huge variety of scenarios and tasks, such as humans.

          LLM are specialized models to generate fluent text, but very different from autocompletes because can work with concepts, semantics and (pretty surprisingly) with rather complex logic.

          As oversimplification even humans are fancy autocomplete. They are just different, as LLMs are different.

  • Bonifratz@feddit.de
    link
    fedilink
    arrow-up
    17
    ·
    1 year ago

    Even if AI did make psychology redundant in a couple of years (which I’d bet my favourite blanket it won’t), what are the alternatives? If AI can take over a field that is focused more than most others on human interaction, personal privacy, thoughts, feelings, and individual perceptions, then it can take over almost any other field before that. So you might as well go for it while you can.

    • intensely_human@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      The fields that will hold out the longest will be selected by legal liability rather than technical challenge.

      Piloting a jumbo jet for example, has been automated for decades but you’ll never see an airline skipping the pilot.

  • nottheengineer@feddit.de
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    It’s just like with programming: The people who are scared of AI taking their jobs are usually bad at them.

    AI is incredibly good at regurgitating information and translation, but not at understanding. Programming can be viewed as translation, so they are good at it. LLMs on their own won’t become much better in terms of understanding, we’re at a point where they are already trained on all the good data from the internet. Now we’re starting to let AIs collect data directly from the world (chatGPT being public is just a play to collect more data), but that’s much slower.

    • Cossty@lemmy.worldOP
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      I am not a psychologist yet. I only have a basic understanding of the job description but it is a field that I would like to get into.

      I guess you are right. If you are good at your job, people will find you just like with most professions.

    • Nibodhika@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      2
      ·
      1 year ago

      I slightly disagree, in general I think you’re on point, but artists specially are actually being fired and replaced by AI, and that trend will continue untill there’s a major lawsuit because someone used a trademarked thing from another company.

    • intensely_human@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      The web is one thing, but access to senses and a body that can manipulate the world will be a huge watershed moment for AI.

      Then it will be able to learn about the world in a much more serious way.

  • ???@lemmy.world
    link
    fedilink
    arrow-up
    15
    arrow-down
    1
    ·
    1 year ago

    No, it won’t. I don’t think I would have made it here today alive without my therapist. There may be companies that have AI agents doing therapy sessions but your qualifications will still be priceless and more effective in comparison.

  • scorpionix@feddit.de
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    Given how little we know about the inner workings of the brain (I’m a materialist, so to me the mind is the result of processes in the brain), I think there is still ample room for human intuition in therapy. Also, I believe there will always be people who prefer talking to a human over a machine.

    Think about it this way: Yes, most of our furniture is mass-produced by IKEA and others like it, but there are still very successful carpenters out there making beautiful furniture for people.

    • intensely_human@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I was gonna say given how little we know about the inner workings of the brain, we need to be hesitant about drawing strict categorical boundaries between ourselves and LLMs.

      There’s a powerful motivation to believe they are not as capable as us, which probably skews our perceptions and judgments.

  • Nonameuser678
    link
    fedilink
    arrow-up
    9
    ·
    1 year ago

    Psychotherapy is about building a working relationship. Transference is a big part of this relationship. I don’t feel like I’d be able to build the same kind of therapeutic relationship with an AI that I would with another human. That doesn’t mean AI can’t be a therapeutic tool. I can see how it could be beneficial with things like positive affirmations and disrupting negative thinking patterns. But this wouldn’t be a substitute for psychotherapy, just a tool for enhancing it.

  • 4am@lemm.ee
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    AI cannot think, it does not logic or reason. It outputs a result from an input prompt. That will not solve psychological problems.

    • baked_tea@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      1 year ago

      It’s what AI does at the moment. Which may not necessarily be true in a few years, what’s what OP is asking about.

  • hugz@kbin.social
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    The caring professions are often considered to be among the safest professions. “Human touch” is very important in therapy

  • dumples@kbin.social
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    At the end of the day AI (no just the LLM we call AI now) are really good at doing boring machine work. These tasks are repetitive, simple and routine. This includes all the LLM which can summarize boring text and generate more boring text. It can’t generate anything new but just output and rearrange.

    What there will be always need for are human work. This includes creativity, emotions and human interaction. A machine can’t replace that at all. Psychology and therapy are all emotions and human interactions so it might be the most safe career choice. Same with something like haircutting or other career that involve human wisdom and personal skills.

    Boring jobs like sending and receiving emails might be replaced. The reason businesses are so scared is that the majority of people in an office just do that