I know it’s not even close there yet. It can tell you to kill yourself or to kill a president. But what about when I finish school in like 7 years? Who would pay for a therapist or a psychologist when you can ask for help a floating head on your computer?

You might think this is a stupid and irrational question. “There is no way AI will do psychology well, ever.” But I think in today’s day and age it’s pretty fair to ask when you are deciding about your future.

  • scorpionix@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Given how little we know about the inner workings of the brain (I’m a materialist, so to me the mind is the result of processes in the brain), I think there is still ample room for human intuition in therapy. Also, I believe there will always be people who prefer talking to a human over a machine.

    Think about it this way: Yes, most of our furniture is mass-produced by IKEA and others like it, but there are still very successful carpenters out there making beautiful furniture for people.

  • Evilschnuff@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    There is the theory that most therapy methods work by building a healthy relationship with the therapist and using that for growth since it’s more reliable than the ones that caused the issues in the first place. As others have said, I don’t believe that a machine has this capability simply by being too different. It’s an embodiment problem.

  • jabathekek@sopuli.xyz
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    I don’t think many people would want to seek psychiatric care from what they might see as a computer. A large part of clinical psychology is creating and maintaining a relationship with patients and I highly doubt language models will become sophisticated enough to achieve that in seven years, if at all. Remember these aren’t true AI’s, they are language models. They have a long way to go before they can be seen as true intelligences.

  • nottheengineer@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    It’s just like with programming: The people who are scared of AI taking their jobs are usually bad at them.

    AI is incredibly good at regurgitating information and translation, but not at understanding. Programming can be viewed as translation, so they are good at it. LLMs on their own won’t become much better in terms of understanding, we’re at a point where they are already trained on all the good data from the internet. Now we’re starting to let AIs collect data directly from the world (chatGPT being public is just a play to collect more data), but that’s much slower.

  • FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Well, I won’t say I think there’s no risk at all. AI is advancing rapidly and in very surprising ways. But I expect that most of the jobs that AI is currently “replacing” will actually still survive in some related form. When sewing machines were invented it didn’t poof tailors out of existence, they started doing other things. The invention allowed people to be able to own way more clothing than they did before, so fashion design became a bigger thing. Etc.

    Even if AIs get really good at psychology there’ll still be people who are best handled by a human. Heck, you might end up with an AI “boss” that decides which cases those would be and give you suggestions on how to handle them, but your own training will likely still be useful.

    If you want to be really future-proof then make sure to set aside some savings and think about alternate careers that you might enjoy keeping abreast of as hobbies just in case something truly drastic happens to your primary field.

  • hugz@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    The caring professions are often considered to be among the safest professions. “Human touch” is very important in therapy

  • halcyondays@midwest.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    20 years ago the line was “there are no careers in psychology/philosophy”. So I got a comp sci degree, and I do well enough coding, but I could probably be happier with how I spend my days. I still read philosophy in my free time. Less tangible paths have always been demonized, largely because society needs a lot of laborers and engineers, and fewer thinkers and theorists. The potential of AI is just the latest buzzword applied to a century old coercion tactic.

    That said, if we entertain the possibility, I think you’re taking too narrow of a view of the possibilities. Who will advise the training of those therapy AI models? Doctorate psychologists.

    I work for an education tech company, obviously our product is built by an engineering team of comp sci majors that know how to code - but we employ a large number of former teachers and folks with pedagogical degrees to guide how the product actually works in the real world.

    The same will continue to be true for future products, a model to perform a task well doesn’t exist without those that deeply understand the task at hand.

    Another example that comes to mind is data science - has any economist ever recommended a theoretical math degree as a career choice? And yet every company racing to implement the latest machine learning models now needs someone that understands Bayesian probability networks and Markov chains. Suddenly a “useless” degree is in high demand.

    If that’s what you want to do, I think you’ll find your way. Minor in comp sci and think about how to implement your psychology learnings in code, if you want to have a contingency plan.

  • livus@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    If you have a talk with the AI called Pi, it talks like a therapist. It’s impressive at first but you can’t escape the knowledge that it dgaf about you.

    And that’s a trait people really don’t want in a therapist.

      • rynzcycle@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        You jest, but honestly this is what helped me. I felt very alone, deeply depressed and held a long rooted belief that I wasn’t important enough to deserve better.

        Knowing that this person was listening because they were being paid/it was their job, helped be get past the guilt and open up. Likely saved my life. AI would not have given me that.

  • Bonifratz@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Even if AI did make psychology redundant in a couple of years (which I’d bet my favourite blanket it won’t), what are the alternatives? If AI can take over a field that is focused more than most others on human interaction, personal privacy, thoughts, feelings, and individual perceptions, then it can take over almost any other field before that. So you might as well go for it while you can.

  • 4am@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    AI cannot think, it does not logic or reason. It outputs a result from an input prompt. That will not solve psychological problems.

  • TimewornTraveler@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    homie lemme let you in on a secret that shouldn’t be secret

    in therapy, 40% of positive client outcomes come from external factors changing

    10% come from my efforts

    10% come from their efforts

    and the last 40% comes from the therapeutic alliance itself

    people heal through the relationship they have with their counselor

    not a fucking machine

    this field ain’t going anywhere, not any time soon. not until we have fully sentient general ai with human rights and shit

  • dumples@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    At the end of the day AI (no just the LLM we call AI now) are really good at doing boring machine work. These tasks are repetitive, simple and routine. This includes all the LLM which can summarize boring text and generate more boring text. It can’t generate anything new but just output and rearrange.

    What there will be always need for are human work. This includes creativity, emotions and human interaction. A machine can’t replace that at all. Psychology and therapy are all emotions and human interactions so it might be the most safe career choice. Same with something like haircutting or other career that involve human wisdom and personal skills.

    Boring jobs like sending and receiving emails might be replaced. The reason businesses are so scared is that the majority of people in an office just do that

  • magnetosphere@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    You are putting WAY too much faith in the ability of programmers. Real AI that can do the job of a therapist is decades away, at least - and then there’s the approval process, which will take years all by itself. Don’t underestimate that. AI therapy is uncharted territory, and the approval process will be lengthy, detailed, and incredibly strict.

    Lastly, there’s public acceptance. Even if AI turns out to have measurably better outcomes, if people aren’t comfortable with it, statistics won’t matter. People aren’t rational. I don’t care how “good” Alexa is, or how much evidence you show me - I will never accept that a piece of software can understand what it’s like to grow up as a person. I want to talk about my issues with a flawed, fallible human, not a box plugged into the wall.

    You ask a valid question, just much earlier than necessary. I’d be surprised if AI was a viable alternative by the time you retire.