AI Mental Health Question

2 points | by diogenix 9 hours ago ago

4 comments

  • p_ing 9 hours ago ago

    How will your tool distinguish clinical depression from other potentially more serious conditions such a Bipolar or Cyclothymia? Or even major depression from 'simple' depression?

    What are you going to do from there? States (generally) require therapists to be licensed. Only doctor's can prescribe medication.

    Just having an app track depression seems useless. "You're still depressed, Bob". Thanks, f-ing LLM.

    • diogenix 8 hours ago ago

      Great feedback. Thanks. On day 1, it won't track anything else. Presumably it could tackle a litany later. It won't practice medicine (not diagnosing, not informing treatment, not prescribing - just a tool to help). And yes, "you're still depressed" may be what it does. But, it does so objectively, and on-demand. Within the world of mental health, objective data is rare. As an example of possible utility - in a former role, I worked at a company that designed/manufactured medical devices that treated mental health disorders. Many patients understandably doubted their utility, and might discontinue use in the middle of a series of treatments. Many begged providers for data that it was working to justify ongoing use. Providers, however, may be perceived as having a conflict of interest in their assessments ("Sure it's working, just keep paying me to continue to treat you"). Having an objective, 3 party data stream would have helped in that particular scenario. Regardless, more generally, for some I suspect ongoing objective data may be of interest. Not all, but (particularly if it became very accurate, and very passive) perhaps a large number?

      • p_ing 8 hours ago ago

        I don't understand the utility. If I'm depressed, I can confirm that with a therapist or psychiatrist, and I can simply feel it -- I don't need a Debbie Downer Validator.

        I agree with medication/devices the efficacy for the individual is often an unknown until you try it. It's the curse of not understanding exactly how these things work, simply that (at scale) they do.

        As for someone entrapping you to treat you, that presents serious ethical red flags that can get someone's license yanked (in the US). Such practitioners should be few and far between, especially with the little amount of money they're reimbursed from insurance by.

        Can you describe the goal of this app in a single sentence? What does it do for the individual/how does it make the individual 'better'? And could this individual get this sort of feedback on their own (anyone can take PHQ-9 at any time they wish)?

        • diogenix 8 hours ago ago

          Thanks.

          For those that trust their ability to assess their own depression, it offers minimal value. Sounds like you're profoundly in that camp.

          That said, many don't trust their ability to assess their own depression. They may trust their therapists more but have to pay to get access to them and my expectation is that this will be more accurate then them regardless.

          And, if the system is set up to be a passive instrument, it would allow you to monitor evolution of your mental health condition over time. For many people in the course of a treatment, this data could be useful.

          Simple description of value to individual: "Simple, digital, free, objective ability to obtain a depression score that doesn't rely on your own subjective self-assessment."

          And I agree whole-heartedly that there are a ton of ethical questions around many forms of medical treatment and conflicts-of-interest. The ones that personally scare me the most are those that rely on subjective data.