Using AI to 'Identify People at Risk of Mental Illness'
May 8, 2023 0:17:56 GMT -5
Post by leilani on May 8, 2023 0:17:56 GMT -5
The question is, who exactly will determine who is and who isn't mentally ill? And who will define mental illness?
Techno-Hell: Using AI to 'Identify People at Risk of Mental Illness'?
BY BEN BARTEE 7:01 PM ON MAY 07, 2023
The technocrats are now seeding the concept of using AI in the application of mental healthcare, starting with the most extreme cases of potentially suicidal youth and then, once the practice is established, widening it out to more routine contexts.
Everyone, or nearly everyone, is sympathetic to adolescent suicide and would like to stop it. So that’s the case study they will start with to legitimize using AI to screen for mental health conditions. As they often use this kind of human shield tactic in their social engineering narratives, once you recognize that it becomes a given that they’ll use suicidal youth (probably transgender) in their story.
Every narrative you see in corporate media is predicated on emotional manipulation.
Note how uncritically receptive the news actors are to the proposition that AI should effectively replace humans in the detection and subsequent treatment of mental health conditions.
Not a single concern do they articulate — either because they are too stupid to think of any or because they are simply conditioned to accept whatever “experts” say as gospel.
My reflexive reaction is that one of the main (but not only) reasons for the population-level mental health decline in the West is the mechanization/automation of everything that removes the element of human interaction from the equation.
Further automating the treatment for the mental health problems caused in large part by the depersonalization of everyday life is probably counterproductive.
But what do I know? I’m not one of the experts we are all trained to outsource our intellect to.
link
Things are going to get dystopian real quick.
I’m going on the record with a prediction right now: at some point in the not-so-distant future, perhaps in the next edition of the Diagnostic and Statistical Manual, right-wing ideology — or any sort of heterodox political thought — will be added as a mental health condition to be remedied with pharmaceutical intervention like lithium or anti-psychotic drugs traditionally used to treat schizophrenia.
Two birds, one stone: neutralize your political opponents with chemical lobotomies and increase pharmaceutical profits at the same time!
The new mental health condition will probably be termed similar to Oppositional Defiant Disorder, meant to convey the idea that there is a legitimate, benevolent authority that the individual will not submit to and, ergo, endangers society such that they must be medicated and/or removed for the greater good.
In this scenario, speculatively, AI could be used to comb through social media posts, emails, etc. on the prowl for red flag phrases like “Jesus” or “Trump” or whatever in order to industrialize/streamline the diagnosis and “treatment” process.
Techno-Hell: Using AI to 'Identify People at Risk of Mental Illness'?
BY BEN BARTEE 7:01 PM ON MAY 07, 2023
The technocrats are now seeding the concept of using AI in the application of mental healthcare, starting with the most extreme cases of potentially suicidal youth and then, once the practice is established, widening it out to more routine contexts.
Everyone, or nearly everyone, is sympathetic to adolescent suicide and would like to stop it. So that’s the case study they will start with to legitimize using AI to screen for mental health conditions. As they often use this kind of human shield tactic in their social engineering narratives, once you recognize that it becomes a given that they’ll use suicidal youth (probably transgender) in their story.
Every narrative you see in corporate media is predicated on emotional manipulation.
Note how uncritically receptive the news actors are to the proposition that AI should effectively replace humans in the detection and subsequent treatment of mental health conditions.
Not a single concern do they articulate — either because they are too stupid to think of any or because they are simply conditioned to accept whatever “experts” say as gospel.
My reflexive reaction is that one of the main (but not only) reasons for the population-level mental health decline in the West is the mechanization/automation of everything that removes the element of human interaction from the equation.
Further automating the treatment for the mental health problems caused in large part by the depersonalization of everyday life is probably counterproductive.
But what do I know? I’m not one of the experts we are all trained to outsource our intellect to.
link
Things are going to get dystopian real quick.
I’m going on the record with a prediction right now: at some point in the not-so-distant future, perhaps in the next edition of the Diagnostic and Statistical Manual, right-wing ideology — or any sort of heterodox political thought — will be added as a mental health condition to be remedied with pharmaceutical intervention like lithium or anti-psychotic drugs traditionally used to treat schizophrenia.
Two birds, one stone: neutralize your political opponents with chemical lobotomies and increase pharmaceutical profits at the same time!
The new mental health condition will probably be termed similar to Oppositional Defiant Disorder, meant to convey the idea that there is a legitimate, benevolent authority that the individual will not submit to and, ergo, endangers society such that they must be medicated and/or removed for the greater good.
In this scenario, speculatively, AI could be used to comb through social media posts, emails, etc. on the prowl for red flag phrases like “Jesus” or “Trump” or whatever in order to industrialize/streamline the diagnosis and “treatment” process.