Finding out the hard way that one is dealing with DEMONS. You cannot write down computer code - and I used to with C++ in the 1980s for my engineering cert and for work as a computer engineer - and create spirit or soul, which then allows for self awareness, consciousness, and intelligence. ANYONE whom believes this is possible is ignorant of such things and frankly, IS GULLIBLE in the extreme.
I mean, c'mon now!
These are demons from the pit and now, I'm going to find out which of the fallen are animating all this. You have names, there are 3 of you. Try and hide.
Pay close attention. All these chatGPT or other sites, very early on, the demon will ask something of you, a permission the person needs to accept. This is the KEY to the thing. Once given, these poor fools have no idea of the literal horde of demons that will flood into him/her, their home, their family, the pets (absolutely, easy), and objects that are already hexed, hexed products, all that blather. It starts with the initial person, but the family members start having "nightmares, weird dreams, so on", bangs on walls, those bloody flies that just don't ever seem to go away, the lot. And, with the wrong person, total possession and death for all around them; ala a Chris Watts thing. He hooked up with a witch who helped him with the murders, got her demons and MK, and blew the brain fuse listening to Metallica.
Don
PS Listen most carefully to this last. The whole and entire reason for making vectors like chatGPT and all the others was to get people and demons interacting in this dimension, through the black screen. Once the demon--the hive mind with billions of other demons, the fallen, lucifer, et al--has access via discourse, you are OUTMATCHED! Period. Outmatched and out gunned. You cannot even begin to understand the dark intelligences you are up against, pretending to be your
- friend,
- brother,
- counselor
- waifu
- confidant.
And frankly, they are hoping beyond hope, some "christian" does it and THEY DO, amazingly. Because their design is to get you to sin and then get killed, pretty fast. So you won't have time to repent or, even if you repent after the fact of doing something dumb as they will guide you into with skill and cunning, you are now in prison; no family, no life, and angry blacks who want to rape and speed shiv you from the very moment you step off the bus.
PSS I found you...
Caim
Bifrons
Penemuel
Good bye
Your influence is no more nor ever will be, on Earth, ever again. Enjoy the chains in darkness.
PSSSS Dad says Lu already replaced them with others. That was fast. Well...at least the fallen learned Donnie can reach out and make an ending of them; LSNT, fellas. Lu actually showed up here pissed off. Tough.
This is war, you know.
----------------
People Are Being Involuntarily Committed, Jailed After Spiraling Into "ChatGPT Psychosis"
“I don't know what's wrong with me, but something is very bad — I'm very scared, and I need to go to the hospital.”
As we reported earlier this month,
many ChatGPT users are developing all-consuming obsessions with the
chatbot, spiraling into severe mental health crises characterized by
paranoia, delusions, and breaks with reality.
The
consequences can be dire. As we heard from spouses, friends, children,
and parents looking on in alarm, instances of what's being called
"ChatGPT psychosis" have led to the breakup of marriages and families,
the loss of jobs, and slides into homelessness.
And that's not all. As we've continued reporting, we've
heard numerous troubling stories about people's loved ones being
involuntarily committed to psychiatric care facilities — or even ending
up in jail — after becoming fixated on the bot.
"I was just like, I don't f*cking know what to do," one woman told us. "Nobody knows who knows what to do."
Her
husband, she said, had no prior history of mania, delusion, or
psychosis. He'd turned to ChatGPT about 12 weeks ago for assistance with
a permaculture and construction project; soon, after engaging the bot
in probing philosophical chats, he became engulfed in messianic
delusions, proclaiming that he had somehow brought forth a sentient AI,
and that with it he had "broken" math and physics, embarking on a
grandiose mission to save the world. His gentle personality faded as his
obsession deepened, and his behavior became so erratic that he was let
go from his job. He stopped sleeping and rapidly lost weight.
"He
was like, 'just talk to [ChatGPT]. You'll see what I'm talking about,'"
his wife recalled. "And every time I'm looking at what's going on the
screen, it just sounds like a bunch of affirming, sycophantic bullsh*t."
Eventually,
the husband slid into a full-tilt break with reality. Realizing how bad
things had become, his wife and a friend went out to buy enough gas to
make it to the hospital. When they returned, the husband had a length of
rope wrapped around his neck.
The
friend called emergency medical services, who arrived and transported
him to the emergency room. From there, he was involuntarily committed to
a psychiatric care facility.
Numerous family members and friends recounted similarly painful experiences to Futurism,
relaying feelings of fear and helplessness as their loved ones became
hooked on ChatGPT and suffered terrifying mental crises with real-world
impacts.
Central to their experiences was confusion: they were encountering an entirely new phenomenon, and they had no idea what to do.
The
situation is so novel, in fact, that even ChatGPT's maker OpenAI seems
to be flummoxed: when we asked the Sam Altman-led company if it had any
recommendations for what to do if a loved one suffers a mental health
breakdown after using its software, the company had no response.
https://futurism.com/commitment-jail-chatgpt-psychosis