- This article is part of our coverage of Mental Health Awareness Week 2023
I went to ChatGPT to overcome grief in a hurry. I wasn’t trying to speed my grief, or rush my emotions. I just know that finding mental health care right now is time-consuming. If there was a process to begin, I wanted to begin. The first time I asked ChatGPT for help, I wasn’t ready to hear the answer; I don’t even remember what it said. The next time, I found an AI throwing up its hands and telling me I was going to need a real human to guide me through my very human problem.
This is a story about a dog, my dog Moose, and it doesn’t have a happy ending. I’m the type of person who uses the website DoesTheDogDie.com to find scenes I need to skip when I’m watching a movie at home. I cannot stand even fictional harm coming to animals.
If you need to bail on this story now, I understand, or you can skip this scene and I promise the rest won’t be too sad.
The sad part, about losing my dog Moose
In January I was in Las Vegas for CES, and I left my two dogs, Moose and Beesly, with my son and his Mom. She and I live apart, but I dogsit for her little dachshund, and she watches mine. On my second day away, she called to tell me that Moose had broken through her doggie door at night and gotten out of the house. Moose is deaf. No.
Moose was deaf. He was born deaf; I used sign language to talk to him. His name is Moose, but he only knew his name as a special hand sign I used for him. If you were a stranger and you used his hand sign, he knew right away that you were a friend.
Moose was hit by a car at night on a very dark road in suburban Connecticut. The driver took him to an animal hospital, but Moose didn’t make it. I don’t know much more than that. I never asked, and I don’t want to know. What plays in my imagination is terrible enough without factual support.
Moose was cremated and I received his remains in a pretty little sealed wooden box. I got a paw print in plaster as well. I haven’t looked at it since the day I got it, when I touched it to my face once and then put it high on a shelf. I’m sobbing just thinking about it. I cry every time I read these words again.
The sad part is over, here comes the grieving
Okay, that’s all of the explicitly sad stuff, so you can come back now if you skipped ahead. My tragedy happened in early January, and there hasn’t been a day since that I haven’t cried about it. I’m crying less today than in the days that followed, but I’ve never dealt with sadness like this.
My first step was talking to a mental health professional. I see a counselor for ADHD and other issues, and I asked her about my emotional situation and whether I should seek help for depression.
I explained my situation, my feelings, and how often I was feeling this way. She told me that I was likely feeling severe grief because my pain and sadness come in waves. I am able to function normally and feel happy when I’m not thinking of my loss. If this were depression, she said, the feelings would be more persistent. It would be less likely I’d find pleasure in life in the moments I wasn’t so acutely sad.
I agreed, because I have dealt with and managed depression in the past, under the care of a psychiatrist. This is grief. I still need to take it seriously and get help, it was just a different type of help (and less likely that medication would be involved).
I know there are grief support groups. I watched “Dead to Me”. I know that I can seek professional counseling, and I’ve done so in the past.
I’m also very busy. There’s a pandemic, regardless of what politicians claim, and people are sick and filling up doctors’ offices. It’s never been so hard to get an appointment, especially as a new patient.
Plus, I’ve just moved to a new area and I’m still getting my bearings, so I’m not ready to attend local support groups. Frankly, I’m not sure when I could find the time.
I was not ready when I first went to ChatGPT for help
I turned to ChatGPT. I’d been looking for a reason to try ChatGPT, but as a professional writer I was offended by the idea I might use it to create written content. Instead I waited for a problem to solve, something more complicated than a Google search could answer.
I went to the Open AI chatbot about my grief a couple months ago, when ChatGPT first started to get hot, and I explained my problem and how I was responding. I’m not sure what it said. Just asking the question made me too sad to consider the response.
We talk about triggering and trigger warnings, and if you’ve never experienced an emotional trigger like this or known someone who has, this is how it works. When I thought about my dog, I broke down. I couldn’t hold back. If I heard his name, or saw a photo, or even saw a similar dog, I would lose it. If I saw a dog getting hurt on TV, I would need to leave the room. I wasn’t just upset. I had an uncontrollable, physical response.
It was a few months before I could reconsider and talk to the AI again. It was still a very emotional experience, but maybe there is some catharsis in it? Not the answers themselves. The answers were next to useless. But in facing this problem head-on and looking for an answer, at least I’m taking a first step.
Here’s the advice that ChatGPT (and Bard) gave on grief
I talked to ChatGPT and Bard for help. I won’t bother recounting Bard’s response, it was repetitive and generic, and it wouldn’t even answer some of my questions. It flatly refused, saying it was just “a text-based AI and can’t assist with that.” Ok, I respect that. Better than bad advice.
ChatGPT, on the other hand, was willing to help, but it wasn’t truly helpful, not in a way that moved me closer to healing, or feeling normal. It tended to repeat similar advice no matter what I asked, but worded differently.
To ChatGPT’s credit, the AI seemed to register the emotional level of my questions and respond with an increasingly concerned tone. My first question just said that my dog had passed away after an accident and asked for help. ChatGPT said it was “really sorry to hear about the loss of your dog.”
Another time, I said I was “still emotionally a wreck.” ChatGPT responded by saying it was “so sorry to hear that you’re still experiencing deep grief” over my loss. The first time was a more general response, but the second was directed at me personally. I appreciate the difference.
Each time I asked, ChatGPT gave me some version of the same five or six steps, depending on whether I mentioned feelings of “guilt” specifically. Those steps always included:
- Allow yourself to grieve
- Seek support from friends and family
- Process your guilt (when I mention feeling “guilty”)
- Remember your dog
- Take care of yourself
- Consider professional help
It’s interesting to note that ChatGPT mentions seeking help and support twice, but did not immediately offer any specific support groups or services.
When I asked Bard the exact same questions, I got the same advice, except that on a few occasions, Bard provided a link to a source where it found its information. The links were … odd. When I asked about the difference between depression and grief, Bard used a scientific paper about smartphone addiction in children in Kolkata for its definition of depression.
An AI has not experienced loss, so it cannot help me
I should have known that an AI was not going to help me through my grieving process, not in any way that makes meaningful progress. An AI lacks the one thing that I need when I’m looking for support with my loss and my grief.
I need experience. I need to talk to someone who has been where I am and come through it. I need an actual human who is still living and still breathing to show me that living and breathing is going to remain possible. If they can do it, so can I.
I can’t get that from ChatGPT, and though I do think the AI could help more, I’m terrified how it will try. My fear in approaching this project was that an AI would lie to me about losing its dog. I was worried that ChatGPT would fake a pet, and then let something terrible happen to its imagined animal. I’m grateful that AI is dumb or smart enough to avoid this scenario.
I still wish the AI could have done more. I don’t want insincerity or lies from a computer, but I could have used more than just advice. I wonder if there is some guided conversation that the computer could offer that would help me through my process, or give me new avenues of thought to consider.
ChatGPT can’t help my grief, it should help me get help
The next step for me will be support, whether that’s a group I can join or a solitary professional. The same old problems apply. I don’t have the time or the resources, and support doesn’t have appointments available.
So, the AI should help with that. It should find me the support I need and help me sign up. It should help me fill out all of the annoying paperwork. It should help me manage my schedule so that I can get the personal help I need.
I didn’t have time to find support for my grief, so I turned to ChatGPT for help. I was backwards. The AI can’t help me solve my personal problems. Instead, I’d like the AI to help me with other parts of my life so that I have the time, resources, and energy I need to help myself.
I love you, Moose. I miss you.