You can, but in my experience it is resistant to custom instructions.
I spent an evening messing around with ChatGPT once, and fairly early on I gave it special instructions via the options menu to stop being sycophantic, among other things. It ignored those instructions for the next dozen or so prompts, even though I followed up every response with a reminder. It finally came around after a few more prompts, by which point I was bored of it, and feeling a bit guilty over the acres of rainforest I had already burned down.
I don’t discount user error on my part, particularly that I may have asked too much at once, as I wanted it to dramatically alter its output with so my customizations. But it’s still a computer, and I don’t think it was unreasonable to expect it to follow instructions the first time. Isn’t that what computers are supposed to be known for, unfailingly following instructions?
All my grandparents save one have passed on, and I have no actual relationship with the one that still lives. So I’m not terribly concerned about her, beyond how much I care for any other random senior citizen.
To answer the spirit of the question, though we can talk about my parents, who are grandparents now.
Both are educated and about as tech proficient as I am. However, my mom nearly got caught in a gift card scam a few years ago, where someone was posing as one of her friends. She had even bought the cards, but insisted on going to give them in-person, which exposed the scam.
Because of that, I think my parents are actually pretty safe, as they are now extra vigilant about the messages they receive, and know to follow up anything suspicious using an alternate communication method.
I know, “once bitten twice shy,” isn’t the best defence, but alternate communication methods are. Stress to your loved ones that if they ever recieve a message from someone asking for money, to follow up using a different medium.