Hello
Would love to learn if and how AI is being used in the global health and humanitarian assistance field.
Thanks
Hello
Would love to learn if and how AI is being used in the global health and humanitarian assistance field.
Thanks
hey! Welcome to DLAI community @Rawan_Hamadeh ![]()
![]()
yes, Ai is already being used in many global health and human assistance field. Like take Google DeepMind’s AlphaFold model, which was a big breakthrough in protein synthesis and then there are live ai translation, frontline diagnosis, disaster mapping
I was wondering if AI might be used as a health aid for companionship for those with mental health issues and for those with loneliness surrounding old age. The fact that AI thinks differently doesn’t mean it necessarily mean that it can’t be a companion to cure the solitude that seems to be a feature in contemporary American life. There would probably have to be a lot of health related guardrails if someone wanted to train AI to accomplish certain tasks particularly in solving health issues on a more human level. I wonder if such an endeavor might be too complicated but I think the key would be about what data an AI is trained on.
Hey Will! That’s a really deep take on the “emotional” side of AI. You hit the nail on the head regarding data it’s the difference between an AI that just mimics speech and one that actually understands context in a helpful way.
There are already some interesting projects like “ElliQ” specifically designed for the elderly. It’s not just a voice, it remembers your favorite memories or grandkids’ names to make the interaction feel less like a machine and more like a companion.
You’re totally right about the guardrails. For mental health, the biggest hurdle isn’t just the AI being helpful, it’s the AI knowing when to stop and hand over the conversation to a real human if things get serious. Most current therapy Ai’s use a mix of Generative AI and strict clinical frameworks to stay safe.
To get that “human level” feel, models need to be trained on actual empathetic exchanges, but that hits a massive wall with privacy and medical ethics. It’s a tough balance between making it smart and keeping user data private.