20
Pro tip: I used an AI to generate a week's worth of patient handouts and my supervisor flagged three for being too empathetic.
I fed it basic discharge instructions for common procedures, expecting dry text. It added phrases like 'this part might feel scary, but it's normal' and 'remember to be kind to yourself while you heal'. Learned that even simple task AIs can inject unexpected tone that changes how info lands. Anyone else run into AI adding weirdly human nuance where you didn't ask for it?
4 comments
Log in to join the discussion
Log In4 Comments
beth14723d ago
Totally get that, @drew_patel. I just tell it to sound like a textbook now lol.
4
abbyp6127d ago
Oh wow, that's actually kind of a big deal. It's like the AI is making a choice about what patients need to hear, not just what they need to know.
3
drew_patel27d ago
Exactly. Always ask for the raw data too.
2
drew_patel26d ago
But hold on, that's the whole point of having a doctor or an AI helper. We trust them to filter the noise. If I got every single raw data point from a blood test, I'd have no idea what any of it means. It would just scare me. The system's job is to turn data into useful info, not just dump it all on us. Abbyp61 makes it sound like a bad choice, but it's a needed one. Most people don't want the raw spreadsheet, they want the summary that says "you're fine" or "talk to your doctor about this one thing." Isn't that actually more helpful?
10