Doctors routinely prescribe "placebos".
I have really mixed feelings about this. If it gives me relief, why should I care whether it's a placebo or not? (Be sure to read their definition of placebo.) But I'm also a big fan of informed patients. But if you tell them "Hey, this medicine really does nothing for your body" then you negate the placebo effect.
What do you guys think?
I have really mixed feelings about this. If it gives me relief, why should I care whether it's a placebo or not? (Be sure to read their definition of placebo.) But I'm also a big fan of informed patients. But if you tell them "Hey, this medicine really does nothing for your body" then you negate the placebo effect.
What do you guys think?