How did the misconception that Western medicine is the ONLY correct form of healing become so popular ?
Why is it considered stupid if a patient prefers meditation instead of surgery?
Why do people think it is better to opt for the most expensive, extensive and intrusive form of treatment?
I don’t think that Western medicine is defunct. I just don’t agree with doctors who think that they’re the only ones with the right answers.
Because, actually, doctors don’t have all the answers. No one does. Not a single method or school is singularly effective in fixing anything in the human body.
Remember Abraham’s conversation with the Lord?
“Lord, where does illness come from?”
He said, “From Me.”
“And where does wellness come from?”
“Yeah, I send that too.”
Abe was confused. “Then why do we seek health from doctors and shamans and astrologers?”
The Lord replied, “Because, like all things worldly, health needs vessels.”