Western Medicine Dismisses Nature to Its Own Detriment

Count me among a growing number of Americans skeptical of Western medicine. After decades of interacting with the American healthcare system, I am now more receptive to alternative treatments, plant-based medicines, and eastern practices. Western medicine has largely let me down, primarily because it is so dismissive of nature.

I don’t go so far as to worship mother earth as some sort of deity who controls my physical health. I do not look at the planet and everything on it as being superior to humanity. But I also don’t see nature as a utility that exists solely to support human beings. Nature is much more complex than such simplistic applications.

Humanity Is Part of Nature

One of Western medicine’s biggest failures is separating humanity from nature. That is a mistake. Human beings may live in brick-and-mortar homes and drive cars powered by internal combustion engines; we may have an affinity for technology and all things digital. But at our core, we are part of nature. We are as much a part of the natural world as any plant or animal.

Stepping back and looking at how the rest of nature deals with its health reveals something that is as simplistic as it is startling: the natural world maintains good health by relying on nature.

Western medicine is more likely to rely on manufactured drugs and medical devices. Humans are more likely to turn to invasive surgical procedures, lab-manufactured pharmaceuticals, and trendy health fads to maintain good health. Meanwhile, the rest of the natural world merely takes advantage of what nature provides.

Medicine Before There Was a West

What we now consider the Western world was not always as dominating as it is today. Long before there was such a thing as Western medicine, ancient cultures practiced their own version of healthcare. We refer to what they did as Eastern medicine. Regardless of any term we might choose, the real learning point here is that Eastern medicine is more closely aligned with the natural world.

Today we talk about things like herbal and plant-based medicine. Organizations like Utah’s KindlyMD assist patients in obtaining state-issued medical cards so that they might access heavily regulated plant-based medicines. Yet both herbal and plant-based medicine are not new. They are ancient practices.

Herbal medicine takes a broad approach to treating disease and injury with whole plants or plant parts. Plant-based medicine is a narrower approach to the same idea. It seeks to isolate certain plant compounds for medicinal purposes. Ancient cultures have been practicing both since the beginning of time.

Treatments That Don’t Involve Medicine

It is tough to convince Western practitioners to accept herbs and plants as medicines. It’s even tougher to convince them to try therapies that don’t involve medicine of any type. Western medicine is all about drugs, medical devices, and surgical procedures. It is quick to dismiss things like yoga and acupuncture.

It is strange to me that a doctor would be okay with physical therapy but not yoga. It is equally strange that an insurance carrier would cover massage therapy but not acupuncture. Just because a therapy has roots in ancient medicine doesn’t make that therapy useless or unworthy of consideration. And yet, that seems to be the attitude of Western medicine more often than not.

I admit to growing more skeptical of Western medicine as I get older. No, it is not all bad. Western medicine has accomplished some amazing things. But it isn’t the be-all and end-all of good health. There are plenty of very good treatments and therapies well outside of Western medicine’s scope.

Leave a Reply

Your email address will not be published. Required fields are marked *