Still, many Americans—and their physicians—have come to think that every symptom, every hint of disease requires a drug, says Vinay Prasad, M.D., an assistant professor of medicine at Oregon Health & Science University. “The question is, where did people get that idea? They didn’t invent it,” he says. “They were spoon-fed that notion by the culture that we’re steeped in.”
It’s a culture, say the experts we consulted, encouraged by intense marketing by drug companies and an increasingly harried healthcare system that makes dashing off a prescription the easiest way to address a patient’s concerns.