15 Oct If food is medicine, why isn’t it taught at medical schools?
Culturally and politically, we’re increasingly acknowledging that what we eat plays a major role in our health. Which is why it’s especially strange that healthcare providers know so little about it.