Until recently I used to take some supplements for vitamins and minerals, but one of my colleagues has advised me to start taking whole food vitamins supplements, as it is anytime more beneficial and nutritional than vitamins and minerals supplements taken together. Now before switching on to whole food vitamins supplements I want to know from some health expert about its health benefits over vitamins and minerals supplements. I would be really thankful for an early and appropriate solution to my query!