
If You Take Vitamins, Read This
A new study funded by the National Institutes of Health found no benefits to taking vitamin supplements and suggests we should be getting our vitamins the old-fashioned way: by eating healthy food.