The question is “do you really need those vitamin supplements you are taking?”
I am certainly no expert on this topic, but I do have a strong opinion based on quite a few articles I’ve read over the years.
Logic would tell us that it certainly in our best interest to acquire our daily requirements naturally through the food we eat including vegetables.
Supplements are big business. Supplement sales will likely top $20 billion this year in the U.S. alone.
Many individuals are wasting their money… and worse, harming their health.
The marketing for these pills implies that they are medical treatments… but they aren’t regulated like medicines. No one’s checking the health claims, many of which are pure marketing fiction. And worse, the supplements contain ingredients and substances that are different from what’s on the label.
In most cases, you’re far better off dumping your supplements and meeting your nutritional needs with healthy, whole foods – starting with vegetables.
Many vegetables are packed with phytochemicals – molecules that can help protect our bodies and preserve our health. Whole foods are the best way to get what you need… a supplement just can’t replace the nutrients in real food.
Much of the most recent research indicates that eating a variety of fruits and vegetables will provide the most benefits, so skip those pills.
Have a discussion with your doctor and by all means have your blood work done annually to check your vitamin levels.