I grew up eating red meat. It's one of those cultural programmed behaviors that I've never, ever, questioned. But lately I've encountered more and more people in my life that told me about how much better they feel after becoming...dare I say the word...vegan.
So I just finished watching the movie "What the Health " and I'm starting to wonder, what if...?
Considering that I believe people on this site is generally progressive, I am seeking out some feedback. Have you tried it?