I live in Cali and have been hearing some people speak badly about the state. I don't think it bothers me much because frankly, I would much rather live up north anyway, but I just want to know, has there been anything about California in the news that strikes you wrong? Or, if you've visited, did you like or dislike your visit. Did you once live here and now live somewhere else? What are your thoughts and opinions?