This has a lot to do with why the UI.S. has gone into decline in so many areas.
It has a lot to do with the fact if you learn to think for yourself. You will question their policies and their god. The other thing is they don't think the common man or woman should be educated, but their kids and family should be. They like to tell you that the colleges are teaching you the wrong things, if that was true then they wouldn't be bragging about their own education in the same schools that they say are too liberal.
This probably has to do with the fact that colleges have a tendency to have left leaning ideologies. Would you not see it as possibly negative if colleges were right leaning institutions that pushed viewpoints opposed to yours?
I don't think colleges push viewpoints. They teach the bgest known facts and most provable explanations. Granted all the facts and best explanations don't support religion or even a conservative viewpoint, but they do teach ideas that are as close to the truth as we have gotten.
@snytiger6 I’m not talking about what is taught in classes.