Telling women not to do things because "it's not safe" is a big pet peeve of mine. I am aware that there are real dangers out in the world for men AND women but this attitude smacks of benevolent paternalism.
I spent most of my high school career slipping out of my window and wandering my hometown(a midsize town in florida)armed with nothing more than a sketchbook and a handful of pencils. The scariest thing that ever happened to me on these jaunts was being chased by a flying cockroach.
I have walked, driven and biked through even the"scariest" parts of town alone and common sense and a little awareness was all it ever took to keep me safe.
I HATE that women are told not to go out on their own, not to go certain places or be about at certain times. Comparing my experiences with all the warnings I've heard over the years has convinced me that it's just one more form of control men exert over women.
I know SO many women who say things like "I'd love to do "blank" but I'm scared to do it by myself". Why do they think this way? Because they've heard this alarmist trope their entire lives. To go out alone, or after dark or to the wrong places is to invite rape or violence. It fosters this horrible idea about the nature of most people and it clips womens wings and their ability to feel like autonomous adults.
Please, for Pete's sake stop telling women they can't do things because it's dangerous because they are women. If you wouldn't caution your son against the same behaviour don't lay that weight on your daughters. Can we please let women be people and stop acting like they're children or invalids simply because they're female?