This whole idea has caused me to wonder exactly what makes a certain contention more feminine or masculine? Why must a feminine science place so much focus on nature? It places a lot of trust in the idea that women should care about nature, and that seems awfully stereotypical to me, so why is this viewpoint actually encouraged by feminists?
I think that the gender roles that exist have purely social causes, and have almost nothing to do with the innate nature of one's gender. I think that men and women would have generally the same interests if certain social stigmas didn't tell us "you're a woman, so you have to act this way," or "you're a man, so here are a list of things you are forbidden to like." The whole idea troubles me in spite of the fact that it is so ingrained in our society.
But then again, maybe I have a limited viewpoint, so I ask of anyone reading: what do you think of this? Do you think men and women are innately drawn to or away from certain interests? Or are these affected by social stigma? Do you have a different view entirely? Please let me know about your opinions, because I would really enjoy reading them!
-Christopher Hoef
-Christopher Hoef