Women don't like their own instincts theory
Jump to navigation
Jump to search
Women don't like their own instincts theory is the theory that all women have a hardwired, immutable, biological instinct to give men that do nice things for them sexual favors, and that most women hate this about themselves. So most women attempt to earn as much money and acquire as much status as possible, so they can turn down favors and nice things from men they are not physically attracted too.