Member-only story
Women Should Center Men In Order To Heal. Yeah, I Said It.
It sucks that women would rather disconnect from meeting their own feminine energy in ways that don’t result in quick fixes, social approval, or positive validation from the opposite sex.
As for centering men, no matter your sexual orientation or identity you cannot escape the fact that men have been presented to us as a pivotal aspect of our lives and our development as feminine beings. Men have been extremely influential in our lives, both in positive and negative ways, and in the ways we view this world as women.
This is why I believe that stepping into one’s authenticity as a woman involves a deep and thorough analysis of men from both a positive and negative lens, not merely accepting the narrative that you are childish unless you refrain from sharing any thoughts and opinions on the opposite sex that paint them in a negative light because you are scared that they won’t want to marry you and give you babieeesssss.
Women who pretend to care about each other as multifaceted people, not merely as future incubators and housewives, enjoy policing the voices of women who are sincerely trying to make sense of the heavy conditioning of loving and serving the same group of people who have negatively impacted their lives since birth, all while allowing men with podcasts to speak their minds freely…