Every passing year in the Western world creates higher stakes and a more extreme environment for the average man. The younger generation of men/boys are getting pummelled by misandrist propaganda on T.V and social media. It truly is a dystopian nightmare unravelling before our eyes in real-time.
What does this mean for men an society at large?
Ultimately all of these shenanigans in the West are creating a vortex of doom for everyone. No society in the history of our species has survived WITHOUT masculine men. If you are red-pilled to any degree you can already see the deterioration all around you. Some of you with more wealth may be more insulated from seeing the social decay but for the average man on the street who has half a brain its now pretty obvious. The thing that shocks me the most is the speed of the decline. This just further reinforces just how important men are in a stable of healthily functioning society.