Thanks to social, political, economic and entertainment* trends … Western women now feel a greater level of independence than they've ever felt before.
For possibly the first time in civilisation, a mass number of women no longer believe they are reliant on men to enjoy a full and happy life.
Because men have an inherent need to maintain their gender position within society, they now place much greater importance on their physical, mental and aesthetic 'offering' as they feel this is what is needed to stand out from the crowd and attract a female and maintain [what they believe] is one of their gender fundamental 'roles'.
With the attitude and roles of men and women slowly but surely merging, could this be best described as 'Gender Blending'
* This could be everything from the Spice Girls, right through to Sex In The City etc etc
