• PolarPerspective
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            You say that, but I think bikinis are objectifying af. I’m perfectly happy for women to dress however they want. But from the perspective of a man, I find almost-naked women to be the option I benefit more from.

            Western culture seems to push women into degrading positions through social pressure rather than legal means. Just look at how common skin tight leggings and a sports bra are as gym attire.

            Is this actually benefitting women? Or is it just another way to take advantage of them?