Is it just me, or should a person with designation “Dr.” in front of her name know the difference between sex and gender? True: language changes all the time and words gain new meanings. So, as American society becomes increasingly uptight when talking about sex (the act), sex (the anatomical distinction) has been replaced in our vernacular with a word that refers to the classic assumptions for social role play placed upon each sex: gender. (It would seem we’re the new Victorians.)
This ambiguity isn’t the only issue I have with Dr.