Ladies, does it bother you when someone refers to women as females?
I can understand using the word as an adjective (e.g. a female doctor), but why as a noun? It always sort of bothered me and I couldn't figure out why. I'm still not sure exactly why. When I was a teacher, the biology teacher would always tell our students, "Animals are females, humans are women". Thoughts, anyone?