I really want this part of human culture that's terribly afraid of being honest about the body to go away. It would even seem there's enough evidence that it's caused far more harm and difficulty than any good at all. My mom never made any effort to teach me about sex, and even to this day she only really makes passing comments on not getting any women pregnant. My father did better. He explained all the slang and code words and such, but in an honest way. It was always clear that sex was for making babies and it felt good to do so. That was the good part, but we never got into any specifics at all. Everything I learned about the anatomy basically came from individual research on the internet and biology texts in school. Speaking of which, because I went to Catholic School straight through HS, there was never any real sex ed.