I really want this part of human culture that's terribly afraid of being honest about the body to go away. It would even seem there's enough evidence that it's caused far more harm and difficulty than any good at all. My mom never made any effort to teach me about sex, and even to this day she only really makes passing comments on not getting any women pregnant. My father did better. He explained all the slang and code words and such, but in an honest way. It was always clear that sex was for making babies and it felt good to do so. That was the good part, but we never got into any specifics at all. Everything I learned about the anatomy basically came from individual research on the internet and biology texts in school. Speaking of which, because I went to Catholic School straight through HS, there was never any real sex ed.
There is a great PSA to be made along the lines of "talk to your children or the Internet will."Everything I learned about the anatomy basically came from individual research on the internet
I went to a Catholic school in the UK. There was pretty much no sex education apart from two areas: 1. In science class, looking at male and female anatomy. 2. In R.E. class, being taught (in a semi-impartial manner) that all contraception aside from the rhythm method is considered sinful by the Catholic Church.
Point number 2 is the strangest part. They were basically condoning not using condoms to a bunch of newly sexually active young males. It doesn't help that all these young men come from poor urban environments.