Tukka wrote:Coming in a bit late on the discussion, but I agree with a lot of the sentiments that have been expressed already. I think that in schools, sex education should just be treated as a subset of biology, and physical education.
Teach the facts. Describe to students what sexual intercourse is and what the possible and probable consequences of doing it are. Explain how pregnancy occurs, how and why sex (and other sexual activity) is an effective vector for the spread of a variety of diseases, teach the anatomy of the sexual organs, what orgasm is (in both genders) and how and why it occurs, and what sex does to your body (release of endorphins, etc.).
Such a thorough education might get rather explicit at times, but then that's the whole point. You can't teach something very well by beating around the bush (no pun intended). You have to want to teach the subject of discussion and have no qualms about answering any pertinent questions your students might ask and indeed you want to engage in such a way that they don't have to feel ashamed about answering any questions they might have.
However, I think it is pretty clear that those who are bent on teaching "abstinence only" are not interested in educating students as much as they are interested in pushing a specific moral (not educational) agenda.
I don't think abstinence-only programs are really there to prevent teen pregnancies or the spread of STDs. No amount of scientific evidence one way or the other is going to make the people who push abstinence-only education the hardest change their tune, because pregnancies and STDs aren't their primary concern.
Their primary concern is keeping young people, especially women, ignorant about their options when it comes to contraception and abortion. It's a cultural thing, a taboo. It's to keep people (again, especially women) as ignorant and ashamed about their bodies as possible.
Hopefully the results of this study will precipitate a change in education policy around the country for a fuller, broader treatment of a subject that all of us have to deal with in our lives at one moment or another, but for the reasons I mention I doubt that'll happen.
Is this all opinion or do you have some support for these views? Especially the part where people are trying to "keep women as ignorant and ashamed about their bodies as possible." I'd like to know who's pushing that agenda.