Originally posted by JaTo: the fact that most Western religions have excluded any guidance or education about sex for so long due to the "uncomfortable" nature of the topic is partly to blame for at least some of the stigma that Christians usually receive in today's society.
Absolutely. The Bible contains depictions of sexuality that would probably make some of the ultra-conservative religious types cringe if they were to read them. Interesting how mainstream christianity dosen't discuss them either ...