darkaniken2
Guest
I don't know about elsewhere in the world, but our school never even taught us about condoms and the pill and whatnot. They had a kindasorta sex-ed class, but it's been so long, I can't remember what it taught exactly. So obviously doesn't leave a very good impression.
As far as media goes, I'm inclined to agree. However, I'd have to say that sex is not only promoted and advertised, it's actually encouraged. It's become the social norm. I'm the odd teen out at the table when talking about sex, as I'm one of the few that hasn't "done it". I don't know if it's American, Western, or global society that is causing it, but sex is become something that has no meaning, because everyone does it. It's become like breathing, or eating. It just is.
As far as media goes, I'm inclined to agree. However, I'd have to say that sex is not only promoted and advertised, it's actually encouraged. It's become the social norm. I'm the odd teen out at the table when talking about sex, as I'm one of the few that hasn't "done it". I don't know if it's American, Western, or global society that is causing it, but sex is become something that has no meaning, because everyone does it. It's become like breathing, or eating. It just is.