|
As many of you already know, I grew up in a place where women being topless on a beach was no big deal. Almost everywhere I've been outside the US, it's that way too. Personally, except when I was in the US for school and such, I've never worn a top on any beach. Here on the isalnd where I live women walk up and down the beaches topless all the time. Nobody except maybe overly religious types seem to mind. I could never understand why Americans seem to think boobs are such a bad thing when the rest of the world seems to be open to admiring them even in public. In the states, if a guy walks up to you on the beach and says "nice breasts" (even with a top on) most women will act genuinely pissed or embarrassed. Here, women just say thank you, take it as a compliment, and go on their way. So, is being topless on public beaches a good thing? Bad thing? Or, what?And.why does American society seem to think it's so bad? Is it because when the English people(religious based puritans) who left England and settled in America brought all those old religious based principles with them? And to this day they've stuck? If so, it seems a shame that after two hundred years, American society hasn't been able to shed something like the sight of some bare breasts on a beach is a bad thing. *KIISES*
|