The other day, Blogger Buddy Steve (All Natural and More) put the following original Dear Abby news clipping into one of his posts.
I wonder when, and why, our society turned from the acceptance of guys being naked together into the societal attitudes we now have. As Abby pointed out, guys were - and still should be - "free to relax and enjoy the sun and water as nature intended": Naked.
Boners or not
Naked and unashamed. Just as Abby stated, used to be the norm.
When did guys being naked together, once thought to be something so natural and normal turn into something lewd?
Can someone, anyone, explain this to me?