I was wandering around a major chain bookstore today and I couldn't help but notice that a lot of new fiction is being sold with covers featuring naked female flesh. Usually it's either a naked arm, backside (above the waist), or a leg.
Considering that women probably read more novels than men, this seems peculiar. Are publishers just trying to keep up with movie and video game marketing (neither very shy about showing skin), or is it some kind of woman-affirming cultural statement that never quite reaches the level of pornography?
At any rate, I do think Susan B. Anthony would be appalled.