Women take sex too seriously IMO. You make life what it is. Adults should be able to fuck without getting feelings involved and women should be able to say what they want. If the man doesn't want you anymore after you fuck then why do you care? Why do you care if people label you a whore? Are we still in high school. No one can take your dignity and you self respect. You lose it all on your own. i tell my man very clearly I WANT YOU TO MAKE LOVE TO ME.... OR FUCK ME... so simple.
Now if you feel funny about it then don't do it. If you can't respect yourself after then don't do it. But i've never felt disrespected because I had sex with someone. Sounds ridiculous to me. Society tells us we should be ashamed of having sex.

It's crazy.