Just like with Liberalism and Conservatism, Feminism seems to be oft misunderstood. Just like with the former two, it seems the stereotype associated with Feminism is that of the extremist; people who I wouldn't even call Feminists. I'll explain why.
Feminism is a Liberal philosophy which fights for freedom and equality between the sexes, but is called Feminism because, for that to happen, women would need to gain more rights and privileges to match those shared by men. This means that Feminism is not just a viewpoint for women to hold; men can be considered Feminists as well. This also means that anyone who actually fits the stereotype of a Feminist (a misandrist who wants to treat men as inferior) is not really a Feminist.
Believe it or not, we still have a bit of progress to make in the United States (considering it is a country founded on the concept of freedom, yet inconsistencies still exist), and it seems that most of these problems exist in the workplace. Also, the problems seem to be persistent because of the cultural belief in nature over nurture, far more so than what should really be believed about the subject. However, I will say that it's at least not nearly as bad as what is going on over in places like the Middle East and Africa, where women have little to no rights.
Also, even though I understand why it was named so, I don't like the term Feminism. A change nowadays would be good. I'd prefer something gender neutral, but I don't know what. If anybody has any great ideas, let me know (if you don't want to leave a comment, you can email me through my profile).