Fighting for feminism
The Merriam Webster dictionary defines feminism as “the theory of the political, economic, and social equality of the sexes,” which often entails “organized activity on behalf of women's rights and interests.” However, throughout the past decade, society and the media have greatly distorted feminism’s connotation in the minds of many. Keep Reading