In the News
Weigh in: Does Facebook have a responsibility to police hate speech?
Earlier this week, activists took to social media and online petition sites to fight against hate speech--particularly the variety that arises on thematic Facebook pages, many of which promote violence against women. Yesterday, after several large advertisers refused to continue spending money on ads that might appear on controversial pages, Facebook vowed to review the way they discover and handle hate speech. Which makes us wonder what you think about hate speech on Facebook and other social media sites.
What is hate speech? In the '90s, the National Telecommunications and Information Administration defined hate speech as "speech that advocates or encourages violent acts or crimes of hate" and "speech that creates a climate of hate or prejudice, which may in turn foster the commission of hate crimes." The Supreme Court has mandated that when it comes to hate speech, it's not so much what you say but how you say it--as in, "I don't like so-and-so" versus "so-and-so should be assaulted."
From offensive groups and pages (like the one titled "Violently raping your friend just for laughs") to discriminatory tweets, we've seen hate speech cropping up more and more frequently as social media becomes continually popular--especially when users can hide their true identity behind anonymous usernames.
While we're all about free speech, we despise seeing these sorts of offensive words and actions anywhere, online and off. But should it be up to social media sites like Facebook and Twitter to find and ban hate speech? Tell us what YOU think the limits of free speech online should be in the comments, cuties.