Facebook has recently come under attack for failing to enforce its own guidelines on hate speech and violent imagery. Is it a website’s job to moderate the content its users post, or should users have complete freedom? Is there a happy medium? If so, how would you structure it?
There is no doubt that there is hate speech and some violent imagery appearing on Facebook. There are pages that should not appear on Facebook that are there. Facebook has guidelines about hated speech and violent imagery and content and should police the pages on its website in regards to the violators of its policies and pull down those pages that offend its guidelines. If a website moderator has guidelines, which Facebook does, it is the moderator’s responsibility to monitor the pages of its users and to take down those who violate its policy. It removes content that violates copyright laws. It has the obligation to remove offense pages when it comes to hate speech and violent images and content.
Facebook obviously cannot monitor everything all the time. But if they get a complaint about a particular page or a group that is posting hate messages or violent content, they can investigate and if they find that this is true, they can ban or block this group. It is against their own policy and they have the obligation to do this for the protection of the rest of their users. They can’t let the freedom of a hate group overrule the protection of the rest of the users.
It is incumbent on Facebook to do spot checks of the various pages of groups and organizations and persons and to look for hate speech and violent content. It is also necessary that Facebook take seriously any complaint from users about hate speech or violent imagery and address them immediately. It has an obligation to keep its users safe and provide a safe environment for its users. It can choose to issue a warning and then monitor the abuser closely. If the abuser uses hate speech again, he or she should then be banned. Pages that prompt hate speech or violent images should not be allowed. Period. That is the policy of Facebook. It is for the good of the users of Facebook. The good of the many outweighs the freedom of speech of the few, particularly when the freedom of speech is hate speech and is detrimental to the majority.
Facebook is a private organization and has a policy limiting hate speech and can therefore ban those who practice this type of speech. These people and groups have no right to use Facebook. They have violated the rules of using Facebook. Therefore they are out of the club, so to speak. It’s as if they hadn’t paid their dues. They don’t deserve to belong.
Facebook has a duty to its users to police its website and make sure it is safe. That includes monitoring it for hate speech and violent imagery.