Jump to content

* * * * * 2 votes

The Horrifying Job of Facebook Content Moderators

world fears human fears

11 replies to this topic

#11 Ghost in the Machine

Ghost in the Machine

    Premium Member

  • Members
  • PipPipPipPip
  • 252 posts

Posted 05 February 2018 - 07:13 PM


Working for a corporation makes it easier for moderators to know what to flag. The company sets stringent guidelines all employees must adhere too. Plus, constant training on social do's and don'ts are constantly hammered into the brain. The politics from on high on always looking over your shoulder. 
Independent site owners have more leeway in their approach. Working on specific subject matter or genres keeps it growing in variety. I think the independent forums have an opportunity to push themselves forward by showing personalities and character in their presentations. It's like the facebook groups only they have more freedom to set their own rules of conduct. Some of the best ones have great personality. Independents who can gain a decent audience with great content along with members who create entertaining chemistry will stand out. Being educational and interesting at the same time helps to elevate the audience and their attention.



Easier said than done. Especially with the changing atmosphere from above concerning cyberspace censorship. The ISP themselves could limit what you can see in the future. They could even decide your own website is worthless for any reason and limit or erase all your ad revenue. Possibly even 'sending' your traffic into the phantom zone of accounting they so love and admire.


Finding contrary information regarding just about any subject may be a prolific problem also. Some countries already limit what their citizens can see and hear...

  • 0


#12 status - Guest

status - Guest
  • Guests

Posted Yesterday, 04:58 PM


Granted, it's a given those who post such heinous trash should be punished, all they post should be deleted, and all the negative effects to the moderators should be compensated greater than than are. But, moderating a social network site isn't just about cleaning up the trash. Moderators also have to deal with strong opinions within different ideologies and how they clash with one another. They have to deal with trolls in all kinds of genres. These trolls cast their nets filled with baited language getting people to fill themselves with an emotional connection to an issue. Inciting content designed to inflame others to verbally lash out at those who disagree with an opinion. Most of it is sold to people like a product on the shelf. Packaged with bright colors and big bold letters of copy write and headlines to catch the eye. Always hiding the small print in the back of the box. Blurring the facts with extra small type making it impossible to read without a magnifying glass.
On the smaller forums it's possible for owners and administrators to set their own policies regarding the censoring issue. Most of these smaller social networks are usually focused on a specific range of topics. They can break up into different cliques within but for the most part they're filled with an interesting array of ideas regarding any topic of interest. Personalities tend to emerge in these little discussions. Moderators in this realm mainly have to deal with the spammers, drunks, and trolls. Usually adhering to any policy of behavior an owner sets forth.
Larger social sites, political forums and religious sites tend to attract their fair amount of paid trolls. Facebook and the Russian agenda during the last election is a good example. Whether true or not it's a great method for spreading ideas out into the public ether. Spreading malicious content on a grand scale takes a lot of resources and planning. It's marketed out like any product to get people to buy into an agenda without really looking into the details of the story. Sometimes, going along with the flow can be dangerous. It can distract away from more important issues facing all of society.




Does Facebook delete all their objectionable content or do they store it away in the forbidden data box? What about smaller forums and their content? Backups of all postings are gathered in one way or another. If not your own backups others will make them for themselves.  

Deleting (or storing it someplace else like a hard drive) data from a server is a cost cutting device.

As for seeing objectionable content described in the above article? I agree that those who sift through the muck should be compensated justly. Not only monetarily but for their health and service. Doing a job of that nature is more than just a public service. It also serves a higher one. A moral one. Offering ones services to clean the sewers of the social mind is a commendable one. One that is under appreciated.


But the sewers aren't the only places where admins and mods find themselves with problems. Moderating drama and ideologies gets really heated. Name calling and argument shifting are prevalent. Professional argumentation skills marketing issues to the forefront of the public mind make it difficult to see which issues should take precedence over another. Many issues are designed to inflict damage upon one group or another. These 'top stories' cause much angst on the social forums. Emotional outbursts are common. How should a moderator proceed? What comments should be deleted or flagged?


Those are questions one learns along the way...


...along with many others...


Do you think admins and mods could be used as psychological study groups as a whole? Divided up into groups like politics, conspiracy, etc. Each with their own sub-groups of study. Making it easier to figure out which forums are easier to trigger certain kinds of conflict. Sending in the trolls and bots to do what their programming dictates.

  • 0

Reply to this topic


Similar Topics Collapse

IPB Skin By Virteq