Jump to content


Photo
* * * * * 2 votes

The Horrifying Job of Facebook Content Moderators

world fears human fears

16 replies to this topic

#11 Ghost in the Machine

Ghost in the Machine

    Premium Member

  • Members
  • PipPipPipPip
  • 263 posts

Posted 05 February 2018 - 07:13 PM

 

Working for a corporation makes it easier for moderators to know what to flag. The company sets stringent guidelines all employees must adhere too. Plus, constant training on social do's and don'ts are constantly hammered into the brain. The politics from on high on always looking over your shoulder. 
 
Independent site owners have more leeway in their approach. Working on specific subject matter or genres keeps it growing in variety. I think the independent forums have an opportunity to push themselves forward by showing personalities and character in their presentations. It's like the facebook groups only they have more freedom to set their own rules of conduct. Some of the best ones have great personality. Independents who can gain a decent audience with great content along with members who create entertaining chemistry will stand out. Being educational and interesting at the same time helps to elevate the audience and their attention.

 

 

Easier said than done. Especially with the changing atmosphere from above concerning cyberspace censorship. The ISP themselves could limit what you can see in the future. They could even decide your own website is worthless for any reason and limit or erase all your ad revenue. Possibly even 'sending' your traffic into the phantom zone of accounting they so love and admire.

 

Finding contrary information regarding just about any subject may be a prolific problem also. Some countries already limit what their citizens can see and hear...


  • 0

7mDFXjl.gif




#12 status - Guest

status - Guest
  • Guests

Posted 17 February 2018 - 04:58 PM

 

Granted, it's a given those who post such heinous trash should be punished, all they post should be deleted, and all the negative effects to the moderators should be compensated greater than than are. But, moderating a social network site isn't just about cleaning up the trash. Moderators also have to deal with strong opinions within different ideologies and how they clash with one another. They have to deal with trolls in all kinds of genres. These trolls cast their nets filled with baited language getting people to fill themselves with an emotional connection to an issue. Inciting content designed to inflame others to verbally lash out at those who disagree with an opinion. Most of it is sold to people like a product on the shelf. Packaged with bright colors and big bold letters of copy write and headlines to catch the eye. Always hiding the small print in the back of the box. Blurring the facts with extra small type making it impossible to read without a magnifying glass.
 
On the smaller forums it's possible for owners and administrators to set their own policies regarding the censoring issue. Most of these smaller social networks are usually focused on a specific range of topics. They can break up into different cliques within but for the most part they're filled with an interesting array of ideas regarding any topic of interest. Personalities tend to emerge in these little discussions. Moderators in this realm mainly have to deal with the spammers, drunks, and trolls. Usually adhering to any policy of behavior an owner sets forth.
 
Larger social sites, political forums and religious sites tend to attract their fair amount of paid trolls. Facebook and the Russian agenda during the last election is a good example. Whether true or not it's a great method for spreading ideas out into the public ether. Spreading malicious content on a grand scale takes a lot of resources and planning. It's marketed out like any product to get people to buy into an agenda without really looking into the details of the story. Sometimes, going along with the flow can be dangerous. It can distract away from more important issues facing all of society.

 

 

 

Does Facebook delete all their objectionable content or do they store it away in the forbidden data box? What about smaller forums and their content? Backups of all postings are gathered in one way or another. If not your own backups others will make them for themselves.  

Deleting (or storing it someplace else like a hard drive) data from a server is a cost cutting device.

As for seeing objectionable content described in the above article? I agree that those who sift through the muck should be compensated justly. Not only monetarily but for their health and service. Doing a job of that nature is more than just a public service. It also serves a higher one. A moral one. Offering ones services to clean the sewers of the social mind is a commendable one. One that is under appreciated.

 

But the sewers aren't the only places where admins and mods find themselves with problems. Moderating drama and ideologies gets really heated. Name calling and argument shifting are prevalent. Professional argumentation skills marketing issues to the forefront of the public mind make it difficult to see which issues should take precedence over another. Many issues are designed to inflict damage upon one group or another. These 'top stories' cause much angst on the social forums. Emotional outbursts are common. How should a moderator proceed? What comments should be deleted or flagged?

 

Those are questions one learns along the way...

 

...along with many others...

 

Do you think admins and mods could be used as psychological study groups as a whole? Divided up into groups like politics, conspiracy, etc. Each with their own sub-groups of study. Making it easier to figure out which forums are easier to trigger certain kinds of conflict. Sending in the trolls and bots to do what their programming dictates.


  • 0

#13 status - Guest

status - Guest
  • Guests

Posted 21 March 2018 - 09:02 AM

Google to hire thousands of moderators after outcry over YouTube abuse videos

The company, which owns YouTube, has endured a stream of negative press over violent and offensive content

YouTube is continuing to develop advanced machine-learning technology to automatically flag problematic content for removal. The company said its new efforts to protect children from dangerous and abusive content and block hate speech on the site were modeled after the company’s ongoing work to fight violent extremist content.

YouTube said machine learning was helping its human moderators remove nearly five times as many videos as they were previously, and that 98% of videos removed for violent extremism are now flagged by algorithms. Wojcicki claimed that advances in the technology allowed the site to take down nearly 70% of violent extremist content within eight hours of it being uploaded.

The statement also said YouTube was reforming its advertising policies, saying it would apply stricter criteria, conduct more manual curation and expand its team of ad reviewers. Last month, a number of high-profile brands suspended YouTube and Google advertising after reports revealed that they were placed alongside videos filled with exploitative and sexually explicit comments about children.

https://www.theguard...ld-abuse-videos


  • 0

#14 status - Guest

status - Guest
  • Guests

Posted 21 March 2018 - 09:10 AM

A Visit to Facebook's Recently Opened Center for Deleting Content
 
Our tour of one of Germany's new content moderation centers gave us a look at Facebook’s content moderation—and what it means for the people who have to enforce its deletion rules.
 
Facebook didn't allow us to talk to the employees or look at the details on their computer screens during the 30-minute visit. While journalists were there, employees didn't actually review any user-created content. According to a Facebook spokesperson, this was due to security reasons.
 
 
Here’s What Content Facebook Will Restrict With Its New Community Guidelines
 
CGI nudity, support for terrorist activity, and revenge porn are all banned.
 
Facebook just updated its community guidelines to clamp down on misconduct on the site, which means that scrolling through your news feed will be that much less interesting, but probably a little bit safer; it's a mixed bag of rules that are likely to be beneficial to users and restrictions that could ultimately censor them.
 
Revenge Porn Is Finally Banned
Supporting "Dangerous Organizations" Is Not Allowed
No Butts, Breasts, or CGI Nudity
Graphic Images Shared for Your "Sadistic Pleasure" Will Be Removed
You Can't Publicly Shame People
Governments Are Asking Facebook to Restrict More Content
 
Will the new community guidelines, in concert with increased requests for content restrictions on the part of governments, result in more posts, pages, and comments being removed from the site? We'll just have to wait and see. If you don't want to bother with the whole mess, you could always just use another site that won't place limits on what you have to say—but will you, really?
 
 

  • 0

#15 status - Lead Story

status - Lead Story
  • Guests

Posted 21 March 2018 - 06:41 PM

The ultimate machine for targeting leads. What else does a good salesman need?
 
The Power of Spin!
 

  • 0

#16 status - Primer Timer

status - Primer Timer
  • Guests

Posted 22 March 2018 - 12:34 PM

Do you think admins and mods could be used as psychological study groups as a whole? Divided up into groups like politics, conspiracy, etc. Each with their own sub-groups of study. Making it easier to figure out which forums are easier to trigger certain kinds of conflict. Sending in the trolls and bots to do what their programming dictates.

 

:Laughing-rolf:

 

Well, I guess Cambridge got caught with their hands in the cookie jar. I never thought for a minute that outside 3rd, 4th, and even 5th party sources didn't have access to all that juicy prime target lead material.


  • 0

#17 status - Sources Say

status - Sources Say
  • Guests

Posted 22 March 2018 - 12:41 PM

:Laughing-rolf:

 

Well, I guess Cambridge got caught with their hands in the cookie jar. I never thought for a minute that outside 3rd, 4th, and even 5th party sources didn't have access to all that juicy prime target lead material.

 

I guess you can ask yourself where all the polls and possible 'studies' being reported on by the media are coming from!?


  • 0



Reply to this topic



  



Similar Topics Collapse


Also tagged with one or more of these keywords: world fears, human fears

IPB Skin By Virteq