Jump to content


Photo
* * * * * 2 votes

The Horrifying Job of Facebook Content Moderators

world fears human fears

30 replies to this topic

#21 status - CS

status - CS
  • Guests

Posted 29 July 2018 - 12:23 PM

Purge?

 

:chuckle:

 


  • 0



#22 status - Guest

status - Guest
  • Guests

Posted 13 October 2018 - 01:56 PM

Purge?

 

:chuckle:

 



‘Land of censorship & home of the fake’: Alternative voices on Facebook and Twitter’s crackdown

Some 800 pages spanning the political spectrum, from left-leaning organizations like The Anti Media, to flag-waving opinion sites like Right Wing News and Nation in Distress, were shut down. Other pages banned include those belonging to police brutality watchdog groups Filming Cops and Policing the Police. Even RT America’s Rachel Blevins found her own page banned for posts that were allegedly “misleading users.”

https://www.rt.com/u...ter-ban-reacts/


  • 0

#23 status - Guest

status - Guest
  • Guests

Posted 16 December 2018 - 01:18 PM

 

There are other kinds of facebook teams out there in cyber space:
 
How Facebook's Secret Unit Created Digital Propaganda Troll Armies To Influence Elections
 

 

 

What about government agencies selling their data to private security troll farms? Or at least integrating public resources with the private ones. Google seems to have the lead in world wide affairs when it comes to social management. They can tailor make their system to run under any government regime on the planet. 


  • 0

#24 status - Guest

status - Guest
  • Guests

Posted 14 January 2019 - 06:25 PM

Do you think admins and mods could be used as psychological study groups as a whole? Divided up into groups like politics, conspiracy, etc. Each with their own sub-groups of study. Making it easier to figure out which forums are easier to trigger certain kinds of conflict. Sending in the trolls and bots to do what their programming dictates.

 

Yes


  • 0

#25 status - I'm not a robot

status - I'm not a robot
  • Guests

Posted 25 February 2019 - 09:51 PM

The secret lives of Facebook moderators in America


Collectively, the employees described a workplace that is perpetually teetering on the brink of chaos. It is an environment where workers cope by telling dark jokes about committing suicide, then smoke weed during breaks to numb their emotions. It’s a place where employees can be fired for making just a few errors a week — and where those who remain live in fear of the former colleagues who return seeking vengeance.

It’s a place where, in stark contrast to the perks lavished on Facebook employees, team leaders micromanage content moderators’ every bathroom and prayer break; where employees, desperate for a dopamine rush amid the misery, have been found having sex inside stairwells and a room reserved for lactating mothers; where people develop severe anxiety while still in training, and continue to struggle with trauma symptoms long after they leave; and where the counseling that Cognizant offers them ends the moment they quit — or are simply let go.
KEY FINDINGS

    Moderators in Phoenix will make just $28,800 per year — while the average Facebook employee has a total compensation of $240,000.
    In stark contrast to the perks lavished on Facebook employees, team leaders micro-manage content moderators’ every bathroom break. Two Muslim employees were ordered to stop praying during their nine minutes per day of allotted “wellness time.”
    Employees can be fired after making just a handful of errors a week, and those who remain live in fear of former colleagues returning to seek vengeance. One man we spoke with started bringing a gun to work to protect himself.

    Employees have been found having sex inside stairwells and a room reserved for lactating mothers, in what one employee describes as “trauma bonding.”

    Moderators cope with seeing traumatic images and videos by telling dark jokes about committing suicide, then smoking weed during breaks to numb their emotions. Moderators are routinely high at work.

    Employees are developing PTSD-like symptoms after they leave the company, but are no longer eligible for any support from Facebook or Cognizant.

    Employees have begun to embrace the fringe viewpoints of the videos and memes that they are supposed to moderate. The Phoenix site is home to a flat Earther and a Holocaust denier. A former employee tells us he no longer believes 9/11 was a terrorist attack.

The moderators told me it’s a place where the conspiracy videos and memes that they see each day gradually lead them to embrace fringe views. One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape route out of his house and sleeps with a gun at his side, said: “I no longer believe 9/11 was a terrorist attack.”

n May 3, 2017, Mark Zuckerberg announced the expansion of Facebook’s “community operations” team. The new employees, who would be added to 4,500 existing moderators, would be responsible for reviewing every piece of content reported for violating the company’s community standards. By the end of 2018, in response to criticism of the prevalence of violent and exploitative content on the social network, Facebook had more than 30,000 employees working on safety and security — about half of whom were content moderators.

The moderators include some full-time employees, but Facebook relies heavily on contract labor to do the job. Ellen Silver, Facebook’s vice president of operations, said in a blog post last year that the use of contract labor allowed Facebook to “scale globally” — to have content moderators working around the clock, evaluating posts in more than 50 languages, at more than 20 sites around the world.

The use of contract labor also has a practical benefit for Facebook: it is radically cheaper. The median Facebook employee earns $240,000 annually in salary, bonuses, and stock options. A content moderator working for Cognizant in Arizona, on the other hand, will earn just $28,800 per year. The arrangement helps Facebook maintain a high profit margin. In its most recent quarter, the company earned $6.9 billion in profits, on $16.9 billion in revenue. And while Zuckerberg had warned investors that Facebook’s investment in security would reduce the company’s profitability, profits were up 61 percent over the previous year.

Since 2014, when Adrian Chen detailed the harsh working conditions for content moderators at social networks for Wired, Facebook has been sensitive to the criticism that it is traumatizing some of its lowest-paid workers. In her blog post, Silver said that Facebook assesses potential moderators’ “ability to deal with violent imagery,” screening them for their coping skills.n May 3, 2017, Mark Zuckerberg announced the expansion of Facebook’s “community operations” team. The new employees, who would be added to 4,500 existing moderators, would be responsible for reviewing every piece of content reported for violating the company’s community standards. By the end of 2018, in response to criticism of the prevalence of violent and exploitative content on the social network, Facebook had more than 30,000 employees working on safety and security — about half of whom were content moderators.

The moderators include some full-time employees, but Facebook relies heavily on contract labor to do the job. Ellen Silver, Facebook’s vice president of operations, said in a blog post last year that the use of contract labor allowed Facebook to “scale globally” — to have content moderators working around the clock, evaluating posts in more than 50 languages, at more than 20 sites around the world.

The use of contract labor also has a practical benefit for Facebook: it is radically cheaper. The median Facebook employee earns $240,000 annually in salary, bonuses, and stock options. A content moderator working for Cognizant in Arizona, on the other hand, will earn just $28,800 per year. The arrangement helps Facebook maintain a high profit margin. In its most recent quarter, the company earned $6.9 billion in profits, on $16.9 billion in revenue. And while Zuckerberg had warned investors that Facebook’s investment in security would reduce the company’s profitability, profits were up 61 percent over the previous year.

Since 2014, when Adrian Chen detailed the harsh working conditions for content moderators at social networks for Wired, Facebook has been sensitive to the criticism that it is traumatizing some of its lowest-paid workers. In her blog post, Silver said that Facebook assesses potential moderators’ “ability to deal with violent imagery,” screening them for their coping skills.

More here:

https://www.theverge...ditions-arizona


  • 0

#26 Feathers

Feathers

    Premium Member

  • Moderators
  • 780 posts
  • LocationFloating in the Breeze

Posted 26 February 2019 - 11:34 AM

Interesting article....

 

Contract moderators? How much do they pay them? Probably not enough. Considering the possible mental side effects from a job like that. The medical psychologists and psychiatrists can now charge overtime for all the medical expenses these people may incur. Who's going to pay for that?

 

This leads me to this question...

 

Does Facebook create more mental problems than it supposedly offers to solve?


  • 0
Posted Image

#27 status - Guest

status - Guest
  • Guests

Posted 10 March 2019 - 03:40 PM

Does Facebook delete all their objectionable content or do they store it away in the forbidden data box? What about smaller forums and their content? Backups of all postings are gathered in one way or another. If not your own backups others will make them for themselves.  

Deleting (or storing it someplace else like a hard drive) data from a server is a cost cutting device.

As for seeing objectionable content described in the above article? I agree that those who sift through the muck should be compensated justly. Not only monetarily but for their health and service. Doing a job of that nature is more than just a public service. It also serves a higher one. A moral one. Offering ones services to clean the sewers of the social mind is a commendable one. One that is under appreciated.

 

But the sewers aren't the only places where admins and mods find themselves with problems. Moderating drama and ideologies gets really heated. Name calling and argument shifting are prevalent. Professional argumentation skills marketing issues to the forefront of the public mind make it difficult to see which issues should take precedence over another. Many issues are designed to inflict damage upon one group or another. These 'top stories' cause much angst on the social forums. Emotional outbursts are common. How should a moderator proceed? What comments should be deleted or flagged?

 

Those are questions one learns along the way...

 

...along with many others...

 

Don't forget about the scripting bots and actual people using tried and true gangstalking methods.


  • 0

#28 status - Guest

status - Guest
  • Guests

Posted 31 May 2019 - 11:47 AM

I guess you can ask yourself where all the polls and possible 'studies' being reported on by the media are coming from!?

 

Well, if you look at China's social credit system you get a pretty good idea...


  • 0

#29 status - Guest

status - Guest
  • Guests

Posted 19 June 2019 - 08:50 PM

Well, if you look at China's social credit system you get a pretty good idea...

 

... this tactic as a way to deal with Chinese scammers/trolls. There were comments mentioning that "Free Tibet" also works.... this tactic as a way to deal with Chinese scammers/trolls.

tumblr_inline_p4s6loZaO81u57i45_540.png


  • 0

#30 status - Common

status - Common
  • Guests

Posted 20 June 2019 - 12:30 PM

Don't forget about the scripting bots and actual people using tried and true gangstalking methods.

 

also, some people and bots 'go back in time' and edit their previous posts to reflect something entirely different. either with purpose or just vindictive hate.

 

hackers can also do this to change a persons history. look at admins and moderators for this type of thing. but, even they can be hacked.

 

Conclusion: It is getting more difficult to trust anything put online by any one. Too many methods of profile alteration are available by the higher entities involved in this social networking structure we all call the internet. The technology is also trickling down...

 

An example:

 

New Deepfake Software Only Needs One Image to Make You Sing

Deepfakes just got even easier to make. Will they take us further down a rabbit hole of political manipulation?

https://interestinge...o-make-you-sing

 

How do moderators and admins detect this kind of stuff?


  • 0



Reply to this topic



  



Similar Topics Collapse


Also tagged with one or more of these keywords: world fears, human fears

IPB Skin By Virteq