Jump to content


Photo
* * * * * 2 votes

The Horrifying Job of Facebook Content Moderators

world fears human fears

17 replies to this topic

#1 Ghost in the Machine

Ghost in the Machine

    Premium Member

  • Members
  • PipPipPipPip
  • 263 posts

Posted 23 December 2017 - 03:03 PM

The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed
 
8Nuhmh8.jpg
 
Baybayan is part of a massive labor force that handles “content moderation”—the removal of offensive material—for US social-networking sites. As social media connects more people more intimately than ever before, companies have been confronted with the Grandma Problem: Now that grandparents routinely use services like Facebook to connect with their kids and grandkids, they are potentially exposed to the Internet’s panoply of jerks, racists, creeps, criminals, and bullies. They won’t continue to log on if they find their family photos sandwiched between a gruesome Russian highway accident and a hardcore porn video. Social media’s growth into a multibillion-dollar industry, and its lasting mainstream appeal, has depended in large part on companies’ ability to police the borders of their user-generated content—to ensure that Grandma never has to see images like the one Baybayan just nuked.
 
This work is increasingly done in the Philippines. A former US colony, the Philippines has maintained close cultural ties to the United States, which content moderation companies say helps Filipinos determine what Americans find offensive. And moderators in the Philippines can be hired for a fraction of American wages. Ryan Cardeno, a former contractor for Microsoft in the Philippines, told me that he made $500 per month by the end of his three-and-a-half-year tenure with outsourcing firm Sykes. Last year, Cardeno was offered $312 per month by another firm to moderate content for Facebook, paltry even by industry standards.
 
When I asked Microsoft, Google, and Facebook for information about how they moderate their services, they offered vague statements about protecting users but declined to discuss specifics. Many tech companies make their moderators sign strict nondisclosure agreements, barring them from talking even to other employees of the same outsourcing firm about their work.
 
“I think if there’s not an explicit campaign to hide it, there’s certainly a tacit one,” says Sarah Roberts, a media studies scholar at the University of Western Ontario and one of the few academics who study commercial content moderation. Companies would prefer not to acknowledge the hands-on effort required to curate our social media experiences, Roberts says. “It goes to our misunderstandings about the Internet and our view of technology as being somehow magically not human.”
 
A list of categories, scrawled on a whiteboard, reminds the workers of what they’re hunting for: pornography, gore, minors, sexual solicitation, sexual body parts/images, racism. When Baybayan sees a potential violation, he drills in on it to confirm, then sends it away—erasing it from the user’s account and the service altogether—and moves back to the grid. Within 25 minutes, Baybayan has eliminated an impressive variety of dick pics, thong shots, exotic objects inserted into bodies, hateful taunts, and requests for oral sex.
 
More difficult is a post that features a stock image of a man’s chiseled torso, overlaid with the text “I want to have a gay experience, M18 here.” Is this the confession of a hidden desire (allowed) or a hookup request (forbidden)? Baybayan—who, like most employees of TaskUs, has a college degree—spoke thoughtfully about how to judge this distinction.
 
“What is the intention?” Baybayan says. “You have to determine the difference between thought and solicitation.” He has only a few seconds to decide. New posts are appearing constantly at the top of the screen, pushing the others down. He judges the post to be sexual solicitation and deletes it; somewhere, a horny teen’s hopes are dashed. Baybayan scrolls back to the top of the screen and begins scanning again.
 
While a large amount of content moderation takes place overseas, much is still done in the US, often by young college graduates like Swearingen was. Many companies employ a two-tiered moderation system, where the most basic moderation is outsourced abroad while more complex screening, which requires greater cultural familiarity, is done domestically. US-based moderators are much better compensated than their overseas counterparts: A brand-new American moderator for a large tech company in the US can make more in an hour than a veteran Filipino moderator makes in a day. But then a career in the outsourcing industry is something many young Filipinos aspire to, whereas American moderators often fall into the job as a last resort, and burnout is common.
 
The worst was the gore: brutal street fights, animal torture, suicide bombings, decapitations, and horrific traffic accidents. The Arab Spring was in full swing, and activists were using YouTube to show the world the government crackdowns that resulted. Moderators were instructed to leave such “newsworthy” videos up with a warning, even if they violated the content guidelines. But the close-ups of protesters’ corpses and street battles were tough for Rob and his coworkers to handle. So were the videos that documented misery just for the sick thrill of it.
 
“If someone was uploading animal abuse, a lot of the time it was the person who did it. He was proud of that,” Rob says. “And seeing it from the eyes of someone who was proud to do the fucked-up thing, rather than news reporting on the fucked-up thing—it just hurts you so much harder, for some reason. It just gives you a much darker view of humanity.”
 
Given that content moderators might very well comprise as much as half the total workforce for social media sites, it’s worth pondering just what the long-term psychological toll of this work can be. Jane Stevenson was head of the occupational health and welfare department for Britain’s National Crime Squad—the UK equivalent of the FBI—in the early 2000s, when the first wave of international anti-child-pornography operations was launched. She saw investigators become overwhelmed by the images
 
“From the moment you see the first image, you will change for good,” Stevenson says. But where law enforcement has developed specialized programs and hires experienced mental health professionals, Stevenson says that many technology companies have yet to grasp the seriousness of the problem.
 
 
Underpaid and overburdened: the life of a Facebook moderator 
 
“There was literally nothing enjoyable about the job. You’d go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see. Heads being cut off.”
 
That’s how one man, who wished to remain anonymous, characterized his job as a content moderator at Facebook.
 
“We were underpaid and undervalued,” said the man, who earned roughly $15 per hour removing terrorist content from the social network after a two-week training course.
 
Pictures, videos, profiles and groups would be flagged either by users or algorithms to review, and his team would decide if they needed to be removed or escalated.
 
“Every day people would have to visit psychologists. Some couldn’t sleep or they had nightmares.”
 
Psychologists say that the emotional toll of looking at extreme imagery, whether violent terrorist acts or child sexual abuse, can be punishing. Workers exposed to such content should have extensive resiliency training and access to counsellors, akin to the support that first-responders receive. However, testimony from those working to keep beheadings, bestiality and child sexual abuse images off Facebook indicates that the support provided isn’t enough.
 
“The work entails looking at the underbelly of human nature, including betrayal and cruelty. We need to offset the dark side with human connections,” said Naheed Sheikh, co-founder of the Workplace Wellness Project.
 
If moderators have not been trained effectively to handle a particular type of content, it can be especially distressing.
 
 
 

  • 0

7mDFXjl.gif




#2 status - Florian

status - Florian
  • Guests

Posted 24 December 2017 - 03:14 PM

They don't pay these moderators enough to put up with all the problems looking at that kind of garbage can cause. 
 
What about any A.I. detection?
 
Will AI be able to moderate online discussions like humans?
 
Some artificial intelligence products have become so advanced in online discussion moderation that they will no longer be confused by colloquial language, neologisms or spelling mistakes. AI is able to take on routine human tasks, but cannot fully replace human intelligence.
 
Sites have their own policies on what they consider inappropriate.
 
It is able to take on routine human tasks but cannot replace human intelligence
 
The technology behind Utopia Moderator is based on statistical distributions: unlike a traditional spam filter or word list, it examines a comment as a whole and can identify its semantic content.
 
Big data, artificial intelligence, machine learning and algorithms are part of the spearhead of technological development. They are already affecting all of us indirectly, but in the future, their role in an increasingly digitized society will be tremendous.
 
 

  • 0

#3 status - Wilma

status - Wilma
  • Guests

Posted 25 December 2017 - 02:57 PM

There are other kinds of facebook teams out there in cyber space:
 
How Facebook's Secret Unit Created Digital Propaganda Troll Armies To Influence Elections
 

  • 0

#4 status - Funny Faces

status - Funny Faces
  • Guests

Posted 27 December 2017 - 10:42 AM

Granted, it's a given those who post such heinous trash should be punished, all they post should be deleted, and all the negative effects to the moderators should be compensated greater than than are. But, moderating a social network site isn't just about cleaning up the trash. Moderators also have to deal with strong opinions within different ideologies and how they clash with one another. They have to deal with trolls in all kinds of genres. These trolls cast their nets filled with baited language getting people to fill themselves with an emotional connection to an issue. Inciting content designed to inflame others to verbally lash out at those who disagree with an opinion. Most of it is sold to people like a product on the shelf. Packaged with bright colors and big bold letters of copy write and headlines to catch the eye. Always hiding the small print in the back of the box. Blurring the facts with extra small type making it impossible to read without a magnifying glass.
 
On the smaller forums it's possible for owners and administrators to set their own policies regarding the censoring issue. Most of these smaller social networks are usually focused on a specific range of topics. They can break up into different cliques within but for the most part they're filled with an interesting array of ideas regarding any topic of interest. Personalities tend to emerge in these little discussions. Moderators in this realm mainly have to deal with the spammers, drunks, and trolls. Usually adhering to any policy of behavior an owner sets forth.
 
Larger social sites, political forums and religious sites tend to attract their fair amount of paid trolls. Facebook and the Russian agenda during the last election is a good example. Whether true or not it's a great method for spreading ideas out into the public ether. Spreading malicious content on a grand scale takes a lot of resources and planning. It's marketed out like any product to get people to buy into an agenda without really looking into the details of the story. Sometimes, going along with the flow can be dangerous. It can distract away from more important issues facing all of society.
 

  • 0

#5 status - No. 1 Troll

status - No. 1 Troll
  • Guests

Posted 27 December 2017 - 10:58 AM

never_popuar_at_school.jpg


  • 0

#6 Feathers

Feathers

    Premium Member

  • Moderators
  • 666 posts

Posted 30 December 2017 - 02:14 PM

All the admins and mods on the social networking sites are like fisherman. Each has their own job to do on the boat. Some work in the ocean trawling their nets deep into the seas. Others use the rivers and streams to fish in the fresh water. The lakes are nice too. Keeping it fresh and clean is the hard part. Too bad many visitors come and throw their garbage and pollute the waters with extreme poisons of all shapes and sizes.


  • 0
Posted Image

#7 Jesse Jimmie

Jesse Jimmie

    Admin

  • Administrators
  • 1,299 posts

Posted 01 January 2018 - 04:03 PM

 

Granted, it's a given those who post such heinous trash should be punished, all they post should be deleted, and all the negative effects to the moderators should be compensated greater than than are. But, moderating a social network site isn't just about cleaning up the trash. Moderators also have to deal with strong opinions within different ideologies and how they clash with one another. They have to deal with trolls in all kinds of genres. These trolls cast their nets filled with baited language getting people to fill themselves with an emotional connection to an issue. Inciting content designed to inflame others to verbally lash out at those who disagree with an opinion. Most of it is sold to people like a product on the shelf. Packaged with bright colors and big bold letters of copy write and headlines to catch the eye. Always hiding the small print in the back of the box. Blurring the facts with extra small type making it impossible to read without a magnifying glass.
 
On the smaller forums it's possible for owners and administrators to set their own policies regarding the censoring issue. Most of these smaller social networks are usually focused on a specific range of topics. They can break up into different cliques within but for the most part they're filled with an interesting array of ideas regarding any topic of interest. Personalities tend to emerge in these little discussions. Moderators in this realm mainly have to deal with the spammers, drunks, and trolls. Usually adhering to any policy of behavior an owner sets forth.
 
Larger social sites, political forums and religious sites tend to attract their fair amount of paid trolls. Facebook and the Russian agenda during the last election is a good example. Whether true or not it's a great method for spreading ideas out into the public ether. Spreading malicious content on a grand scale takes a lot of resources and planning. It's marketed out like any product to get people to buy into an agenda without really looking into the details of the story. Sometimes, going along with the flow can be dangerous. It can distract away from more important issues facing all of society.

 

 

Does Facebook delete all their objectionable content or do they store it away in the forbidden data box? What about smaller forums and their content? Backups of all postings are gathered in one way or another. If not your own backups others will make them for themselves.  

Deleting (or storing it someplace else like a hard drive) data from a server is a cost cutting device.

As for seeing objectionable content described in the above article? I agree that those who sift through the muck should be compensated justly. Not only monetarily but for their health and service. Doing a job of that nature is more than just a public service. It also serves a higher one. A moral one. Offering ones services to clean the sewers of the social mind is a commendable one. One that is under appreciated.

 

But the sewers aren't the only places where admins and mods find themselves with problems. Moderating drama and ideologies gets really heated. Name calling and argument shifting are prevalent. Professional argumentation skills marketing issues to the forefront of the public mind make it difficult to see which issues should take precedence over another. Many issues are designed to inflict damage upon one group or another. These 'top stories' cause much angst on the social forums. Emotional outbursts are common. How should a moderator proceed? What comments should be deleted or flagged?

 

Those are questions one learns along the way...

 

...along with many others...


  • 0

To Cluck or not to Cluck, that is the question...


#8 status - Christopher Marlowe

status - Christopher Marlowe
  • Guests

Posted 01 January 2018 - 04:26 PM

Working for a corporation makes it easier for moderators to know what to flag. The company sets stringent guidelines all employees must adhere too. Plus, constant training on social do's and don'ts are constantly hammered into the brain. The politics from on high on always looking over your shoulder. 
 
Independent site owners have more leeway in their approach. Working on specific subject matter or genres keeps it growing in variety. I think the independent forums have an opportunity to push themselves forward by showing personalities and character in their presentations. It's like the facebook groups only they have more freedom to set their own rules of conduct. Some of the best ones have great personality. Independents who can gain a decent audience with great content along with members who create entertaining chemistry will stand out. Being educational and interesting at the same time helps to elevate the audience and their attention.
 
 
 
 

  • 0

#9 Ghost in the Machine

Ghost in the Machine

    Premium Member

  • Members
  • PipPipPipPip
  • 263 posts

Posted 02 January 2018 - 11:08 PM

Does Facebook delete all their objectionable content or do they store it away in the forbidden data box? What about smaller forums and their content? Backups of all postings are gathered in one way or another. If not your own backups others will make them for themselves.  

Deleting (or storing it someplace else like a hard drive) data from a server is a cost cutting device.

As for seeing objectionable content described in the above article? I agree that those who sift through the muck should be compensated justly. Not only monetarily but for their health and service. Doing a job of that nature is more than just a public service. It also serves a higher one. A moral one. Offering ones services to clean the sewers of the social mind is a commendable one. One that is under appreciated.

 

But the sewers aren't the only places where admins and mods find themselves with problems. Moderating drama and ideologies gets really heated. Name calling and argument shifting are prevalent. Professional argumentation skills marketing issues to the forefront of the public mind make it difficult to see which issues should take precedence over another. Many issues are designed to inflict damage upon one group or another. These 'top stories' cause much angst on the social forums. Emotional outbursts are common. How should a moderator proceed? What comments should be deleted or flagged?

 

Those are questions one learns along the way...

 

...along with many others...

 

I ran into this article that is relevant:
 
Are Toxic Political Conversations Changing How We Feel about Objective Truth?
 
As political polarization increases in the U.S., the kind of antagonistic exchange exemplified by the Trump-Clinton debate is occurring with increasing frequency—not just among policy makers but among us all. In interactions such as these, people may provide arguments for their views, but neither side is genuinely interested in learning from the other. Instead the real aim is to “score points,” in other words, to defeat the other side in a competitive activity. Conversations on Twitter, Facebook and even YouTube comment sections have become powerful symbols of what the combativeness of political discourse looks like these days. We refer to this kind of discussion as “arguing to win.” 
 
The divergence of Americans’ ideology is accompanied by an animosity for those across the aisle. Recent polls show that partisan liberals and conservatives associate with one another less frequently, have unfavorable views of the opposing party, and would even be unhappy if a family member married someone from the other side. At the same time, the rise of social media has revolutionized how information is consumed—news is often personalized to one’s political preferences. Rival perspectives can be completely shut out from one’s self-created media bubble. Making matters worse, outrage-inducing content is more likely to spread on these platforms, creating a breeding ground for clickbait headlines and fake news. This toxic online environment is very likely driving Americans further apart and fostering unproductive exchanges. 
 
In this time of rising tribalism, an important question has arisen about the psychological effects of arguing to win. What happens in our minds—and to our minds—when we find ourselves conversing in a way that simply aims to defeat an opponent? Our recent research has explored this question using experimental methods, and we have found that the distinction between different modes of argument has some surprisingly far-reaching effects. Not only does it change people’s way of thinking about the debate and the people on the opposing side, but it also has a more fundamental effect on our way of understanding the very issue under discussion. 
 
Are we objectivists or relativists?
 

  • 1

7mDFXjl.gif


#10 Feathers

Feathers

    Premium Member

  • Moderators
  • 666 posts

Posted 02 January 2018 - 11:18 PM

Great article Ghost! I think this kind of thing happens right on down the line within the system. The ancient art of Sophism takes precedent even in the court rooms.

 

Nevermind the social clouds out in cyberspace. Big storms, huge floods, and giant earthquakes abound.

 

Sometimes even a volcano will blow...

 

Too bad issues like this aren't higher up in the media exposure pyramid.

 

Nobody gets a fair shake with this form argumentation. Only those at the top get the full benefit from this mighty machine. And I believe it's kept in the 'family'. The economic factor is the number one contributor in this morass of communication.


  • 1
Posted Image



Reply to this topic



  



Similar Topics Collapse


Also tagged with one or more of these keywords: world fears, human fears

IPB Skin By Virteq