Advertisement

These Tampa workers see disturbing Facebook posts so you don’t. But it takes a toll.

Content moderators have one of the toughest jobs on the internet. They say they’re losing the fight — and paying a price in the process.
[RON BORRESEN   |   Times]
[RON BORRESEN | Times] [ RON BORRESEN | Ron Borresen ]
Published Aug. 21, 2019|Updated Oct. 4, 2021

Editor’s note: This story contains descriptions of explicit and violent acts and images.

TAMPA — In a Carrollwood office park, deep inside a red brick building, workers sit in front of monitors and fight to clean up the world’s most popular social media platform.

Welcome to the front lines of Facebook.

Most days, it feels like they’re losing.

One day last fall, they got crushed.

“It was aborted babies in buckets, aborted babies lying on the side of the road,” said former content moderator Michele Bennetti, 43. “I was like, ‘Oh my gosh, I can’t take this anymore.’”

Disturbing images filled rows of computer screens. Never before had the workers seen so much graphic content pour in so quickly.

Some walked away from their screens. Some vomited into trash cans at their desks so they wouldn’t be punished for taking bathroom breaks. Others cried.

Managers apologized. They ordered pizza to help employees get through the day.

Cognizant Technology Solutions is paid millions to help Facebook clean up its platform. But several employees told the Tampa Bay Times that incident in the fall of 2018 and management’s response illustrated the company’s dysfunction.

“Buying someone pizza doesn't help them cope with something like that,” said Herbert Wright, 29, a former moderator.

Most of the content the workers inspect is relatively harmless. The rest can be violent, explicit or hateful. Content moderators decide what will stay in peoples’ feeds and what has to go. But the job exacts a toll.

The grueling conditions at Cognizant’s Tampa operation were the subject of a story published earlier this summer by the Verge, an online magazine that covers technology.

The Times spoke with more than a dozen current and former employees who described a bleak workplace. They said they were poorly trained to take on a dreadful job and received little help to deal with the consequences.

Working conditions were oppressive and the turnover frequent. At work, many said they felt in constant danger — psychologically, and at times even physically.

Not only did Cognizant fail them, they told the Times, but the company was failing at its primary mission: making Facebook a better place.

Related: Oct. 31, 2019 update: Cognizant ending content moderation for Facebook, may cut Tampa jobs

• • •

Facebook’s 2.41 billion users make it the largest social media platform in the world. Now it’s attempting content moderation on an unprecedented scale. The ever-evolving experiment comes as the company forges policies and practices while reacting to crisis after crisis.

“What happens in the world makes its way onto Facebook and that, unfortunately, often includes things that don’t belong on our platform and go against our Community Standards,” Facebook spokesman Drew Pusateri said in an email to the Times. “This content therefore needs to come down.”

The company said the standards are designed to prevent users from inciting and coordinating violence. They ban hate speech, bullying and threats, “dangerous individuals and organizations” and material that could encourage suicide.

Facebook said it is developing artificial intelligence tools to help it meet those standards. But right now, the work must be done by humans.

“The content that one might see can be difficult, and we take our responsibility to support the people that do this job very seriously,” Pusateri said.

Cognizant disputed that its workplace is dangerous.

“We have hundreds of people working in this facility who find this work, though challenging, to be quite fulfilling because of the important role they play in protecting people and keeping harmful content off the Internet,” said company spokesman Rick Lacroix in an email to the Times.

In March 2018, the logo for Facebook appears on screens at the Nasdaq MarketSite in New York's Times Square. [AP Photo/Richard Drew]
In March 2018, the logo for Facebook appears on screens at the Nasdaq MarketSite in New York's Times Square. [AP Photo/Richard Drew] [ RICHARD DREW | AP ]

Facebook started outsourcing content moderation in 2009 to places like the Philippines and Ireland. Then came the 2016 presidential election, when “malicious actors” created “fake personas” to spread misinformation, according to a Facebook report. The company faced a backlash. It decided the subtleties of culture and slang were better handled by locals.

Cognizant Technology Solutions won a contract and launched content moderation sites in Phoenix and Tampa in 2017.

The Tampa employees who spoke to the Times said the Verge’s report this summer had little impact. Managers routinely ignored their concerns and instead reminded them how easily they could be let go.

Pressure from Facebook trickled down to Cognizant’s managers, who tried to squeeze the most out of moderators, said a current team leader. The leader’s identity is being withheld by the Times because that person fears speaking out could cost their job.

“Every day we find a new way to threaten them or make them feel like we’re going to fire them,” the team leader said.

• • •

Content moderators start their shifts by leaving their phones in lockers, a rule meant to protect the privacy of the Facebook users they monitor.

Then they take seats at rows of workstations on an open office floor.

The Cognizant office operates 24 hours a day, on four shifts. The pay is $15 an hour — more if they work the overnight shift.

Each moderator sits at a desk with two monitors. They click through a stream of content that users reported for violating Facebook’s policies, deciding whether to “ignore” or “delete” each image or video. They must also explain what policy was violated and apply filters like “cruel” or “disturbing.”

Facebook says it does not use a quota system. But moderators say they feel pressure to get through an average of 270 reported violations a day.

As content flashes onto their screens, there is no warning of what images, videos or messages the moderators will see next. Nearly every content moderator who spoke to the Times recalled disturbing videos that haunt them: a child flayed alive; the rape of an infant; pigs doused with gas and set on fire.

Sometimes people quit on the spot.

In other cases, moderators couldn’t help but watch with morbid curiosity. Bennetti recalled her co-workers standing around her desk, rewatching a video of a man having sex with roadkill.

Hate speech bothered some employees more than the graphic violence. Most workers in the Tampa office are black.

“You have them working in a queue where people are saying, ‘We should kill all black people, murder all black babies, rape all black women,’” the team leader said.

Some former employees said they didn’t find the content or workplace too distressing. The benefits were pretty good. For some, it beat working at McDonald’s.

But they weren’t sorry to leave, either. Former employees say two years seemed long enough.

Former Cognizant moderator Afia Amfo, 25, put it this way:

“If you have a weak stomach, if you care a lot about others, this is not the right job for you.”

• • •

Wright, the former moderator, said he was one of the first employees to join Cognizant’s Tampa office in 2017. It was a different place then.

Initially, his boss told him moderators would work 35 hours a week, with enough “wellness time” to recover. Graphic content training days were kept short so as not to overload anyone. Psychological counselors were always on site.

A few months later, he said, that management team was pushed out.

The office was apparently performing poorly, Wright said. New executives came in, who cracked down like “drill sergeants.” They extended the work week to 40 hours and whittled “wellness time” to a few minutes a day.

Supervisors strictly monitored lunch and bathroom breaks. The counselors’ hours were cut. The office was often left without any counselor, especially on the overnight shift, when the worst content streamed in. Instead, managers encouraged moderators to call a hotline.

Cognizant spokesman Lacroix said the company will be expanding on-site counseling to 24 hours a day.

Wright’s biggest frustration was that a job requiring human analysis was ruthlessly reduced to metrics.

In a matter of minutes, each moderator had to settle tough questions: Is a joke about autism “cruel?” Is a post denigrating immigrants hate speech? Should a viral video of children eating Tide Pods stay online? What if someone threatens suicide while streaming a video live on Facebook?

Supervisors kept track of moderators’ “accuracy score,” but the guidelines are long and complicated. If employees asked too many questions, they feared supervisors would accuse them of wasting time. If they were too slow or made mistakes, their scores dipped and they could be fired.

Facebook says each moderator receives at least 80 hours of instructor-led training and hands-on practice. But Wright, who worked as a trainer in 2018, said instruction was insufficient in the Tampa office and many employees failed at the job because they weren’t well-prepared.

Facebook constantly updated its community standards, adding to the confusion. The quizzes used to train workers were often outdated and contained wrong answers.

To many employees, pressure to make fast decisions conflicted with the nuances of the job. Content moderation was an art, they said, not a science.

• • •

Lacroix said job candidates receive “detailed and specific information” throughout the hiring process. Yet employees the Times spoke to said they were unprepared for the nature of the job.

For Shawn Speagle, the job seemed like a good opportunity. He graduated college in 2017 and believed this was his chance to eventually build a career at Facebook.

During his job interview, he said the hiring manager showed him a picture of a baby that had been killed in a chemical attack in Syria. He might see content like that from time to time, the manager said.

But Speagle said he didn’t realize how viewing such brutal images would deepen his depression. He started getting panic attacks and coped by overeating. The on-site counselor had no answers for him.

“Even he was shocked by what he was seeing,” said Speagle, now 26.

Wright said he started having nightmares after joining Cognizant. His girlfriend was pregnant at the time. Watching videos of babies being sexually assaulted gutted him.

He tried to keep what he saw to himself, to shield his family and because of his non-disclosure agreement. But his feelings never stayed buried.

“I used to wonder why I was always angry and upset,” Wright said.

When employees brought up mental health concerns, the team leader said supervisors politely encouraged them to consider leaving. They might not be a “good fit.”

Psychotherapist Kathleen Heide, a University of South Florida professor who specializes in treating trauma survivors, said repeated exposure to disturbing content can lead to post-traumatic stress disorder.

Emotional trauma worsens the longer someone works with disturbing content.

“If somebody sees something really sadistic or torturous, they should be able to shut that down immediately and say, ‘I’ve seen enough,’” Heide said.

Facebook said it is working on tools that could help moderators by allowing them to blur graphic images, display them in black and white and choose whether to listen to the audio.

Shawn Speagle used to work as a content moderator at Cognizant, a Tampa company that has a contract to remove disturbing content from Facebook. [SCOTT KEELER   |   Times]
Shawn Speagle used to work as a content moderator at Cognizant, a Tampa company that has a contract to remove disturbing content from Facebook. [SCOTT KEELER | Times] [ "SCOTT KEELER | TIMES" | Tampa Bay Times ]

For now, the employees who spoke to the Times complained that Cognizant didn’t give them the resources they needed to deal with their traumas — or even enough time to get away from their screens.

They were allowed to use “tranquility rooms,” but breaks had to be kept short. Some said they could take 9-minute breaks. Others said it was just seven minutes.

Lacroix disputed that, saying Cognizant offers resources to its employees and doesn't limit breaks. Facebook reviews and approves those policies.

“We allow employees to use Wellness time at their discretion and understand that at times they may need more time away from moderating than others,” he said in an email.

But at least eight employees the Times spoke with said managers would penalize moderators if they spent too much time away from their desks. Some said supervisors even went looking for workers who were on breaks.

Wright recalled seeing lists that showed who was over the allotted break time.

The team leader who spoke to the Times said managers were expected to discuss long breaks with employees. That was a warning.

“They're not asking because they care,” the team leader said. “They are asking because you are negatively impacting their metrics.”

• • •

The content wasn’t the only traumatic thing about Cognizant, employees told the Times.

Fights broke out on the floor. People had sex in the tranquility rooms and took drugs in the parking lot. Sexual harassment was common. The bathrooms were filthy, at times smeared with feces. A manager called them “vile” in an internal email.

Employees were also surprised by some hiring decisions. Wright remembers a military veteran who had a service dog to help him deal with post-traumatic stress disorder.

Some employees worried that a co-worker might snap — some spoke openly about guns — or that someone angry at Facebook might target their office. They were never briefed on an emergency plan, such as what to do in case of an active shooter. Cognizant says the company has armed security personnel on site 24 hours.

But the team leader said that level of security started in July, after an argument between two employees outside led to deputies being called to the office.

One night in March 2018, a man working the overnight shift suffered a heart attack and died. There is no indication his death was job-related, but multiple employees told the Times that management’s reaction disturbed them.

Some said they only heard about the incident through word of mouth. Others remember attending a town hall meeting where managers focused on reminding everyone to keep the employee’s death quiet.

“They didn’t even send an email,” the team leader told the Times. “That just showed how much you didn’t value this individual.”

Lacroix said employees “were communicated with and provided support following this incident.”

A May 2018 photo of Facebook CEO Mark Zuckerberg delivering the keynote speech at F8, Facebook's developer conference, in San Jose, Calif. [AP Photo/Marcio Jose Sanchez]
A May 2018 photo of Facebook CEO Mark Zuckerberg delivering the keynote speech at F8, Facebook's developer conference, in San Jose, Calif. [AP Photo/Marcio Jose Sanchez] [ MARCIO JOSE SANCHEZ | AP ]

The employees also told the Times they worked amid a constant state of job insecurity. Dismissals came without warning. They even had a name for it: “red bag days,” because managers used red bags to collect an employee’s belongings as they were whisked out the door.

Workers also believe executives targeted those they didn’t like, pushing them out. Supervisors reprimanded employees for small things, like dress code violations, standing up to stretch or writing an email expressing concerns. Too many “occurrences,” as they were called, could cost someone their job.

“Retaliation is very real,” said Wright. Moderators had to be escorted inside the human resources department, which sits behind a locked door.

Promotions were promised, then rescinded. Paycheck errors weren’t fixed for months.

Lacroix said Cognizant has a zero-tolerance approach to all forms of discrimination, harassment and bullying.

Facebook says its attrition rate is significantly lower than other content moderation sites. The global average for the industry is 40 to 60 percent, the company said. However, neither Cognizant nor Facebook said what the attrition rate is for the Tampa office, which employs about 600 people.

Content moderators have started suing over their work conditions. A California woman who worked for a subcontractor sued Facebook, alleging she suffered psychological damage.

Locally, 14 current and former employees have sought legal help from labor attorney K.C. Hopkinson. She said she has taken complaints of discrimination and harrassment to the Equal Employment Opportunity Commission and the Florida Commission on Human Relations.

• • •

Ultimately, content moderators are being asked to fight a war that no one seems to be winning.

“We were told our job was super important, cleaning up Facebook to make it safe for kids,” said Melynda Johnson, 39, a former content moderator for Cognizant.

But no matter how hard the moderators worked, she said, they could never truly rid the platform of graphic, obscene content. Eventually she realized, “You are not cleaning anything up, you are just wasting time.”

Wright remembers discovering private Facebook pages filled with people trading child pornography and animal abuse photos.

He could shut the pages down, delete the photos, but they always seemed to pop up again.

“For every page they are deleting,” Wright said, “there are probably three or four taking its place.” It haunted him.

Facebook and WhatsApp icons sit side-by-side in an iPhone in Gelsenkirchen, Germany. [AP Photo/Martin Meissner]
Facebook and WhatsApp icons sit side-by-side in an iPhone in Gelsenkirchen, Germany. [AP Photo/Martin Meissner] [ MARTIN MEISSNER | AP ]

There aren’t best practices for content moderation jobs, said Sarah Roberts, an assistant professor of information studies at UCLA. Each social media company — Facebook, Twitter and YouTube — came up with its own ways of figuring out what to ban, she said.

Last year, YouTube announced its content moderators would only be on duty for four hours at a time. Roberts said it’s unclear whether that was based on any kind of study or analysis.

“The problem is there is more content to be processed than there are people to fill the seats,” she said. She is skeptical of the effectiveness of outsourcing this kind of labor. The Tampa office, she said, sounds like “straight-up call center hiring.”

Johnson believes Cognizant’s true purpose isn’t to effectively moderate content — it’s to make it look like Facebook is doing something about the problem:

“I think they could care less that this stuff is on their platform as long as people keep using this platform and selling ads and making money.”

Times senior news researcher Caryn Baird contributed to this report. Contact Kavitha Surana at ksurana@tampabay.com or (727 0 893-8149. Follow @ksurana6. Contact Dan Sullivan at dsullivan@tampabay.com. Follow @TimesDan.