Social media content moderators charge for better rights -Dlight News

Social media content moderators charge for better rights

Content moderators for platforms like Facebook and Instagram have some of the most serious jobs in social media, checking illegal, violent and harmful content, often through long shifts and for little pay.

These workers are now leading the charge for better working conditions, putting pressure on platforms that have largely failed to recognize formal union representation.

A workers’ group in Germany last week drew up a list of working standards for businesses, including giving staff the right to bargain collectively and join unions or works councils.

“We believe this is just the beginning,” said Hikmat Al-Hamouri of the German union Verdi. “More and more content mediators are now joining us and learning about the power they can access through trade unions and works councils.

“For too long, the big social media companies have behaved as if they didn’t have to respond to the labor movement in Germany. They are going to get a big wake-up call.”

Against a backdrop of brutal job losses across the technology sector, many employees are reluctant to demand better rights.

“People are afraid of retaliation and are worried that, because of the recent relaxation, they cannot plan as it may affect their employment,” said a Tiktok employee.

Hikmat Al-Hamouri of the German union Verdi: 'The big social media companies have behaved as if they don't have to respond to the labor movement in Germany.  They're about to get a big wake-up call'.

Hikmat Al-Hamouri of the German union Verdi: ‘The big social media companies have behaved as if they don’t have to respond to the labor movement in Germany. They’re about to get a big wake-up call’ © Christian von Pollentz

But deteriorating conditions and high living costs are encouraging workers to stand up to their managers and employers.

“Workers are starting to talk more about unionization. . . It’s an example of how the bright futuristic world of employment that tech companies tried to project has been completely stripped away,” said Bruce Daisley, Twitter’s former European vice-president turned consultant on workplace culture.

However, he warned: “Increasing job cuts mean there is very little leverage for workers to organize.”

The employment contract at TikTok includes a clause prohibiting employees from discussing their salaries with each other, according to staff, which they consider a measure to prevent fair and equal pay.

TikTok said it fully supports the rights of employees and complies with “collective employment regulations, including in relation to trade unions”.

The average salary for a content moderator in the UK is around £25,000 per year, according to jobs website Glassdoor. Content moderators hired by third-party contractors are reported to be paid close to minimum wage and often provide the most disturbing footage. Social networks including Meta, TikTok and YouTube hire external contractors to do this work.

“We are treated like machines, not humans,” said one moderator who reviewed the meta content through a third-party company. He said he suffered from health problems after repeated exposure to terrorist and suicide content on the platform.

“I thought the job would never affect me, but after a while, it affected my mental health. First, it was a lot of nightmares, and then I got a lot of disturbing physical symptoms.

Among the demands listed by the collective of German mediators is a request for independent and appropriate mental health support for mediators. Workers are sometimes asked to sign a waiver, expressly acknowledging the health risks of their role.

Meta said the third-party companies it uses are required to pay above industry standards and audit them twice a year. These companies must provide 24/7 on-site support with trained practitioners and access to private healthcare, the company added.

A former TikTok moderator criticized the endorsement: “They have yoga videos and you can, [and] Random people you can go to who aren’t properly qualified and aren’t even confidential.”

TikTok said it provided psychological support to all content moderators, including independent and qualified mental health support.

Another issue is how staff are monitored internally, with moderators for TikTok and Meta telling the FT that their work is closely tracked, including how many seconds it takes to review each piece of content.

“The process gave me a headache because I had to review around 1,000 videos per day,” said the former TikTok moderator. “The system is tracking every second of our activity [and] We will be judged on our performance. . . Prioritizing speed over quality.

Another former TikTok moderator said that even when the content was innocuous, it was too repetitive and viral songs kept getting stuck in his head, affecting his sleep.

Meta and TikTok are launching formal worker representation in European offices through the Staff Work Council, legally enforceable bodies that exist to represent staff on matters including wages, hours and working conditions for large companies.

Franziska, chair of the German TikTok Works Council, which was set up in late 2022, said she had negotiated an additional €50 monthly payment for employees working remotely three times a week and from home.

He is broadly positive about the measures TikTok has introduced for content moderators but believes they should be better paid.

“Although social media companies will state that content moderation is the front-line and most important element in these companies, it is not reflected in payment,” she said.

“We do this only to protect children and users on the platform but . . . We also protect freedom of speech and democracy and protect people from misinformation. In fact it is a very important cog in this world that is not respected that way.

Source link