Today, let’s talk about some of the front-line workers at Facebook and Google working on the pandemic: the content moderators who keep the site running day in and day out. Like most stories about content moderators, it’s a tale about difficult tradeoffs. And actions taken over the past few days by Facebook and YouTube will have significant implications for the future of the business.
First, though, some history.
At first, content moderation on social networks was a business problem: let in the nudity and the Nazis, and the community collapses. Later, it was a legal and regulatory problem: despite the protections afforded by Section 230, companies have a legal obligation to remove terrorist propaganda, child abuse imagery, and other forms of content. As services like YouTube and Facebook grew user bases in the billions, content moderation became more of a scale problem: how do you review the millions of posts a day that get reported for violating your policies?
The solution, as I explored last year in a series of pieces for The Verge, was to outsource the job to large consulting companies. In the wake of the 2016 election, which revealed a deficit of content moderators at all the big social networks, tech companies hired tens of thousands of moderators around the world through firms including Accenture, Cognizant, and Genpact. This, though, created a privacy problem. When your moderators work in house, you can apply strict controls to their computers to monitor the access they have to user data. When they work for third parties, that user data is at much greater risk of leaking to the outside world.
The privacy issues surrounding the hiring of moderators generally haven’t gotten much attention from journalists like me. (Instead we have been paying attention to their generally awful working conditions and the fact that a subset of workers are developing post-traumatic stress disorder from the job.) But inside tech companies, fears over data leaks ran strong. For Facebook in particular, the post-2016 election backlash had arisen partly over privacy concerns — once the world learned how Cambridge Analytica intended to use information gleaned from people’s Facebook use, trust in the company plunged precipitously.
That’s why outsourced content moderation sites for Facebook and YouTube were designed as secure rooms. Employees can work only on designated “production floors” that they must badge in and out of. They are not allowed to bring in any personal devices, lest they take surreptitious photos or attempt to smuggle out data another way. This can create havoc for workers — they are often fired for inadvertently bringing phones onto the production floor, and many of them have complained to me about the way that the divide separates them from their support networks during the day. But no company has been willing to relax those restrictions for fear of the public-relations crisis a high-profile data loss might spark.
Fast-forward to today, when a pandemic is spreading around the world at frightening speed. We still need just as many moderators working to police social networks, if not more — usage is clearly surging. If you bring them to the production floor to continue working normally, you almost certainly contribute to the spread of the disease. And yet if you let them work from home, you invite in a privacy disaster at a time when people (especially sick people) will be hyper-sensitive to misuses of their personal data.
Say you’re Facebook. What do you do?
Until Monday, the answer looked a lot like business as usual. Sam Biddle broke the story in The Intercept last week. (Incidentally, the publication that The Interface is most frequently mistaken for.)
Discussions from Facebook’s internal employee forum reviewed by The Intercept reveal a state of confusion, fear, and resentment, with many precariously employed hourly contract workers stating that, contrary to statements to them from Facebook, they are barred by their actual employers from working from home, despite the technical feasibility and clear public health benefits of doing so.
The discussions focus on Facebook contractors employed by Accenture and WiPro at facilities in Austin, Texas, and Mountain View, California, including at least two Facebook offices. (In Mountain View, a local state of emergency has already been declared over the coronavirus.) The Intercept has seen posts from at least six contractors complaining about not being able to work from home and communicated with two more contractors directly about the matter. One Accenture employee told The Intercept that their entire team of over 20 contractors had been told that they were not permitted to work from home to avoid infection.
In fairness, Facebook was far from alone in not having deployed a full plan for its contractors last Thursday. Some American companies are still debating what to do with their full-time workforces this week. But as Biddle notes, Facebook wasn’t one of those: it was already encouraging employees to work from home. This prompted justified criticism from contract workers — some of whom petitioned Facebook to act, Noah Kulwin reported in The Outline. (Googlers are circulating a similar petition on behalf of their own contract coworkers, Rob Price reported at Business Insider.)
On Monday night, Facebook did act. As of Tuesday, it began to inform all contract moderators that they should not come into the office. Commendably, Facebook will continue to pay them during the disruption. Here’s the announcement:
For both our full-time employees and contract workforce there is some work that cannot be done from home due to safety, privacy and legal reasons. We have taken precautions to protect our workers by cutting down the number of people in any given office, implementing recommended work from home globally, physically spreading people out at any given office and doing additional cleaning. Given the rapidly evolving public health concerns, we are taking additional steps to protect our teams and will be working with our partners over the course of this week to send all contract workers who perform content review home, until further notice. We’ll ensure that all workers are paid during this time.
The news followed a similar announcement from Google on Sunday. It was followed by a joint announcement from Facebook, Google, LinkedIn, Microsoft, Reddit, Twitter, and YouTube that they “are working closely together on COVID-19 response efforts,” including a commitment to remove fraud and misinformation related to the virus and promote “authoritative content.” (I’m told the announcement is unrelated to the shift in content moderation strategies, but it points to a future where companies collaborate more on removing harmful posts.)
OK, so the content moderators have mostly been sent home. How does stuff get … moderated? Facebook allowed some moderators who work on less sensitive content — helping to train machine-learning systems for labeling content, for example — to work from home. More sensitive work is being shifted to full-time employees. But the company will also begin to lean more heavily on those machine-learning systems in an effort to automate content moderation.
It’s the long-term goal of every social network to put artificial intelligence in charge. But as recently as December, Google was telling me that the day when such a thing would be possible was still quite far away. And yet on Monday the company — out of necessity — changed its tune. Here’s Jake Kastrenakes at The Verge:
YouTube will rely more on AI to moderate videos during the coronavirus pandemic, since many of its human reviewers are being sent home to limit the spread of the virus. This means videos may be taken down from the site purely because they’re flagged by AI as potentially violating a policy, whereas the videos might normally get routed to a human reviewer to confirm that they should be taken down. […]
Because of the heavier reliance on AI, YouTube basically says we have to expect that some mistakes are going to be made. More videos may end up getting removed, “including some videos that may not violate policies,” the company writes in a blog post. Other content won’t be promoted or show up in search and recommendations until it’s reviewed by humans.
YouTube says it largely won’t issue strikes — which can lead to a ban — for content that gets taken down by AI (with the exception of videos it has a “high confidence” are against its policies). As always, creators can still appeal a video that was taken down, but YouTube warns this process will also be delayed because of the reduction in human moderation.
All that represents a huge bet on AI at a time when, as the company itself notes, it is still quite error-prone. And on Monday evening, both Facebook and Twitter followed suit. Here’s Paresh Dave in Reuters:
Facebook also said the decision to rely more on automated tools, which learn to identify offensive material by analyzing digital clues for aspects common to previous takedowns, has limitations.
“We may see some longer response times and make more mistakes as a result,” it said.
Twitter said it too would step up use of similar automation, but would not ban users based solely on automated enforcement, because of accuracy concerns.
So many of tech platforms’ troubles with regulators and elected officials over the past couple years have come down to content moderation. Which posts did they allow to stay up? Which did they wrongfully take down? Which posts did they amplify, and which did they suppress?
At global scale, the companies were making plenty of mistakes even with the benefit of human judgment. As of Tuesday, they will be entrusting significantly more to the machines. The day one result was not great. Here’s Josh Constine in TechCruch:
Facebook appears to have a bug in its News Feed spam filter, causing URLs to legitimate websites including Medium, BuzzFeed, and USA Today to be blocked from being shared as posts or comments. The issue is blocking shares of some coronavirus-related content, while some unrelated links are allowed through, though it’s not clear what exactly is or isn’t tripping the filter. Facebook has been trying to fight back against misinformation related to the outbreak, but may have gotten overzealous or experienced a technical error.
I’m sure that bug will be fixed before too long. (Facebook says it’s not related to changes in content moderation.) In the meantime, my thoughts are with the moderators who kept showing up to work every day for the past week even as they knew it put them in physical danger. One Facebook moderator working for Accenture recalled how the company began putting out more hand sanitizer in February as the threat worsened, but waited until Tuesday to tell him to stay home. This came after days, if not weeks, of employees telling Accenture that their partners and roommates had been exposed to the disease.
“We were working with people who where exposed, definitely,” the moderator told me. “I think they have moved too late, and the actions initially taken were clearly insufficient.”
Today in news that could affect public perception of the big tech platforms.
Trending up: Facebook plans to award $100 million in cash grants and ad credits for up to 30,000 small businesses in 30 countries around the world. The money is aimed at helping them deal with the economic impact of the coronavirus outbreak.
Trending up: Facebook partnered with the International Fact-Checking Network to give $50,000 grants to organizations working on fact-checking misinformation related to COVID-19. The total budget for the partnership is $1 million. (Poynter)
Trending up: A group of the biggest tech companies in the US has banded together to fight coronavirus-related fraud and misinformation. The group includes Facebook, Google, LinkedIn, Microsoft, Reddit, Twitter, and YouTube.
Here’s the latest in the United States:
Here’s a map of where coronavirus cases have been confirmed so far across the United States. (Sara G. Miller and Jiachuan Wu / NBC News)
Why has the rollout of COVID-19 testing been so slow in the United States? The tests we have generally need to be sent to a lab, and the process is slow. Nicole Westman / The Verge)
Here’s what’s going on with the big companies:
Alphabet’s health-care company Verily ran tests for about 20 people on its first day of screening for the coronavirus. The firm said it is working with the state of California to expand the program. (Gerrit De Vynck / Bloomberg)
Facebook is adding $1,000 to its employees’ next paychecks to deal with the coronavirus fallout. It’s also giving everyone their full bonus for the quarter regardless of their performance.
Google is delaying the rollout of its informational coronavirus website to “later this week.” The site was at the center of the controversy we talked about yesterday. (Dieter Bohn / The Verge)
Apple is keeping its retail stores outside mainland China closed indefinitely as the global spread of the coronavirus continues. The iPhone maker was originally targeting March 27th to reopen locations in the US and elsewhere around the world. (Nick Statt / The Verge)
Demand for Amazon delivery is soaring as more people are forced to stay home. Some Amazon workers worry the situation is creating a potential health crisis, and say the company isn’t doing enough to protect them. (Caroline O’Donovan and Ken Bensinger / BuzzFeed)
Amazon is prioritizing the shipment of “household staples, medical supplies and other high-demand products” due to the coronavirus pandemic. The company is also suspending some of its “Fulfillment by Amazon” program, which typically provides warehouse and shipping services for products from third-party sellers. (Darrell Etherington / TechCrunch)
At least five workers at Amazon warehouses in Europe have contracted the coronavirus. It’s a sobering development for a company already struggling to hire enough people to deal with the spike in orders. (Matt Day, Daniele Lepido, Helene Fouquet and Macarena Munoz Montijano / Bloomberg)
CVS’s Chief Medical Officer sent employees an email with tips on how to stay safe during the coronavirus pandemic. It included strikingly similar misinformation from the fake Stanford tips we debunked here. Not a good look for a pharmacy!
Uber expanded its previously announced policy on sick pay for drivers during the coronavirus pandemic. Now, drivers who test positive for COVID-19 or have their Uber accounts suspended as the result of public health advice will be eligible for up to 14 days of paid sick leave. (Andrew J. Hawkins / The Verge)
Uber and Lyft suspended Uber Pool and shared rides due to the worsening outbreak of COVID-19. UberX and Uber Eats are still running. (Ryan Broderick / BuzzFeed)
Coronavirus has prompted a wave of direct donations for individuals and businesses hardest hit by the crisis. The giving campaigns are often organized on social media. (Nicholas Kulish / The New York Times)
With millions of people working and learning from home during the pandemic, internet networks are being pushed to the limit. Many providers are rolling out new policies to help people who can’t pay their bills, and preparing to increase capacity on the networks if needed. (Davey Alba and Cecilia Kang / The New York Times)
Coronavirus testing shouldn’t be this hard, but limited investment in the necessary technology means the US is lagging behind other countries in terms of getting fast, reliable tests out the door. (Nicole Wetsman / The Verge)
Coronavirus is making Instagram more intimate. Without a steady stream of brunch photos and beach-vacation selfies, the platform has mutated into close-up scrapbooks of days spent cooped up inside. (Kaitlyn Tiffany / The Atlantic)
⭐The Justice Department dropped its two-year-long prosecution of a Russian company indicted in the Mueller election interference probe. The company was one of three businesses indicted for allegedly carrying out a long-running scheme to criminally interfere with the 2016 election. This seems like a disaster. Here’s Spencer S. Hsu at The Washington Post:
Assistants to U.S. Attorney Timothy Shea of Washington and Assistant Attorney General for National Security John C. Demers cited an unspecified “change in the balance of the government’s proof due to a classification determination,” according to a nine-page filing accompanied by facts under seal.
Prosecutors also cited the failure of the company, Concord Management and Consulting, to comply with trial subpoenas and the submission of a “misleading, at best” affidavit by Yevgeniy Prigozhin, a co-defendant and the company’s founder. Prigozhin is a catering magnate and military contractor known as “Putin’s chef” because of his ties to Russian President Vladimir Putin.
Facebook’s misinformation problem is rooted in its business model: data-targeted ads and algorithmically optimized content. In a new report, researchers at Ranking Digital Rights lay out a prescription for fixing the company. Here’s a good interview with one of the report’s co-authors. (Russell Brandom / The Verge)
⭐Zoom has become the place where we work, go to school and party these days. And while the company was prepared to grow when the coronavirus started to spread, nothing could have prepared it to become a cultural phenomenon. Here’s Taylor Lorenz at The New York Times:
A Facebook group for young people trapped at home called Zoom Memes for Self Quaranteens, founded less than a week ago, has already grown to more than 150,000 members.
College students across the country are going on Zoom blind dates. Parents of sixth-graders at Rosenbaum Yeshiva Of North Jersey organized a Zoom “recess” for their children. Ethel’s Club, a wellness platform, is conducting Zoom tarot card readings, breath work and cannabis hangouts.
It is a high-stakes moment for Zoom, which was founded in 2011 by Eric Yuan, a former Cisco Systems executive. Its sudden cultural cachet also brings new concerns over privacy, security, content moderation, safety for young people and sensitivity to the seriousness of the pandemic. There’s also the tiny matter of keeping the service up and running.
A day in the life of a bike messenger who rides for DoorDash, Uber Eats, and Postmates, in the time of coronavirus. “I thought I’d be getting fat-ass tips. I’m not getting fat-ass tips,” he said. People — the time to give fat-ass tips is now. (Matt DeCaro / Vice)
Sleep gadgets—like the Oura ring worn by Jack Dorsey — are everywhere. My main takeaway here is that Dorsey is still sleeping very well. (Ruth Reader / Fast Company)
Nintendo’s online services temporarily went down. It could be a reflection of high demand during the pandemic. (Michael McWhertor / Polygon)
Cameo turned D-list celebrities into an addictive monetization machine by allowing them to charge for shout-outs. Now, people pay them anywhere from $5 to $2,500 to send short videos, delivered via text or email. And you can do it from home. The perfect pandemic business! (Patrick J. Sauer / Marker)
Stuff to occupy you online during the quarantine.
Browse this list of projects that can help with the COVID-19 response. It’s geared toward software engineers, but anyone can make a contribution. Designers and product managers would also probably be useful here. As well as anyone who can donate.
Give Local. Support a local business by buying a gift card online. Lots to choose from in San Francisco, Austin, Chicago, New York, and more.
Save Our Faves is a similar idea focused on San Francisco from Mike and Kaitlyn Krieger. (You may remember Mike from such previous projects as Instagram.)
Here are 450 free Ivy League courses you can take at home. Get smart while you pass the time!
Make one of these toasts. A definitive ranked list by a former Verge staffer.
Play Kingdom Rush Frontiers (iOS and Android) and Kingdom Rush Origins (iOS and Android), two of the best tower-defense strategy games ever made. I have spent days of my life playing this game, and now they’re free for a week.
Subscribe to Shudder, a streaming service for horror movies. It’s free for the next 30 days. Distract yourself from terror with a more entertaining form of terror!
If you’re going to the hospital for a covid-19 test make sure you bring with you a valid form of identification along with a printout of your IMDB page and/or your Basketball Reference stats