I'll believe it when I see it. YouTube (google) doesn't care much anymore.
Like most big tech companies these days, YT/Google is primarily interested in automation (in this case of content flagging). It'd be interesting to know if the new staff will be flagging content manually (doubtful), reviewing content that has already been flagged by their algorithms (mos likely) or refining their automated flagging algorithms (also possible). “Today, 98 percent of the videos we remove for violent extremism are flagged by our machine-learning algorithms,” Wojcicki wrote. “Our advances in machine learning let us now take down nearly 70 percent of violent extremist content within eight hours of upload and nearly half of it in two hours and we continue to accelerate that speed,” she added.
Sounds like their most useful would be at the "manual appeal" station. It's where creators whose content got flagged send request for manual reviews, after which most videos lose their flagging. Also sounds like Google is massive on algorithms. They're the technoshamans of the big stage.
So, they're only doing the policing when inaction would lead to an uproar. The lines that define their actions are starting to get blurry for me.
Cool stats from the same thread: A thousand days' worth of content is being uploaded to YouTube every hour. That's 24,000 days' worth a day. 10% × 24,000 = 2,400 days × 24 hours = 57,600 hours. $10,000,000/day? ÷ 57,600 hours ≈ $174/hr. of content. (there's a better way to write the whole equation side, but I really wanted to use the multiplication sign)