a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by illu45
illu45  ·  2326 days ago  ·  link  ·    ·  parent  ·  post: YouTube promises to increase content moderation staff to over 10K in 2018

    “Today, 98 percent of the videos we remove for violent extremism are flagged by our machine-learning algorithms,” Wojcicki wrote. “Our advances in machine learning let us now take down nearly 70 percent of violent extremist content within eight hours of upload and nearly half of it in two hours and we continue to accelerate that speed,” she added.

Like most big tech companies these days, YT/Google is primarily interested in automation (in this case of content flagging). It'd be interesting to know if the new staff will be flagging content manually (doubtful), reviewing content that has already been flagged by their algorithms (mos likely) or refining their automated flagging algorithms (also possible).





FirebrandRoaring  ·  2325 days ago  ·  link  ·  

Sounds like their most useful would be at the "manual appeal" station. It's where creators whose content got flagged send request for manual reviews, after which most videos lose their flagging.

Also sounds like Google is massive on algorithms. They're the technoshamans of the big stage.