Following the January 6 disturbances at the Capitol, a heated discussion has ensued regarding how platforms censor material and what constitutes protected free expression.
Facebook spends billions of dollars every day reviewing millions of pieces of material, which is a cumbersome and costly process. While TikTok employs content moderators directly, Facebook, Twitter, and YouTube outsource the majority of this laborious task to thousands of people at third-party firms.
Because of the horrific things that they witness when sorting through hundreds or thousands of postings each day, many moderators in the United States and abroad believe they need greater compensation, better working conditions, and improved mental health care. This is because they see these things while doing their jobs.
As a result, some businesses are depending increasingly on algorithms in the belief that they can do the majority of the dirty labor. However, according to experts, robots cannot identify everything, such as the subtleties of hate speech and disinformation. There are also a large number of alternative social networks, such as Parler and Gab, that gained popularity because they promised little content control. Because of this strategy, Parler was temporarily banned from the app stores operated by Apple and Google as well as the hosting services provided by Amazon Web Services.
Other networks, such as Nextdoor and Reddit, depend nearly solely on volunteer moderators.