[ad_1]
Meta chief Mark Zuckerberg has ended eight years of independent fact-checking on Facebook and Instagram claiming it was stifling âtoo many voices”. This has prompted fears of unchecked spread of propaganda and fakery. Mint explains the facts behind the row.
What did Zuckerberg announce last week?
Zuckerberg said Metaâs platforms will phase out fact-checks in favour of âcommunity notesââmade popular by Elon Muskâs X (formerly Twitter). The latter will use a consensus mechanism from users of a platform and add comments if certain information is inaccurate. Zuckerberg also pulled back filters on permissible posts and promised more political posts on Facebook, Instagram and Threads. He said there will be fewer automatic bans of posts and accounts because âmachines, they make mistakes⌠of late theyâre stifling too many voices.” The move begins in the US, but will eventually expand worldwide.
Why is the move of importance?
Experts say fact-checking is crucial to handling nationalistic, polarizing content around the world. Plus, intermediaries such as Metaâs platforms arenât meant to opine on subjectsâtheir âsafe harbourâ protection is based on this logic. Political agendas leading to misinformation are widespread today. However, some point to bias among fact-checkers: though inevitable, it can lead to them making qualitative judgment calls. Various political ideologies have claimed in the past decade that tech firms have typically leaned toward liberalsâciting this as a reason they shouldnât be called âintermediariesâ.
Does India have its own fact-checking units?
Meta has 11 fact-checking partners in India. India is Metaâs most complicated fact-checking geography, with teams having to check facts in 18 languagesâthe most worldwide. Alt News does independent third-party fact-checks, too. The Centre created its own fact-check unit, but it was stayed by courts last year. For most fact-checkers Big Tech is a key revenue source.
Can this end filtering misinformation?
Community notes check for agreement between users who typically do not agree. However, the process has often failed to filter misinformationâdomestic experts said the lack of human fact- checkers is one reason for the waning influence of X in Indiaâone of the worldâs largest online economies. That said, Meta could tweak its efforts to add some degree of human intervention in the next one year. Doing so could be critical to Metaâs goal of selling its platforms as a hub for business and policy ads to the online world.
How do Indian laws define fact-checking?
IT Rules, 2021 puts the onus on social media intermediaries to do due diligence and rid platforms of misinformation. Lawyers say âdue diligenceâ can be subjectiveâthus, the community notes mechanism could work just fine, like it is for X. There is, however, no legal requirement for human fact-checking. Many said this may work against any ruling political party in states or at the Centre as community notes automatically filter agendas and suggest other political views as a potential middle ground.
[ad_2]
Source link
