Two Rohingya refugees, Mohammad Hamim and Kawsar Mohammed, have filed a Public Interest Litigation (PIL) in the Delhi High Court, seeking intervention to address the issue of hateful and inflammatory content against the Rohingya community on Facebook (now Meta).
The PIL calls for directions to Meta to cease the spread of such content and dismantle algorithms that promote hate speech and violence against minority communities. Advocate Kawalpreet Kaur, representing the petitioners, has alleged that misinformation and harmful content originating in India target Rohingya refugees on Facebook, and the platform appears to intentionally neglect acting against such content.
The PIL says that Facebook’s algorithms contribute to the promotion of such harmful content.
The petition stresses the highly politicised nature of the Rohingya refugee presence in India, stating that the community is disproportionately targeted with content portraying them as a threat, often using terms such as ‘terrorists’ and ‘infiltrators.’
The plea refers to a 2019 study by Equality Lab, revealing that a significant percentage of Islamophobic posts on Facebook in India specifically targeted Rohingya, despite their minimal representation in India’s Muslim population.
The PIL argues that Facebook’s failure to act against hate speech poses a threat to the lives of Rohingyas, violating their right to life under Article 21 of the Constitution.
The petitioners contend that Meta is also in violation of Section 79(3) of the Information Technology Act read with Rule 3 of the Information Technology (Intermediaries Guidelines) Rules 2011, which outlines the due diligence to be observed by intermediaries.
Hamim and Mohammed seek court directions to Meta to suspend accounts promoting hate against the Rohingya community and to transparently report how it applies content moderation policies on flagged content.
The plea further demands an India-specific report on hate speech content moderation, specifying removal decisions, appeals, and outcomes.