NEW YORK - Meta's independent oversight board on Tuesday criticized the social media titan of removing posts that showed human suffering in the Middle East conflict.
The board, set up by Meta in 2020 as a supreme court of sorts for the social media titan, overturned two post removal decisions, and urged the company to respond more quickly to changing circumstances in the war between Hamas and Israel.
One case involved the removal by Instagram of a video showing what appeared to be the aftermath of a strike on or near Al-Shifa Hospital in Gaza City during a ground offensive by Israel.
The video showed Palestinians, including children, injured or killed, according to the board.
A second case involved Facebook's decision to remove a video of an Israeli woman begging her kidnappers not to kill her as she is taken hostage during Hamas raids on Israel on October 7, the board said.
"These decisions were very difficult to make," oversight board co-chair Michael McConnell said in a release.
"The board focused on protecting the right to the freedom of expression of people on all sides about these horrific events, while ensuring that none of the testimonies incited violence or hatred."
The board urged Meta to preserve any removed posts that might contain evidence of human rights violations.
Meta told the board that it temporarily lowered thresholds for automatic removal of posts with potentially harmful content after the Hamas-led attack on Israel, according to the overseers.
The use of automated tools for content moderation at Facebook and Instagram increase the likelihood of removing posts showing the harsh reality of what is happening in the conflict, according to the board.
"These testimonies are important not just for the speakers, but for users around the world who are seeking timely and diverse information about ground-breaking events," McConnell said.
"Some of which could be important evidence of potential grave violations of international human rights and humanitarian law."
Content decisions by the oversight board are binding, but its recommendations are not, according to Meta.
The conflict between Israel and Hamas has claimed many lives and arouses intense emotions around the world.
Social networks have been flooded with violent imagery along with fabricated content intended to misinform, in a challenge to online platforms.
The European Union in October sent Meta a request for information about the dissemination of violent and terrorist content on its platforms.
Similar investigations are targeting TikTok, owned by China-based ByteDance, and X, formerly known as Twitter.
lem-gc/arp