Kenyan Court Allows Suit Against Meta Over Ethiopian Conflict Content
The Meta logo, a symbol recognized globally, is prominently displayed at the company’s temporary stand, eagerly waiting for the commencement of the World Economic Forum in Davos, Switzerland, on January 18, 2025. This image, captured by REUTERS/Yves Herman, weaves a new narrative in the ever-evolving saga of the digital age. (Purchase Licensing Rights)
Recently, a significant legal decision has emerged from the plains of East Africa that could echo across the technological world. Facebook’s parent company, Meta, faces a groundbreaking lawsuit in Kenya due to claims it promoted content, leading to ethnic violence in Ethiopia. The ruling by the Kenyan court has set a legal precedent, demonstrating that Meta can indeed be held accountable in regions beyond its registration boundaries.
The backdrop of this case is the severe civil unrest that transpired between 2020 and 2022 in Ethiopia’s Tigray region. The allegations center around the amplification of hateful content during this conflict, raising questions about the broader responsibilities of global technology companies to prevent harm through their platforms. What responsibilities do these digital giants have when their platforms become tools for hate?
Meta, in its defense, argued that the Kenyan judiciary lacks jurisdiction over it since it isn’t registered in the region. However, in a decisive turn of events, Kenya’s High Court dismissed this claim. A key plaintiff, the Katiba Institute, along with two Ethiopian researchers, expressed that this decision is a critical step in addressing global issues at a local level. Nora Mbagathi, executive director of the Katiba Institute, eloquently noted, “The court here has refused to shy away from determining an important global matter, recognising that homegrown issues must be addressed directly in our courts.”
In the shadows of this courtroom drama, a disturbing narrative unfolds. Plaintiff Abrham Meareg tells a gut-wrenching story — his father, Meareg Amare, met a tragic end in 2021, a victim of threats propagated through Facebook posts. Similarly, Fisseha Tekle, conducting human rights research for Amnesty International, recounts facing online vitriol for his efforts in Ethiopia. These personal stories underscore the real-world impact of virtual actions.
The plaintiffs have called upon Meta to establish a restitution fund to aid the victims of online hate and violence. Moreover, they demand alterations to Facebook’s algorithms that tend to inadvertently prioritize incendiary speech. Yet, this demand is not about stifling free expression but safeguarding it from manipulation and harmful distortion.
Meta, on its part, insists that considerable resources have already been allocated towards content moderation. Hateful content was reportedly removed from its platform. However, this assurance will be tested in the courtroom, weighed against the lived experiences of the plaintiffs and broader society.
This lawsuit is notably the third legal challenge Meta faces in Kenya in recent times. Previously, content moderators, employed through a local contractor, initiated suits citing unacceptable working environments and dismissal due to attempts to unionize. In response, Meta claimed it mandates its partners to adhere to industry-leading conditions.
In a somewhat contradictory development, despite these global investments in moderation, Meta decided to end its U.S. fact-checking program earlier this year. This raises eyebrows and leads us to wonder: Has the responsibility of content oversight taken a backseat? Meta now addresses rule violations reactively rather than proactively, only reviewing reports flagged by users themselves.
These evolving dynamics force us to ponder: In a world increasingly woven together by digital threads, how much power and responsibility do technology companies truly bear? As we navigate this digital frontier, each lawsuit, each legal precedent, becomes a thread of its own, contributing to the complex tapestry of our shared future.
Reported by Hereward Holland; Edited by Aaron Ross and Tomasz Janowski
Edited By Ali Musa, Axadle Times International – Monitoring