A New Form of Propaganda Has Targeted the Central African Republic

A New Form of Propaganda Has Targeted the Central African Republic

A New Form of Propaganda Has Targeted the Central African Republic

A report from Meta shows that groups in Russia and France are battling for political influence over one of Africa’s largest diamond and uranium exporters.

Facebook
Twitter
Email
Flipboard
Pocket

In 2019, Meta, the company formerly known as Facebook, introduced a term into the lexicon of Internet governance: “coordinated inauthentic behavior.” The phrase describes actions taken by individuals or groups to misrepresent themselves on the app or use fake accounts to boost the popularity of posts in order to make certain content more visible to other users. Since the first use of the term in 2019, Meta’s has used it in its transparency reporting to explain why it has taken down content that deliberately misinforms users. Through this process, Meta has revealed a new form of transnational conflict taking place in countries that many in the Global North would be hard-pressed to find on a map.

The Central African Republic (CAR) is one of those countries. Although a cheat code to its location is contained in its name, it is not well known. Even people in the region are unlikely to be able to name the current president (Faustin-Archange Touadéra) or the second-largest city (Bimbo). CAR’s latest cycle of violence began in 2012 with the collapse of the peace treaty between the president and Séléka rebels, culminating in the de facto partitioning of the country into two parts. Sporadic but intense fighting continues in great part because of the involvement of foreign nations. For example, the president’s security team is composed of Russian mercenaries, some of whom were recently redeployed to fight in Ukraine. On social media, hate speech and misinformation run rampant, and in 2020, Meta’s transparency reports singled out the country as being a target of “coordinated inauthentic behavior” funded by “foreign or government entities.”

In the early days after the Internet became a common domestic utility, there was broad optimism that interconnectedness would ultimately be for the greater good. Today, that picture is tarnished, particularly as the Internet is dominated and shaped by a handful of private companies. Platforms don’t simply provide us with a service. Their real goal is often to track our interactions even outside their own sites in order to monetize every millisecond of our lives. We are under constant surveillance, because these platforms want to sell us things. The hope of a free and open Internet endures, but evidence is mounting that, without some guardrails, those who seek to harm will take advantage of the platforms’ drive for profit to deepen their own power.

Coordinated inauthentic behavior is made possible by the same tools that make it easier for a digital company to serve you an ad for the pair of jeans that you were looking at on social media. The people who build social media companies do not intend for people to use them for politics; they are designed to encourage connections that can be monetized. But groups have recognized that a system that can serve you a targeted advertisement for jeans can also sell you a story about your society that changes how you behave politically.

The 2020 statement from Meta identified coordinated inauthentic behavior by more than 100 groups that violated its policy on foreign government interference. These groups were united by a few factors. First, all of the activity originated in either France or Russia. Second, they primarily targeted countries in Africa, although there were a few targets in the Middle East. The activity Meta detected was designed to shift how citizens of the targeted countries like Mali, Niger, and Burkina Faso perceived France and Russia. The report showed that CAR is at the center of a battle between Moscow and Paris for political influence over one of Africa’s largest diamond and uranium exporters.

On Facebook, the company found fake accounts posing as locals of the countries they were targeting and sharing content that engaged with French policy in Francophone Africa. Meta said that the posts included “claims of potential Russian interference in the election in the Central African Republic, supportive commentary about the French military and criticism of Russia’s involvement in CAR.” At the same time, the company found posts originating from Russia that shared content about the election and Russia’s Covid-19 vaccine, and supportive commentary about the CAR government, which currently receives military support from Russian mercenaries. The Russian-origin content also criticized French foreign policy and discussed a “fictitious coup d’état in Equatorial Guinea.”

Meta does not explicitly attribute any of these activities to the national governments of either France or Russia, and coordinated inauthentic behavior is not illegal. But it is unjust. It reorders people’s political preferences in ways that may be inconsistent with how they would behave if they were left on their own. In its purest form, it is propaganda, heightened by the speed and accuracy of digital technology.

CAR is a former French colony that has struggled to decolonize and seems to have traded one imperial power for another. As Russia’s war in Ukraine intensifies, the conflict in CAR becomes an opportunity for Moscow to expropriate resources that can fund this fight.

Coordinated inauthentic behavior exacerbates threats in countries where conflict is connected to social cleavages. Hate speech and incitement to violence is already a major social problem in CAR. And adding to the mix external parties that do not have to live with the consequences of such speech is dangerous.

CAR has endured a long history of outsiders creating propaganda to influence its politics. One story that people outside the country may have heard concerns the self-appointed Emperor Jean-Bedel Bokassa who ruled between 1976 and 1979. Bokassa’s regime was terrible enough on its own, but because he courted and received military and political support from both the Libyan Gen. Moammar El-Gadhafi and then French President Valery Giscard-d’Estaing—at opposite ends of the Cold War political spectrum—CAR was the target of intense Cold War propaganda campaigns. There is ample evidence of his cruelty and mental instability, but the one Bokassa story that circulated around the world is that he was a cannibal who kept the bodies of his opponents in a refrigerator and routinely served them to high-powered guests. This story is not true. Historians say it was invented to sustain an image of an “unhinged native” against whom a rebellion was necessary. If a rumor like this could be created and sustained in the pre-digital age and endure for more than a generation, what would it look like in the digital age?

Long before Meta had developed a name for it, countries of the global majority have realized that digital innovation and dissemination were outpacing regulation. Private capital in the West, and also the governments of China and Israel, are developing technologies that end up subverting political processes in ways that regulators and civil society are struggling to keep up with. In CAR, there are foreign-funded and sustained misinformation campaigns. Around the world, digital technologies are exacerbating preexisting cleavages in dangerous ways.

The West’s narratives of digital innovation ignore outcomes like this because they focus on the experiences of wealthy people in wealthy countries. Tech evangelists often assume that laws aren’t needed, or that when they are there are institutions to implement them and general political goodwill to ensure that groups will abide by them. But countries like CAR, beset by conflict that’s shaped by competing international interests, show the limits of such a libertarian approach to technology.

Instead of scrambling years later when it’s almost too late, those who build digital technologies should ask themselves: If the things you build cause as much disruption in societies that ostensibly have institutions that work, what do they look like in societies that don’t? Building within such a framework is not undermining innovation. It’s thinking things through properly.

We owe one another a version of progress and innovation that doesn’t make life worse in swaths of the world. We owe each other a version of digital connectedness that thinks through the impact of technology beyond the narrow boundaries of the world we know.

Can we count on you?

In the coming election, the fate of our democracy and fundamental civil rights are on the ballot. The conservative architects of Project 2025 are scheming to institutionalize Donald Trump’s authoritarian vision across all levels of government if he should win.

We’ve already seen events that fill us with both dread and cautious optimism—throughout it all, The Nation has been a bulwark against misinformation and an advocate for bold, principled perspectives. Our dedicated writers have sat down with Kamala Harris and Bernie Sanders for interviews, unpacked the shallow right-wing populist appeals of J.D. Vance, and debated the pathway for a Democratic victory in November.

Stories like these and the one you just read are vital at this critical juncture in our country’s history. Now more than ever, we need clear-eyed and deeply reported independent journalism to make sense of the headlines and sort fact from fiction. Donate today and join our 160-year legacy of speaking truth to power and uplifting the voices of grassroots advocates.

Throughout 2024 and what is likely the defining election of our lifetimes, we need your support to continue publishing the insightful journalism you rely on.

Thank you,
The Editors of The Nation

Ad Policy
x