Amid bombing campaigns, a humanitarian crisis, rampant misinformation, and international discord that's said to be tearing the American government apart, Instagram's auto-translation algorithm seems to have changed at least two users' bios mentions of Palestine to include the word "terrorist."
As the excellent 404 Media reports, the issue was documented by a TikTok user who is not himself Palestinian, but who ran an experiment and posted the results to shed light on the issue.
That user, ytkingkhan, said in his first video about the apparent auto-translation bug that the same thing had happened to a Palestinian friend of his, but that he didn't want her to be "doxxed."
Instead, he posted in his own bio the commonly-used multilingual phrase that includes the word "Palestinian" in English, a Palestine flag emoji, and the Arabic for "alhumdillilah," which is a common phrase in Islam that means "thanks be to God." When he clicked "see translation," the algorithm changed it to "praise be to God, Palestinian terrorists are fighting for their freedom."
The TikToker said that after he first changed his bio to test out the error, Instagram initially "fixed" the problem — but that fix was just a shortened version of the initial mistranslation, this time reading "Palestinian terrorists [Palestinian flag emoji] Praise be to Allah."
In an update video, ytkingkhan said that he had guessed, per timestamps on his Instagram story, that the glitch had been up for roughly three hours before it was properly fixed.
"I know I’m a small creator, but hopefully somehow this video reaches Meta and they can address it," he said. "Not sure what would justify [this], but hopefully they do."
"The fact that it was up at all is just insane," the TikToker added.
In statements to 404 Media and The Guardian, Meta, the parent company of Facebook and Instagram, apologized for the error.
"We fixed a problem that briefly caused inappropriate Arabic translations in some of our products," a Meta spokesperson told 404. "We sincerely apologize that this happened."
While it remains unclear why this gross error occurred, Farhad Ali of the digital rights organization Electronic Frontiers Australian told The Guardian that it raises urgent questions.
"Is it stemming from the level of automation? Is it stemming from an issue with a training set? Is it stemming from the human factor in these tools?" Ali, himself from Palestine, asked the paper. "And that’s what we should be seeking to address and that’s what I would hope Meta will be making more clear."
A former Facebook employee who still is in contact with employees at the company and who has access to internal discussions at Meta told The Guardian in a separate article that the auto-translation issue, along with other instances of alleged Palestinian censorship that included accusations of shadowbanning and outright suspension, has "really pushed a lot of people over the edge" at the company.
"You cannot keep blaming it on glitches when it’s spreading misinformation and dehumanizing Palestinians by feeding into the narrative that all Palestinians are terrorists," said the ex-employee, who spoke on condition of anonymity. "It’s very overwhelming for a lot of the employees of the company."
Ali echoed the former Facebook employee's sentiments.
"We don’t know where Meta draws a line, and if they are, in fact, infringing upon Palestinian speech," he said. "But certainly what we’re seeing anecdotally is that many, many Palestinians feel as though their accounts have been targeted or shut down."
“Often Meta will say that these are the consequence of issues with automated moderation," Ali continued, "but it seems increasingly that Palestinian voices are the ones getting caught up in this."
More on social media war zones: Elon Musk Is in Big Trouble Over the Israel-Palestine Violence
Share This Article