On August 13th, Facebook shut down the English language page of TeleSUR, blocking access for roughly half a million followers of the leftist media network until it was abruptly reinstated two days later. Facebook has provided three different explanations for the temporary disappearing, all contradicting one another, and not a single one making sense.
TeleSUR was created by Venezuela's then-president Hugo Chavez in 2005 and co-funded by hemispheric neighbors Cuba, Bolivia, Nicaragua, and Uruguay - Argentina pulled support for the web and cable property in 2016. As a state owned and media property, it exists somewhere on the same continuum as RT and Al Jazeera, though like the former, TeleSUR has been criticized as a nakedly partisan governmental mouthpiece, and like the latter, it does engage in real news reporting. But putting aside questions of bias and agenda aside, TeleSUR does seem to exist on a separate plane than, say, InfoWars, which exists primarily to peddle its particular, patently false genre of right wing paranoia fan fiction packaged as news (and brain pills), as opposed to some garden variety political agenda. Unlike RT, TeleSUR hasn't been singled out for a role in laundering disinformation for military intelligence purposes, nor is it a hoax factory, a la Alex Jones.
So it was unexpected when TeleSUR English blinked out of existence on the 13th, and even stranger when Facebook struggled to explain its own actions. At the time of its suspension, TeleSUR received this boilerplate message from Facebook:
The Facebook Team
We have a mishmash of incompatible justifications.
The next day, Facebook wrote TeleSUR again, this time saying that the company's engineers had conducted "several tests," and assured the outlet that "technicians" continued to look for an answer. On Wednesday, after a 48 hour blackout, Facebook wrote once more to say the page had been suspended due to a mysterious "instability on the platform," which had now been corrected. It's unclear whether Facebook would have corrected this "instability" had TeleSUR not complained to them, and equally unclear why the company had initially claimed TeleSUR had violated its terms of service.
But Facebook has a third reason for suspending TeleSUR: In an emailed statement to The Intercept, a company spokesperson said "The Page was temporally unpublished to protect it after we detected suspicious activity." The term "suspicious activity" does not appear in Facebook's terms of service. The spokesperson would not explain what "suspicious activity" was observed on TeleSUR's page, nor define the term, nor is there any explanation for why it was initial blamed on rule-breaking by TeleSUR and then technical problems on the social network's end.
Even if you were to assume the worst about TeleSUR, that it exists to parrot the opinions of repressive regimes, and even if you could come up with an argument that TeleSUR in fact ought to be suspended for one reason or another, it's hard to imagine an argument that Facebook has no obligation to explain its actions in a manner that could be described as even mostly coherent, if not transparent. This is typical behavior for the company, which both touts its use of automated rule enforcement and scapegoats the algorithms when they go awry. In TeleSUR's case, there's no word as to whether a human or string of content-policing computer code "unpublished" the page, mistakenly or not, justifiably or otherwise. Instead, we have a mishmash of incompatible justifications, the latest in a long stream out of a company that's struggled to create intelligible rules for acceptable content and behavior, let alone enforce them. To its credit, Facebook published a long account of its reasoning behind suspending InfoWars' Alex Jones, though this likely has more to do with public relations angst than some commitment to consistency and transparency. For a company that's testified before Congress and bought billboards around the country saying they're working on accountability and earning public trust, this is a problem - it's difficult to picture anything further from accountability than enforcing rules from behind a curtain.
We depend on the support of readers like you to help keep our nonprofit newsroom strong and independent. Join Us
Contact the author:
Sam Biddle [email@example.com@samfbiddle]