Brazil’s Far Right Plots Its Own January 6

Telegram did not respond to a request fof comment.

Telegram was used heavily by organizers of the January 6 riot in the US. The platform is unmoderated, with small exceptions for pornographic and terrorist content, making it a hub for conspiracy theories and disinformation that may otherwise be removed from platforms like Facebook, Instagram, and Twitter.

Many of these Telegram channels are publicly searchable and have thousands of members who share tens of thousands of pieces of content a month. Many refer to Bolsonaro’s opponent, Luiz Inácio Lula da Silva, as a communist and allege that any outcome that doesn’t favor Bolsonaro will be the result of a corrupted electoral process.

But Telegram does not operate in a vacuum. “In Brazil, the center of misinformation is not Telegram itself, it’s YouTube,” claims Leonardo Nascimento, a professor at Universidade Federal da Bahia and researcher at the Internet Lab. Telegram, he says, is often used as a conduit to spread links to YouTube videos. According to Nascimento’s research, the most popular videos are often clips or interviews with Bolsonaro himself, shared hundreds of times in multiple groups. Bolsonaro has cast doubt on the validity of the country’s elections many times, even leading to a federal police investigation into his claims about the country’s voting systems.

“On one side you have honest soldiers without any accusation of corruption. On the other side, you have two thieves. Which one would you invite into your house?” asked one video from the YouTube channel PodVoxx that was recently shared in a Telegram group with more than 15,000 users. Nascimento’s research showed that in just 90 days, more than 300,000 YouTube links were shared in the right-wing groups in Brazil that he monitors.

According to research from the Internet Lab, the most common misinformation links on Telegram direct users to unlisted YouTube videos, meaning they cannot be found when searching on-platform and can only be accessed by those who have the URL. That makes it hard for outsiders to find such links, but not YouTube itself, Nascimento argues. “[YouTube] knows that these links are being shared,” he says. He also claims that the platform tends to be slower than Meta or Twitter when it comes to removing hateful and extremist content.

YouTube spokesperson Cauã Taborda says there is no difference in its moderation practices for listed and unlisted videos. But Nascimento says that because platforms enforce policies differently—if at all—harmful content can continue to circulate in one way or another. “The problem is not Twitter itself, or YouTube itself, or other platforms,” says Nascimento. “The problem is the whole system.”

Additional reporting by Priscila Bellini. 

Source

Author: showrunner