National Opinions

OPINION: Telegram’s hands-off approach to content faces a reckoning

The recent arrest of Telegram founder and Chief Executive Officer Pavel Durov at Le Bourget airport near Paris has sent shockwaves through the tech world. Elon Musk called on France to “free Pavel” to avert a threat to democracy; Paul Graham, the co-founder of leading Silicon Valley accelerator Y Combinator, suggested it would hurt the country’s chances of being “a major startup hub.” Yet while some are citing a French-led assault on free speech and innovation, the reality is more nuanced.

Durov’s detention is not a shocking act of government overreach but the culmination of years of tension between his ultra-lax approach to oversight and growing concern about Telegram’s role in enabling criminal activity. The charges are extensive and serious, covering Telegram’s complicity in the distribution of child sexual-abuse material, or CSAM, drug trafficking and money laundering. While the likes of Meta Platforms Inc., TikTok and Alphabet Inc.’s YouTube have much stricter bans on such activities, Durov’s arrest should also be taken as a sign that the “no consequences” era for social media is fading as governments push to make companies more accountable for what happens on their apps.

Telegram is one of the world’s biggest social media platforms with an estimated 900 million monthly users, many of whom follow popular channels that broadcast content to thousands of people. It’s also unique in its approach to overseeing all that activity: It doesn’t. While its peers invest heavily in content moderation and cooperate with law enforcement, Telegram has a minimal-intervention policy that has contributed to its low operational costs. Durov once told the Financial Times that each Telegram user cost the company just 70 cents a year to support.

His platform has been linked to the spread of conspiracy theory groups, CSAM and terrorism, with ISIS having reportedly used the app as a communication hub for nearly a decade. Such groups don’t just use the app for alleged secrecy, but for its “anything goes” approach to moderation. During the recent UK riots, calls to violence proliferated on the platform even though they broke the app’s rules. One such post was only taken down after I contacted the app about it. Despite all this, Telegram has proudly maintained a stance of non-cooperation. In its FAQs, the company states “to this day, we have disclosed 0 bytes of user data to third parties, including governments.”

Now, in response to the arrest, Telegram has said it’s “absurd to claim that a platform or its owner are responsible for abuse of that platform. Telegram abides by EU laws, including the Digital Services Act — its moderation is within industry standards and constantly improving.”

But it’s far from “absurd” for a company to be held accountable for criminal activity on its platform. Telegram is in this position because of its choice to avoid content moderation — and not because of an encroaching effort by a government to conduct surveillance on its supposedly secret chats. Cryptography experts have long pointed out that Telegram is not fully end-to-end encrypted. Most chats on the app use client-server encryption, meaning Telegram could access message contents if it chose to —and much of the content on the platform is on public channels anyway. The company’s “Secret Chats” feature does offer end-to-end encryption, but that’s not the default and it isn’t always used for regular communication. In essence, Telegram has created an illusion of total privacy while retaining the technical means to monitor content — a capability it chooses not to use.

France’s move against Durov marks a reckoning for that choice, and the involvement of specialized units such as the country’s Centre for the Fight against Cybercrime and the Anti-Fraud National Office highlight the gravity of his app’s alleged offenses. Musk and other critics may argue that his arrest threatens free speech, but Telegram’s hands-off approach to much of the activity on its platform doesn’t grant it freedom from consequences. The digital world requires as much governance as the physical one, and when a platform becomes a tool for widespread criminal activity, turning a blind eye isn’t a defense of liberty but a dereliction of duty.

ADVERTISEMENT

One lesson the tech industry can glean from this week’s developments is that social media giants can no longer expect to keep operating in a regulatory vacuum. Europe is on track to take a harsher line on harms that occur on social media with laws like the Digital Services Act and Britain’s Online Safety Act coming into force in the next year or so. The charges brought by the French prosecutors aren’t connected to the new EU law, but they are part of a broader shift in aggression. Tech’s leading players aren’t as untouchable as they thought they were.

Parmy Olson is a Bloomberg Opinion columnist covering technology. A former reporter for the Wall Street Journal and Forbes, she is author of “We Are Anonymous.” This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

The views expressed here are the writer’s and are not necessarily endorsed by the Anchorage Daily News, which welcomes a broad range of viewpoints. To submit a piece for consideration, email commentary(at)adn.com. Send submissions shorter than 200 words to letters@adn.com or click here to submit via any web browser. Read our full guidelines for letters and commentaries here.

ADVERTISEMENT