Monday, August 26, 2024

Telegram Founder Arrested

Nadeem Badshah and Reuters (Hacker News, MacRumors, The Verge):

Pavel Durov, billionaire co-founder and chief executive of the Telegram messaging app, was arrested at the Bourget airport outside Paris on Saturday evening, TF1 TV said, citing an unnamed source.

[…]

Telegram offers end-to-end encrypted messaging and users can also set up “channels” to disseminate information quickly to followers.

John Gruber:

One-on-one chats in Telegram are not encrypted by default and group chats never are. Telegram employees have access to every single message ever sent to every group.

Nick Heer:

I believe it is best to wait until there is a full description of the crimes French authorities are accusing Durov of committing before making judgements about the validity of this arrest.

[…]

One can quibble with Telegram’s choices. How appealing it is to be using an app which does not support end-to-end encryption by default is very much a user’s choice. But one can only make that choice if Telegram provides accurate and clear information. I have long found Apple’s marketing of iMessage deceptive. Telegram’s explanation of its own privacy and security is far more exploitative of users’ trust.

Matthew Green (Hacker News):

This post is inspired by the recent and concerning news that Telegram’s CEO Pavel Durov has been arrested by French authorities for its failure to sufficiently moderate content. While I don’t know the details, the use of criminal charges to coerce social media companies is a pretty worrying escalation, and I hope there’s more to the story.

There are some details here (Hacker News), but I don’t have a good understanding of the charges. Some questions that come to mind:

Matthew Gault (Hacker News):

Telegram is a lot of things—a great place for open-source intelligence about war, a possible vector for child sex abuse material, and a hub for various scams and crimes—but it is absolutely not an encrypted chat app. Does Telegram provide an encrypted chat option? Yes, but it’s not on by default and turning it on isn’t easy.

[…]

Telegram is mostly about big group chats and channels where people share information with their fans.

Maybe the encryption issue is a distraction it’s mostly a Facebook-style social network. How would E2EE make sense there?

Previously:

Update (2024-09-09): Preston Byrne (via Hacker News):

Most countries do not have such a permissive regime. France is part of that group. In 2020, for example, the Loi Lutte Contra la Haine sur Internet (Law against hate speech on the Internet) in relation to which global Internet companies can be fined $1.4 million per instance, and up to 4% of their total worldwide revenue, for failing to restrict “hate speech” (which in the United States constitutes “protected speech”) from their websites. Similarly, Germany has its law, the Netzwerkdurchsetzungsgesetz or “Network Enforcement Act” (sometimes referred to as the “Facebook-gesetz” but more commonly referred to by its acronym, the NetzDG), in relation to which politically inflammatory content must come down or the government has the power to impose fines north of EUR 50 million.

[…]

If, however, the French are simply saying that Durov’s failure to police his users or respond promptly to French document requests is the crime (which I suspect is the case), then this represents a dramatic escalation in the online censorship wars. What it means is that European states are going to try to extraterritorially dictate to foreign companies what content those companies can and cannot host on foreign-based webservers.

If correct, this would represent a major departure from the U.S.-compliant approach most U.S.-headquartered social companies currently take, which has generally governed the global compliance strategies of most non-China social media companies, including any which offer greater or lesser degrees of full encryption on their services (Telegram’s “Secret Chats” feature, WhatsApp, and Signal among them). In brief, platforms thought that if they didn’t specifically intend their platforms to be put to criminal use, they’re unlikely to find themselves on the receiving end of criminal charges. That’s not true anymore, apparently.

[…]

Facebook’s popular encrypted messaging app WhatsApp has, famously, been used for years by the erstwhile non-state terror organization in, and now rulers of, Afghanistan, the Taliban. This fact was widely known by NATO generals and reported in the press during the Afghan war, and was even reported on again in the New York Times as recently as last year[…]

Zlatti71:

Pavel Durov said that he was lured to France by President Macron.

It turns out that the French president invited Durov to dine together. Pavel told about this during interrogation by the police, the French newspaper Le Canard Chainé claims.

But instead of lunch, Durov was met in France by local police.

Victor Goury-Laffont (Slashdot):

President Emmanuel Macron said Monday that the French government was not involved in the arrest of Telegram founder and CEO Pavel Durov.

Jon Brodkin (Slashdot):

On Monday, prosecutor Laure Beccuau issued a statement saying Durov was arrested “in the context of a judicial investigation” into a “person unnamed.” The wording leaves open the possibility that the unnamed person is someone else, but the prosecutor’s statement listed a raft of potential charges that may indicate what Durov could be charged with.

Barbara Surk and Angela Charlton (via Hacker News):

Preliminary charges under French law mean magistrates have strong reason to believe a crime was committed but allow more time for further investigation.

Telegram (via Hacker News):

Telegram abides by EU laws, including the Digital Services Act — its moderation is within industry standards and constantly improving.

[…]

It is absurd to claim that a platform or its owner are responsible for abuse of that platform.

Telegram:

Establishing the right balance between privacy and security is not easy. You have to reconcile privacy laws with law enforcement requirements, and local laws with EU laws. You have to take into account technological limitations. As a platform, you want your processes to be consistent globally, while also ensuring they are not abused in countries with weak rule of law. We’ve been committed to engaging with regulators to find the right balance. Yes, we stand by our principles: our experience is shaped by our mission to protect our users in authoritarian regimes. But we’ve always been open to dialogue.

Sometimes we can’t agree with a country’s regulator on the right balance between privacy and security. In those cases, we are ready to leave that country. We’ve done it many times. When Russia demanded we hand over “encryption keys” to enable surveillance, we refused — and Telegram got banned in Russia. When Iran demanded we block channels of peaceful protesters, we refused — and Telegram got banned in Iran. We are prepared to leave markets that aren’t compatible with our principles, because we are not doing this for money. We are driven by the intention to bring good and defend the basic rights of people, particularly in places where these rights are violated.

Jason Koebler:

We at 404 Media have seen and reported on much of the illegal activity on Telegram with our own eyes. Telegram is widely and blatantly used in the open by drug dealers who advertise their products on Facebook and Instagram, hackers who sell credit cards in public groups, hacking crews that have begun to commit physical violence against each other, widespread fraud rings, and people who make and sell nonconsensual, AI-generated sexual content of celebrities, ordinary people, and minors.

Crucially, much of this content is not encrypted, because group chats on Telegram are not encrypted and because encryption is not enabled by default. It would be more accurate to call Telegram a messaging app on which a version of encryption can be enabled for certain chats if you want. It is not really an “encrypted messaging app.” Many of these devices and groups are advertised in the open, and many of these groups have thousands of users. In our experience, Telegram does very little to remove this sort of activity, and in many years of reporting on them, we can think of only one instance in which Telegram actually banned a group we sent to them.

[…]

It can be simultaneously true that Pavel Durov has enabled some of the worst things on the internet via Telegram but that his arrest partially on the grounds of “providing cryptology services” should be more broadly concerning.

Mike Masnick:

The problem is, without more details, we have no idea what is actually being charged and what his alleged responsibility is. After all, we’ve seen other cases where people have been charged with sex trafficking, when the reality was that was just how law enforcement spun a refusal to hand over data on users.

On top of that, leaping to criminal charges against an exec over civil penalties for a company… seems strange. For that to make any sense, someone should need to show actual criminal behavior by Durov, and not just “his service hosted bad stuff.”

[…]

The other interesting point is how central Telegram has been to Russia’s war in Ukraine, for both sides.

Of course, Europol has also said that Telegram cooperates with its request for dealing with terrorism online. And other reports have talked about Telegram cooperating with German officials and handing over data on users.

[…]

Also, I have to remind folks that a little over two decades ago, France also put out an arrest warrant on Yahoo CEO Tim Koogle, charging him as a war criminal, because Yahoo’s auction site in the US (notably, not the French version) allowed people to sell Nazi memorabilia. Eventually he was acquitted.

jgarzik:

Most people really, really do not understand the large amount of military traffic on Telegram, and the consequence of that during wartime... and how valuable that is to multiple nation-states around the world.

Strategic comms, soldier command and control, battlefield drone command and control, intel asset management.

ProPublica:

Telegram’s ease of use, its huge public channels and the ability to encrypt private conversations have helped fuel its global appeal. Ukrainian President Volodymyr Zelensky used the app to rally his compatriots to repel the Russian invasion. Activists in Hong Kong turned to Telegram to organize demonstrations against a repressive law. In Belarus, pro-democracy forces used the platform to fight back against election fraud.

Mike Masnick:

I would bucket the list of charges into four categories, each of which raise concerns.

[…]

It says there was a “refusal to communicate, at the request of competent authorities, information or documents necessary for carrying out and operating interceptions allowed by law.” This could be about encryption, and a refusal to provide info they didn’t have, or about not putting in a backdoor. If it’s either of those, that would be very concerning. However, if it’s just “they didn’t respond to lawful subpoenas/warrants/etc.” that… could be something that’s more legitimate.

[…]

In the end, though, a lot of this does seem potentially very problematic. So far, there’s been no revelation of anything that makes me say “oh, well, that seems obviously illegal.” A lot of the things listed in the charge sheet are things that lots of websites and communications providers could be said to have done themselves, though perhaps to a different degree.

Ronny Reyes:

Among the charges was an allegation that he refused to help French authorities wiretap users of the site who were suspected of crimes, Paris prosecutors said.

[…]

Prosecutor Laure Beccuau accused Durov, 39, of showing a “near-total absence” of replies to legal demands from officials looking for Telegram to help crack down on crime tied to its services.

Lindsay Clark:

He has since been released on €5 million bail, is not allowed to leave France, and must report to the police twice a week.

Albert Wenger:

How much moderation should there be on social networks? What are the mechanisms for moderation? Who should be liable for what?

The dialog on answering these questions about moderation is broken because the most powerful actors are motivated primarily by their own interests.

Mike Rockwell:

Why should we allow governments to force companies to moderate the content shared through their services? Why should we be treating speech online any differently than speech spoken in person?

Should restaurants be forced to moderate the speech of their patrons? Should they be forced by their government to install microphones at each table to ensure their customers aren’t sharing misinformation or engaging in illegal activity? Of course not.

Should customers be told that they are only allowed to speak in a restaurant if they do so in code? Of course not.

Nick Heer:

It is important to more fully contextualize Telegram’s claim since it does not seem to be truthful. In 2022, Der Spiegel reported Telegram had turned over data to German authorities about users who had abused its platform. However, following an in-app user vote, it seems Telegram’s token willingness to cooperate with law enforcement on even the most serious of issues dried up.

I question whether Telegram’s multi-jurisdiction infrastructure promise is even real, much less protective against legal demands, given it says so in the same FAQ section as its probably wrong “0 bytes of user data” claim. Even so, Telegram says it “can be forced to give up data only if an issue is grave and universal enough” for several unrelated and possibly adversarial governments to agree on the threat. CSAM is globally reviled. Surely even hostile governments could agree on tracking those predators. Yet it seems Telegram, by its own suspicious “0 bytes” statistic, has not complied with even those requests.

Durov’s arrest presents an internal conflict for me. A world in which facilitators of user-created data are responsible for their every action is not conducive to effective internet policy. On the other hand, I think corporate executives should be more accountable for how they run their businesses. If Durov knew about severe abuse and impeded investigations by refusing to cooperate with information the company possessed, that should be penalized.

Max Read (via Hacker News):

But there are also limits to the “media company” as an analogy to explain them, or to understand their place in the world. More newsy coverage of Durov’s arrest has seemed to imply that the complaint underlying the specific charges is less the content viewable on Telegram and more his and his company’s unwillingness to assist French (and European) law enforcement in tracking down the people posting it, as the Times writes[…] If you accept this reporting of events, I suppose in somewhat indirect sense Durov’s arrest is a “free speech” issue, but it’s not really a “censorship” issue, as Carlson would have it.

Sean Hollister:

Twelve days after he was arrested in France, Telegram CEO Pavel Durov has broken his silence with a 600-word statement on his Telegram account that blames “growing pains that made it easier for criminals to abuse our platform.”

[…]

While the vast majority of his statement today paints his arrest as surprising and unfair, he also admits that policing Telegram has become harder. Durov says it’s now his “personal goal” to “significantly improve things in this regard.”

Mia Sato (Hacker News):

Telegram has quietly removed language from its FAQ page saying private chats were protected and that “we do not process any requests related to them.”

[…]

In response, Telegram spokesperson Remi Vaughn says the app’s source code has not changed.

[…]

Earlier on Thursday evening, Durov issued his first public statement since his arrest, promising to moderate content more on the platform, a noticeable change in tone after the company initially said he had “nothing to hide.”

Emma Roth:

Durov says the service has stopped new media uploads to its standalone blogging tool, Telegraph, because it was “misused by anonymous actors.”

[…]

Telegram has also removed its People Nearby feature, which lets you find and message other users in your area. Durov says the feature has “had issues with bots and scammers” and was only used by less than 0.1 percent of users. Telegram will replace this feature with “Businesses Nearby” instead, allowing “legitimate, verified businesses” to display products and accept payments.

Pavel Durov:

While 99.999% of Telegram users have nothing to do with crime, the 0.001% involved in illicit activities create a bad image for the entire platform, putting the interests of our almost billion users at risk.

That’s why this year we are committed to turn moderation on Telegram from an area of criticism into one of praise.

All this sounds like he was forced into some kind of deal, but it’s hard to say what’s changing in practice. Haroun Adamu notes a similar case with Telegram and Brazil in 2022.

Update (2024-09-25): Emma Roth (via Hacker News):

Telegram will now turn over a user’s phone number and IP address if it receives a request from authorities, according to its just-updated privacy policy.

Update (2024-10-03): Nick Heer:

I do not know what to make of this. There is a vast difference, in my mind, between “0 bytes of user data” — which would include things like IP addresses and phone numbers — and “0 bytes of user messages”. Perhaps this was just poor wording in the earlier version — if so, it feels misleading. If I were some crime lord, I would see that as reassurance Telegram reveals nothing, especially with its reputation.

[…]

I do not know whether I can believe him. From the outside, it looks like Telegram was habitually uncooperative with law enforcement on legitimate investigative grounds. It turned over some data to German authorities but realized users hated that, so it did one of two things: it deceived authorities, or it deceived users. […] I understand being skeptical of charges like these and I am not condemning Durov without proof. But I do not believe Durov either.

19 Comments RSS · Twitter · Mastodon


I can add some context to this beyond knee jerk libertarianism.

Organised crime is out of control in Seden due to the increasing privatisation of public well fare and the growing class divides. The police can see hit contracts being published publicly on Telegram, Instagram and TikTok but they have no backchannel to get any information about actual crimes taking place in the open.

This arrest might be related to that.


Telegram is not just a messenger, but also a portal like X. Because the messages on the server are not encrypted, even newcomers in a group can access the entire conversation.
Portals such as X etc. are obliged in the EU to moderate content against hate speech. Hate is not an opinion, so it is not a restriction on freedom of expression. This can be differentiated and It has nothing to do with censorship. Telegram has refused to cooperate with the authorities. Why are there laws if someone could simply ignore them?


Pierre Lebeaupin

One of the issues here is that we do not have "habeas corpus" in France: while limits have been added to arbitrary privations of liberty, mostly thanks to the ECHR lately, a member of the armed forces with minimal law training can still arrest you and keep you in custody for 48 hours, even 96 hours for terrorism (and a few other…) charges, without formal charges notifications or a judge even being involved.

(The equivalent of the district attorney is notified and has authority over this privation of liberty, but that is still not a judge)

The outcome is that law enforcement can cover the arrest and custody with charges that have not even been reviewed by any independent judge. They may even announce the charges as being such and such, only for them to later drop some of these charges once this policy custody is over.


Hard to form an opinion about what's going on with such little information but comparing initial reactions to Apple's unwillingness to assist the FBI in unlocking the San Bernadino shooter's phone is noteworthy. You'll remember that the "tech community" praised Apple, mostly.

There are those pointing out that Telegram is "not encrypted by default" and "group chats are not encrypted at all" etc. etc. But do we know specifically why he was arrested? Does it relate to cases that do in fact use the "encrypted chat"? Do they want a backdoor and Telegram is refusing? Is it not "mathematically possible" for Telegram to assist authorities in these cases because these users did use the hard to find end-to-end encryption feature?


Telegram is not just a messaging app that refuses to moderate bad chats.
First of all it is a great app, that works better for many people than competitors. It is available freely. Those great features and options are not limited to just large groups. Notifications are way better, folders, files uploads, media playback, translations, their stickers - all that is superior to competition.

Yes, it is not E2EE, but it does not mean that any Telegram employee can read any message or look at any uploaded files. Telegram has keys, and those keys are guarded. But as we all know it's not technically impossible to look inside those encrypted messages because Telegram has keys. So, the privacy expectations are based on the reputation and trust into the company and the team. But so does Apple and Meta - we trust them not to include secred backdoors into their E2EE protection. It's not the same level of trust, but there is not evidence that any private info in TG was ever outed. Gmail is not E2EE, and we trust Google.

Moderation is not a clear cut, doing it means repressed people will loose their freedom to communicate just as much as criminals. It's all obvious things, but it feels that many people just don't care.

Society problems like out of control crime are not solved by restricting free speech.
Of course Pavel Durov is not a saint and his track record is spotty in regards to what TG allows or not allows to do. But at the same time, TG is a very very important tool for many people around the world, who would suffer from local tyrants if moderation will be enforced. Criminals will be able to move to another platform way faster than ordinary repressed folks


Telegram is not widely used in the US and I’ve seen a lot of misconceptions about it.
For starters – I get the impression that analogies with Signal, iMessage, Facebook Messenger are invalid.
My limited understanding is that this is about public channel moderation and this is more analogous to Twitter’s or Facebook’s legal moderation responsibilities. None of this is encrypted.


Old Unix Geek

The head of Protonmail finds this arrest crazy, and suggests people would be crazy to build a startup in France.

A French lawyer says these charges make operating any messaging service criminal if the French state wishes it to be so, even extraterritorially. Only message services that give the French government full access to monitor all communications will have peace of mind.

As a reminder to people, Rumble is banned in France, China and Russia, and its CEO kept his mount shut until he left Europe.

Durov is being held and questioned for 96 hours by a team of police officers. Presumably the idea is to wear him down. Old movies where sleep deprivation was torture come to mind, but hopefully they're being more humane.

This is not the only antidemocratic thing going on in France right now. Elections were held 100 days ago, yet Macron is preventing a new government from forming. A constitutional lawyer on French Radio pointed out that this is illegal.

At this point, it seems that France has given up on "Liberté" and "Égalité", and on its self-description as "le pays des droits de l'Homme". Oh well, another one bites the dust.


Old Unix Geek

The arrest warrant was signed when he was landing on the tarmac, according to French TV channel BFMTV.

Here are the charges.

Telegram has an E2EE private messaging service but it's a pain to use, and it's unclear how safe it really is. This suggests this is really not the key problem, and that instead they want something else... like forcing telegram to submit to the Western narrative, since it's an uncensored source for information about events in the Ukraine war and Gaza.

Those who take the CSAM allegations seriously must also wonder why none of Epstein's clientele have been arrested, and why Zuckerberg been not been charged since he's living in the US.

Al Jazeera is apparently reporting the UAE has frozen a contract for 80 Rafales as a result of the arrest of their citizen Durov.


Worth noting he arrived in France when he had the option to stay away; and chose to enter on a French passport when he could have entered on his others (Turkish or Russian)

There may be more going on behind the scenes: being in a French jail might put him safely out of the way of Russian agents.

Also worth realising that because it doesn’t use end to end encryption, and stores messages on servers in the clear, Telegram was legally in possession of CSAM, which in many venues is a strict liability offence.

And since group chats are never encrypted, it was in a great place to communicate who was actually trading CSAM and refusing.

Generally, it’s positive to see world’s most insecure and inaccurately marketed (as private) social network put under scrutiny: advocates of true free speech always hated how it misled users. Signal, WhatsApp and iMessage (without backups) were always more private.


> Al Jazeera is apparently reporting the UAE has frozen a contract for 80 Rafales as a result of the arrest of their citizen Durov.

Why am I not surprised an autocracy would act like a national authority has direct control over all judicial proceedings in that country?


Old Unix Geek

"being in a French jail might put him safely out of the way of Russian agents."

Being released on 5 million euros bail presumably falsifies this hypothesis?


"Worth noting he arrived in France when he had the option to stay away"

He didn't know he'd be arrested.

"Telegram was legally in possession of CSAM"

This is a highly problematic claim, because it implies that a hosting company owns the data its customers put on their servers. This feels one step removed from saying Samsung owns my documents because I'm storing them on a Samsung SSD.

"it doesn’t use end to end encryption"

That's the main issue. Telegram should never have known what its users are saying on its platform.


@Plume: I think there is a reasoning error behind your Samsung argument. The SSD is in your possession, not Samsung's. The hosting servers are in the providers' possession.

Though I feel strongly about encryption and am wholly aware of all the dangers of outlawing it, recently I've become more open to that thought if only to combat the torrent of CSAM, drugs and terrorism related activities that are facilitated by it.

Telegram is, by nature of its current design, in a position to act in a way that is moral and ethical (by moderating for this kind of content), not doing so could be perceived as unethical. Parties such as Facebook and X also moderate content after being forced (by public and law) to do so. Of course there will be other dark places for this kind of behaviour to take place, but increasing the complexity and risk of getting to it should be an aim. Having it so freely and easily accessible should be frowned upon.

Though the human cost of this content moderation (imagine having to stare at those kinds of material all day, every day) is high, this is a separate discussion to be had.

/rant ;)


Seriously though, this is most likely the French government saying
"We have the DSA, you are not in compliance. You have to give us a solid channel of communication with a team that can take actions that are legally required. Here, have a taste of this rubber hose."


"The hosting servers are in the providers' possession."

That does not mean that the providers possess the content on the servers. If somebody sends a letter containing child porn, the mailman is not guilty of possessing child porn.

"I've become more open to that thought if only to combat the torrent of CSAM, drugs and terrorism related activities that are facilitated by it."

Given the state of the US government, I think the more pressing issue than somebody ordering drugs on a Telegram channel is the fact that some States (and depending on who is elected next, the federal government) will absolutely use any information it has on its citizens to put them in jail for getting an abortion or ordering the wrong children's book.

If anything, we need more privacy, not less.

"We have the DSA, you are not in compliance"

Probably, let's see.


Old Unix Geek

It doesn't seem to be an actual DSA violation...


Old Unix Geek

It seems that the large social media platforms are losing their liability shields in the US too.

If I understood the article correctly, it appears to be saying that if a platform owner infers what a user wants to see next, that is his "speech", and as such he's liable for it. So if TikTok automatically shows you videos about a self-asphyxiation challenge, that's their choice and therefore on them, and they can be sued for it. Apparently the US allows one to sue libraries for "distributing" books one doesn't like, since the library chose to make these books available rather than others, and they are liable for making that choice. In this case TikTok would be the library.


Old Unix Geek

Apparently Durov may be required to stay in France until the investigation is over which could take a number of years. During that time he "may continue his work".

Sounds like the process is the punishment, and that France wants control over telegram.


@OUG Beat me to it, citing Craig Murray. A refreshing and welcome take.

And yes, the rise and rise and rise of authoritarianism in western democracies really is a problem.

Leave a Comment