4 minute read
Tel-a-lie-a-gram
Zak Steventon-Barnes Science Writer
Telegram is perhaps the most surprising success story in online messaging.
Advertisement
It has successfully pitched itself as a privacy-centric messenger, even though it lacks E2EE (end-to-end encryption) by default, one of the most important technologies of recent years which is rapidly becoming standard.
E2EE by default is already found on WhatsApp, Snapchat, iMessage, and Android Messenger (as well as security-focused options like Signal, Wire, Session, and Matrix) with Facebook planning to roll it out for Facebook Messenger and Instagram DMs by the end of this year.
Even when Telegram does use E2EE (for example for secret chats), the system it uses is regarded by most cryptographic experts as abnormal and potentially unreliable.
Messages without the protection of E2EE can be viewed by Telegram and by anyone it shares this data with, such as governments. Telegram has long sought to reassure users by stating in its FAQ and Privacy Policy that it has high standards for handing over data to governments, and both state clearly that Telegram had never done this.
But in separate cases in Germany and India, Telegram has reportedly been handing over user data to police and courts. Despite this, Telegram’s privacy policy and FAQ remain unchanged, raising the question ‘can we really trust Telegram?’
The German government has had a long-running struggle to remove what it views as extremism on Telegram. However, according to information seen by Der Spiegel, a German weekly magazine described by the New York Times as “the standardbearer for investigative reporting in Germany”, Telegram has been doing more than just removing channels.
Der Spiegel wrote “Contrary to what has been publicly stated so far, the operators of the messenger app Telegram have released user data to the Federal Criminal Police Office (BKA) in several cases. According to SPIEGEL information, this was data from suspects in the areas of child abuse and terrorism. In the case of violations of other criminal offences, it is still difficult for German investigators to obtain information from Telegram, according to security circles.”
I’m not saying that this is the wrong decision, I myself would rather Telegram complied with lawful police orders. My problem is that Telegram continues to say that this has never happened.
As of time of writing, the FAQ still states “to this day, we have disclosed 0 bytes of user data to third parties, including governments.” While the privacy policy still says “If Telegram receives a court order that confirms you’re a terror suspect, we may disclose your IP address and phone number to the relevant authorities. So far, this has never happened. When it does, we will include it in a semi-annual transparency report.”
And that Telegram channel still appears to be empty, in the words of one privacy commentator, “that’s what makes it a transparency report: it is transparent, there is nothing there.” Indeed, it is usual to publish a transparency report even when nothing has been disclosed, just to show that you’re keeping up with letting users know what’s going on.
Furthermore, I do think Telegram should comply in cases of child abuse, Telegram’s privacy policy only says they would hand over data of terror suspects and then only with a court order, which Der Spiegel does not mention being required.
When asked for comment by Der Spiegel and Android Central, Telegram did not reply.
In India, the case revolves around the piracy of study materials created by Neetu Singh, who is suing the people distributing them as well as Telegram. Telegram attempted to argue that it was unable to provide the court with IP addresses as it would violate its privacy policy and the laws of Singapore, where the data was located. This lines up with Telegram’s claims in its FAQ that it protected user data by splitting it between jurisdictions.
It appears, however, not to have been very protected. The court decided that “merely because Telegram chooses to locate its server in Singapore, the same cannot result in the Plaintiffs’ – who are copyright owners of course materials – being left completely remediless against the actual infringers.”
As a result, the court directed Telegram to hand over the “phone numbers, IP addresses, email addresses” of the creators and operators of the channels Manish Singh said were distributing his study materials.
Telegram appears to have complied where possible with a court order (published by Live Law India) stating “the data as was available with Telegram has been handed over in a sealed cover”.
Some of the data appears to not have been available as Telegram limited the time for which it kept the data. However, the data handed over contains, according to the order, “the names of the admins, the phone numbers, and IP addresses of some of the channels [listed by Singh].”
While there was a court order in this case, the targets weren’t accused of terrorism but rather copyright infringement, a far less serious offence. When asked for comment by TechCrunch, Telegram replied that they “can’t confirm that any private data has been shared in this instance.”
An obvious question is whether this is deliberate dishonesty, or simple failure to update its FAQ and privacy policy. At the minimum, it would seem that they are not doing what they said they would, by handing out user data in a copyright case and by failing to disclose this in a transparency report. Giving assurances that you do not keep, despite being clearly able to in the case of the transparency report, is dishonesty.
Furthermore, I struggle to think that it has not occurred to them at some point when handing over data either to the Indian court or to the German police that they would need to update either their FAQ or Privacy Policy. That would seem to me to be dishonesty by inaction. Next is the question of why did Telegram first disclose user data in Germany and India, which are not exactly at the top of the list of nations that privacy advocates are concerned about, and in the Indian case over something as minor as a copyright violation, hardly the terrorism that Telegram had said it would share data over.
This is especially notable when Telegram was based for several years in the UK, listed by Reporters Sans Frontières as an ‘enemy of the internet’ for its surveillance capabilities. An optimist might suggest that it was because Telegram was more confident the requests were legitimate. A pessimist might suggest that it was because requests from the UK and USA are generally made in secret. Another question is whether message contents are safe. The report by Der Spiegel did not state what data was shared, and the case in India listed only phone numbers and IP addresses.
If they are handing over data in a case they said they wouldn’t and, to an organisation they said they wouldn’t, what’s to stop them from handing over data they said they wouldn’t? And given that we have only heard cases that have come out in spite of Telegram, not because of it, how do we know that they haven’t already done so?