Encrypted Communication Apps

I have discussed this idea in the past, but normally I’ve only gotten excitement about encrypted communication from my fellow libertarians and netsec friends. But with the current Presidential situation, there seems to be more interest in communicating without being overheard by the government, even among my government-loving left-wing friends. And this is excellent! Even if you don’t need privacy, by communicating securely all the time, you make it less notable when you do have to communicate securely, and you create more encrypted traffic that other government targets of surveillance can blend into.

First, let’s go over a very quick summary of encryption. If you’re already familiar with encryption, skip down past this section and the pictures to the list.

Public Key Encryption in 5 Minutes

An encryption algorithm takes information, like text, numbers, picture data (it’s all just 0s and 1s to computers) and outputs different text on the other side. A good encryption algorithm will output text that looks randomly generated so that no information can be gained about the source text. That output is then sent out in the clear (over the internet, where people might be spying) to the recipient. The recipient then reverses the process, decrypting the message and getting the original text, numbers, picture data, etc. However, if an algorithm always created the same output data from the same inputs, bad guys could figure out what you were saying pretty quickly. This introduces the idea of keys. A key is a number the algorithm uses to change the output in a predictable way. If both the sender and the recipient have a secret key, they can use their keys and the algorithm to send messages that only they can read (without the right key, the algorithm won’t reverse the encryption):

Symmetric key encryption. Public domain image.

But we can do better! In our previous scenario, we need to somehow communicate the secret key separately from our message. That’s a problem, since we likely are using encryption precisely because we can’t communicate openly. The solution is something called public key encryption. In this system, each person has two keys, one public and one private. To send someone a message, you can encrypt the message with their public key, and then send it to them. Then only they alone can decrypt the message with their private key.

Public key cryptography. Public domain image.

The reality of the mathematics is slightly more complicated, but for our purposes, what matters is how the public and private keys exist in each messaging app. Messing with these keys is difficult and confusing for users, but loss of the private key means communication is unsecured. Therefore, when using encrypted messaging, it’s important to be aware of how the app uses and manages the keys.

The Best Apps

The following is my ranked order of preferred secure communication:

1. Signal. This the gold standard encrypted communication app. It’s open source, free, has group chat, works on mobile and desktop, and of course is end-to-end encrypted. It even has encrypted voice calls. The one significant drawback is that it requires a phone number. It uses your phone number to distribute your public key to everyone that needs to contact you.  Because of this, it offers excellent encryption (requiring no security knowledge!), but no anonymity. If you want that, check the next entry.

2. PGP Encrypted email. So this one is a bit complicated. OpenPGP (stands for Pretty Good Privacy) is an open protocol for sending encrypted messages. Unlike the other apps on this list, PGP isn’t an app and therefore requires you to produce and manage your own keys. The tools you can find at the link will allow you to produce a private and public key pair. To send a message to someone else, you will have to obtain that person’s public key from them, use the software to encrypt the message with their public key, and then send it to them. Because it is so much work, I have this method second on the list, but there is no better way to communicate securely and anonymously. To better distribute your public key, I recommend keybase.io (use that link to send use encrypted emails!). The good thing about PGP is that it can be used with any email, or really any other method of insecure communication. Additionally, it’s open source, free, and very encrypted. 

Both Signal and PGP are very secure methods of communication. The following apps are good, but they are not open source and thus are not as provably secure. They are still better than just using unencrypted methods like SMS text, email, etc.

3. Whatsapp. WhatsApp is pretty good. It’s free, widely used, implements Signal protocol (and requires a phone number), works on mobile and desktop, has group chat and encrypted phone calls, and is encrypted by default. Moxie Marlinspike, the guy who made Signal, the number one app on this list, actually implemented the same Signal protocol on WhatsApp. That’s great, but unfortunately, WhatsApp isn’t open source, so while Moxie vouches for WhatsApp now, we don’t know what could happen in the future. WhatsApp could push out an update that does sneaky, but bad things, like turning off defaults. It’s also important to acknowledge that WhatsApp’s implementation already isn’t perfect, but it’s not broken. If you use WhatsApp, it’s important to make sure the notifications are turned on for key changes. Otherwise, it’s an excellent, widely used texting substitute.

4. Threema. Threema has an advantage in that it isn’t based in U.S., and it’s more security focused than Whatsapp. Threema is fairly feature rich, including group chat, but it isn’t free, it’s limited to mobile, and it isn’t open source. Threema uses the open source library NaCl, and they have a validation procedure which provides some comfort, although I haven’t looked at it in depth and can’t tell if it proves the cryptography was done perfectly. This paper seems to indicate that there’s nothing obviously wrong with their implementation. Nonetheless, it cannot be higher on this list while still being closed source.

5. FB Messenger secret conversations. Facebook Messenger is a free app and when using its secret conversations options, the Signal protocol is used. The app is also widely used but it takes effort to switch the conversations to secret. An encrypted app that isn’t encrypted by default doesn’t do much good. FB Messenger does let you look at your keys, but it isn’t as easy to check as it is in WhatsApp, and since it isn’t open source, keys could be managed wrong or defaults changed without us knowing. It also doesn’t have other features like group chat or desktop versions.

6. iMessage. Apple has done a good job with an excellent secure protocol for iMessage. It’s also feature rich, with group chat and more, but it’s only “free” if you are willing to shell out for Apple products. While Apple does a good job documenting their protocols, iMessage is not open source, which means we can’t verify how the protocol was implemented. Moreover, we cannot view our own keys on the app, so we don’t know if they change, and we don’t know how Apple manages those keys. It is therefore possible that Apple could either loop government spying into their system (by encrypting all messages with an extra master key) or simply turn over specific keys to the government. The amount you are willing to use iMessage to communicate securely should be determined by the amount you trust Apple can withstand government attempts to access their security system, both legal and technological.

Things I have specifically not listed on purpose:

  1. Don’t use SMS. It’s not encrypted and insecure. It would be good to not even use it for 2-factor authentication if you have a better option.
  2. Don’t use email. It’s not encrypted and insecure.
  3. Don’t use Telegram. They created their own “homemade” crypto library which you should NEVER EVER DO. Their protocol is insecure and their encryption is not on by default. In fact, there are at least two known vulnerabilities.

Leave a comment on the official Reddit thread.

Model-Breaking Observations in the Senate

It’s rare when an idea, or piece of evidence, comes along that is so impressive, it forces you to rethink your entire model of the world. The recently released Feinstein-Burr encryption bill has done just that.

It has been described as “technically illiterate”, “chilling”, “ridiculous”, “scary”, and “dangerous“.  Not only are the issues with the bill fairly obvious to anyone with a cursory understanding of encryption, the problems are of such magnitude that it thwarts any attempt to understand the Senators’ actions.  Let’s look at the effects of the hypothetical law.

The biggest issue is that this bill will significantly damage the United States’ national security. We live in a highly insecure world where cyberattacks, both foreign and domestic, are omnipresent. The Feinstein-Burr bill would fundamentally reduce the security of all technology infrastructure in the country. Jonathan Zdziarski in a blog linked above, gives some details:

Due to the backdooring of encryption that this legislation implies, American electronics will be dangerously unsafe compared to foreign versions of the same product. Diplomats, CEOs, scientists, researchers, politicians, and government employees are just a few of the people whose data will be targeted by foreign governments and hackers both while traveling, but also whenever they’re connected to a network.

That’s awful, and even if you have the most America-first, protect-American-lives mentality, weakening American encryption is the worst thing you could do; it literally endangers American lives.

I think there’s also a strong case to be made that this will do very little to combat terrorism. Unbreakable, strong encryption is widely available on the internet for free, forever; if bad people want to use it, they will.  Moreover, terrorism, as awful as it is, is relatively rare; Americans are about a 1000x more likely to die non-terrorism related homicide. And many more “common” homicides occur due to heat-of-the-moment arguments, which means there would be no encrypted messages detailing conspiracies. All this bill does is remove the ability of average, non-technically inclined Americans to secure their data.

And the people whose data will be most at risk will be those consumers who are less educated or less technically adept. Better informed consumers might have the ability to install foreign encryption software on their phone to keep their data safe, but most uninformed consumers just use default settings.  Thus, criminals who try and commit identity theft will greatly benefit from this legislation; they wouldn’t usually bother targeting knowledgeable users anyway, and with security stripped away from phones, it will be much easier to steal data from susceptible users. The people most in need of help to protect their data will be disproportionately harmed by this legislation.

On the other hand, most companies are not uniformed users. They have IT departments who understand the value of encrypting their data, and they will continue to purchase strong security software, even if it is no longer sold in the United States.  Foreign produced software works just as well.  Banning strong encryption will debilitate the American technology sector, one of the biggest and most important parts of the economy.   This will cost Americans jobs and diminish America’s influence on the future of the world, as technological innovation moves overseas.  But this isn’t just bad for Americans; it’s not easy to simply move an entire company or product overseas. There are huge capital investments these companies have made that will not be available in other countries immediately, if ever, and this will set back the global technology industry billions if not trillions of dollars.

So this really begs the question of why Senators Dianne Feinstein and Richard Burr introduced this bill; given their stated obsession with national security, and given the horrific effect this bill would have on American national security, there’s no good way to resolve their stated beliefs with their actions. Here are a couple theories to explain their behavior, and some discussion as to why each respective theory is unsatisfying.

The Senators are actually foreign spies purposefully trying to weaken American national security.  Obviously, if this theory is true, it’s self-evidently very bad that our elected officials not only don’t represent us, but actually represent foreign governments likely trying to harm Americans. Sure it’s quite unlikely since it’s very difficult to become a U.S. Senator at all, and no spy agency would send agents in with a plan to become a U.S. Senator.  Whether they were turned into foreign agents after being elected, I really can only speculate. But it strikes me as improbable. Nonetheless, it is true that this legislation is exactly what foreign security agencies would want to introduce to make the United States more vulnerable.  I was curious, so I checked the constitutional definition of treason as well as the Espionage Act, but it seems that you need to literally give secrets to other people, not just make it easier for them to obtain. But there is that one case where a high ranking official is in trouble for storing documents insecurely…

They’re power hungry politicians. The idea of the Senators being foreign spies is bit far-fetched.  But what know for sure is that they are politicians, which means they chose a career path that would give them more power to change things. Maybe Burr and Feinstein are sick of technology companies telling the FBI that they can’t assist their investigations, and they wanted to put them in their place.  If this theory is true, it’s pretty self-evidently evil; people in power using their power indiscriminately to harm citizens is the exact problem Thomas Jefferson identified in the Declaration of Independence.  Of course, it’s not usually a big problem, because James Madison helped construct a whole host of ways to check the power of government. Of course, the most important check for our situation is that senators are voted in by the people. So as long as people know about this dumb bill, they’ll kick these guys out…right?

Hanlon’s Razor (origin disputed) states that one should “never attribute to malice that which is adequately explained by stupidity.”  This theory would mean that two sitting, highly experienced U.S. Senators are too stupid to realize the ill effects this will have on national  and economic security.  Obviously, congress has to make laws in areas that its members are not always familiar with…but Burr and Feinstein are the chair and vice chair, respectively, of the Intelligence Committee. If anyone knows about intelligence, they do. And Feinstein is even on the Judiciary Subcomittee on Technology, Privacy, and the Law! If even these people are too stupid to understand what the effects of their own policies are, we might as well stop sending representatives to a legislature at all and just have run-of-the-mill uneducated voters pass everything directly through referendum. Sure, they’d have no idea what they’re doing, but apparently neither do Senators!

What I think is most likely, and most terrifying, is that American Democracy incentivizes members of Congress to make bad policy if it’s politically beneficial. With all the aides and staff Senators have, plus the amount of pressure they receive from outside groups, it seems unlikely they never heard about the bad effects of the bill. Yet, they did it anyway. Given they don’t work for law enforcement, there is no Frank Underwood endgame for passing this bill; banning encryption doesn’t directly allow Burr and Feinstein to look at their political enemies’ phones (…probably), just criminals and the police.  So then maybe their incentive was to appear tough on crime and terrorism, consequences be damned. Richard Burr is in a reelection year in North Carolina, so let’s look at the effect this horrible bill has had on his chances to win according to Predictit.org:

Primary was in mid-March, bill introduced in early April
Primary was in mid-March, bill introduced in early April

As you can see, the bill had very little effect on his perceived chances. Now, it could be that voters have already factored in Senator Burr’s position on destroying defending American national security, and he needed to introduce this legislation to maintain his position. But it looks identical to a situation where North Carolina voters couldn’t care less about Senator Burr’s position on encryption, and his introduction of legislation consequently had no effect on his reelection chances. If it’s the former, then we are in serious trouble because our legislative representatives are incentivized to make horrible policies because voters aren’t well informed.  If it’s the latter, then we have to dismiss this explanation and go back to one of the other three.

Whatever the explanation is, it reflects poorly on how the government constructs policy, and it reflects poorly on American Democracy. Moreover, assuming any of those discussed theories are true, they imply massive issues that will be difficult or impossible to solve.  Reforming democracy as many progressives would like, through campaign finance, wouldn’t even address any of these issues; it is the technology corporations and privacy NGOs which have been advocating for more privacy and making unbreakable encryption more accessible, while law enforcement and other government agencies have been advocating for less security.  But as far as I can tell, even they haven’t demanded anything like this bill.  Thus,  more campaign spending by private groups would help, not hinder good policy.

No matter how you look at it, this bill indicates a big failure for democratic government and illustrates the dangers discretionary state power.

Photo credit: Caïn venant de tuer son frère Abel, by Henry Vidal in Tuileries Garden in Paris, France, photo by Alex E. Proimos, licensed under CC-BY-2.0.

Banning Unbreakable Smartphone Encryption is Stupid

At least two states, New York and California, have introduced legislation that would ban smartphones sold in those states if those smartphones could not be searched under request from law enforcement.  This would likely mean no phones would be sold with unbreakable encryption, although I suppose Apple or Samsung could manufacture two types of phones and then just sell all the encrypted ones from New Hampshire or something. These bills are still somewhat controversial, and as it has gotten press coverage, there has been a House bill introduced that would prevent state legislation like those bills introduced in New York and California. Continue reading Banning Unbreakable Smartphone Encryption is Stupid