![]() | ![]() |
RSA was developed in 1978 by Ronald Rivest, Adi Shamir, and Leonard Adleman. Mathematically, it is arguably one of the simplest public key algorithms to understand and implement. The RSA algorithm obtains its strength from the belief that it is difficult to factor large numbers. The RSA algorithm is patented, although the last patent that is believed to cover the algorithm runs out on 20 September 2000. When implemented in software, RSA with 1024-bit keys is about 100 times slower than DES and gets much slower as the keys get larger.
[191]See the book Cracking DES: Secrets of Encryption Research, Wiretap Politics and Chip Design, by the Electronic Frontier Foundation (O'Reilly & Associates, 1998).Triple DES (3DES) is a way of combining three uses of single DES with two keys, making the key size 112 bits. Because of the increase in key size, 3DES is believed to be much more secure than ordinary DES. 3DES runs about three times slower than single DES.
The DES algorithm was developed by IBM in the 1970s and was standardized by the U.S. National Bureau of Standards and Technology (the organization is now called the National Institute of Standards and Technology or NIST). It was intended for encrypting unclassified data. Some of the development process for this algorithm is shrouded in mystery, and the NSA had input into the design process that resulted in a number of modifications. The fact that nobody has been able to show that DES has significant weaknesses suggests that the influence of the NSA actually increased the strength of the cipher. It is not clear that the designers intended for the algorithm to be released to the public.
These algorithms were developed by Ronald Rivest and are trade secrets of RSA Data Security. The RC4 algorithm was disclosed by a 1994 anonymous Usenet posting; RC2 met the same fate in 1996. Both algorithms seem to be reasonably strong. When implemented in software they are about ten times faster than DES.
Skipjack was initially available only in a hardware implementation called the Clipper chip. The algorithm was declassified in 1998 and can now be implemented in software (in this form, it does not necessarily include the provisions for law enforcement access). Research into the strength of Skipjack itself is inconclusive, but it has been shown that a slightly modified version of Skipjack can be broken without using an exhaustive key search. Skipjack does not seem to be a very popular algorithm to implement, possibly because of its history, and as such there is little comparative timing data.
IDEA was invented by Xuejia Lai and James Massey and released in 1992. It has undergone some extensive cryptanalysis and seems to be a strong cipher. The IDEA algorithm is patented in Europe and the United States and must be licensed for commercial use. IDEA is the symmetric block cipher used in Phil Zimmerman's original PGP, one of the most widespread programs for exchanging arbitrary encrypted data across the Internet. There are no problems with the key size. When implemented in software, IDEA runs at about half the speed of DES, but one and a half times the speed of 3DES.
Blowfish was invented by Bruce Schneier and released in 1994. The algorithm appears to be strong. It is designed to be used on 32-bit microprocessors using simple mathematical operations. It does have larger memory requirements than other algorithms, which makes it less attractive for use in smart cards and other small devices. Blowfish is not patented, and implementations in C are in the public domain. When implemented in software, Blowfish runs at about five times the speed of 3DES.
The DSS is a U.S. NIST standard, issued in 1994, that standardizes the use of DSA; in an official sense, the DSA and the DSS are separate objects (one of them is an algorithm, and the other is an official U.S. government document), but in practice, the terms are often used interchangeably. The DSA is between 10 and 40 times slower to verify signatures than RSA. Some elements of the design of the DSA make it difficult to use for encryption, but it is possible with a full implementation of the algorithm.
The patent situation in regard to implementations using the DSA is unclear; there is some possibility that parts of it are patented and thus cannot be used without paying license fees. The U.S. government has indicated that they would indemnify companies that were sued by possible patent holders, but only if they were implementing the DSS as part of a government contract.
This algorithm was designed by Ronald Rivest and released in 1990 as RFC 1186. In 1996 some significant flaws in MD4 were discovered. As a result, MD4 should not be used.
This algorithm was designed by Ronald Rivest as an improvement to MD4 and was released in 1992 as RFC 1321. Research on MD5 has indicated that one of the design goals of a cryptographic hash (collision resistance) has been violated. A general way to generate collisions has not been found, but the research is troubling. Current cryptographic wisdom also suggests that the size of a cryptographic hash function should be at least 160 bits in size in order to be resistant to birthday[192] attacks. Given these issues, MD5 should probably be avoided in situations where long-term digital signatures are required.
[192]A birthday attack is based on probability and is related to the question: How many people have to be in a room for there to be two people with the same birthday? The surprising answer to this is that there is more than a 50 percent chance of having two people with the same birthday if there are at least 23 people in the room. This footnote is too small to contain an explanation, but we encourage you to look this up in any good book on probability.
The SHA algorithm is a U.S. NIST standard and was first issued in 1992 for use with the DSA. In 1995 NIST released a technical update to SHA called SHA-1. This update supersedes the previous version and is thought to address a collision problem similar to the one discussed previously in MD5. SHA-1 should now be used instead of SHA. As this algorithm calculates a 160-bit value, it is more resistant to birthday attacks.
This algorithm was invented by Whitfield Diffie and Martin Hellman in 1976. It uses exponentiation and modular arithmetic as the basis of its calculations; this is known to be secure, but it involves very large numbers and relatively slow calculations. One of the most important features of Diffie-Hellman is that it can be used to generate a secret that has perfect forward secrecy. Diffie-Hellman was patented, but the patent expired in 1997 so the algorithm can now be freely used.
Diffie-Hellman is a pure key exchange algorithm, which cannot be used to encrypt data. People who claim to be using Diffie-Hellman to encrypt data are extremely confused (unfortunately, because Diffie-Hellman is commonly used for key exchange with bulk encryption schemes, such people are not as rare as one would hope).
Elliptic key algorithms are much newer than Diffie-Hellman, which gives them two significant drawbacks. First, the mathematical foundations, including the cryptographic strengths and weaknesses, are not as well understood. Second, elliptic key algorithms are still subject to patent protection (and there are significant arguments about who owns what patents). As of this writing, elliptic curve algorithms are still considered a relatively risky choice; this may change as more mathematical results are found (it is beginning to look as if they may become a widely accepted choice, but it is also possible that they will become unusable, if there turns out to be a general solution to the problem that makes them difficult to solve).
Purpose | Size (in bits) | Acceptable Algorithms |
---|---|---|
Symmetric encryption | 128 |
IDEA |
Symmetric encryption | 112 | 3DES |
Cryptographic hashes | 160 | SHA-1 |
Cryptographic hashes | 128 | MD5 |
Key exchange | 1400 | Diffie-Hellman |
Key exchange | 1024 | RSA |
Digital signatures | 1024 |
RSA |
In fact, in most cases, all you need is the suspicion. Cryptography is a difficult business: it's hard to come up with good cryptographic algorithms; there are trade-offs between the speed of an algorithm, the memory requirements of an algorithm, and the strength of an algorithm; and no algorithm is perfectly unbreakable. Therefore, any product that advertises a magic new algorithm that runs really fast on small devices and can never be broken is at best over-optimistic and at worst fraudulent.
If you need to evaluate an algorithm, here are some questions you should ask: