Home  Switchboard  Unix Administration  Red Hat  TCP/IP Networks  Neoliberalism  Toxic Managers  
May the source be with you, but remember the KISS principle ;) Bigger doesn't imply better. Bigger often is a sign of obesity, of lost control, of overcomplexity, of cancerous cells 
Cryptanalysis:


Traditional two areas of cryptography are symmetric ciphers and public key cryptosystems. See SSH  Tech Corner  Cryptographic Algorithms for an excellent introduction.
Breaking cipher refers to finding a property (or fault) in the design or implementation of the cipher that reduces the number of keys required in a brute force attack (that is, simply trying every possible key until the correct one is found). For example, assume that a symmetric cipher implementation uses a key length of 128 bits: this means that a brute force attack needs to try up to all 2^128 possible combinations to convert the ciphertext into plaintext, which is way too much for the current and foreseeable future computers or computer farms. However, a cryptanalysis of the cipher with 16 bit key allow the plaintext to be found in 2^16 rounds, which is feasible.

There are several classic techniques for performing cryptanalysis, depending on what level of access the cryptanalyst has to the plaintext, ciphertext, secret key or other aspects of the cryptosystem. Below are some of the most common types of attacks:
In addition to the above, other techniques are available. Even more important, the boundaries between a ciphertextonly attack, a known plaintext attack, and a chosenplaintext attack are not necessarily rigid in practice. One of the techniques used to mount a ciphertextonly attack on a message is the probable word method, where a guess about possible plaintext is tried as a hypothesis. This was how Enigma messages were decrypted despite the fact that Enigma was a very secure stream cipher machine of its time. So in this case human negligence and excessive bureaucracy of reports (standard header and hooter were used by Germans) was successfully exploited by British. This trick, in effect, at the cost of some wrong guesses and additional trials, turns the ciphertextonly case into the known plaintext case. In case of Enigma, the guessed plain text of the intercepted cipher message was called a crib.
In 193940 Alan Turing and another Cambridge mathematician, Gordon Welchman, designed the most famous deciphering machine of all times, the British Bombe. The basic property of the Bombe was that it could break any Enigmaenciphered message, provided that the hardware of the Enigma was known and that a plaintext 'crib' of about 20 letters could be guessed accurately.
Similarly, if a sufficient amount of known plaintext is available, that quantity will include plaintexts with different properties, including some that are desirable to the cryptanalyst; hence, at least in extreme cases, the known plaintext case blurs into the chosenplaintext case.
Compression of the plaintext text with salt and no standard headers before encryption destroys the notion of the plaintext and this makes braking the cipher more difficult, or even impossible and "known plaintext notion is no longer valid.
Compression of the plaintext text with salt and no standard headers before encryption destroys the notion of the plaintext and this makes braking the cipher more difficult, or even impossible and "known plaintext notion is no longer valid. 
It is important to understand that plaintext should not necessarily represent just the message you need to transmit. Very important way of increasing security of messages is adding random noise or changing the language in which the message is written or the alphabet used (for example, using digrams instead of a regular letters this increasing the size of the alphabet dramatically)
Using mixture of two languages, or alphabets (for example Latin+Cyrillic) you weaken statistical properties of the text which is an important way to increasing message security. One important weakness of German communication during WW2 was formal structure of messages which should be avoided at all costs. Compression is perfect way to file and formal structure and should be the the standard preprocessing technique in such cases.
Transformation of text before encryption is another was to make ciphertext less "breakable". There are several major text transformation methods:
The cryptanalysis of singlekey cryptosystems depends on one simple fact  that some traces of the original structure of the plaintext may be preserved or somehow reflected in the ciphertext. For example, in a monoalphabetic substitution cipher where each letter in the plaintext is replaced by a letter in the ciphertext which is the same each time, a simple statistical analysis of a sizeable portion of ciphertext can be used to retrieve most of the plaintext due to the difference in frequencies of letters in the natural languages. That's why block ciphers are generally preferable.
Compression is a classic method of defeating cryptanalysis by statistical properties of the text. When using standard compression algorithms, you need to add salt and rearrange the parts of compressed text before applying cipher (to avoid compressed text starting with the header. for example zip file header).
But the beauty of compression is that you can use nonstandard algorithms based on some classic published dictionary. In this case words are replaced to indexes to this dictionary and then standard compression algorithm is applied to sequences of numbers.
By destroying the notion of plaintext compression dramatically increases the strength of even weak ciphers making them close tot he strangth of onetime pad
Instead of standard alphabet you can use codes for digrams, as number of digrams in a natural language is pretty limited. In this case the text will have difference statistical properties.
Another simple trick is doubling of alphabet happens when, for example, you transliterate odd words or sentences Russian in Latin. In this case a same latter (say "a") is represented by two symbols instead of one in plain text submitted to the crypto algorithm.
Striping involved splitting the text into stripes (for example each nth character method produces n stripes) and then encoding each stripe separately, possibly using a different key. Striping reduces the effective length of the ciphertext and distorts statistical properties of the plaintext that are present and thus makes cryptanalysis a lot more difficult. But again in most cases compression is so effective in obscuring plaintext that the need for other methods became much less.
Deliberate inclusion of noise in the plain text is very effective method of distorting statistical properties of the text. Also gibberish at the beginning and end of plaintext makes detection of the start of plaintext more difficult and is a simple and effective way to defeat attempts to get "know plaintext" at the beginning due to the nature of header and footer in a given communication.
Here is a nice illustration of the classic striping technique with the "plaintext" consisting of each second (odd) line that is obfuscated by "noise" in even lines.
Jokes Magazine
Employee Review January 25, 2000My boss asked me for a letter describing my partner Bob Smith, and this is what I wrote:
Bob Smith, my assistant programmer, can always be found
hard at work in his cubicle. Bob works independently, without
wasting company time talking to colleagues. Bob never
thinks twice about assisting fellow employees, and he always
finishes given assignments on time. Often Bob takes extended
measures to complete his work, sometimes skipping
coffee breaks. Bob is a dedicated individual who has absolutely no
vanity in spite of his high accomplishments and profound
knowledge in his field. I firmly believe that Bob can
be classed as a highcaliber employee, the type which cannot
be dispensed with. Consequently, I duly recommend that Bob
be promoted to executive management, and a proposal will
be executed as soon as possible.
S.D.  Project Leader
Shortly afterward I sent the following followup note: That bastard Bob was reading over my shoulder while I wrote the report sent to you earlier today. Kindly read only the odd numbered lines (1, 3, 5, etc.) for my true assessment. Regards,
This actually also can be viewed as an example of Steganography so steganography methods are applicable here.
Successful cryptanalysis is a combination of mathematics, inquisitiveness, intuition, persistence, powerful computing resources, availble human interlligence data against the adversary  and more often than many would like to admit  luck. It's often a strictly government's three letter agencies game: enormous resources and long term focus are usually required. Breaking of the German Enigma code during WWII, for example, was probably the most famous of such cases. Venona project is most probably (at least partially ) is PR attack as much information about it is very contradictory.
Today, cryptanalysis is practiced by a wider range of organizations that includes companies developing security products, etc. It is this constant battle between cryptographers trying to secure information and cryptanalysts trying to break cryptosystems and invent approaches that moves the cryptology forward. My impression is that cryptographers are now in much better position then ever before to defeat attempts of cryptanalysts to break their ciphers.
The reader needs to be very careful in judging the security of algorithms. IMHO weak keys problem is especially important, much more so than resistance to differential crypto attacks  the currently fashionable area. One needs to understand that people are probably is the weakest link and many attacks were possible because of poor selection of keys or other blunders (Enigma).
But with a good (or even decent) cipher even in this case you need a genius working on the problem (Alan Turing) and huge resources (those of UK government) to succeed. Having spies in the enemy camp does not hurt either.
Generally attacks on wellstudied crypto algorithms are very difficult and in this sense popular papers often distort the truth hinting that if algorithms contain a particular weakness it is easy to break it. Not true.
Moreover even a slight variation of the classic scheme of application of a cipher (one secret key to a monolithic nondistorted plaintext) completely changes the rules of the game and may substantial increase the security of even a very simple algorithm. That means that restrictions of exporting of a product that contain cryptographic algorithms are not as effective as one can think.
For example if the key is artificially limited to say 56 bits to make bruteforce attack possible there are still multiple cheap ways to foil such attempts. The compression of ciphertext before applying DES (with rearranging of header and tail to avoid exploitation of standard header for archives like zip) makes this weak cipher byandlarge unbreakable for a very simple reason  you no longer can easily guess what the unencrypted plaintext looks like.
Many software and hardware systems are using DES and there are multiple suspicions that a regular DES can be broken by state players (which actually never operate without a layer of human intelligence to guide the decoding efforts). As of 2012 for private players DES is still probably a challenge.
Please note that brute force attack on plain vanilla DES is not an easy task and the cost of specialized computer is substantial (one million dollars in 1998, probably around $100K now). As RSA FAQ states:
No easy attack on DES has been discovered, despite the efforts of researchers over many years. The obvious method of attack is a bruteforce exhaustive search of the key space; this process takes 2^{55} steps on average. Early on, it was suggested [DH77] that a rich and powerful enemy could build a specialpurpose computer capable of breaking DES by exhaustive search in a reasonable amount of time. Later, Hellman [Hel80] showed a timememory tradeoff that allows improvement over exhaustive search if memory space is plentiful. These ideas fostered doubts about the security of DES. There were also accusations the NSA (see Question 6.2.2) had intentionally weakened DES. Despite these suspicions, no feasible way to break DES faster than exhaustive search (see Question 2.4.3) has been discovered. The cost of a specialized computer to perform exhaustive search (requiring 3.5 hours on average) has been estimated by Wiener at one million dollars [Wie94]. This estimate was recently updated by Wiener [Wie98] to give an average time of 35 minutes for the same cost machine.
The first attack on DES that is better than exhaustive search in terms of computational requirements was announced by Biham and Shamir [BS93a] using a new technique known as differential cryptanalysis (see Question 2.4.5). This attack requires the encryption of 2^{47} chosen plaintexts (see Question 2.4.2); that is, the plaintexts are chosen by the attacker. Although it is a theoretical breakthrough, this attack is not practical because of both the large data requirements and the difficulty of mounting a chosen plaintext attack. Biham and Shamir have stated they consider DES secure.
More recently Matsui [Mat94] has developed another attack, known as linear cryptanalysis (see Question 2.4.5). By means of this method, a DES key can be recovered by the analysis of 2^{43} known plaintexts. The first experimental cryptanalysis of DES, based on Matsui's discovery, was successfully achieved in an attack requiring 50 days on 12 HP 9735 workstations. Clearly, this attack is still impractical.
Most recently, a DES cracking machine was used to recover a DES key in 22 hours; see Question 2.4.4. The consensus of the cryptographic community is that DES is not secure, simply because 56 bit keys are vulnerable to exhaustive search. In fact, DES is no longer allowed for U.S. government use; tripleDES (see Question 3.2.6) is the encryption standard until AES (see Section 3.3) is ready for general use.
With minor additional preprocessing it's not that easy to break even very weak ciphers, algorithms that are often dismissed by a popular press as insecure. This is even more true about DES. Let's assume that it is incorporated into some hardware of blackbox program and you need to enhance the security of the communication. You can use one (or combination) of the following very simple methods
Bijective Huffman encoding. Preprocess the plaintext with, say, bijective Huffman encoding. This is probably the most logical way to enhance the strength of stream ciphers. What you lose in speed of encoding you get back in speed of transmission. Moreover any type of compression effectively shorten the message and this increases the ratio length of plaintext/length of the password.
This is just random thoughts and there are definitely other simple ways to enhance any well studied cipher like DES or GOST. The area is definitely more complex and I am not a specialist in this area, but still the level of hype and distortions is enough to raise red flags even for me.
My impression is that even in best case brute force attacks on DES are mostly impractical. So "other" methods are typically used. An infection of computers with a sniffing Trojan recently became the method of choice for three letter agencies who want to get access to secret documents. See discussion of Trojans (Duqu, Flame ) that were detected in attacks on Iran enrichment activities as an example of methods used.
As many attacks depend on the length of cyphertext encrypted with a single key, generally the length of encrypted plaintext with a given key should be limited. That dictates the necessity of some formal method of generating sequence of "good" keys.
Allin all I feel that the availability of powerful computers radically changed this field not in favor of cryptanalysis specialists, but in favor of those who want the protect the text from decryption. Such methods like compression, steganography, new types of onetime pads, codelength manipulation (with Huffman encoding the length of the letter became a variable) generally favor the defender not the attacker of the encrypted text. All these methods provides a lot of interesting opportunities and IMHO can significantly enhance the level of security of ciphers used. Unfortunately those areas are usually ignored in the traditional cryptography textbooks.
I would like to reiterate my impression that with the current level of sophistication of crypto algorithms, the chances that a particular message will be decrypted are very small and outside of well known three letter agencies are minimal even with ciphers that are considered "weak" (e.g. DES). All discussions of a weaknesses of DES assume that it is a single level of defense and that the party involved does not understand the limitations of the technology used. The reality is more complex and it is more realistic to assume that in all practical cases the relative weakness of DES with short keys (as for example implemented in export version of Lotus Notes) is known to both parties.
And it might well be that further progress can be achieved not by a creating a stronger cipher, but by using new capabilities provided by computers including but not limited to striping (multiple pipelines encoding), compression, steganography (with adding noise as the simplest form)
While compression of the plaintext (with its rearranging to hide the prefix of compressed file if standard compressing utilities like zip are used) is the most simple and effective method to increase the strength of the cipher, steganography generally allows to avoid using cipher at all.
It is very attractive method when the length of the transmitted text is not critical and can be at least doubled. The simplest examples include:
Please note that while compression is a powerful method to increase the security of the messages the usage of oftheshelf compression utilities is a mixed blessing: they reduce the overall redundancy of the text, but they often put a predictable header at the beginning of the compressed stream, which facilitates knownplaintext attacks. That problem can be solved by several different methods and on several different levels, but even a simplest communication device now contains enough memory and CPU power to use bijective Huffman encoding that was designed to nullify this disadvantage. For more information, see
It has been argued that the prevention of brute force attacks is actually one of the purposes of compression. As for chosen plaintext attacks compression does not add much: you can always compress the chosen plaintext and compare results not with known plaintext but with the compressed encoding. But the limitation is that all plaintext needs to be known, partial knowledge is problematic and applies only to initial bytes of the message as compressed image of "internal text" depends on previous context. Also compression can be based not on bytes but on larger entities for example digrams. Another classic defense is replacing any frequently used word with randomly chosen from a set of languages or (for articles) a random letter combinations that does not occur in a given natural language and due to this can be automatically eliminated. In the paper Compression and Security the author aptly summarized the advantages of this method:
It has long been appreciated that there are advantages to eliminating regularities in the plaintext before encrypting.
The primary advantages to doing this are:
 The opponents get less ciphertext to analyze;
 What they do get has a corresponding plaintext with fewer redundancies and regularities.
The advantage of the first point should be obvious enough: the less data the enemy has to analyze, the fewer clues the have about the internal state of your cipher, and thus its key. The advantage of the second point is that it hinders cryptanalytic attacks. "Fewer redundancies and regularities" may be translated into more formal terms as "greater entropy per bit". The more closely the statistical properties of the file approach that of a random data stream, the fewer regularities the cryptanalyst has to go on. All this should be uncontroversial.
That compression aids encryption was realized by those who first employed "codewords" in their ciphers. By replacing frequentlyused words like "the" and "and" with otherwise littleused symbols before encrypting they succeeded in reducing the volume of the text based on known regularities in the English language. This type of cipher was employed, for example, by Mary Queen of Scots.
That eliminating patterns in the frequency of the occurrence of particular symbols in the text before enciphering is desirable was clearly realized by the time homophones were employed in conjunction with mono alphabeticsubstitution ciphers.
Dr. Nikolai Bezroukov

Switchboard  
Latest  
Past week  
Past month 
2002/185 ( PDF ) Turing, a fast stream cipher Greg Rose and Philip Hawkes 2002/182 ( PS PS.GZ ) Oblivious Keyword Search Wakaha Ogata and Kaoru Kurosawa 2002/142 ( PDF ) On the Applicability of Distinguishing Attacks Against Stream Ciphers Greg Rose and Philip Hawkes 2002/131 ( PS PS.GZ PDF ) An Improved Pseudorandom Generator Based on Hardness of Factoring Nenad Dedic and Leonid Reyzin and Salil Vadhan
Since graduating in theoretical physics and electrical engineering some 30+ years ago I have had an interest in cryptography and this has developed with the advent of progressively more powerful home computers. In recent years I have played with a number of algorithms where I have taken a particular interest in the techniques involved in making algorithms go as fast as possible.
IMPORTANT
Classical Cryptography Course (Lanaki)
CSE207C Lattices in Cryptography and Cryptanalysis
Cryptanalysis of CipherSaber1
Cryptanalysis of Contents Scrambling System
Links to papers about cryptanalysis of block ciphers
FM 34402 Basic Cryptanalysis
Cryptography, Encryption and Stenography
Springer LINK Lecture Notes in Computer Science 2133
PublicKey Cryptosystems Using SymmetricKey Cryptoalgorithms
Bruce Christianson, Bruno Crispo, and James A. Malcolm
Abstract. The prospect of quantum computing makes it timely to consider the future of publickey cryptosystems. Both factorization and discrete logarithm correspond to a single quantum measurement, upon a superposition of candidate keys transformed into the fourier domain. Accordingly, both these problems can be solved by a quantum computer in a time essentially proportional to the bitlength of the modulus, a speedup of exponential order.
At first sight, the resulting collapse of asymmetrickey cryptoalgorithms seems to herald the doom of publickey cryptosystems. However for most security services, asymmetrickey cryptoalgorithms actually offer relatively little practical advantage over symmetrickey algorithms. Most of the differences popularly attributed to the choice of cryptoalgorithm actually result from subtle changes in assumptions about hardware or domain management.
In fact it is straightforward to see that symmetrickey algorithms can be embodied into tamperproof hardware in such a way as to provide equivalent function to a publickey cryptosystem, but the assumption that physical tampering never occurs is too strong for practical purposes. Our aim here is to build a system which relies merely upon tamperevident hardware, but which maintains the property that users who abuse their cryptographic modules through malice or stupidity harm only themselves, and those others who have explicitly trusted them.
LNCS 2133, p. 182 ff.
AES and Beyond The IETF and Strong Crypto Nortel slides about some crypto issues in networking. Pretty basic, but still useful.
Crypto Scientists Crack Prime Problem
Recently, a group of Indian scientists made news by announcing an algorithm that appears to be able to tell quickly whether a number is prime or not.
If you're mathematically minded, the actual downloadable primality.pdf is worth reading.
So what does this actually mean for cryptography? First, a little background.
Many of the popular common crypto algorithms work because of "something to do with prime numbers". Most security books are about that vague. So math research about primes could have interesting effects on our field. But is being able to determine whether a number is prime quickly going to be able to help or hinder us? Let's look at the RSA algorithm as an illustrative example. (It lost its patent a few years back, so it's okay to discuss now.)
... ... ...
Public key crypto algorithms such as RSA depend on there being two keys used to encrypt and decrypt a message. (Hence, the "generate a key pair" step you see when setting up many applications that use cryptography.) Every user has a complimentary set made up of a private key and a public key. Anything encrypted with the private key can be decrypted with the public key, and anything encrypted with the public key can be decrypted with the private key. Only you should have a copy of your private key, but anyone can have your public key because it's, well, public. If someone encrypts traffic with your public key, it doesn't matter to you because only you can decrypt it.
So, you're probably thinking, if I have a message to send to Jane, I want to encrypt it. I can't encrypt it with my public key, because she doesn't have my private key to decrypt it. So I'll encrypt it with my private key, and she can decrypt it with my public key. Right? Not quite, but this is a really common mistake. Sure, Jane can decrypt the message with your public key. But so can anyone else. What you need to do is encrypt the message with Jane's public key, so that only Jane's private key (which only Jane should have) can decrypt it.
So, the RSA algorithm says this:
 Take two large prime numbers.
 When multiplied together, they have a product N.
 Find two numbers E and D, such that:
 When E is multiplied by D, that should be equal to one mod (p1)(q1).
 What this boils down to is that E and N have to be relatively prime.
 They can't share any common components.
8 and 9 are relatively prime. When broken down as much as possible,
8 = 2 x 2 x 2
9 = 3 x 3Nothing in common.
8 and 20 are not relatively prime.
8 = 2 x 2 x 2
20 = 2 x 2 x 5They have 2 in common, so they're not relatively prime.
If E and D are chosen correctly, then let's make C the ciphertext and P the plaintext.
C = M to the E power mod N
M = C to the D power mod NSo, something encrypted with N and E (the public key) can be solved for M  decrypted into the plaintext. Something encrypted with N and D (the private key) can be solved for the ciphertext C. And since E and D fit together in a defined mathematical relationship as above, you cannot automatically deduce one from the other, but can encrypt and decrypt. The beauty of the modulus is that it's a one way operation. You know what the remainder is, but you'll have to try bruteforcing it to figure out whether it's C multiplied by one with a remainder of three, by two with a remainder of three... by forty thousand with a remainder of three... [grin] That takes a lot of time.
If you want to see an example of this worked out with numbers, there's a clear one at http://math.kennesaw.edu/maa/talks/RSAEncryptionAlgorithm.htm
So, back to our original point. Being able to quickly determine whether a number is prime  what effect does that have on all this? Well, one of the weakest points about RSA and other public key algorithms is that their large prime numbers are only probably prime. It's really hard to tell whether a number with eight zillion digits is actually prime or not  you have to try dividing it by every prime number up to half of its value or so. That's very time consuming. Since those of us that use PGP, etc., don't want to wait too long for our keys to be generated, the RSA algorithm picks values for P and Q that are very likely to be prime, but that's not known for certain.
If those numbers aren't actually prime, then there may be different solutions for the equations other than the ones that are supposed to work. So, someone might be able to decrypt a message without having the matching key  they'd just need a matching key, if there were more than one. (That's what could happen if P and Q aren't prime.) If the new algorithm can determine whether P and Q are really prime and they're not for a given key pair, that could lead to a weakness in RSA. But if that's the case, RSA and other algorithm authors could modify their software to use the new algorithm to ensure that P and Q really are prime, and that would defeat that sort of attack.
There's a lot of sound and fury at the moment about this article, and many people are freaking out about it, but I don't think it's anything to worry about. Mathematicians haven't fully satisfied themselves yet that it's a good tester for primes  I don't think we'll be seeing exploit code in the near future.
Lectures for Computer Security
These lectures contain the base introductory material used for this course. After these lectures, the student will be familiar with the underlying concepts of advanced operating systems.
Crypto Lectures
Information on cryptography useful collection of links:
[Oct 20, 2002] Crypto++ Library 5.0  a Free C++ Class Library of Cryptographic Schemes
[Oct 20, 2002] Speed Comparison of Popular Crypto Algorithms
Here are speed benchmarks for some of the most popular hash algorithms and symmetric and asymmetric ciphers. All were coded in C++ or ported to C++ from C implementations, compiled with Microsoft Visual C++ 6.0 SP4 (optimize for speed, blend code generation), and ran on a Celeron 850MHz processor under Windows 2000 SP 1. Two assembly routines were used for multipleprecision addition and subtraction.
Algorithm Bytes Processed Time Taken Megabytes(2^20 bytes)/Second CRC32 1073741824 8.682 117.945 Adler32 2147483648 6.970 293.831 MD2 8388608 11.276 0.709 MD5 1073741824 10.165 100.738 SHA1 536870912 10.565 48.462 SHA256 268435456 10.345 24.746 SHA512 67108864 7.761 8.246 HAVAL (pass=3) 536870912 7.922 64.630 HAVAL (pass=4) 536870912 12.337 41.501 HAVAL (pass=5) 268435456 7.090 36.107 Tiger 268435456 10.325 24.794 RIPEMD160 268435456 8.332 30.725 Panama Hash (little endian) 1073741824 7.401 138.360 Panama Hash (big endian) 1073741824 11.797 86.802 MDC/MD5 268435456 9.884 25.900 LubyRackoff/MD5 67108864 8.402 7.617 DES 134217728 9.945 12.871 DESXEX3 134217728 11.716 10.925 DESEDE3 33554432 6.740 4.748 IDEA 134217728 11.286 11.341 RC2 33554432 7.912 4.044 RC5 (r=16) 536870912 12.988 39.421 Blowfish 134217728 7.091 18.051 Diamond2 67108864 11.086 5.773 Diamond2 Lite 67108864 9.403 6.806 3WAY 201326592 12.728 15.085 TEA 134217728 12.799 10.001 SAFER (r=8) 67108864 10.565 6.058 GOST 134217728 12.829 9.977 SHARK (r=6) 268435456 12.878 19.879 CAST128 134217728 7.090 18.054 CAST256 134217728 9.995 12.806 Square 268435456 7.801 32.816 SKIPJACK 67108864 12.017 5.326 RC6 268435456 7.871 32.524 MARS 268435456 8.503 30.107 Rijndael 268435456 8.442 30.325 Twofish 268435456 9.974 25.667 Serpent 134217728 10.505 12.185 ARC4 536870912 8.122 63.039 SEAL 1073741824 8.672 118.081 WAKE 1073741824 13.029 78.594 Panama Cipher (little endian) 1073741824 8.512 120.301 Panama Cipher (big endian) 536870912 7.091 72.204 Sapphire 134217728 12.868 9.947 MD5MAC 1073741824 12.078 84.782 XMACC/MD5 1073741824 11.096 92.286 HMAC/MD5 1073741824 10.254 99.863 CBCMAC/RC6 268435456 8.713 29.381 DMAC/RC6 268435456 8.642 29.623 BlumBlumShub 512 524288 10.766 0.046 BlumBlumShub 1024 262144 12.668 0.020 BlumBlumShub 2048 65536 8.903 0.007
[Oct 20, 2002] Cryptographic Algorithms  discussion of several popular algorithms
[Aug 3, 2002] Useful links
O'Reilly Java Center  News  An Interview with Jonathan Knudsen
Java Cryptography  Sample chapter Authentication
The first challenge of building a secure application is authentication. Let's look at some examples of authentication from everyday life:
 At an automated bank machine, you identify yourself using your bank card. You authenticate yourself using a personal identification number (PIN). The PIN is a shared secret, something that both you and the bank know. Presumably, you and the bank are the only ones who know this number.
 When you use a credit card, you identify yourself with the card. You authenticate yourself with your signature. Most store clerks never check the signature; in this situation, possession of the card is authentication enough. This is true when you order something over the telephone, as well; simply knowing the credit card number is proof of your identity.
 When you rent a movie at a video store, you prove your identity with a card or by saying your telephone number.
Authentication is tremendously important in computer applications. The program or person you communicate with may be in the next room or on another continent; you have none of the usual visual or aural clues that are helpful in everyday transactions. Public key cryptography offers some powerful tools for proving identity.
In this chapter, I'll describe three cryptographic concepts that are useful for authentication:
 Message digests produce a small "fingerprint" of a larger set of data.
 Digital signatures can be used to prove the integrity of data.
 Certificates are used as cryptographically safe containers for public keys.
A common feature of applications, especially customdeveloped "enterprise" applications, is a login window. Users have to authenticate themselves to the application before they use it. In this chapter, we'll examine several ways to implement this with cryptography.[1] In the next section, for instance, I'll show two ways to use a message digest to avoid transmitting a password in cleartext from a client to a server. Later on, we'll use digital signatures instead of passwords.
aesutil 1.0.1 (Stable) by Tim Tassonis  Friday, July 19th 2002 13:20 EDT 
About: aesutil is a small library and command line program to encrypt or decrypt data using the Rijndael algorithm in CBC mode.
Changes: A Windows port of the commandline utility, and better option handling.
Google matched content 
Ten most useful resources:
Directories and Portals
Best metalink collections:
Associations and public organizations
Government:
People:
Companies:
Prime numbers and random number generators:
Etc:
***** Cryptology an excellent introduction into cryptology
***** SSH  Tech Corner  Cryptographic Algorithms a very good intro text. The best I found on the WEB.
[packet storm].  httppacketstormsecurity.org
Counterpane Labs SelfStudy Course in Block Cipher Cryptanalysis
Cryptography tutorial from Australia (Peter Gutman)
The Cryptography API, or How to Keep a Secret
Learning about Cryptography by Terry Ritter
Basic
VB Helper Tutorial Cryptography  basic
Cryptography Tutorial, Math 4 ME 2000 Programming Workshop  very basic
Encryption and Security Tutorial
Welcome to the Elliptic Curve Cryptosystem Classroom. This site provides an intuitive introduction to Elliptic Curves and how they are used to create a secure and powerful cryptosystem. The first three sections introduce and explain the properties of elliptic curves. A background understanding of abstract algebra is required, much of which can be found in the Background Algebra section. The next section describes the factor that makes elliptic curve groups suitable for a cryptosystem though the introduction of the Elliptic Curve Discrete Logarithm Problem (ECDLP). The last section brings the theory together and explains how elliptic curves and the ECDLP are applied in an encryption scheme. This classroom requires a JAVA enabled browser for the interactive elliptic curve experiments and animated examples.
Elliptic curves as algebraic/geometric entities have been studied extensively for the past 150 years, and from these studies has emerged a rich and deep theory. Elliptic curve systems as applied to cryptography were first proposed in 1985 independently by Neal Koblitz from the University of Washington, and Victor Miller, who was then at IBM, Yorktown Heights.
Many cryptosystems often require the use of algebraic groups. Elliptic curves may be used to form elliptic curve groups. A group is a set of elements with customdefined arithmetic operations on those elements. For elliptic curve groups, these specific operations are defined geometrically. By introducing more stringent properties to the elements of a group, such as limiting the number of points on such a curve, creates an underlying field for an elliptic curve group. In this classroom, elliptic curves are first examined over real numbers in order to illustrate the geometrical properties of elliptic curve groups. Thereafter, elliptic curves groups are examined with the underlying fields of F_{p} (where p is a prime) and F_{2}m (a binary representation with 2^{m} elements).
Cryptography for encryption, signatures and authentication
Cryptography  mainly PGP related...
Sample Chapters
CRC Press has generously given us permission to make all chapters available for free download.
Please read this copyright notice before downloading any of the chapters.
 Chapter 1  Overview of Cryptography (48 pages)
Postscript file, 554k; Pdf file, 343k.  Chapter 2  Mathematics Background (38 pages)
Postscript file, 472k; Pdf file, 301k.  Chapter 3  NumberTheoretic Reference Problems (46 pages)
Postscript file, 543k; Pdf file, 397k.  Chapter 4  PublicKey Parameters (36 pages)
Postscript file, 497k; Pdf file, 331k.  Chapter 5  Pseudorandom Bits and Sequences (22 pages)
Postscript file, 330k; Pdf file, 206k.  Chapter 6  Stream Ciphers (32 pages)
Postscript file, 484k; Pdf file, 274k.  Chapter 7  Block Ciphers (60 pages)
Postscript file, 783k; Pdf file, 491k.  Chapter 8  PublicKey Encryption (36 pages)
Postscript file, 434k; Pdf file, 303k.  Chapter 9  Hash Functions and Data Integrity (61 pages)
Postscript file, 690k; Pdf file, 482k.  Chapter 10  Identification and Entity Authentication (40 pages)
Postscript file, 444k; Pdf file, 316k.  Chapter 11  Digital Signatures (64 pages)
Postscript file, 748k; Pdf file, 526k.  Chapter 12  Key Establishment Protocols (53 pages)
Postscript file, 532k; Pdf file, 400k.  Chapter 13  Key Management Techniques (48 pages)
Postscript file, 536k; Pdf file, 340k.  Chapter 14  Efficient Implementation (44 pages)
Postscript file, 547k; Pdf file, 371k.  Chapter 15  Patents and Standards (27 pages)
Postscript file, 296k; Pdf file, 212k.  Appendix  Bibliography of Papers from Selected Cryptographic Forums (40 pages)
Postscript file, 363k; Pdf file, 331k.  References (52 pages)
Postscript file, 521k; Pdf file, 459k.  Index (26 pages)
Postscript file, 236k; Pdf file, 160k.
Dynamical systems are often described as ``unpredictable" or ``complex" as aspects of their behavior may bear a cryptic relationship with the simple evolution laws which define them. Some theorists work to quantify this complexity in various ways. Others try to turn the cryptic nature of dynamical systems to a practical end: encryption of messages to preserve their secrecy. Here some previous efforts to engineer cryptosystems based on dynamical systems are reviewed, leading up to a detailed proposal for a cellular automaton cryptosystem.
Cryptosystems constructed from cellular automaton primitives can be implemented in simply constructed massively parallel hardware. They can be counted on to deliver high encryption/decryption rates at low cost. In addition to these practical features, cellular automaton cryptosystems may help illuminate some foundational issues in both dynamical systems theory and cryptology, since each of these disciplines rests heavily on the meanings given to the intuitive notion of complexity.
Prime Numbers  University of Tennessee
Mercy  block encryption algorithm Mercy is a fast block cipher operating on 4096bit blocks, designed specifically around the needs of disk sector encryption. It takes a 128bit parameter representing the block number being encrypted, so that saving the same plaintext to different blocks results in different ciphertexts. Mercy was presented at Fast Software Encryption 2000.
SSH  Tech Corner  Cryptographic Algorithms
Cryptography, SSH, prime numbers, factorisation, vigenere, crypto ...
[FW1] Crypto Algorithms, need to know the crypto algorithm for ...
Digital Code Signing StepbyStep Guide
Differential cryptanalysis is a chosenplaintext/chosenciphertext cryptanalytic attack. Cryptanalysts choose pairs of plaintexts such that there is a specified difference between members of the pair. They then study the difference between the members of the corresponding pair of ciphertexts. Statistics of the plaintext pairciphertext pair differences can yield information about the key used in encryption. Differential cryptanalysis should be seen as mostly a "white hat" method, since such an attack would be very hard to mount in a realworld situation. But a possible weakness may allow more practical attacks in the real world. Since differential cryptanalysis became public knowlege, it has become an essential tool of cipher designers. No cipher will be taken seriously unless there is reason to believe it has strong resistance to this attack
See also Differential cryptanalysis  Wikipedia
block cipher being developed as a successor to DES. Several interesting algorithms were submitted as candidates to become AES. The selected algorithm called Rijndael (one suggested pronunciation: "rain doll"), a variant of an algorithm called Square.
The Enigma was one of the best of the new electromechanical cipher machines produced for the commercial market in the 1920s. Hugo Koch, a Dutchman, conceived of the machine in 1919. Arthur Scherbius first produced it commercially in 1923. Impressed by its security, which was based on statistical analysis, the German government acquired all rights to the machine and adapted it to the needs of its new, modern military forces. It became the standard cipher machine of the military services, of German agents, and of the secret police. It was also used at all echelons from high command to frontline tactical units including individual airplanes, tanks, and ships. An ordinary threewheel Enigma with reflector and six plug connections generated the following number of coding positions:
3,283,883,513,796,974,198,700,882,069,882,752,878,379,955,261,095,623,685,444,055,315,226,006,
433,616,627,409,666,933,182,371,154,802,769,920,000, 000,000Given this statistical capability, proper communications procedures and practices, and the fact that solving the Enigma on a timely basis would require rapid analytic machinery which did not exist, the Germans regarded the Enigma as impenetrable even if captured. The Germans, however, did not always practice proper communications security, and, more importantly, the Allies, even in 193839, were on the verge of creating the necessary cryptanalytic machinery which would unlock the Enigma's secrets. The evolution of this technology and its application were major contributing factors to the ultimate Allied victory in World War II.
Alan Turing and the Battle of the Atlantic
The Bombe was used with success from the summer of 1940 onwards, to break messages enciphered on the simpler Enigma system used by the German Air Force. But the most important messages were those to and from the Uboat fleet, and these were enciphered on a much more secure Enigma system.
Alan Turing took on this problem, going against the prevailing view that it would prove unbreakable. Although he had crucial new ideas at the end of 1939, not much practical progress could be made until there was a breakthrough in the spring of 1941, with the capture of some vital missing information. After that, the Uboat communications were effectively mastered. Alan Turing headed the cryptanalysis of all German Naval signals in Hut Eight.
The naval Enigma was more complicated than those of the other German services, using a stock of eight rather than five rotors. For the Bombe to work in a practical time it was necessary to find ways of cutting down the number of possibilities. Alan Turing developed 'Banburismus,' a statistical and logical technique of great elegance, to find the identity of the rotors of the enciphering Enigma before using the Bombe. Turing made major developments in Bayesian statistical theory for this work. Tony Sale has a sequence of pages on Naval Enigma explaining in considerable detail what Alan Turing did and how Banburismus worked.
See also Humor
The viral marketing campaign revolves around a Flash game infested with techie
throwaway words in which the user must guide a "worm" through a "computer system"
to collect "nodes" and "crack" a password within 60 seconds.
Jokes Magazine Employee Review January 25, 2000 (The classic crypto joke)
My boss asked me for a letter describing my partner Bob Smith, and this is what I wrote:
Bob Smith, my assistant programmer, can always be found
hard at work in his cubicle. Bob works independently, without
wasting company time talking to colleagues. Bob never
thinks twice about assisting fellow employees, and he always
finishes given assignments on time. Often Bob takes extended
measures to complete his work, sometimes skipping
coffee breaks. Bob is a dedicated individual who has absolutely no
vanity in spite of his high accomplishments and profound
knowledge in his field. I firmly believe that Bob can
be classed as a highcaliber employee, the type which cannot
be dispensed with. Consequently, I duly recommend that Bob
be promoted to executive management, and a proposal will
be executed as soon as possible.
S.D.  Project Leader
Shortly afterward I sent the following followup note: That bastard Bob was reading over my shoulder while I wrote the report sent to you earlier today. Kindly read only the odd numbered lines (1, 3, 5, etc.) for my true assessment. Regards,
S.D.
NSA declassifies crypto algorithms
The Metaphor Is the Key: Cryptography, the Clipper Chip, and the Constitution  HTMLized version of 180 page University of Pennsylvania Law Review Article by U. Miami School of Law Prof. A. Michael Froomkin. http://www.law.miami.edu/~froomkin/articles/clipper.htm
Cypherpunks, Cryptography & Hackers
Society
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
Quotes
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random ITrelated quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Bulletin:
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) ObjectOriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
History:
Fifty glorious years (19502000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 19872006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
Classic books:
The Peter Principle : Parkinson Law : 1984 : The Mythical ManMonth : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editorrelated Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPLrelated Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perlrelated Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assemblerrelated Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 19962018 by Dr. Nikolai Bezroukov. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) in the author free time and without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
You can use PayPal to make a contribution, supporting development of this site and speed up access. In case softpanorama.org is down you can use the at softpanorama.info 
Disclaimer:
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.
The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.
Last modified: March 11, 2018