resourceone.info Biography Cryptographic Algorithms Pdf

CRYPTOGRAPHIC ALGORITHMS PDF

Tuesday, July 30, 2019


n Cryptographic algorithms: n encryption algorithm, ciphertext to be decoded, pairs of (plaintext, ciphertext) n. Chosen .. n resourceone.info Cryptographic systems are an integral part of standard protocols, most notably the Transport Layer Security (TLS) The Baby step giant step algorithm . Nov 6, Cryptographic algorithms. Prof. Bart Preneel. COSIC. resourceone.infol(at) resourceone.info resourceone.info~preneel.


Cryptographic Algorithms Pdf

Author:KACEY GOGGINS
Language:English, Spanish, Dutch
Country:Oman
Genre:Children & Youth
Pages:225
Published (Last):27.11.2015
ISBN:758-5-77688-142-6
ePub File Size:20.41 MB
PDF File Size:13.45 MB
Distribution:Free* [*Regsitration Required]
Downloads:36903
Uploaded by: SHAREN

Oct 6, Document Version. Publisher's PDF, also known as Version of record .. In this thesis we will look at different cryptographic algorithms which. PDF | Cryptography is the field concerned with linguistic and mathematical techniques for securing information, particularly in communications. Historically. PDF | On Oct 15, , M.B. NIVETHA and others published A Comparative Analysis Of Cryptography Algorithms.

This paper has two major purposes. The first is to define some of the terms and concepts behind basic cryptographic methods, and to offer a way to compare the myriad cryptographic schemes in use today. The second is to provide some real examples of cryptography in use today. See Section A.

Several companies, products, and services are mentioned in this tutorial. Such mention is for example purposes only and, unless explicitly stated otherwise, should not be taken as a recommendation or endorsement by the author.

Some experts argue that cryptography appeared spontaneously sometime after writing was invented, with applications ranging from diplomatic missives to war-time battle plans. It is no surprise, then, that new forms of cryptography came soon after the widespread development of computer communications. In data and telecommunications, cryptography is necessary when communicating over any untrusted medium, which includes just about any network, particularly the Internet.

In cryptography, we start with the unencrypted data, referred to as plaintext. Plaintext is encrypted into ciphertext , which will in turn usually be decrypted back into usable plaintext.

The encryption and decryption is based upon the type of cryptography scheme being employed and some form of key. For those who like formulas, this process is sometimes written as:. In many of the descriptions below, two communicating parties will be referred to as Alice and Bob; this is the common nomenclature in the crypto field and literature to make it easier to identify the communicating parties. If there is a third and fourth party to the communication, they will be referred to as Carol and Dave, respectively.

A malicious party is referred to as Mallory, an eavesdropper as Eve, and a trusted third party as Trent. Finally, cryptography is most closely associated with the development and creation of the mathematical algorithms used to encrypt and decrypt messages, whereas cryptanalysis is the science of analyzing and breaking encryption schemes. Cryptology is the term referring to the broad study of secret writing, and encompasses both cryptography and cryptanalysis. There are several ways of classifying cryptographic algorithms.

For purposes of this paper, they will be categorized based on the number of keys that are employed for encryption and decryption, and further defined by their application and use. The three types of algorithms that will be discussed are Figure 1: Uses a single key for both encryption and decryption; also called symmetric encryption.

Primarily used for privacy and confidentiality. Uses one key for encryption and another for decryption; also called asymmetric encryption.

Primarily used for authentication, non-repudiation, and key exchange. Hash Functions: Uses a mathematical transformation to irreversibly "encrypt" information, providing a digital fingerprint.

Primarily used for message integrity. Three types of cryptography: Secret Key Cryptography Secret key cryptography methods employ a single key for both encryption and decryption. As shown in Figure 1A, the sender uses the key to encrypt the plaintext and sends the ciphertext to the receiver. The receiver applies the same key to decrypt the message and recover the plaintext.

Because a single key is used for both functions, secret key cryptography is also called symmetric encryption. With this form of cryptography, it is obvious that the key must be known to both the sender and the receiver; that, in fact, is the secret. The biggest difficulty with this approach, of course, is the distribution of the key more on that later in the discussion of public key cryptography. Secret key cryptography schemes are generally categorized as being either stream ciphers or block ciphers.

Stream ciphers operate on a single bit byte or computer word at a time and implement some form of feedback mechanism so that the key is constantly changing.

Stream ciphers come in several flavors but two are worth mentioning here Figure 2. Self-synchronizing stream ciphers calculate each bit in the keystream as a function of the previous n bits in the keystream. It is termed "self-synchronizing" because the decryption process can stay synchronized with the encryption process merely by knowing how far into the n -bit keystream it is.

One problem is error propagation; a garbled bit in transmission will result in n garbled bits at the receiving side. Synchronous stream ciphers generate the keystream in a fashion independent of the message stream but by using the same keystream generation function at sender and receiver. While stream ciphers do not propagate transmission errors, they are, by their nature, periodic so that the keystream will eventually repeat. A block cipher is so-called because the scheme encrypts one block of data at a time using the same key on each block.

In general, the same plaintext block will always encrypt to the same ciphertext when using the same key in a block cipher whereas the same plaintext will encrypt to different ciphertext in a stream cipher.

The most common construct for block encryption algorithms is the Feistel cipher , named for cryptographer Horst Feistel IBM. As shown in Figure 3, a Feistel cipher combines elements of substitution, permutation transposition , and key expansion; these features create a large amount of " confusion and diffusion " per Claude Shannon in the cipher. One advantage of the Feistel design is that the encryption and decryption stages are similar, sometimes identical, requiring only a reversal of the key operation, thus dramatically reducing the size of the code software or circuitry hardware necessary to implement the cipher.

One of Feistel's early papers describing this operation is " Cryptography and Computer Privacy " Scientific American , May , 5 , DES is a Feistel block-cipher employing a bit key that operates on bit blocks. DES has a complex set of rules and transformations that were designed specifically to yield fast hardware implementations and slow software implementations, although this latter point is not significant today since the speed of computer processors is several orders of magnitude faster today than even twenty years ago.

DES was based somewhat on an earlier cipher from Feistel called Lucifer which, some sources report, had a bit key. This was rejected, partially in order to fit the algorithm onto a single chip and partially because of the National Security Agency NSA.

A variant devised by Ron Rivest. By combining 64 additional key bits to the plaintext prior to encryption, effectively increases the keylength to bits. The algorithm can use a variable block length and key length; the latest specification allowed any combination of keys lengths of , , or bits and blocks of length , , or bits.

They, too, have approved a number of cipher suites for various applications. Also available internationally. A bit block cipher using variable-sized keys designed to replace DES. It's code has not been made public although many companies have licensed RC2 for use in their products. Described in RFC A stream cipher using variable-sized keys; it is widely used in commercial cryptography products. More detail about RC4 and a little about Spritz can be found below in Section 5.

A block-cipher supporting a variety of block sizes 32, 64, or bits , key sizes, and number of encryption passes over the data. Key lengths can vary from 32 to bits in length. A bit block cipher using , , or bit keys. Designed to be highly secure and highly flexible, well-suited for large microprocessors, 8-bit smart card microprocessors, and dedicated hardware. Camellia has some characteristics in common with AES: Also described in RFC Developed at Mitsubishi Electric Corp.

Designed for hardware and software implementations, and is resistant to differential and linear cryptanalysis. A series of block ciphers designed by James Massey for implementation in software and employing a bit block. KASUMI is the intended confidentiality and integrity algorithm for both message content and signaling data for emerging mobile communications systems.

A block cipher using bit blocks and bit keys. A bit block cipher employing , , and bit keys to encrypt bit blocks in 12, 14, and 16 rounds, depending on the key size. Developed by large group of researchers from academic institutions, research institutes, and federal agencies in South Korea in , and subsequently named a national standard. CLEFIA is one of the new-generation lightweight blockcipher algorithms designed after AES, offering high performance in software and hardware as well as a lightweight implementation in hardware.

SMS4 is a bit block cipher using bit keys and 32 rounds to process a block. SKC scheme proposed, along with the Clipper chip , as part of the never-implemented Capstone project.

Although the details of the algorithm were never made public, Skipjack was a block cipher using an bit key and 32 iteration cycles per bit block.

Capstone, proposed by NIST and the NSA as a standard for public and government use, met with great resistance by the crypto community largely because the design of Skipjack was classified coupled with the key escrow requirement of the Clipper chip.

A family of block ciphers developed by Roger Needham and David Wheeler. TEA was originally developed in , and employed a bit key, bit block, and 64 rounds of operation. GSM mobile phone systems use several stream ciphers for over-the-air communication privacy. Use of this scheme is reportedly one of the reasons that the National Security Agency NSA can easily decode voice and data calls over mobile phone networks.

Described in RFC , KCipher-2 is a stream cipher with a bit key and a bit initialization vector. Using simple arithmetic operations, the algorithms offers fast encryption and decryption by use of efficient implementations. KCipher-2 has been used for industrial applications, especially for mobile health monitoring and diagnostic services in Japan. Salsa and ChaCha: Salsa20 uses a pseudorandom function based on bit whole word addition, bitwise addition XOR , and rotation operations, aka add-rotate-xor ARX operations.

Salsa20 uses a bit key although a bit key variant also exists.

In , Bernstein published ChaCha , a new family of ciphers related to Salsa FPE schemes are used for such purposes as encrypting social security numbers, credit card numbers, limited size protocol traffic, etc. FFX can theoretically encrypt strings of arbitrary length, although it is intended for message sizes smaller than that of AES 2 points. The FFX version 1. Simon and Speck: Simon and Speck are a pair of lightweight block ciphers proposed by the NSA in , designed for highly constrained software or hardware environments.

While both cipher families perform well in both hardware and software, Simon has been optimized for high performance on hardware devices and Speck for performance in software. Both are Feistel ciphers and support ten combinations of block and key size:. TWINE's design goals included maintaining a small footprint in a hardware implementation i.

Designed in , LED is a lightweight, bit block cipher supporting and bit keys. LED is designed for RFID tags, sensor networks, and other applications with devices constrained by memory or compute power.

There are several other references that describe interesting algorithms and even SKC codes dating back decades. Savard's albeit old A Cryptographic Compendium page. Public key cryptography has been said to be the most significant new development in cryptography in the last years.

Their paper described a two-key crypto system in which two parties could engage in a secure communication over a non-secure communications channel without having to share a secret key. PKC depends upon the existence of so-called one-way functions , or mathematical functions that are easy to compute whereas their inverse function is relatively difficult to compute. Let me give you two simple examples:. While the examples above are trivial, they do represent two of the functional pairs that are used with PKC; namely, the ease of multiplication and exponentiation versus the relative difficulty of factoring and calculating logarithms, respectively.

The mathematical "trick" in PKC is to find a trap door in the one-way function so that the inverse calculation becomes easy given knowledge of some item of information.

Generic PKC employs two keys that are mathematically related although knowledge of one key does not allow someone to easily determine the other key. One key is used to encrypt the plaintext and the other key is used to decrypt the ciphertext. The important point here is that it does not matter which key is applied first , but that both keys are required for the process to work Figure 1B.

Because a pair of keys are required, this approach is also called asymmetric cryptography. In PKC, one of the keys is designated the public key and may be advertised as widely as the owner wants. The other key is designated the private key and is never revealed to another party. It is straight-forward to send messages under this scheme. Suppose Alice wants to send Bob a message.

Alice encrypts some information using Bob's public key; Bob decrypts the ciphertext using his private key. This method could be also used to prove who sent a message; Alice, for example, could encrypt some plaintext with her private key; when Bob decrypts using Alice's public key, he knows that Alice sent the message authentication and Alice cannot deny having sent the message non-repudiation. Public key cryptography algorithms that are in use today for key exchange or digital signatures include:.

RSA today is used in hundreds of software products and can be used for key exchange, digital signatures, or encryption of small blocks of data. RSA uses a variable size encryption block and a variable size key.

The key-pair is derived from a very large number, n , that is the product of two prime numbers chosen according to special rules; these primes may be or more digits in length each, yielding an n with roughly twice as many digits as the prime factors.

The public key information includes n and a derivative of one of the factors of n ; an attacker cannot determine the prime factors of n and, therefore, the private key from this information alone and that is what makes the RSA algorithm so secure. Some descriptions of PKC erroneously state that RSA's safety is due to the difficulty in factoring large prime numbers.

In fact, large prime numbers, like small prime numbers, only have two factors! The ability for computers to factor large numbers, and therefore attack schemes such as RSA, is rapidly improving and systems today can find the prime factors of numbers with more than digits. Nevertheless, if a large number is created from two prime factors that are roughly the same size, there is no known factorization algorithm that will solve the problem in a reasonable amount of time; a test to factor a digit number took 1.

Regardless, one presumed protection of RSA is that users can easily increase the key size to always stay ahead of the computer processing curve. As an aside, the patent for RSA expired in September which does not appear to have affected RSA's popularity one way or the other.

A detailed example of RSA is presented below in Section 5. D-H is used for secret-key key exchange only, and not for authentication or digital signatures.

More detail about Diffie-Hellman can be found below in Section 5. Described in FIPS A PKC algorithm based upon elliptic curves. More detail about ECC can be found below in Section 5. These documents are no longer easily available; all links in this section are from archive. PKCS 1: Incorporated into PKCS 1. PKCS 3: PKCS 5: Extended-Certificate Syntax Standard being phased out in favor of X.

A public key cryptosystem proposed by R. Cramer and V. Shoup of IBM in A public key cryptosystem designed by P. Smith and based on Lucas sequences.

Can be used for encryption and signatures, using integer factoring. A public key cryptosystem based on algebraic coding theory. Menezes, P. Vanstone CRC Press, A digression: Who invented PKC? I tried to be careful in the first paragraph of this section to state that Diffie and Hellman "first described publicly" a PKC scheme. Although I have categorized PKC as a two-key system, that has been merely for convenience; the real criteria for a PKC scheme is that it allows two parties to exchange a secret even though the communication with the shared secret might be overheard.

As shown in Section 5. And, indeed, it is the precursor to modern PKC which does employ two keys. Their method, of course, is based upon the relative ease of finding the product of two large prime numbers compared to finding the prime factors of a large number. Diffie and Hellman and other sources credit Ralph Merkle with first describing a public key distribution system that allows two parties to share a secret, although it was not a two-key system, per se. A Merkle Puzzle works where Alice creates a large number of encrypted keys, sends them all to Bob so that Bob chooses one at random and then lets Alice know which he has selected.

An eavesdropper Eve will see all of the keys but can't learn which key Bob has selected because he has encrypted the response with the chosen key. In this case, Eve's effort to break in is the square of the effort of Bob to choose a key. While this difference may be small it is often sufficient. Merkle apparently took a computer science course at UC Berkeley in and described his method, but had difficulty making people understand it; frustrated, he dropped the course.

Merkle's method certainly wasn't published first, but he is often credited to have had the idea first.

Share this page

An interesting question, maybe, but who really knows? Because of the nature of the work, GCHQ kept the original memos classified. In , however, the GCHQ changed their posture when they realized that there was nothing to gain by continued silence.

Documents show that a GCHQ mathematician named James Ellis started research into the key distribution problem in and that by , James Ellis, Clifford Cocks, and Malcolm Williamson had worked out all of the fundamental details of PKC, yet couldn't talk about their work. They were, of course, barred from challenging the RSA patent! Hash functions, also called message digests and one-way encryption , are algorithms that, in essence, use no key Figure 1C.

Instead, a fixed-length hash value is computed based upon the plaintext that makes it impossible for either the contents or length of the plaintext to be recovered. Hash algorithms are typically used to provide a digital fingerprint of a file's contents, often used to ensure that the file has not been altered by an intruder or virus.

Hash functions are also commonly employed by many operating systems to encrypt passwords. Hash functions, then, provide a mechanism to ensure the integrity of a file. This is an important distinction.

Suppose that you want to crack someone's password, where the hash of the password is stored on the server. Indeed, all you then need is a string that produces the correct hash and you're in! However, you cannot prove that you have discovered the user's password, only a "duplicate key. Message Digest MD algorithms: A series of byte-oriented algorithms that produce a bit hash value from an arbitrary-length message.

MD2 RFC Designed for systems with limited memory, such as smart cards. MD2 has been relegated to historical status, per RFC MD4 RFC Developed by Rivest, similar to MD2 but designed specifically for fast processing in software.

MD4 has been relegated to historical status, per RFC MD5 RFC Also developed by Rivest after potential weaknesses were reported in MD4; this scheme is similar to MD4 but is slower because more manipulation is made to the original data.

MD5 has been implemented in a large number of products although several weaknesses in the algorithm were demonstrated by German cryptographer Hans Dobbertin in "Cryptanalysis of MD5 Compress". In , NIST announced that after reviewing 64 submissions, the winner was Keccak pronounced "catch-ack" , a family of hash algorithms based on sponge functions.

The NIST version can support hash output sizes of and bits. Designed by Y. Zheng, J. Pieprzyk and J. Seberry, a hash algorithm with many levels of security. HAVAL can create hash values that are , , , , or bits in length. Designed by V. Rijmen co-inventor of Rijndael and P. Whirlpool operates on messages less than 2 bits in length and produces a message digest of bits. The design of this hash function is very different than that of MD5 and SHA-1, making it immune to the same attacks as on those hashes.

A root hash is used on peer-to-peer file transfer networks, where a file is broken into chunks; each chunk has its own MD4 hash associated with it and the server maintains a file that contains the hash list of all of the chunks. The root hash is the hash of the hash list file. A digression on hash collisions. Hash functions are sometimes misunderstood and some sources claim that no two files can have the same hash value.

This is in theory, if not in fact, incorrect. Consider a hash function that provides a bit hash value. There are, then, 2 possible hash values. Now, while even this is theoretically correct, it is not true in practice because hash algorithms are designed to work with a limited message size, as mentioned above.

Nevertheless, hopefully you get my point.

Cryptographic Algorithms on Reconfigurable Hardware

The difficulty is not necessarily in finding two files with the same hash, but in finding a second file that has the same hash value as a given first file.

Consider this example. Since there are more than 7 billion people on earth, we know that there are a lot of people with the same number of hairs on their head. Finding two people with the same number of hairs, then, would be relatively simple.

The harder problem is choosing one person say, you, the reader and then finding another person who has the same number of hairs on their head as you have on yours.

This is somewhat similar to the Birthday Problem. Alas, researchers in found that practical collision attacks could be launched on MD5, SHA-1, and other hash algorithms. Readers interested in this problem should read the following:. For historical purposes, take a look at the situation with hash collisions, circa , in RFC In October , the SHA-1 Freestart Collision was announced; see a report by Bruce Schneier and the developers of the attack as well as the paper above by Stevens et al.

See also the paper by Stevens et al. Stevens, A. Lenstra, and B. Finally, note that certain extensions of hash functions are used for a variety of information security and digital forensics applications, such as:. So, why are there so many different types of cryptographic schemes? Why can't we do everything we need with just one? The answer is that each scheme is optimized for some specific cryptographic application s. Hash functions, for example, are well-suited for ensuring data integrity because any change made to the contents of a message will result in the receiver calculating a different hash value than the one placed in the transmission by the sender.

Since it is highly unlikely that two different messages will yield the same hash value, data integrity is ensured to a high degree of confidence. Secret key cryptography, on the other hand, is ideally suited to encrypting messages, thus providing privacy and confidentiality. The sender can generate a session key on a per-message basis to encrypt the message; the receiver, of course, needs the same session key in order to decrypt the message.

Key exchange, of course, is a key application of public key cryptography no pun intended. Asymmetric schemes can also be used for non-repudiation and user authentication; if the receiver can obtain the session key encrypted with the sender's private key, then only this sender could have sent the message. Public key cryptography could, theoretically, also be used to encrypt messages although this is rarely done because secret key cryptography values can generally be computed about times faster than public key cryptography values.

Figure 4 puts all of this together and shows how a hybrid cryptographic scheme combines all of these functions to form a secure transmission comprising a digital signature and digital envelope. In this example, the sender of the message is Alice and the receiver is Bob.

A digital envelope comprises an encrypted message and an encrypted session key. Alice uses secret key cryptography to encrypt her message using the session key , which she generates at random with each session.

Alice then encrypts the session key using Bob's public key. The encrypted message and encrypted session key together form the digital envelope. Upon receipt, Bob recovers the session secret key using his private key and then decrypts the encrypted message. The digital signature is formed in two steps. First, Alice computes the hash value of her message; next, she encrypts the hash value with her private key.

Upon receipt of the digital signature, Bob recovers the hash value calculated by Alice by decrypting the digital signature with Alice's public key.

Bob can then apply the hash function to Alice's original message, which he has already decrypted see previous paragraph. If the resultant hash value is not the same as the value supplied by Alice, then Bob knows that the message has been altered; if the hash values are the same, Bob should believe that the message he received is identical to the one that Alice sent. This scheme also provides nonrepudiation since it proves that Alice sent the message; if the hash value recovered by Bob using Alice's public key proves that the message has not been altered, then only Alice could have created the digital signature.

Bob also has proof that he is the intended receiver; if he can correctly decrypt the message, then he must have correctly decrypted the session key meaning that his is the correct private key. This diagram purposely suggests a cryptosystem where the session key is used for just a single session. Even if this session key is somehow broken, only this session will be compromised; the session key for the next session is not based upon the key for this session, just as this session's key was not dependent on the key from the previous session.

This is known as Perfect Forward Secrecy ; you might lose one session key due to a compromise but you won't lose all of them. In a article in the industry literature, a writer made the claim that bit keys did not provide as adequate protection for DES at that time as they did in because computers were times faster in than in Therefore, the writer went on, we needed 56,bit keys in instead of bit keys to provide adequate protection.

The conclusion was then drawn that because 56,bit keys are infeasible true , we should accept the fact that we have to live with weak cryptography false! The major error here is that the writer did not take into account that the number of possible key values double whenever a single bit is added to the key length; thus, a bit key has twice as many values as a bit key because 2 57 is two times 2 In fact, a bit key would have times more values than a bit key.

But this does bring up the question, "What is the significance of key length as it affects the level of protection? In cryptography, size does matter. The larger the key, the harder it is to crack a block of encrypted data. The reason that large keys offer more protection is almost obvious; computers have made it easier to attack ciphertext by using brute force methods rather than by attacking the mathematics which are generally well-known anyway.

With a brute force attack, the attacker merely generates every possible key and applies it to the ciphertext. Any resulting plaintext that makes sense offers a candidate for a legitimate key. Until the mids or so, brute force attacks were beyond the capabilities of computers that were within the budget of the attacker community.

By that time, however, significant compute power was typically available and accessible. General-purpose computers such as PCs were already being used for brute force attacks. Distributed attacks, harnessing the power of up to tens of thousands of powerful CPUs, are now commonly employed to try to brute-force crypto keys. This information was not merely academic; one of the basic tenets of any security system is to have an idea of what you are protecting and from whom are you protecting it!

The table clearly shows that a bit key was essentially worthless against even the most unsophisticated attacker. On the other hand, bit keys were fairly strong unless you might be subject to some pretty serious corporate or government espionage.

But note that even bit keys were clearly on the decline in their value and that the times in the table were worst cases. So, how big is big enough? DES, invented in , was still in use at the turn of the century, nearly 25 years later. If we take that to be a design criteria i. The DES proposal suggested bit keys; by , a bit key would have been required to offer equal protection and an bit key necessary by A or bit SKC key will probably suffice for some time because that length keeps us ahead of the brute force capabilities of the attackers.

Note that while a large key is good, a huge key may not always be better; for example, expanding PKC keys beyond the current or bit lengths doesn't add any necessary protection at this time.

Weaknesses in cryptosystems are largely based upon key management rather than weak keys. Blaze, W. Diffie, R. Rivest, B. Schneier, T. Shimomura, E. Thompson, and M.

Wiener The most effective large-number factoring methods today use a mathematical Number Field Sieve to find a certain number of relationships and then uses a matrix operation to solve a linear equation to produce the two prime factors. The sieve step actually involves a large number of operations that can be performed in parallel; solving the linear equation, however, requires a supercomputer.

In early , Shamir of RSA fame described a new machine that could increase factorization speed by orders of magnitude. There still appear to be many engineering details that have to be worked out before such a machine could be built. Furthermore, the hardware improves the sieve step only; the matrix operation is not optimized at all by this design and the complexity of this step grows rapidly with key length, both in terms of processing time and memory requirements.

Nevertheless, this plan conceptually puts bit keys within reach of being factored. It is also interesting to note that while cryptography is good and strong cryptography is better, long keys may disrupt the nature of the randomness of data files. Shamir and van Someren "Playing hide and seek with stored keys" have noted that a new generation of viruses can be written that will find files encrypted with long keys, making them easier to find by intruders and, therefore, more prone to attack.

Finally, U. Until the mids, export outside of North America of cryptographic products using keys greater than 40 bits in length was prohibited, which made those products essentially worthless in the marketplace, particularly for electronic commerce; today, crypto products are widely available on the Internet without restriction. The U. Department of Commerce Bureau of Industry and Security maintains an Encryption FAQ web page with more information about the current state of encryption registration.

Without meaning to editorialize too much in this tutorial, a bit of historical context might be helpful. In the mids, the U. Department of Commerce still classified cryptography as a munition and limited the export of any products that contained crypto. For that reason, browsers in the era, such as Internet Explorer and Netscape, had a domestic version with bit encryption downloadable only in the U.

Many cryptographers felt that the export limitations should be lifted because they only applied to U. Those restrictions were lifted by or , but there is still a prevailing attitude, apparently, that U.

On a related topic, public key crypto schemes can be used for several purposes, including key exchange, digital signatures, authentication, and more. The length of the secret keys exchanged via that system have to have at least the same level of attack resistance.

Secure use of cryptography requires trust. While secret key cryptography can ensure message confidentiality and hash codes can ensure integrity, none of this works without trust.

PKC solved the secret distribution problem, but how does Alice really know that Bob is who he says he is? Just because Bob has a public and private key, and purports to be "Bob," how does Alice know that a malicious person Mallory is not pretending to be Bob? There are a number of trust models employed by various cryptographic schemes. This was rejected, partially in order to fit the algorithm onto a single chip and partially because of the National Security Agency NSA.

By combining 64 additional key bits to the plaintext prior to encryption, effectively increases the keylength to bits. The algorithm can use a variable block length and key length; the latest specification allowed any combination of keys lengths of , , or bits and blocks of length , , or bits. They, too, have approved a number of cipher suites for various applications.

Also available internationally. RC1: Designed on paper but never implemented. It's code has not been made public although many companies have licensed RC2 for use in their products. Described in RFC RC3: Found to be breakable during development. RC4: A stream cipher using variable-sized keys; it is widely used in commercial cryptography products. More detail about RC4 and a little about Spritz can be found below in Section 5. RC5 : A block-cipher supporting a variety of block sizes 32, 64, or bits , key sizes, and number of encryption passes over the data.

Key lengths can vary from 32 to bits in length. Twofish : A bit block cipher using , , or bit keys. Designed to be highly secure and highly flexible, well-suited for large microprocessors, 8-bit smart card microprocessors, and dedicated hardware. Camellia has some characteristics in common with AES: a bit block size, support for , , and bit key lengths, and suitability for both software and hardware implementations on common bit processors as well as 8-bit processors e.

Also described in RFC Designed for hardware and software implementations, and is resistant to differential and linear cryptanalysis. KASUMI is the intended confidentiality and integrity algorithm for both message content and signaling data for emerging mobile communications systems. SEED : A block cipher using bit blocks and bit keys. ARIA : A bit block cipher employing , , and bit keys to encrypt bit blocks in 12, 14, and 16 rounds, depending on the key size.

Developed by large group of researchers from academic institutions, research institutes, and federal agencies in South Korea in , and subsequently named a national standard. CLEFIA is one of the new-generation lightweight blockcipher algorithms designed after AES, offering high performance in software and hardware as well as a lightweight implementation in hardware.

Symmetric-key algorithm

Skipjack : SKC scheme proposed, along with the Clipper chip , as part of the never-implemented Capstone project. Although the details of the algorithm were never made public, Skipjack was a block cipher using an bit key and 32 iteration cycles per bit block. Capstone, proposed by NIST and the NSA as a standard for public and government use, met with great resistance by the crypto community largely because the design of Skipjack was classified coupled with the key escrow requirement of the Clipper chip.

TEA was originally developed in , and employed a bit key, bit block, and 64 rounds of operation. Use of this scheme is reportedly one of the reasons that the National Security Agency NSA can easily decode voice and data calls over mobile phone networks. Using simple arithmetic operations, the algorithms offers fast encryption and decryption by use of efficient implementations. KCipher-2 has been used for industrial applications, especially for mobile health monitoring and diagnostic services in Japan.

Salsa20 uses a pseudorandom function based on bit whole word addition, bitwise addition XOR , and rotation operations, aka add-rotate-xor ARX operations. Salsa20 uses a bit key although a bit key variant also exists. In , Bernstein published ChaCha , a new family of ciphers related to Salsa FPE schemes are used for such purposes as encrypting social security numbers, credit card numbers, limited size protocol traffic, etc.

FFX can theoretically encrypt strings of arbitrary length, although it is intended for message sizes smaller than that of AES points. The FFX version 1. Simon and Speck: Simon and Speck are a pair of lightweight block ciphers proposed by the NSA in , designed for highly constrained software or hardware environments.

While both cipher families perform well in both hardware and software, Simon has been optimized for high performance on hardware devices and Speck for performance in software. The initial permutation is performed on plain text. TDES is a strongest encryption algorithm.

The disadvantage of 4. Rijndael cipher. AES is a symmetric-key algorithm, meaning 6. The result of this process produces bit cipher text. Each of the 16 rounds, in turn, consists of the data.

The number of internal rounds of the cipher is a function broad level steps and shown in Figure 3. The number of rounds for bit key is Feistel networks do not encrypt an entire block per iteration, on the other hand, AES encrypts all bits in one iteration.

This is one reason why it has a comparably small number of rounds. In this standard the encryption Figure 6: It is a known fact that 3DES is slower than other block round involves four steps: RC2 is vulnerable to a related-key encryption is fast and flexible. It can be implemented attack using chosen plaintexts [7][14].

RC4 D. Blowfish RC4 is a stream cipher, symmetric key encryption algorithm. Blowfish is one of the most common public domain The same algorithm is used for both encryption and encryption algorithms. It contains two parts Subkey decryption. The data stream is simply XORed with the series Generation: This process converts the key upto bits long of generated keys. The key stream does not depend on to subkeys to totaling bits and Data Encryption: This plaintext used at all.

A variable length key from 1 to bit is process involves the iteration of a simple function 16 times. Vernam stream cipher is Each round contains a key dependent permutation and key- the most widely used stream cipher based on a variable key- and data dependent substitution. Blowfish suits the size. It is popular due to its simplicity.

It is often used in file applications where the key remain constant for a long time encryption products and secure communications, such as e. It was also used by many other email encryption products. The cipher can E. Two fish be expected to run very quickly in software. It was considered It is a symmetric key block cipher and was one of the five secure until it was vulnerable to the BEAST attack [14].

Two fish is related to the I. RC5 earlier block cipher Blowfish. Two fish's distinctive features RC5 is a symmetric-key block cipher notable for its simplicity. One half of an n-bit key is based on RC5. Unlike many schemes, RC5 has a variable used as the actual encryption key and the other half of the n-bit block size 32, 64 or bits , key size 0 to bits and key is used to modify the encryption algorithm key-dependent number of rounds 0 to The original suggested choices of S-boxes.

As a A key feature of RC5 is the use of data-dependent rotations; result, the Two fish algorithm is free for anyone to use without one of the goals of RC5 was to prompt the study and any restrictions whatsoever. It is one of a few ciphers included evaluation of such operations as a cryptographic primitive. The general structure of the algorithm available longer [6]. The encryption and decryption routines can be specified in a few lines of code. The key F.The IV is placed in the first bytes of the encrypted file and is appended to the user-supplied key which, in turn, can only be up to bytes in length.

He has published thirteen research papers in international and national Manzoor Hussain Dar is a M. There are several other references that describe interesting algorithms and even SKC codes dating back decades. The table clearly shows that a bit key was essentially worthless against even the most unsophisticated attacker.

These tasks can be accomplished in one of two ways.