Entropy is a measure of the randomness in a system. The more random the system, the less predictable it is and the higher its entropy.
Entropy and cryptanalysis
Entropy is useful in a variety of different fields, including cryptography. A measure of the randomness in a system is a useful method of differentiating between strong encryption and weak or non-existent encryption.
One of the methods of determining if an encryption algorithm is effective is if the ciphertexts that it produces can be differentiated from a random binary string. A fully random binary string has maximal entropy, meaning that there is no information exposed.
This is desirable in an encryption algorithm because it means that the ciphertext leaks no information about the corresponding plaintext. Therefore, calculating the entropy of data can help to differentiate between the ciphertext created by a strong encryption algorithm or the use of potentially weak and broken encryption.
Entropy can be calculated in a number of different ways. In cryptography, the most commonly used type of entropy is Shannon entropy, which was created by Claude Shannon, the father of information theory.
Shannon entropy can be calculated based upon the observed probability that a particular