What's New

+ fix issues
+ minor improvements
+ Hamlet source added

App Description

The Entropy of an information source is a measure of the uncertainty or the surprise on the information transmitted.

If our source sends a set of symbols (called 'source alphabet') the greatest surprise occurs when all the symbols are independent (without correlations) and have the same probability (equiprobability). In this case, the entropy of the source is equal to 1 because the receiver can't predict the symbols sent by the source.

The text sources haven't got independent or equiprobable symbols. For this reason, we can predict some letters or words in an incomplete text. Moreover, this type of sources can be compressed using coding and compression data algorithms.

Huffman coding (David A. Huffman, 1925-1999) is an entropy encoding algorithm used for lossless data compression based on the assignation of short binary codes to high probability symbols and vice-versa. With this App you can use different text sources to generate Huffman codifications in order to see the reduction of the text source size.

Enter your own text, press the “Text Source” button and generate your own Huffman probability tree to study this data compression algorithm.

iPhone Screenshots

(click to enlarge)

iEntropy screenshot 1 iEntropy screenshot 2 iEntropy screenshot 3 iEntropy screenshot 4

iPad Screenshots

(click to enlarge)

iEntropy screenshot 5 iEntropy screenshot 6 iEntropy screenshot 7 iEntropy screenshot 8 iEntropy screenshot 9

App Changes

  • June 14, 2014 Initial release