Skip to content

Tag: Arithmetic Coding

Information entropy explained

How much information is in the result of a coin toss? That’s an odd question to ask. It doesn’t sound right. You cannot think of information as if it were water in a bucket, right? Well Claude Shannon begs to differ. In 1948, Shannon published his paper “A Mathematical Theory of Communication,” and by doing so laid the foundation of information theory. This theory became a really big deal, contributing to many fields of science, especially to the field of data compression. Information theory states that we can quantify the information that is contained in a random…