Skip to content

Tag: Entropy

The story of the student, who created the most famous compression algorithm, because he tried to avoid an exam

This is the third in a series of articles where I go over the basics of common type of compressions. You can find the first article here When you take pictures, he’s there.When you listen to music, he’s there.When you surf the web, he’s there.This article is dedicated to him. *** Drawing lesson Question: How would a basic drawing software which allows you to draw in just eight colors, encode the information into bits? A common way to do this is to predefine a specific bit sequence, of a fixed length, per color. For eight colors we…

Information entropy explained

How much information is in the result of a coin toss? That’s an odd question to ask. It doesn’t sound right. You cannot think of information as if it were water in a bucket, right? Well Claude Shannon begs to differ. In 1948, Shannon published his paper “A Mathematical Theory of Communication,” and by doing so laid the foundation of information theory. This theory became a really big deal, contributing to many fields of science, especially to the field of data compression. Information theory states that we can quantify the information that is contained in a random…