Friday, May 18, 2012

The Nature of information

The Nature of Information

George Hrabovsky
MAST

Introduction

Information can be thought of as what we do not already know. We acquire information when we learn, or perhaps it is that the act of learning is acquiring information. In either case we can consider infomration in the abstract to be a phsycial quantity and it can be measured.

Bits

How do we measure information? Like any measurement of a physical quantity we choose a fundamental unit. Then we say that something has a quantity of units, a numerical result. Something can be five meters long, or it can be 4.3 pounds of weight, or 6.1 kilograms of mass, and so on. For information we use the bit as a fundamental quantity. Thus we can say that some information is composed of a number of bits.

Characterizing Information

We will use symbols to represent bits of information. A set of symbols comprising a single bit will be represented by
two bits by
 
and so on, so that a set of n such symbols is the sequence

.

Schemes

If we also associate a probability that a given symbol will be sent,

The symbols of such a set must be mutually exclusive, so that the total of all probabilies is 1. We can combine them into a single symbol,
This is called a finite scheme.

Entropy

Entropy is, among other things, the measure of the disorder of a system of symbols. In the case of a scheme the entropy is given by,
We will discuss these things in more detail later.

No comments:

Post a Comment