Tuesday, May 22, 2012

The Nature of Information II

The Nature of Information II

George Hrabovsky
MAST

Introduction

The previous article was concerned primary with a sequences of events that were mutually exclusive. What happens when more than one such sequence, or scheme, are combined?

Combining Schemes—Mutually Independent Sets

Let’s say that we have two finite schemes,
and
Say that the propbability of the joint occurence of events

 and
 
is
This property is called a mutually independent probability. In fact the set of events
forms another finite scheme. The entropy for such a scheme is,

Combining Schemes—Mutually Dependent Sets

So what if A and B are not mutually independent? Here we say that the probability of event
 
of the scheme B occurs assuming that event
 
occurs from scheme A is
This gives us the scheme
with entropy
It turns out that
 
is a random variable called the expectation of H(B) in the scheme A. It is also written
.

What does It all Mean?

The amount of information delivered by a scheme never decreases unless a another scheme is realized beforehand.

Friday, May 18, 2012

The Nature of information

The Nature of Information

George Hrabovsky
MAST

Introduction

Information can be thought of as what we do not already know. We acquire information when we learn, or perhaps it is that the act of learning is acquiring information. In either case we can consider infomration in the abstract to be a phsycial quantity and it can be measured.

Bits

How do we measure information? Like any measurement of a physical quantity we choose a fundamental unit. Then we say that something has a quantity of units, a numerical result. Something can be five meters long, or it can be 4.3 pounds of weight, or 6.1 kilograms of mass, and so on. For information we use the bit as a fundamental quantity. Thus we can say that some information is composed of a number of bits.

Characterizing Information

We will use symbols to represent bits of information. A set of symbols comprising a single bit will be represented by
two bits by
 
and so on, so that a set of n such symbols is the sequence

.

Schemes

If we also associate a probability that a given symbol will be sent,

The symbols of such a set must be mutually exclusive, so that the total of all probabilies is 1. We can combine them into a single symbol,
This is called a finite scheme.

Entropy

Entropy is, among other things, the measure of the disorder of a system of symbols. In the case of a scheme the entropy is given by,
We will discuss these things in more detail later.

Thursday, May 17, 2012

A New Inquiry—Information Theory

Diving into Information Theory

I have begun to delve into the mysteries of information theory. I plan to post the results here. I cannot predict where it will lead, but it will be great fun. For now I plan the following:

1) The nature of information.
2) Channel capacity.
3) Gaussian channels.
4) Fisher information.
5) The thermodynamics of information.
6) The statistical mechanics of information.
7) Ergodic sources.
8) Entropy
9) Detecting information.
10) Measure of information.
11) Encoding.
12) Noiseless coding.
13) Discrete channels.
14) Error-correcting codes.
15) Channels with memory.
16) Continuous channels.

It is possible that I will get to quantum information and black holes.