The Nature of Information II
George Hrabovsky
MAST
Introduction
The previous article was concerned primary with a sequences of events that were mutually exclusive. What happens when more than one such sequence, or scheme, are combined?
Combining Schemes—Mutually Independent Sets
Let’s say that we have two finite schemes,
and
Say that the propbability of the joint occurence of events
and
is
This property is called a mutually independent probability. In fact the set of events
forms another finite scheme. The entropy for such a scheme is,
Combining Schemes—Mutually Dependent Sets
So what if A and B are not mutually independent? Here we say that the probability of event
of the scheme B occurs assuming that event
occurs from scheme A is
of the scheme B occurs assuming that event
occurs from scheme A is
This gives us the scheme
with entropy
It turns out that
is a random variable called the expectation of H(B) in the scheme A. It is also written
is a random variable called the expectation of H(B) in the scheme A. It is also written
What does It all Mean?
The amount of information delivered by a scheme never decreases unless a another scheme is realized beforehand.
No comments:
Post a Comment