## Wednesday, December 5, 2012

### New Information Theory Post and Mathematica 9

Mathematica 9 has come out and has some amazing new capabilities. I have produced a new information theory post using some of the new functions. This is at http://www.madscitech.org/information/information.html

## Sunday, November 25, 2012

### For Those Wanting to Study on Their Own

I have a book on Lulu called "Self-Instruction and Teaching: Science Education for the New Millenium". This is all about how to learn effectively, and how to teach others. You can find it here: http://www.lulu.com/spotlight/george1053

Labels:
Education,
George Hrabovsky,
Science Teaching

## Wednesday, November 21, 2012

### Starting a new Writing Project

I will be working on a textbook for University Physics, called University Physics with Modern Physics. I know, terribly original title. This book will be freely available as a series of downloads from the web site www.madscitech.org/upmp starting today. I envison this as a bunch of Mathematica-based cdfs (computational documents) and pdfs. I will also be able to put together custom textbooks to suit the needs of specific classes. These will be available at Lulu (www.lulu.com), and through the web site above. At this stage I am working on the structure of the contents. In general, here are the contents:

- Introduction to Physics
- Mathematics
- Mathematical Physics
- Theoretical Physics
- Computational Physics
- Experimental Physics
- Classical Physics
- Modern Physics
- Using Calculus to Model Motion
- Introduction to Mathematica
- Error Analysis
- Newtonian Mechanics
- Thermodynamics and Energy
- Fields and Waves
- Problems with Classical Physics
- Einstein's Relativity
- Quantum Physics
- Applications of Relativity and Quantum Physics
- Differential Equations

### The Web Page for The Theoretical Minimum is Up!

The web page for The Theoretical Minimum is up and you can see it at: www.madscitech.org/tm.

## Thursday, November 8, 2012

### Book Update

The book,

*The Theoretical Minimum*, will be published by Basic Books at the end of January of 2013. There will be a companion web site: www.madscitech.org/tm, it is not yet up! Don't go there and then tell me there is no link!
Labels:
Leonard Susskind,
Madison Area Science and Technology,
The Theoretical Minimum,
Theoretical Physics

## Wednesday, November 7, 2012

## Thursday, October 25, 2012

## Monday, October 15, 2012

### New Information Theory Site on MAST

I have formalized my notes on information theory and placed them on the MAST website. You can find tham at http://www.madscitech.org/information/information.html

## Friday, September 28, 2012

### More Information on the Book

I just finished working over the proofs for the publsher. It is now in their hands again. The next step is the Galleys, and then back to the publisher again. The book is scheduled to be out in January 2012.

## Thursday, June 28, 2012

### The Nature of Information V

In this installment I demonstrate how to determine some of the quantities we have been looking into using Mathematica 8. You can find it at this website:

http://www.madscitech.org/csg/it.html

under Information Theory V.

http://www.madscitech.org/csg/it.html

under Information Theory V.

## Sunday, June 17, 2012

### The Nature of Information IV

In this post I examine what coding is and introduce Shannon's First Coding Theorem and the Shannon-McMillan Theorem. See Information Theory IV.

## Wednesday, June 13, 2012

### The Nature of information III

In this posting I explore the probability of finding strings and establish the principle that the chace of finding a given string is independent from the elements of the string. This result is called the

*Asymptotic Equipartition Property*. For details go to the The Nature of Information III## Friday, June 1, 2012

### New place for formatted information theory.

Due to the difficulty in placing the mathematical symbols into the blog, I will encapsulate the post and provide a link to the formatted version.

In general these will be located at: http://www.madscitech.org/csg/it.html.

In general these will be located at: http://www.madscitech.org/csg/it.html.

## Tuesday, May 22, 2012

### The Nature of Information II

## The Nature of Information II

George Hrabovsky

MAST

### Introduction

The previous article was concerned primary with a sequences of events that were mutually exclusive. What happens when more than one such sequence, or scheme, are combined?

### Combining Schemes—Mutually Independent Sets

Let’s say that we have two finite schemes,

and

Say that the propbability of the joint occurence of events

and

is

This property is called a mutually independent probability. In fact the set of events

forms another finite scheme. The entropy for such a scheme is,

### Combining Schemes—Mutually Dependent Sets

So what if

of the scheme

occurs from scheme

*A*and*B*are not mutually independent? Here we say that the probability of eventof the scheme

*B*occurs assuming that eventoccurs from scheme

*A*is
This gives us the scheme

with entropy

It turns out that

is a random variable called the expectation of

is a random variable called the expectation of

*H**(**B**)*in the scheme*A*. It is also written### What does It all Mean?

The amount of information delivered by a scheme never decreases unless a another scheme is realized beforehand.

## Friday, May 18, 2012

### The Nature of information

## The Nature of Information

George Hrabovsky

MAST

### Introduction

Information can be thought of as what we do not already know. We acquire information when we learn, or perhaps it is that the act of learning is acquiring information. In either case we can consider infomration in the abstract to be a phsycial quantity and it can be measured.

### Bits

How do we measure information? Like any measurement of a physical quantity we choose a fundamental unit. Then we say that something has a quantity of units, a numerical result. Something can be five meters long, or it can be 4.3 pounds of weight, or 6.1 kilograms of mass, and so on. For information we use the bit as a fundamental quantity. Thus we can say that some information is composed of a number of bits.

### Characterizing Information

We will use symbols to represent bits of information. A set of symbols comprising a single bit will be represented by

two bits by

and so on, so that a set of

*n*such symbols is the sequence
.

### Schemes

If we also associate a probability that a given symbol will be sent,

The symbols of such a set must be mutually exclusive, so that the total of all probabilies is 1. We can combine them into a single symbol,

This is called a finite scheme.

### Entropy

Entropy is, among other things, the measure of the disorder of a system of symbols. In the case of a scheme the entropy is given by,

We will discuss these things in more detail later.

## Thursday, May 17, 2012

### A New Inquiry—Information Theory

Diving into Information Theory

I have begun to delve into the mysteries of information theory. I plan to post the results here. I cannot predict where it will lead, but it will be great fun. For now I plan the following:

1) The nature of information.

2) Channel capacity.

3) Gaussian channels.

4) Fisher information.

5) The thermodynamics of information.

6) The statistical mechanics of information.

7) Ergodic sources.

8) Entropy

9) Detecting information.

10) Measure of information.

11) Encoding.

12) Noiseless coding.

13) Discrete channels.

14) Error-correcting codes.

15) Channels with memory.

16) Continuous channels.

It is possible that I will get to quantum information and black holes.

I have begun to delve into the mysteries of information theory. I plan to post the results here. I cannot predict where it will lead, but it will be great fun. For now I plan the following:

1) The nature of information.

2) Channel capacity.

3) Gaussian channels.

4) Fisher information.

5) The thermodynamics of information.

6) The statistical mechanics of information.

7) Ergodic sources.

8) Entropy

9) Detecting information.

10) Measure of information.

11) Encoding.

12) Noiseless coding.

13) Discrete channels.

14) Error-correcting codes.

15) Channels with memory.

16) Continuous channels.

It is possible that I will get to quantum information and black holes.

## Wednesday, March 14, 2012

### New paper published.

I just had a paper published in The Mathematica Journal. You can find it here.

## Sunday, March 4, 2012

### On the nature of theoretical physics.

If you are reading this, you are interested in theoretical physics. But what is theoretical physics? I have come up with several definitions, based on how you approach the subject.

1)

2)

3)

4)

So this, then, is the general nature of theoretical physics.

1)

**The modeling approach to theoretical physics.**Another way of calling this would be the phenomena-centered approach, whose goal is to understand a specific phenomena by developing either a mathematical or computational model. You begin this by choosing a phenomena to study. Then you choose an approach to representing the phenomena; can you represent it as particle? a field? or some continuous distribution of matter? Then you choose a mathematical formulation. Examples of mathematical formulations are Newtonian mechanics, Maxwell's equations, Lorentz covariance, the Maxwell-Boltzmann distribution, etc. Such formulations Constitute much of the material of most textbooks and courses on physics. You then adapt your approach to the mathematical formulation, thus developing a mathematical representation of your phenomena. You then use physical, mathematical, and/or computational arguments and methods to make predictions in the form of tables, plots, and/or formulas. By studying these results in different circumstances you can extend our understanding of the phenomena. This is the most direct method of doing theoretical physics, it is a straight application of mathematical or computational methods. It is certainly the most structured way of doing theoretical physics.2)

**The constructive approach to theoretical physics.**This can be thought of as the method to develop a new formulation of a physical theory. Examples are the Lagrangian formulation of mechanics, the Lagrangian formulation of electrodynamics, the Eulerian formulation of fluid dynamics, the path-integral formulation of quantum mechanics, and so on. You begin by choosing how you represent objects in your developing theory. Then you choose some quantity, or set of quantities to base your construction on. Then you choose an argument to base your construction on. Are you seeking to find symmetries? Are you arguing from some conserved quantity? Are you assuming that your quantity is minimized? For example, in the Lagrangian formulation you choose to create a new quantity called the Lagrangian and then you work out the consequences when the integral of the Lagrangian—the action—is minimized. This leads to the Euler-Lagrange equations of motion, an new formulation of classical mechanics. This is a much more difficult, but powerful method—you build the formulation. The difficulty stems from the lack of structural guidelines in creating a new formulation.3)

**The abstract approach to theoretical physics.**This mode is where you take a number of specific cases and generalize their results. For example, knowing that when a derivative is 0 and quantity is unchanged; you take the zero derivatives of momentum in many cases and generalize that into the law of conservation of momentum. This sort of activity is very difficult since there are few guidelines for how to proceed beyond what is already known.4)

**The unification approach to theoretical physics.**This is based on the idea that it would be nice if there was a single theory to govern a wide range of phenomena. There is no real reason to believe that this is true generally. This is one difficulty with practical application. another difficulty is that all of our equations are, to one degree or another, an approximation of reality. So the fact that equations in different fields look alike is another way of saying that the approximations are similar. Does that mean the phenomena are also similar? Sometimes. Isaac Newton unified gravity at the surface of the Earth and gravity away from the Earth. James Maxwell unified electricity, magnetism, and light. Abdus Salam, Sheldon Glashow and Steven Weinberg unified electromagnetism and the weak nuclear force. The work of unifying electroweak theory with the strong interaction force is a work in progress. Even less success has been made in unifying gravity.So this, then, is the general nature of theoretical physics.

## Tuesday, February 14, 2012

### Work of the last two weeks

Hello everyone. I have been working hard on the book with Leonard Susskind over the last two weeks. We are about two weeks away from presenting the final manuscript to the publishers at basic Books. It is scheduled for release in January 2013.

Last week I published, on the MAST web site, a paper on the Oppenheimer-Volkoff equations. These can be found here.

During the Saturday meeting we had a lot of fun writing a program to calculate the arc length of a curve and then tested lots of different curves. This work can be found here.

Last week I published, on the MAST web site, a paper on the Oppenheimer-Volkoff equations. These can be found here.

During the Saturday meeting we had a lot of fun writing a program to calculate the arc length of a curve and then tested lots of different curves. This work can be found here.

## Thursday, February 2, 2012

### New Material from MAST

New Post on the MAST Web Site

We have several Mathematica-based documents on our web site.

http://www.madscitech.org/mathematica/consulting/consulting.html

We have several Mathematica-based documents on our web site.

http://www.madscitech.org/mathematica/consulting/consulting.html

Subscribe to:
Posts (Atom)