top of page

ENTROPY

In this post, I will try to explain the concept of Entropy- in particular information entropy in as simple terms as possible and discuss how it can be used to quantify uncertainities in the financial markets.

First let us start with a simple example-A flask containing gas, at a certain temperature. We can measure the temperature, volume, number of particles (molecules) etc. (we call these macroscopic variables) of the gas. The gas is made up of millions of molecules, each moving around with some velocity, hitting each other, hitting the walls of the flsk and so on. The macroscopic properties are the result of all these molecular activities. But there is no way of knowing the individual positions and velocities( let us call the position and velocity together, a state) of the tiny molecules. This degree of unknownness in the microscopic configuration of the system (in this example, gas) is what is known as Entropy. (Great, if you dont know something, give it a Jazzy name). Now let us put this flask in a bigger container and isolate this system from the rest of the world. If E1 and E2 are the original energies of the flask and the 2nd container, the total energy E of the system is

E = E1 + E2

Note that now the unknownness of the total system also has increased. If we call the entropy of the total system as S and the original individual entropies as S1 and S2, then

S = S1 + S2

If one of the two systems has higher enery than the other (ex. higher temperature, no. of particles etc.), then the energy flows from the system at higher energy to the lower one until the macroscopic properties of both the systems are same. We call this steady or equilibrium state. This state has the highest entropy and in the abscence of external influesnce the change in entropy is zero. Mind you the total energy E is still the same,

The entropy discussed above is called the thermodynamic entropy. In general entropy props up anywhere there is lack of information. In 1948, Shannon who was at Bell Labs at that time working on the problems connected with communication, introduced the concept of information entropy described in his famous paper 'A Mathematical Theory of Communication' . A general communication system consists of a 1)message sender, 2)a transmitter which converts the message into a signal, 3)a channel that sends the signal to 4) receiver which decodes the signal to a messge. To keep the load small, the messages are usually compressed or encoded. Without going into details of communication theory, it is suffice to say that when a source sends the message through the system described above, there are many uncertainities arising at varius stages that the receiver has only partial information about the message. This means there are many possibilities as to what the message should be. Shannon introduced the term information entropy to address the uncerainity in the received message. If x1, x2, x3, --- are the possible message at the receiver and p(xi) are the likelihood of the message being the required one, then Shannon defined information entropy as

         S = -sum(i) p(i) log(p(i))

Here log is the logarithm(either natural or base 10).

 

When there is no other information all messages are equally likely. This situation has maximum entropy (uncertainity). However if there is some info available ( For ex. If the two possible messages are 'I work here' and 'I worm here' , the first one has a higher probability of being correct). So any information reduces entropy.

Now let us come to the problem at hand - financial markets. we could consider the stock market as a many particle system, the particles here being the traders- whose action changes the market. If x1, x2, x3-----ar the possible changes in the market (which we call returns) and p(x1), p(x2), p(x3)-----are the the corresponding probabilities(likelihood), then we can use Shannon entropy to compute the uncertainity in the market. Now, how do we know these probabilities. Here is where we use the info from previous times. Looking at the returns over a certain previous period, we can compute the histograms (how often a perticular return has occured) which give us the probabilities. Hence we can compute Shannon Entropy or the uncertainity in the market.

Normally standard deviation is used to calculate the risk in financial systems. However there are several studies which indicate that the entropy may be more appropriate in determining the market uncertainity.

The entropy discussed above is suitable for a system where there are no long range interactions between the microscopic constituents of the system. They essentially follow a random walk( like the gas in a flask). Such a model has been used to analyse stock markets since a long time. (See Efficient Market Hypothesis emh). There are questions whether the stock market can be considered random, since crashes, bubbles which are the results of interaction between agents, cannot happen in such a system. We will discuss a differrent entropy called Tsallis Entropy which characterizes uncertainity when long range (other than just collisions) interactions are present and may be more appropriate to characterise risk in the financial markets. See A New Risk Measure

bottom of page