Few words about ENTROPY and its applications

  • Entropy always increases.
  • Entropy represents the diversity of internal distribution of system.
  • Entropy is a function of state of system dependent only on initial and final states.
  • Information theory provides results that are independent of mechanism.

Problem: There is a gang of 256 people whose leader the CIA wants to apprehend, they send two undercover agents A and B to join the gang.
A goes deep in the gang and gets into the inner circle. A’s position in the inner circle makes him privy to a lot of information as well as the leader. B remains at petty position in the gang so as to be able to communicate with the CIA but his position also makes him owner of some information about the leadership. A knows the leader. B has suspicion on 2 members being the leader. A cannot freely talk with B as a result does not know contents of B’s list as it may arouse suspicion. So they will talk to each other using very special setup where each message is a string of 0’s and 1’s and each message cost’s 10K $ per symbol. The messaging is wildly unpredictable in time, it will reach completely but exact moment is not quantifiable .Find a way for A and B to communicate . You have to come up with the strategy before sending agents. Total allocated budget is 80K $. How do you do it and what is minimum cost?

Solution: the initial strategy can be A transmits the position of leader in message. As each gang member can be uniquely represented using 8 bits. 2^8 = 256. Total cost =8 bits to uniquely identify leader. No funds left.

But we have not made any use of B. B has suspicion on say X and Y but A does not know about B’s list. Recall the array question where you have a stream if nos such that each no occhrs twice but one occurs only once, how do you find the no that occurs once. Ex 1 2 3 1 2 Ans 3 . We use XOR

But what if we have a stream of no where every no occura twice but two occur only once . Can we use XOR? Ex 1 2 3 4 1 2. Xor result is 7. How do we recover 3 & 4?.
Recall properties of bitwise xor 1 if bits are different else if bits are similar bitwise xor =0.

What information does xor result =7 = 111 convey? Observe LSB is set i.e the nos whose xor is 7 differ in LSB. We now go though the original stream and see if no’s LSB is set if so we XOR it to set1 else we XOR it to set 2 original both set have value 0. 
Stream = 1 2 3 4 1 2
SET 1 = 1^3^1 =3
SET 2 = 2^4^2 =4

So we have recovered the 2 nos in stream that do not repeat.

We are going to use this idea. B has two nos that denote position of persons he has suspicion on. He finds the lowest bit index where bit vectors of 2 persons differ and sends it to A. The message will be of size 3 as there are atmost 8 different positions that can be uniquely represented in 3 bits. A receives the message and send the value of bit at that position to B helping him identify the leader. Total of 4 bits were used in the process. Well we have gone from 8 bits to 4 bits huge improvement 40K $ to spend.

Can we do better ?

INFORMATION THEORY results are independent of mechanism. Observe there are 8 possible bit index which can be uniquely identified with 3 bits in the mechanism we are all familiar with.

We can design a new mechanism to denote the bit indexes .
Entropy =8 , Information needed can be gained in 3 bits as following

Working: B looks in his list finds the lowest bit index in which two bit vecors differ. If index lies from

0 to 3 B send 2 bits in message needed to mark the index , A responds with value of leader at that index it is either 0 or 1.

4 to 5 B sends 1 bit message as 0. A responds with 2 bit message of the form ( 00, 01, 10, 11) that identifies bit pattern of leader at these positions

6 to 7 B sends 1 bit message as 1. A responds with 2 bit message of the form ( 00, 01, 10, 11) that identifies bit pattern of leader at these positions

So for a total of 3 bits we have identified the leader.

Total improvement from 8 bits to 4 bits to 3 bits, with 50K $ left to spend.

Reach for the sky