2 equally probable outcomes

Entropy can be seen as minimum number of binary questions(answer is yes or no) to know which one it is.

For example, in an unbiased coin, the entropy is 1. It means it would require just one question like “Is it Heads?” or “is it Tails?”

4 equally probable outcomes

Now consider if there are 4 outcomes(A,B,C,D), with equal probability 1/4. The entropy summation of p logp formula gives 2. And indeed, there are 2 questions required to know which one is it?

  1. Is it A or B?

Suppose the answer was yes, then

  1. Is it A?

Or an other way would be

  1. Is it A or B?

If the answer was No

  1. Is it C?

3 unequal probabilities

Say 3 outcomes and probabilities are:

A - 1/2

B - 1/4

C - 1/4

The formula would give entropy as 1.5. It means on an average 1.5 questions are required. How