This will give you an entropy number from 0 to 1.0:
You might want to study Shannon Entropy , which is a measure of entropy for data and information. In fact, it is actually an almost direct analogue of the Physical Formula for Entropy, as defined by the most acceptable interpretations of thermodynamics.
In particular, in your case with a binary string, you can see the Binary Entropy Function , which is a special case related to randomness in binary data bits.
This is calculated using
H(p) = -p*log(p) - (1-p)*log(1-p)
(logarithms in base 2, suppose 0*log(0) is 0)
Where p is your percentage of 1 (or 0, the graph is symmetrical, so your answer is the same anyway)
Here is what the function gives:

As you can see, if p is 0.5 (the same amount from 1 to 0), your entropy will be maximum (1.0). If p is 0 or 1.0, the entropy is 0.
It looks like what you want, right?
The only exception is your size 1 , which can be simply excluded. However, 100% 0 and 100% 1 do not seem too entropic to me. But implement them as you would like.
In addition, this does not take into account the "ordering" of bits. Only their total amount. Thus, repeat / palindromes will not receive any enhancement. You can add additional heuristics for this.
Here are your other examples:
00: -0 * log (0) - (1-0) * log (1-0) = 0.0
01: -0.5 * log (0.5) - (1-0.5) * log (1-0.5) = 1.0
010: - (1/3) * log (1/3) - (2/3) * log (2/3) = 0.92
0100: -0.25 * log (0.25) - (1-0.25) * log (1-0.25) = 0.81