This is Information (or Shannon) Entropy defined as usual but in log-base
$e$:
$$H(X)=-\sum_{{x}}p(x)\log_{e}p(x).$$
You can see it clearly by computing
Entropy[{a, b, b}]
$$\frac{2}{3} \log \left(\frac{3}{2}\right)+\frac{\log (3)}{3}$$
Probability of
$b$ is
$P(b)=2/3$ and
$a$ is
$P(a)=1/3$. Similarly with
Entropy[{a, b, b, c, c, c, d}]
$$\frac{3}{7} \log \left(\frac{7}{3}\right)+\frac{2}{7} \log \left(\frac{7}{2}\right)+\frac{2 \log(7)}{7}$$