The following properties of entropies, as measures of expected information, seem natural. The amount of information expected from an experiment does not change if we add outcomes of zero probability (expansibility). The expected information is symmetric in the (probabilities of the) outcomes. The information expected from a combination of two experiments is less than or equal to the sum of the informations expected from the single experiments (subadditivity); equality holds here if the two experiments are independent (additivity).
In this paper it is shown that linear combinations of the Shannon and Hartley entropies and only these have the above properties. The Shannon and the Hartley entropies are also individually characterized.