Minimal subject and minimal action
That the complexity of meanings increases as they evolve may be intuitively obvious, but it was only in the middle of the 20th century that the concepts of the quantity of information and information complexity were rigorously substantiated in the works of Claude Shannon and Andrey Kolmogorov.
Shannon introduced the concept of information entropy. According to him, entropy H is a measure of the uncertainty, unpredictability, surprise or randomness of a message, event or phenomenon. In terms of culture, such a message or event is a counterfact. Without information losses, Shannon entropy is equal to the amount of information per message symbol. The amount of information is determined by the degree of surprise inherent in a particular message:
“According to this way of measuring information, it is not intrinsic to the received communication itself; rather, it is a function of its relationship to something absent—the vast ensemble of other possible communications that could have been sent, but weren’t. Without reference to this absent background of possible alternatives, the amount of potential information of a message cannot be measured. In other words, the background of unchosen signals is a critical determinant of what makes the received signals capable of conveying information. No alternatives = no uncertainty = no information. Thus Shannon measured the information received in terms of the uncertainty that it removed with respect to what could have been sent” (Deacon 2013, p. 379).