(Re-edited and blogged from earlier post)
After looking into entropy, I wanted to get into making a theory of what I see as its cosmological opposite: significance. In Multisense Realism, this kind of opposite is different than just high entropy = low significance, it is the difference of an orthogonal sort - a horizontal exterior to a vertical interior.
To begin with, from my framework, Significance is not a process generated by the brain whereby emotion is attached to functional semantic groups. In truth, consciousness and significance are not processes at all, but rather the intrinsic affective stillness (solace, solitude, stasis, solid), through which all processes are realized.
In Figure 1 above, the labeled drawing of the house is to illustrate how the one-to-many ratio is inverted from that found in entropy. A simple line drawing, which can be compressed to a very simple digital encoding can nonetheless be understood to host a rich variety of human psychological associations. The associations themselves are of different qualities and strengths (represented by the use of colored and shaded text) and we can assume that these will vary from person to person and culture to culture, with variations by age, gender, and other social factors. I think that this concept of significance as associative promiscuity is critical to bridge the gap between functionalist models of consciousness and subjectively experienced realism.
Without significance, we are left with information-theoretic assumptions about meaning, which are rooted in purely quantitative measures of resource requirements for processing. This model I believe, is a problem because it flattens all meaningful appearances and interactions into rates of data exchange, such that all forms of information, whether convulsively beautiful or mind numbingly banal are identical forms of noise, discernible only by matching algorithms to proscribed formulas. There is no sense here, only large swaths of true-false branching tree algorithms.
In my conception of Significance, quantitative density of data is juxtaposed intentionally from quality. Rather than conceiving of information as isolated texts, a separate measure which honors contextual connectivity is invoked. It isn’t merely a shallow accounting of the number of associations, or even number plus strength, but a vast qualitative summation of cultural, social, and personal positions and momenta which cannot be quantitatively reduced. By figurative extension, each local instance of significance capitulates in some sense the experiences and attitudes, the afferent and efferent charges of every being in the history of the universe.
Using this system of measurement, we can honor the reality of the difference between the high significance, simple line art in the top picture, and the high data content, but low significance grey cloud in the bottom. Claude Shannon’s work shows us how to model statistical entropy (which I have tried to illustrate below in Figure 2) as a measure of how difficult it is to complete a prediction of a message given a particular fragment. If you have the letter ‘Q’, it is a good bet that in any English language transmission, the letter ‘u’ will very likely follow it. Using that statistical insight, we can include many such formulas which will dramatically improve data compression, speeding up transmission and lowering resource requirements. It’s the same thing as predictive text.
Information Entropy vs Thermodynamic Entropy
This sense of ‘Information Entropy’ as the degree of difficulty in predicting information statistically, is distinct from (despite the objections of information theorists) thermodynamic entropy. Thermodynamic entropy is rooted in physics, and even though it relates to quantum decoherence, the two are not constrained together because pattern recognition is dependent on perception. If we make a movie of a glass of ice melting and compare it a movie of the same glass of water after it has melted, the mpeg compressed movie of the former will require far more kilobytes to store than the latter, even though the molecules of warm water are in a higher entropy state than the molecules of ice. If we recoded a movie of the molecules on a microscopic level, this discrepancy would be reversed. This example illustrates how the presentation and representation of patterns fundamentally changes the so-called information content, and I suggest that it be used to understand that all realities, public and private, are presentations and representations of meaning rather than context independent realities.
The approach of science in recent years has been to force a choice between reducing consciousness to the known quantities of physics and computation, or jump off the cliff into speculation and professional suicide. In this new measure of Significance, I propose a phenomenal character that is the opposite in every way to the high-entropy/low-entropy axis of physics and statistics. Where information theories begin with sets of data fragments and use stochastic processes, such as Markov chains, to anticipate future variations within a digitized set of finite possibilities, I propose an experiential sense-based model which is semantically open but spatiotemporally fixed. This is to say that significance accumulates through concrete experiences in this world - the universe, rather than within any phase space simulation. Significance is based in trans-rational experience through time rather than topological functions across space. To understand this, I propose, is to solve the hard problem of consciousness and bridge the explanatory gap.
Significance is a richness and quality of signal that translates as power to inspire motivation through imitation. Not merely low information entropy where mathematical relations can clip along smoothly, but phenomenological beacons of iconic power. Ordinary features blessed by fate or luck to become super-signifying powerhouses. Celebrities who confer access to privilege and godliness just through association with their name or personal attention…a lock of hair, an autograph. With entropy, we must chase meaning and work against resistance to drive purpose, but with significance, its meanings and sense qualities call to us to drive ourselves.