Entropy is a measure of disorder or uncertainty. This terminology is qualitatively used in the understanding of its correlation to pollution in the environmental area. In this research, three different entropies were defined and characterized in order to quantify the qualitative entropy previously used in the environmental science. We are dealing with newly defined distinct entropies
originated from Shannon entropy in the information theory, reflecting concentration of three major green house gases
represented as the probability variables. First,
is to evaluate the total amount of entropy from concentration difference of each green house gas with respect to three periods, due to industrial revolution, post-industrial revolution, and information revolution, respectively. Next,
is to evaluate the entropy reflecting the increasing of the logarithm base along with the accumulated time unit. Lastly,
is to evaluate the entropy with a fixed logarithm base by 2 depending on the time. Analytical results are as follows.
shows the degree of prediction reliability with respect to variation of green house gases. As
increased, the concentration variation becomes stabilized, so that it follows from linear correlation.
is a valid indicator for the mutual comparison of those green house gases. Although
locally varies within specific periods, it eventually follows a logarithmic curve like a similar pattern observed in thermodynamic entropy.