검색결과

검색조건
좁혀보기
검색필터
결과 내 재검색

간행물

    분야

      발행연도

      -

        검색결과 1

        1.
        2013.11 KCI 등재 서비스 종료(열람 제한)
        Entropy is a measure of disorder or uncertainty. This terminology is qualitatively used in the understanding of its correlation to pollution in the environmental area. In this research, three different entropies were defined and characterized in order to quantify the qualitative entropy previously used in the environmental science. We are dealing with newly defined distinct entropies E1, E2, and E3 originated from Shannon entropy in the information theory, reflecting concentration of three major green house gases CO2, N2O and CH4 represented as the probability variables. First, E1 is to evaluate the total amount of entropy from concentration difference of each green house gas with respect to three periods, due to industrial revolution, post-industrial revolution, and information revolution, respectively. Next, E2 is to evaluate the entropy reflecting the increasing of the logarithm base along with the accumulated time unit. Lastly, E3 is to evaluate the entropy with a fixed logarithm base by 2 depending on the time. Analytical results are as follows. E1 shows the degree of prediction reliability with respect to variation of green house gases. As E1 increased, the concentration variation becomes stabilized, so that it follows from linear correlation. E2 is a valid indicator for the mutual comparison of those green house gases. Although E3 locally varies within specific periods, it eventually follows a logarithmic curve like a similar pattern observed in thermodynamic entropy.