Zacytuj

The purpose of this paper is to present the evolution of the concept of entropy from engineering to knowledge management, going through information theory, linguistic entropy, and economic entropy. The concept of entropy was introduced by Rudolf Clausius in thermodynamics in 1865 as a measure of heat transfer between two solid bodies which have different temperatures. As a natural phenomenon, heat flows from the body with a higher temperature toward the body with a lower temperature. However, Rudolf Clausius defined only the change in entropy of the system and not its absolute entropy. Ludwig Boltzmann defined later the absolute entropy by studying the gas molecules behavior in a thermal field. The computational formula defined by Boltzmann relates the microstates of a thermal system with its macrostates. The more uniform the probability distribution of the microstates is the higher the entropy is. The second law of thermodynamics says that in open systems, when there is no intervention from outside, the entropy of the system increases continuously. The concept of entropy proved to be very powerful, fact for which many researchers tried to extend its semantic area and the application domain. In 1948, Claude E. Shannon introduced the concept of information entropy, having the same computational formula as that defined by Boltzmann, but with a different interpretation. This concept solved many engineering communications problems and is used extensively in information theory. Nicholas Georgescu-Roegen used the concept of entropy and the second law of thermodynamics in economics and business. Today, many researchers in economics use the concept of entropy for analyzing different phenomena. The present paper explores the possibility of using the concept of knowledge entropy in knowledge management.

eISSN:
2558-9652
Język:
Angielski