Entropy is a thermodynamic property that can be used to determine the energy not available for work in a thermodynamic process, such as in energy conversion devices, engines, or machines. Such devices can only be driven by convertible energy, and have a theoretical maximum efficiency when converting energy to work. During this work, entropy accumulates in the system, which then dissipates in the form of waste heat.
In classical thermodynamics, the concept of entropy is defined phenomenologically by the second law of thermodynamics, which states that the entropy of an isolated system always increases or remains constant.
Thus, entropy is also a measure of the tendency of a process, such as a chemical reaction, to be entropically favored, or to proceed in a particular direction. It determines that thermal energy always flows spontaneously from regions of higher temperature to regions of lower temperature, in the form of heat. These processes reduce the state of order of the initial systems, and therefore entropy is an expression of disorder or randomness.
This picture is the basis of the modern microscopic interpretation of entropy in statistical mechanics, where entropy is defined as the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification. The second law is then a consequence of this definition and the fundamental postulate of statistical mechanics.
The term entropy was coined in 1865 by Rudolf Clausius based on the Greek εντροπία [entropía], a turning toward, from εν- [en-] (in) and τροπή [tropē] (turn, conversion).
It is possible (in a thermal context) to regard entropy as an indicator or measure of the effectiveness or usefulness of a particular quantity of energy.
[ source + info ]
No comments:
Post a Comment