The Second Law of thermodynamics conventionally describes physical systems. "An important law of physics, the second law of thermodynamics, states that the entropy of any system cannot decrease except insofar as it flows outward across the boundary of the system. As a corollary, in an isolated system the entropy cannot decrease. By implication, the entropy of the whole universe, assumed to be an isolated system, cannot decrease; in fact, the entropy of the universe is always increasing." It has been speculated that the universe is fated to die a heat death in which all of the energy ends up as a homogenous distribution of thermal energy, so that no work can be extracted from any source.
"However, the role of entropy in cosmology remains a controversial subject. Recent work has cast extensive doubt on the heat-death hypothesis and the applicability of a simple thermodynamic model to the universe in general. Although entropy does increase in the model of an expanding universe, the maximum possible entropy rises much more rapidly thus entropy density is decreasing with time. This results in an"entropy gap" that pushes the system further away from equilibrium. Other complicating factors, such as the energy density of the vacuum and macroscopic quantum effects, are difficult to reconcile with thermodynamic models, making any predictions of large-scale thermodynamics extremely difficult."
Entropy has often been associated with the amount of order, disorder and/or chaos in a thermodynamic system. Entropy serves as a measure of how close a system is to equilibrium, that is, to perfect internal disorder. The value of the entropyof distribution of atoms and molecules in a thermodynamic system is universe of the disorder in the arrangements of its particles. Solids, which are typically ordered on the molecular scale, usually have less entropy than liquids; liquids have less entropy than gases; colder gases have less entropy than hotter gases. At absolute zero, crystalline structures are approximated to have perfect "order" and zero entropy.
Mathematically, entropy S is defined as
S= -K Spilog pi(1)
The sum runs over all microstates consistent with the givenmacro state, and pi is the probability of the ith microstate and, K is a constant. According to this definition, highly-ordered states have low entropy and disordered states may or may not have high entropy. For a microcanonical system where all accessible microstates have the same probability, equation (1) gives
S= K lnW (2)
Where W is the number of possible states in which a system can be found.
Some scientists have questioned the relationship between entropy and disorder. If entropy is associated with disorder, and if the entropy of the universe is headed towards the maximum, then many are puzzled as to the nature of the "ordering" process and operation of evolution. In the recent book SYNCThe Emerging Science of Spontaneous Order, Steven Strogatz writes "Scientists have often been baffled by the existence of spontaneous order in the universe. The laws of thermodynamics seem to dictate the opposite; nature should inexorably degenerate towards a state of greater disorder, greater entropy. Yet all around us we see magnificent structures like galaxies, cells, ecosystems, and human beings etc. that have all somehow managed to assemble themselves."
The most general interpretation of entropy is as a measure of our uncertainty about a system. The equilibrium state of a system maximizes the entropy becausewe have lost all information about the initial conditions except for the conserved variables; maximizing the entropy maximizes our ignorance about the details of the system. This uncertainty is not of the everyday subjective kind, but rather the uncertainty inherent to the experimental method and interpretive model.
Locally, the entropy can be lowered by external action. This applies to machines such as a refrigerator, where the entropy in the cold chamber is reduced, and to living organisms. This local decrease is, however, only possible at the expense of entropy increase in the surroundings.