The second law of thermodynamics conventionally describes physical systems. An important law of physics, the second law of thermodynamics, states that the entropy of any system cannot decrease except insofar as it flows outward across the boundary of the system. As a corollary, in an isolated system, the entropy cannot decrease. By implication, the entropy of the whole universe, assumed to be an isolated system, cannot decrease; in fact the entropy of the universe is always increasing. It has been speculated that the universe is fated to a heat death in which all the energy ends up as a homogenous distribution of thermal energy, so that no more work can be extracted from any source.
However, the role of entropy in cosmology remains a controversial subject. Recent work has cast extensive doubt on the heat death hypothesis and the applicability of any simple thermodynamic model to the universe in general. Although entropy does increase in the model of an expanding universe, the maximum possible entropy rises much more rapidly-thus entropy density is decreasing with time. This results in an "entropy gap" pushing the system further away from equilibrium. Other complicating factors, such as the energy density of the vacuum and macroscopic quantum effects, are difficult to reconcile with thermodynamic models, making any predictions of large scale thermodynamics extremely difficult.
Entropy has often been associated with the amount of order, disorder and / or chaos in a thermodynamic system. Entropy serves as a measure of how close a system is to equilibrium that is, to perfect internal disorder. The value of the entropy of a distribution of atoms and molecules in thermodynamic system is a universe of the disorder in the arrangements of its particles. Solids which are typically ordered on the molecular scale usually have smaller entropy than liquids, and liquids have smaller entropy than gases and colder gases have smaller entropy than hotter gases. At absolute zero temperature, crystalline structures are approximated to have perfect "order" and zero entropy.
Mathematically, entropy S is defined as
S= -K Σ Pi ln Pi (1)
The sum runs over all microstates consistent with the given macrostate and Pi is the probability of the ith microstate and, K is a constant. According to this definition highly-ordered states have low entropy and disordered states may or may not have high entropy. For microcanonical system where all accessible microstates have the same probability, equation (1) gives
S= K ln W (2)
Where W is the number of possible states in which a system can be found.
Some scientists have questioned the relationship between entropy and disorder. If entropy is associated with disorder, and if the entropy of the universe is headed towards maximum entropy, then many are often puzzled as to the nature of the "ordering" process and operation of evolution. In the recent book SYNC The Emerging Science of Spontaneous Order, Steven Strogatz writes "Scientists have often been baffled by the existence of spontaneous order in the universe. The laws of thermodynamics seem to dictate the opposite; the nature should inexorably degenerate towards a state of greater disorder, greater, entropy. Yet all around us we see magnificent structures galaxies, cells, ecosystems, human beings that have all somehow managed to assemble themselves."
The most general interpretation of entropy is as a measure of our uncertainly about a system. The equilibrium state of a system maximizes the entropy because we have lost all information about the initial conditions except for the conserved variables; maximizing the entropy maximizes our ignorance about the details of the system. This uncertainty is not of the everyday subjective kind, but rather the uncertainly inherent to the experimental method and interpretive model.
Locally, the entropy can be lowered by external action. This applies to machines such as a refrigerator, where the entropy in the cold chamber is being reduced, and to living organisms. This local decrease is, however, only possible at the expense of entropy increase in the surroundings.