Oct 22,2020
Science Articles

# WHAT DO YOU UNDERSTAND BY ENTROPY?

There are two related definitions of entropy- the statistical mechanic’s definition and the thermodynamic definition. The definition in classical thermodynamics was developed first.

There are two related definitions of entropy- the statistical mechanic’s definition and the thermodynamic definition. The definition in classical thermodynamics was developed first.

Disorders along with irreversibility are the two main concepts that come in one’s mind when thinking about entropy. A way for defining entropy can be the number of ways in which a system can be arranged and re-arranged. Another way of defining it can be that higher entropy leads to a more disordered system or also that with a positive change in entropy, the system shifts towards disorder.

Entropy can be defined on a cosmology stage, at the thermodynamic stage, on a macroscopic level and or so on. But the basic crux is that entropy talks about the tendency of the universe to lean toward disorder. Entropy also talks about randomness.

For example, when we spray a perfume bottle in one corner of a room, the smell does not simply stick to that corner but the molecules of the perfume fill up the entire space altogether. It shifts from order to disorder and is spreads in a random manner. There is also another concept of irreversibility that states that (for example) when you drop a vase and it breaks into pieces, there is simply no way of you turning back the time to save the vase.

##### ENTROPY AND THE LAWS OF THERMODYNAMICS

The first law of thermodynamics is concerned with the energy used in the processes. It does not talk about the conservation of energy or the flow of heat from a cool body to a hot body. For that, there is second law of thermodynamics.

The Second law of thermodynamics states that the entropy ( a measure of the system’s thermal energy per unit of available temperature for doing useful work) of an isolated system increases in a natural thermodynamic process. It also makes believe the fact that the heat cannot be transferred from a body of low temperature to that of a body of high temperature without adding energy to it.

Entropy ensures randomness and spontaneity. Like the internal energy, entropy is a state function. Entropy depends on the transfer of heat and the temperature of the system. The entropy of a system gets increased during a process.

Some examples of entropy are melting of ice, sugar dissolving, boiling water and also making popcorn.