Word of the Week

 Entropy 

It's part of our community's name, so why not get into it?

Entropy, in simple terms is the measure of randomness. Let us understand this with an example. when a jar full of marbles is shook aggressively, and the contents are let out all at once, measuring how randomly the marbles got out of the jar would give you the value of entropy. of course, this is only a vague interpretation of the actual matter. 

Let's view this more technically

By definition, Entropy is the thermal energy per unit temperature that is unavailable to do useful work. 

 The above illustration is our understanding of atoms that are high in entropy, and low in entropy respectively. High entropy signifies higher uncertainty in a system, and vice versa for a low entropy system.

Entropy is denoted by the letter "S". 
The formula is as follows:
                                          
                                                            S = kbln Ω

Where kb is the Boltzmann Constant, and Ω is the number of microscopic configurations.

Introduced by Rudolf Clausius in 1850, Entropy became an expression of the Second Law of Thermodynamics, wherein Clausius said that Entropy in an isolated system would keep increasing for a spontaneous change during an irreversible process. 


To conclude, don't forget to use the term Entropy when you feel like you're being chaotic!



Comments

Popular Posts