The future belongs to those who can manipulate entropy those
\'The future belongs to those who can manipulate entropy; those who understand but energy will be only accountants. The early industrial revolution involved energy, but the automatic factory of the future is an entropy revolution. -Frederic Keffe Ludwig Boltzmann discovered the nature of entropy. Boltzmann was a physicist from Vienna. His first major contribution to science was derivation of the ideal gas law Pv = RT from purely statistical based arguments - no measurements involved, just free atoms modeled as billiard balls in a container with a statistical distribution of characteristics. This is illustrated in Figure 1 Identical atoms, traveling at identical velocities Statistically random atoms at different velocities Ratcheting wheel for work extraction Figure 1: Molecular ratchet for work extraction with atom distributions. Boltzmann\'s next discovery was even more earth shattering. Boltzmann realized that the statistical distribution of the molecular properties was important. A thought experiment similar to the one shown in Figure 1 was helpful with this. Which molecular ratchet do you think is more capable of extracting work from the atoms in the system? If you thought the one on the left, you are probably correct. Which system do you think is more ordered with lower entropy? Again the one on the left is probably the right answer This line of thinking led Boltzmann to realize that the microscopic entropy of a system is related to the statistical distribution of particle energies. The higher the number of ways an atom, molecule, or particle can occupy a space (different velocities, speeds, vibrations, rotations, etc.), the higher the entropy of the system. This eventually led him to derive a mathematical expression for the entropy of a statistical system: s-klog(#wavs) In general, his conclusion for a theoretical or statistical system was that the entropy or disorder of the system is a constant multiplied by the logarithm of the probability, or number of ways, that something can occur Although heat and temperature were used to derive entropy from a classical perspective, starting from a statistical point of view allows entropy to be applied to many more things than heat and thermodynamics. Boltzmann\'s equation continues to come up in many unexpected places Claude Shannon found the same equation when studying the transfer of information during communication. Information entropy is behind the game everyone plays as a kid where you
Solution
magine you are outside a classroom filled with students and you are asked to tell the exact position of Student \'B\' (just for the reference of a particular student)
Conditions are:-
1. You cant see inside the classroom.
2. No prior knowledge about which student is seating where.
Yes,you cant answer the question it means that the entropy is very high.
Now this time you are inside the classroom.
Now you are able to tell exactly which student is seating where and it means that entropy is zero.
