Another method of stating this: Take a precisely testable information or stated prior data of a probability distribution function. Think about the set of all trial probability distributions which would encode the prior data. Of those, one with maximal information entropy is the appropriate distribution, based on this principle.
In many practical cases, the testable information or stated prior data is offered by a set of conserved quantities (average values of a few moment functions), related to the probability distribution in question. This is how a maximum entropy principle is frequently employed in statistical thermodynamics.
Theory of entropy is actually a measure of the uncertainty in the random variable. In this context, the word usually means Shannon entropy, which quantifies the expected value of the information included in a message. Entropy is normally measured in nats, bits or bans. Shannon entropy is the average unpredictability in a random variable, which is equal to its information content.
A single toss of a fair coin posseses an entropy of a single bit. A series of two fair coin tosses posseses an entropy of two bits. The number of fair coin tosses is its entropy in bits. This random selection among two outcomes in a sequence with time, whether the outcomes are similarly probable or not, is often known as a Bernoulli process.
In classical statistical mechanics, the entropy statistical earlier released by Clausius is modified to statistical entropy making use of probability theory. The entropy statistical point of view was released in 1870 with the work of the Austrian physicist Ludwig Boltzmann.
Entropy Statistical Mechanics
The macroscopic state of the system is described by a distribution on the microstates that are available to a system within the course of its thermal fluctuations. Therefore the entropy is described over two various levels of description of the given system.
In Boltzmann's definition, entropy is actually a measure of the number of feasible microscopic states (or microstates) of any system in thermodynamic equilibrium, steady with its macroscopic thermodynamic properties (or macrostate). To know what macrostates and microstates are, think about the example of a gas in a box.