1212.5142 (A. N. Gorban)
A. N. Gorban
The entropy maximum approach (Maxent) was developed as a minimization of the subjective uncertainty measured by the Boltzmann--Gibbs--Shannon entropy. Many new entropies have been invented in the second half of the 20th century. Now there exists a rich choice of entropies for fitting needs. This diversity of entropies gave rise to a Maxent "anarchism". Maxent approach is now the conditional maximization of an appropriate entropy for the evaluation of the probability distribution when our information is partial and incomplete. The rich choice of non-classical entropies causes a new problem: which entropy is better for a given class of applications? We understand entropy as a {\em measure of uncertainty which increases in Markov processes.} In this work, we describe the most general ordering of the distribution space, with respect to which all continuous-time Markov processes are monotonic (the Markov order). For inference, this approach results in a {\em set} of conditionally "most random" distributions. Each distribution from this set is a maximizer of its own entropy. This "uncertainty of uncertainty" is unavoidable in analysis of non-equilibrium systems. Surprisingly, the constructive description of this set of maximizers is possible. Two decomposition theorems for Markov processes provide a tool for this description.
View original:
http://arxiv.org/abs/1212.5142
No comments:
Post a Comment