unified $(r,s)$-entropy measure; order statistics; Shannon entropy; logistic distribution.
K. M. Wong and S. Chen  analyzed the Shannon entropy of a sequence of random variables under order restrictions. Using $(r,s)$-entropies, I. J. Taneja , these results are generalized. Upper and lower bounds to the entropy reduction when the sequence is ordered and conditions under which they are achieved are derived. Theorems are presented showing the difference between the average entropy of the individual order statistics and the entropy of a member of the original independent identically distributed (i.i.d.) population. Finally, the entropies of the individual order statistics are studied when the probability density function (p.d.f.) of the original i.i.d. sequence is symmetric about its mean.
 S. Arimoto: Information-theoric consideration on estimation problems
. Information and Control 19 (1971), 181–194. MR 0309224
 N. Balakrishnan and A. C. Cohen: Order statistics and inference, Estimation methods
. Academic Press, 1991. MR 1084812
 I. S. Gradshteyn and I. M. Ryzhik: Table of integrals, series and products
. Academic Press, 1980. MR 1398882
 I. Havrda and F. Charvat: Quantification method of classification processes: concept of structural $\alpha $-entropy
. Kybernetika 3 (1967), 30–35. MR 0209067
 A. Renyi: On measures of entropy and information
. Proc. 4th Berkeley Symp. Math. Statist. and Prob. 1 (1961), 547–561. MR 0132570
| Zbl 0106.33001
 C. E. Shannon: A mathematical theory of communications
. Bell. Syst. Tech. J. 27 (1948), 379–423. MR 0026286
 B. D. Sharma and D. P. Mittal: New nonadditive measures of entropy for discrete probability distribution
. J. Math. Sci. 10 (1975), 28–40. MR 0539493
 I. J. Taneja: On generalized information measures and their applications. Adv. Elect. and Elect. Phis. 76 (1989), 327–413.
 K. M. Wong and S. Chen: The entropy of ordered sequences and order statistics
. IEEE Transactions on Information Theory 36(2) (1990), 276–284. MR 1052779