Abstract
The significance of using quantile-based entropies lies in their ability to capture different characteristics compared to the approach using distribution functions. This article is centered on examining the quantile-based Rényi entropy and its time-dependent versions for record statistics. The additional parameter \(\mu\) in this entropy enables a broader interpretation, with the Shannon entropy being a specific case, enabling the entropy metric to be tailored to the precise properties of the data being studied. The study investigates these entropies across various generalized models that lack a probability density function or a cumulative distribution function and further introduces an alternative representation for these entropy quantities centered on the expected values of other random variables. This, in turn, leads to a finding on the bound of the quantile Rényi residual entropy of the \(n^{th}\) record in terms of the mode of the truncated Gamma distribution. Additionally, the study presents a unique characterization result for Rényi entropy using its residual form and hazard quantile function.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
A key aspect of statistical research is the evaluation of record values, and the groundwork for this was laid by Chandler (1952), who was later joined by Nagaraja (1977, 1978, 1982, 1988) and Arnold et al. (1992). Analyzing a sequence of random variables that are independently and identically distributed(i.i.d.) with cumulative distribution function(cdf), survival function, and probability density function(pdf) G(y), \(\overline{G}(y) = 1-G(y)\), and g(y), respectively, then \(R_{n}\), the \(n^{th}\) upper record value is defined as the \(n^{th}\) largest value in the sequence that has not been exceeded before. The pdf for \(R_{n}\) can be expressed as follows:
with the gamma function denoted by \(\Gamma (.)\). We have emphasized here the upper extremities of record statistics as these have traditionally been the focus of most literature. However, in practical cases, lower extremes also serve as useful metrics, such as in measuring temperature lows or race duration, etc. Simple modifications can handle these lower records.
The utility of record statistics extends to several fields, including sports events, quality control, finance, and risk management, where they can help identify outliers and unusual patterns in order to analyze extreme events. Leading researchers, such as Arnold et al. (1992); Ahsanullah (2004); David and Nagaraja (2003) have provided valuable insights into the behavior of the tails of the distributions.
In addition, entropy, a fundamental concept introduced by Shannon (1948) in information theory, provides a quantitative measure of the uncertainty and randomness of a system. This concept is particularly helpful in the analysis, modeling, and optimization of complex systems, see Cover (1999); Kumar et al. (2024) and the cited references for additional insights.
Further enhancing the value of entropy in statistical analysis, Rényi (1961) proposed an additive generalization of order \(\mu\) of Shannon entropy for a non-negative absolutely continuous random variable Y with pdf g(y) as given by
In order to control the sensitivity of the entropy effectively to different parts of the distribution, we rely on the parameter \(\mu\). Shannon entropy, represented by \(-\int _{0}^{\infty }g(y)\ln (g(y))dy\), is obtained when \(\mu \rightarrow 1\) in \(\phi _{\mu }(Y)\).
To ensure accuracy in reliability and life testing studies, we must also consider the dynamic measures of entropy. This is especially important when we are attempting to determine the age of a particular component or system. It is vital to comprehend entropy’s accuracy to make well-informed decisions. Abraham and Sankaran (2006) extended and put forth Rényi entropy of order \(\mu\) for residual lifetime \(Y_{t} = [Y-t|Y>t]\) which is defined below, thus making it a valuable tool for accurately understanding the dynamic measures of entropy.
For more details related to dynamic forms of this generalized entropy, refer to Asadi et al. (2005). Rényi entropy for past lifetime \(_{t}Y = [t-Y|Y\le t]\) was extensively studied by Gupta and Nanda (2002) and is given as
Analogous to (1.2), Rényi entropy for the \(n^{th}\) upper record is given by
Several authors namely Nanda and Maiti (2007); Zarezadeh and Asadi (2010) and many more have extensively researched dynamic forms of Rényi entropy and their properties.
A probability model can be well established by utilizing either a distribution or a quantile function. However, both tools serve different purposes. Quantile functions, for example, provide a summary measure of distribution that is less susceptible to extreme values than other measures, including mean or standard deviation. They are often used to construct confidence intervals and conduct hypothesis testing when the null hypothesis does not indicate a particular distribution. In several models like power Pareto, Govindarajulu and various parameters family of lambda distributions, etc. where the distribution function is unknown or not easily modeled through a specific distribution function, traditional approaches may either be insufficient to obtain the required result or too complicated. In such cases, the tools based on quantile functions provide a flexible and robust way of summarizing a dataset’s distribution without relying on certain underlying distribution assumptions, refer to Gilchrist (2000) and Nair et al. (2013). This alternative approach, known as quantile functions (QFs), is defined by
where G(y) is the continuous cdf of a non-negative random variable Y and \(q(p) = \frac{d}{dp}(Q(p))\) is the quantile density function.
The Shannon quantile entropy and its residual form were studied by Sunoj and Sankaran (2012). The quantile interpretation of Shannon’s past entropy was investigated by Sunoj et al. (2013). Nanda et al. (2014) proposed Rényi residual entropy in quantile form, defined as
When \(\mu \rightarrow 1\), the measure (1.7) reduces to
which is a quantile-based residual form of Shannon entropy.
The study of records in various domains such as climatology, sports, medicine, traffic, and industry is of great significance as it offers a natural and comprehensive account of scientific and technological progress. It enables us to examine the evolution of human achievements in different domains of activity. As a result, a significant amount of record data has been collected and saved over time and therefore numerous mathematical models like the Weibull distribution, the Gumbel distribution, and the generalized extreme value distribution, etc. have been developed that capture the nature of the underlying record processes and project future statistics and probabilities of future events related to the records. At the same time, most recently, academics have gained interest in researching the information quantities based on quantiles, as a more natural alternative to the routine distribution function approach, since they can resolve issues that are unresolvable in the latter approach by focusing on abstract quantiles-based characteristics of the underlying distribution, which are more precise and less susceptible to the influence of outliers. One of the intriguing questions that remain is whether it is possible to quantify the volume of information in a spectrum of record data numbers via a sequence of i.i.d. random variables by means of residual or past forms of various information measures and their quantiles. The study of Shannon quantile entropy for records and its proven statisticians remains under research by Kumar and Dangi (2023). However, the quantile-based Rényi entropy measure seems to be more promising as it also generalizes Shannon quantile entropy and provides more accurate and robust results while characterizing the underlying distribution uniquely. Considering the importance of Rényi entropy and its measures, quantile functions, and record statistics, we are greatly motivated to investigate the quantile variant of the Rényi entropy measure for record statistics.
This communication analyzes the findings of the record statistics using the quantile variant of the Rényi entropy measure. We study its application to a variety of models with simple or tractable quantile functions or quantile density functions but no closed-form equation for the pdf or cdf. Section 2 introduces the quantile variant of the Rényi entropy of the \(n^{th}\) higher record and its investigation is carried out for different extended models. The dynamic versions (past and residual) of the Rényi quantile entropy of the \(n^{th}\) higher record are illustrated in Sect. 3, along with an alternate expression for the same in terms of truncated Gamma distribution expectations. In addition, an important result on the bound of the residual version of this entropy based on the mode of the truncated Gamma distribution has been derived. In Sect. 4, we provide a characterization result on the uniqueness followed by the concluding remarks of the paper in the last section.
2 Quantile-based Rényi entropy of record statistics
Using Eq. (1.6), when the function G is continuous, it follows that \(G(Q(p)) = p\). Taking the derivative with respect to p gives \(q(p)g(Q(p)) = 1\). These results allow us to express the pdf of the \(n^{th}\) upper record as defined in Eq. (1.1) in terms of quantile as below
Similar to Eq. (1.2), the quantile-based Rényi entropy of \(n^{th}\) upper record value can be defined as
When \(n=1\), Eq. (2.3) simplifies to the quantile Rényi entropy of the parent distribution and \(\mu\) approaching 1 reduces Eq. (2.3) to the quantile-based Shannon entropy of the \(n^{th}\) upper record.
There may be probability models in real-world situations without a closed-form distribution function, but the quantile function still remains available. Based on this framework, we take into consideration the following model and get the quantile-based Rényi entropy \(\Phi _{\mu }(R_n)\) of the \(n^{th}\) upper record value, where q(.) exists.
2.1 Rényi quantile entropy of nth record for a generalized model
Consider an i.i.d. random variable Y with a quantile density function given as
where \(\eta\), \(\delta\) and \(\nu\) are real valued parameters in this model. Then using Eq. (2.2), Rényi quantile entropy of \(n^{th}\) upper record is obtained as
The substitution of different values of parameters in Eq. (2.4) leads to diverse lifetime distributions and to offer insight into the Rényi quantile entropies of \(n^{th}\) record value corresponding to these distributions, Table 1 is included. In practical applications, quantiles can be employed to determine the values of these parameters. Cook (2010) has developed an advanced set of algorithmic rules for determining the parameters of commonly used distributions, accompanied by a software tool called ParameterSolver. This software tool is a valuable resource for researchers and practitioners who require reliable and efficient computing of these parameters.
In certain situations, employing the quantile functions approach can prove to be more effective than using the cumulative distribution functions approach, as it is less affected by extreme statistical observations. Even though certain models don’t have closed-form equations for the pdf or cdf, they do have quantile density or simple quantile functions. For such models, see, Hankin and Lee (2006); van Staden and Loots (2009) and Nair et al. (2013). Here, we present some distributions for which \(q(\cdot )\) exists, and obtain \(\Phi _{\mu }(R_n)\).
Example 2.1
Consider the distribution proposed by Govindarajulu which does not have any closed-form formulations for its density or distribution functions. However, its quantile and quantile density functions are given respectively by \(Q(p) = \theta + \sigma \{(\lambda +1)p^\lambda -\lambda p^{\lambda +1}\}\) and \(q(p) = \sigma \lambda (\lambda +1)(1-p)p^{\lambda -1}; \theta , \sigma , \lambda >0.\) Then Rényi quantile entropy of \(n^{th}\) record for this distribution is derived as
which is readily traceable for further approximation.
Example 2.2
Consider the Davis Distribution, a lambda family of distributions that was proposed by Hankin and Lee (2006) with quantile density function
This family of distributions offers a decent approximation to the Weibull, Gamma, Exponential, and Lognormal distributions and is best fitted to the right-skewed and non-negative data. Next, for this distribution, the Rényi quantile entropy of \(n^{th}\) record is determined as
which can be easily estimated numerically. When \(\lambda _1 = \lambda _2 >0\), then Eq. (2.7) corresponds to Log Logistic distribution. Also as \(\lambda _{1}\) and \(\lambda _{2}\) approach 0, Eq. (2.7) reduces to the Pareto I distribution and the Power distribution respectively.
Example 2.3
Take into consideration, respectively, the quantile function and quantile density function of the van Staden and Loots (2009) distribution as
Then using Eqs. (2.3) and (2.8), the quantile variant of Rényi’s entropy of \(n^{th}\) record for van Staden-Loots distribution is calculated as
which can be further estimated numerically. As \(\lambda _{1}\longrightarrow 0,\) Eq. (2.9) reduces to Exponential distribution and as \(\lambda _{4}\longrightarrow 1,\) Eq. (2.9) reduces to \(\ln \lambda _2 + \frac{1}{1-\mu }\ln \left[ \frac{\Gamma ((n-1)\mu + 1)}{(\Gamma (n))^{\mu }}\right] ,\) corresponding to the Uniform distribution and when \(\lambda _2 = 2, \lambda _3 = 1/2, \lambda _{4}=0,\) Eq. (2.9) reduces to \(\frac{1}{1-\mu }\ln \left[ \int _{0}^{1} \frac{(-\ln (1-p))^{(n-1)\mu }}{(p(1-p))^{1-\mu }(\Gamma (n))^{\mu }} \textrm{d}p \right] ,\) coinciding with the Logistic distribution having parameter as 1.
Example 2.4
Consider the quantile density function of the Generalized Lambda distribution as defined by
Then using equations (2.3) and (2.10), we calculate Rényi quantile entropy of \(n^{th}\) record as
which can be further estimated numerically. When \(\lambda _{2} = \lambda _{3} = \lambda _{4} = \lambda\), Eq. (2.11) reduces to \(-\frac{\mu }{1-\mu }\ln \Gamma (n) + \frac{1}{1-\mu }\ln \left[ \int _{0}^{1}(-\ln (1-p))^{(n-1)\mu } \left\{ p^{\lambda - 1}+(1-p)^{\lambda -1}\right\} ^{1-\mu } \textrm{d}p\right] ,\) equivalent to the Lambda distribution. As \(\lambda \longrightarrow 1\) and \(\lambda \longrightarrow 0\), it further reduces to \(\ln 2 + \frac{1}{1-\mu }\ln \left[ \frac{\Gamma ((n-1)\mu + 1)}{(\Gamma (n))^{\mu }} \right] ,\) equivalent to Uniform distribution in [-1,1] and \(-\frac{\mu }{1-\mu }\ln \Gamma (n) + \frac{1}{1-\mu }\ln \left[ \int _{0}^{1}\frac{(-\ln (1-p))^{(n-1)\mu }}{ (p(1-p))^{1-\mu }} \textrm{d}p\right] ,\) equivalent to the logistic distribution with parameter as 1 respectively.
Example 2.5
Consider the Lambda family of distribution with five parameters as developed by Gilchrist (2000) with quantile density function given as
Tarsitano (2005) offered very close approximations to a range of symmetric and asymmetric distributions using this model. Furthermore, he advocated for using this model in cases where a specific distributional form cannot be inferred from the prevailing physical context. Then Rényi quantile entropy of \(n^{th}\) record for this family can be derived as
which can be further estimated numerically. As \(\lambda _3\longrightarrow 0\), this family corresponds to the Generalized Tuckey Lambda family of distribution with Eq. (2.13) reduced to \(\ln \frac{\lambda _{2}}{2} -\frac{\mu }{1-\mu }\ln \Gamma (n)+\frac{1}{1-\mu }\ln \left[ \int _{0}^{1}(-\ln (1-p))^{(n-1)\mu } \{ p^{\lambda _4-1} +(1-p)^{\lambda _{5}-1} \}^{1-\mu } \textrm{d}p\right]\). This family also includes the exponential distribution when \(\lambda _4\longrightarrow \infty , \lambda _5 \longrightarrow 0\), the generalized Pareto distribution when \(\lambda _4\longrightarrow \infty\) and \(|\lambda _5| < \infty\), and power distribution when \(\lambda _5\longrightarrow \infty\) and \(|\lambda _4| < \infty\).
The expression for the Rényi’s quantile entropy of the nth upper record value that we have been looking at in this section is obtained in terms of the random variable’s quantile density function. In our further analysis, we are aiming to present an alternative expression for the same based on the expectation of another random variable. This approach will provide a different perspective on quantile entropy calculation and could offer valuable insights into the behavior of the distribution.
Theorem 2.1
Assume a sequence \(\{Y_i: i\ge 1\}\) of i.i.d. continuous random variables with Q(p) and q(p) as quantile function and quantile density function respectively with Rényi entropy \(\Phi _{\mu }(\cdot ) < \infty\). Then for all \(i\ge 1\), an alternate formulation for Rényi’s quantile entropy of \(n^{th}\) upper record, \(R_n\), can be derived as
where E indicates a random variable’s expectation; \(R_{n}^{*}\) follows the gamma distribution with parameters \((n-1)\mu +1\) and 1.
Proof
The gamma distribution’s quantile version of the pdf, with parameters \((n-1)\mu +1\) and 1 is \(\frac{(1-p)(-\ln (1-p))^{(n-1)\mu }}{\Gamma ((n-1)\mu + 1)}\) and hence
Switching these values in Eq. (2.2), we obtain the needed outcome. \(\square\)
3 Quantile-based dynamic Rényi entropy of record statistics
The concept of dynamic quantile entropy bridges the gap between traditional entropy measures and the dynamic behavior of time-varying data. By considering quantiles and their evolution, it offers a nuanced and contextually relevant understanding of how data distributions change over time, leading to improved modeling, prediction, and decision-making.
Time plays a pivotal role in studying quantile entropies, as it introduces the temporal dimension into the analysis of information complexity and uncertainty. The role of time in studying quantile entropies is to capture the changing distributional characteristics of data over different quantiles and time intervals. This provides insights into evolving uncertainty, risks and patterns, making quantile entropies an essential tool across various fields involving time-varying data. Consequently, the exploration of dynamic variants of these statistical quantities holds increasing importance and relevance. Numerous researchers have conducted studies on the significance of residual entropy as an uncertainty measure in order and record statistics; see, for example, Zarezadeh and Asadi (2010); Baratpour et al. (2007a, 2007a); Sunoj et al. (2013), and Kumar (2015, 2016). Quantile-based Rényi residual entropy of \(n^{th}\) record is expressed as
where \(\overline{{\c H}}_n(p) = \frac{\Gamma (n; -\ln (1-p))}{\Gamma (n)}\) is the survival function of \(n^{th}\) upper record in terms of quantile and \(\Gamma (n;-\ln (1-p))\) denotes the truncated upper gamma function. It can also be written as
Table 2 presents a summary of the quantile-based Rényi residual entropy of the \(n^{th}\) record for various lifetime distributions, utilizing the generalized model detailed in Sect. 2.1 with Eq. (3.2) and incorporating a range of parameter values.
It is noteworthy that in certain practical scenarios, uncertainty may be associated with past lifetime rather than future events. As a response, we have examined the nth record’s Rényi quantile entropy for previous time or inactivity. Below is the quantile-based Rényi entropy of the \(n^{th}\) record value for the previous lifetime,
where \({\c H}_n(p) = \frac{\gamma (n; -\ln (1-p))}{\Gamma (n)}\) is the reliability function of nth upper record in terms of quantile and \(\gamma (n;-\ln (1-p))\) denotes the truncated lower gamma function. Similar to Rényi residual quantile entropy of nth upper record, it can be simplified to yield
Remark 3.1
When \(n=1\), Eqs. (3.2) and (3.4) correspond to the Rényi quantile entropy for residual and inactive lifetimes of underlying distribution, respectively. Moreover, as \(p \rightarrow 0\) and \(p \rightarrow 1\) in Eqs. (3.2) and (3.4) respectively, they both correspond to the Rényi quantile entropy of \(n^{th}\) record of the underlying distribution.
In our upcoming analysis, we will delve into the dynamic forms of quantile-based Rényi entropy of the \(n^{th}\) upper record values, expressed in relation to the expectation of the truncated Gamma distribution. This investigation is poised to yield essential insights regarding the boundaries of these entropies with respect to the mode of the truncated Gamma distribution. This could further lead to a better understanding of the behavior of these entropies under certain conditions and potentially inform decision-making processes in relevant industries and fields.
Theorem 3.1
Assume a sequence \(Y_{i}, i \ge 1\) of i.i.d. continuous random variables with Q(p) and q(p) as quantile distribution and quantile density functions respectively. Then quantile-based Rényi residual entropy of the nth upper record value, \(R_{n}\), can be derived as
where \(z = -\ln (1-p)\) and \(S_{z} \sim \Gamma (\mu (n-1) +1; -\ln (1-p))\).
Proof
Rewriting Eq. (3.2) as
The pdf of truncated upper gamma distribution with parameter \((n-1)\mu + 1\) in quantile form is \(\frac{(1-p)(-\ln (1-p))^{(n-1)\mu }}{\Gamma (\mu (n-1) +1; -\ln (1-p))}\) and hence
Substituting this value in Eq. (3.6), we get the desired result. \(\square\)
Theorem 3.2
Under the presumptions of theorem 3.1, the quantile-based Rényi past entropy of the \(n^{th}\) upper record value, \(R_{n}\), can be derived as
where \(z = -\ln (1-p)\) and \(S_{z} \sim \gamma (\mu (n-1) +1; -\ln (1-p))\).
Proof
Using Eq. (3.4), where \(z=-\ln (1-p)\), it can be proved similarly to theorem 3.1. \(\square\)
Theorem 3.3
An Important Result on Bounds: Let \(Y_{i}, i\ge 1\) be a sequence of i.i.d. continuous random variables with quantile Rényi residual entropy of \(n^{th}\) upper record, \(R_{n}\) as \(\Phi _{\mu }(R_n; p) < \infty\), then this quantile Rényi residual entropy of \(n^{th}\) upper record is bounded as follows: for \(\mu > 1(0< \mu < 1)\);
where \(A(p) = \frac{1}{1-\mu } \ln \left( \int _{p}^{1} \frac{(q(s))^{1-\mu }}{1-s} \textrm{d}s\right)\).
Proof
If \(\Phi _{\mu }(R_n; p) < \infty\) and \(m_{n} = max\{ \mu (n-1), -\ln (1-p)\},\) where \(m_{n}\) is the mode of truncated Gamma distribution with parameter \((n-1)\mu + 1\) and quantile version of density function is \(\frac{(1-m_{n})(-\ln (1-m_{n}))^{(n-1)\mu }}{\Gamma ((n-1)\mu + 1; -\ln (1-p))} = M_{n}\)(say). Now, we write, for \(\mu > 1(0< \mu < 1)\)
Using these values, Eq. (3.5) reduces to
Hence the proof. \(\square\)
4 The problem of unique characterization
This section emphasizes how our problem becomes an initial value problem (IVP) and then uniquely specifies the parent distribution using the hazard quantile function in quantile-based Rényi residual entropy of \(n^{th}\) upper record.
Our first task is to determine the sufficient criteria for the unique solution of an IVP: Given a function f(x, y) with two variables defined in a region \(D \subset R^2, (x_0,y_0)\) is a particular point in D, y is the unknown function such that \(y' = f(x,y)~,~ y(x_0)=y_0\). Then a function \(\varphi (x)\) is the solution of this IVP on an interval \(I \subset R\) only if (i) the graph of \(\varphi\) falls into D, (ii) \(\varphi\) is differentiable on I, (iii) \({\varphi }^{'} = f(x,\varphi (x))\) for all \(x \in I\), and (iv) \(\varphi (x_0) = y_0\).
We may prove our characterization result using the following theorem and the lemma.
Theorem 4.1
Suppose that the function f is defined and continuous in a domain \(D \subset R^2\), and f satisfies a Lipschitz condition (with respect to y) in D, namely \(|f(x,y_1)-f(x,y_2)| \le K|y_1 - y_2|, K > 0,\) for every point \((x,y_1)\) and \((x,y_2)\) in D, then the function \(y = \varphi (x)\) satisfies the initial value problem \(y' = f(x,y)\) and \(\varphi (x_0) = y_0, x \in I\) is unique.
We now provide a sufficient condition under which any function f(x, y) of two variables defined in \(D \subset R^2\) will satisfy the Lipschitz condition.
Lemma 4.1
The function f satisfies the Lipschitz condition in D if it is continuous in a convex region \(D \subset R^2\) and that \(\frac{\partial f}{\partial y}\) exists and is continuous in D.
For the proof and other findings concerning the aforementioned theorem and lemma, see, Gupta and Kirmani (2008). The hazard quantile function is another significant quantile measure that is comparable to the widely recognized measure of hazard rate and is expressed as
where \(\zeta (t) = \frac{g(t)}{\overline{G}(t)}\) is the hazard rate of a random variable Y. Then the hazard quantile function of the \(n^{th}\) upper record can be expressed as follows:
where prime \('\) denotes the differentiation with respect to p and \(\psi (n; -\ln (1-p)) = \frac{\Gamma '(n; -\ln (1-p))}{\Gamma (n; -\ln (1-p))}\) is the truncated upper digamma function. We will now proceed to establish the proof for our characterization result.
Rewriting Eq. (3.2) as
Using hazard quantile function of \(n^{th}\) upper record,
Differentiating it with respect to p, we obtain
Differentiating it again with respect to p, we get
where \(\Gamma (\cdot ) = \Gamma (n;-\ln (1-p)), \Gamma '(\cdot ) = \Gamma '(n;-\ln (1-p)), \psi (\cdot ) = \psi (n;-\ln (1-p))\),
and \(\psi _{1}(\cdot ) = \psi _{1}(n;-\ln (1-p))= \psi '(n; -\ln (1-p))\), is the truncated upper trigamma function.
Taking \(\wedge _n(p) = y\) and \(\Phi _{\mu }(R_{n}; p) =f(p)\), we get
where,
Theorem 4.1 and Lemma 4.1 collectively demonstrate that the initial value problem (IVP) (4.1) possesses a unique solution \(y = \wedge _n(p)\) and serves as the unique characterization of the parent distribution.
5 Concluding remarks
Quantile variations of Rényi entropy and its dynamic forms in record statistics analyze the quantiles of the distribution instead of the original data, offering insights into the dynamics of extreme events and their underlying distributions. These dynamic variations could be valuable in fields such as environmental science, finance, and reliability engineering, where understanding extreme events is critical.
In this research, we have formulated expressions for quantile-based Rényi entropy and its dynamic versions (residual and past) for the \(n^{th}\) upper record value and examined them for various generalized and lifetime distributions commonly used in life testing. Additionally, we have derived an alternative expression for Rényi entropy in relation to the expectation of the Gamma distribution and for dynamic forms in relation to the expectation of the truncated upper and lower Gamma distributions, demonstrating their connection with other distributions. These equations have been utilized to establish an extreme boundary for quantile Rényi residual entropy for the \(n^{th}\) highest record in terms of the mode of the truncated Gamma distribution. We have also presented a unique characterization outcome for the distributions using the residual form of Rényi entropy for the \(n^{th}\) upper record and the quantile hazard function.
The potential applications of quantile analysis of Rényi entropy for records are extensive and interdisciplinary, with implications across various fields where analyzing extreme events and quantifying uncertainty are crucial for decision-making and risk management. Ongoing research in this area is likely to yield valuable insights and innovations with significant practical implications. If one is open to using a more intricate model with greater flexibility, the current work can be expanded to encompass more generalized models and entropies.
References
Abraham B, Sankaran PG (2006) Renyi’s entropy for residual lifetime distribution. Stat Pap 47:17–29
Ahsanullah M (2004) Record values-theory and applications. University Press of America, New York
Arnold BC, Balakrishnan N, Nagaraja HN (1992) A First Course in Order Statistics (Vol. 54). SIAM
Asadi M, Ebrahimi N, Soofi ES (2005) Dynamic generalized information measures. Stat Probab Lett 71(1):85–98
Baratpour S, Ahmadi J, Arghami NR (2007) Entropy properties of record statistics. Stat Pap 48:197–213
Baratpour S, Ahmadi J, Arghami NR (2007) Some characterizations based on entropy of order statistics and record values. Commun Stat Theory Methods 36(1):47–57
Chandler K (1952) The distribution and frequency of record values. J Roy Stat Soc: Ser B (Methodol) 14(2):220–228
Cook J (2010) Determining distribution parameters from quantiles. UT MD Anderson Cancer Center Department of Biostatistics Working Paper Series. Working Paper 55
Cover TM (1999) Elements of information theory. Wiley, Amsterdam
David HA, Nagaraja HN (2003) Order statistics (No. 24180). Wiley, Amsterdam
Gilchrist W (2000) Statistical modelling with quantile functions. Chapman and Hall/CRC, New York
Gupta RC, Kirmani SNUA (2008) Characterization based on convex conditional mean function. J Stat Plan Inference 138(4):964–970
Gupta RD, Nanda AK (2002) \(\alpha\)-and \(\beta\)-entropies and relative entropies of distributions. J Stat Theory Appl 1(3):177–190
Hankin RK, Lee A (2006) A new family of non-negative distributions. Aust N Z J Stat 48(1):67–78
Kumar V (2015) Generalized entropy measure in record values and its applications. Stat Probab Lett 106:46–51
Kumar V (2016) Some results on Tsallis entropy measure and k-record values. Phys A 462:667–673
Kumar V, Dangi B (2023) Quantile-based Shannon entropy for record statistics. Commun Math Stat 11(2):283–306
Kumar K, Kumar I, Ng HKT (2024) On estimation of Shannon’s entropy of Maxwell distribution based on progressively first-failure censored data. Stats 7(1):138–159
Nagaraja HN (1988) Some characterizations of continuous distributions based on regressions of adjacent order statistics and record values. Sankhyā: The Indian Journal of Statistics, Series A, 70-73
Nagaraja HN (1977) On a characterization based on record values. Aust N Z J Stat 19:70–73
Nagaraja HN (1978) On the expected values of record values1. Aust N Z J Stat 20:176–182
Nagaraja HN (1982) Record values and extreme value distributions. J Appl Probab 19(1):233–239
Nair NU, Sankaran PG, Balakrishnan N (2013) Quantile-based reliability analysis. Birkhauser, Basel
Nanda AK, Maiti SS (2007) Rényi information measure for a used item. Inf Sci 177(19):4161–4175
Nanda AK, Sankaran PG, Sunoj SM (2014) Renyi’s residual entropy: a quantile approach. Stat Probab Lett 85:114–121
Rényi A (1961) On measures of entropy and information. In: Proceedings of the fourth Berkeley symposium on mathematical statistics and probability, volume 1: contributions to the theory of statistics (Vol. 4, pp. 547–562). University of California Press, California
Shannon CE (1948) A mathematical theory of communication. Bell Syst Tech J 27(3):379–423
Sunoj SM, Sankaran PG (2012) Quantile based entropy function. Stat Probab Lett 82(6):1049–1053
Sunoj SM, Sankaran PG, Nanda AK (2013) Quantile based entropy function in past lifetime. Stat Probab Lett 83(1):366–372
Tarsitano A (2005) Estimation of the generalized lambda distribution parameters for grouped data. Commun Stat Theory Methods 34(8):1689–1709
van Staden PJ, Loots MT (2009) Method of L-moment estimation for the generalized lambda distribution. In: Proceedings of the 3rd Annual ASEARC Conference (pp 7–8)
Zarezadeh S, Asadi M (2010) Results on residual Rényi entropy of order statistics and record values. Inf Sci 180(21):4195–4206
Acknowledgements
The authors would like to express their gratitude to the two anonymous reviewers and the editor-in-chief for their valuable suggestions, which have considerably improved the earlier version of the article.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
On behalf of all the authors, the corresponding author states that there is no Conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Dangi, B., Kumar, I. Rényi quantile entropy and its dynamic forms for record statistics. Life Cycle Reliab Saf Eng (2024). https://doi.org/10.1007/s41872-024-00275-5
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s41872-024-00275-5