1 Introduction

A key aspect of statistical research is the evaluation of record values, and the groundwork for this was laid by Chandler (1952), who was later joined by Nagaraja (1977, 1978, 1982, 1988) and Arnold et al. (1992). Analyzing a sequence of random variables that are independently and identically distributed(i.i.d.) with cumulative distribution function(cdf), survival function, and probability density function(pdf) G(y), \(\overline{G}(y) = 1-G(y)\), and g(y), respectively, then \(R_{n}\), the \(n^{th}\) upper record value is defined as the \(n^{th}\) largest value in the sequence that has not been exceeded before. The pdf for \(R_{n}\) can be expressed as follows:

$$\begin{aligned} h_{n}(y) = \frac{\{-\ln (\overline{G}(y))\}^{n-1}}{\Gamma (n)} g(y);~~~-\infty< y < \infty , \end{aligned}$$
(1.1)

with the gamma function denoted by \(\Gamma (.)\). We have emphasized here the upper extremities of record statistics as these have traditionally been the focus of most literature. However, in practical cases, lower extremes also serve as useful metrics, such as in measuring temperature lows or race duration, etc. Simple modifications can handle these lower records.

The utility of record statistics extends to several fields, including sports events, quality control, finance, and risk management, where they can help identify outliers and unusual patterns in order to analyze extreme events. Leading researchers, such as Arnold et al. (1992); Ahsanullah (2004); David and Nagaraja (2003) have provided valuable insights into the behavior of the tails of the distributions.

In addition, entropy, a fundamental concept introduced by Shannon (1948) in information theory, provides a quantitative measure of the uncertainty and randomness of a system. This concept is particularly helpful in the analysis, modeling, and optimization of complex systems, see Cover (1999); Kumar et al. (2024) and the cited references for additional insights.

Further enhancing the value of entropy in statistical analysis, Rényi (1961) proposed an additive generalization of order \(\mu\) of Shannon entropy for a non-negative absolutely continuous random variable Y with pdf g(y) as given by

$$\begin{aligned} \phi _{\mu }(Y) = \frac{1}{1-\mu }\ln \left( \int _{0}^{\infty }g^{\mu }(y)dy\right) ; \mu > 0, \mu \ne 1. \end{aligned}$$
(1.2)

In order to control the sensitivity of the entropy effectively to different parts of the distribution, we rely on the parameter \(\mu\). Shannon entropy, represented by \(-\int _{0}^{\infty }g(y)\ln (g(y))dy\), is obtained when \(\mu \rightarrow 1\) in \(\phi _{\mu }(Y)\).

To ensure accuracy in reliability and life testing studies, we must also consider the dynamic measures of entropy. This is especially important when we are attempting to determine the age of a particular component or system. It is vital to comprehend entropy’s accuracy to make well-informed decisions. Abraham and Sankaran (2006) extended and put forth Rényi entropy of order \(\mu\) for residual lifetime \(Y_{t} = [Y-t|Y>t]\) which is defined below, thus making it a valuable tool for accurately understanding the dynamic measures of entropy.

$$\begin{aligned} \phi _{\mu }(Y;t) = \frac{1}{1-\mu }\ln \left( \int _{t}^{\infty }\frac{g^{\mu }(y)}{\overline{G}^{\mu }(t)}\textrm{d}y\right) ; \mu , t > 0, \mu \ne 1. \end{aligned}$$
(1.3)

For more details related to dynamic forms of this generalized entropy, refer to Asadi et al. (2005). Rényi entropy for past lifetime \(_{t}Y = [t-Y|Y\le t]\) was extensively studied by Gupta and Nanda (2002) and is given as

$$\begin{aligned} \overline{\phi }_{\mu }(Y;t) = \frac{1}{1-\mu }\ln \left( \int _{0}^{t}\frac{g^{\mu }(y)}{G^{\mu }(t)}\textrm{d}y\right) ; \mu , t > 0, \mu \ne 1. \end{aligned}$$
(1.4)

Analogous to (1.2), Rényi entropy for the \(n^{th}\) upper record is given by

$$\begin{aligned} \phi _{\mu }(R_{n})= & \frac{1}{1-\mu }\ln \left( \int _{0}^{\infty }\{-\ln (\overline{G}(y))\}^{(n-1)\mu }g^{\mu }(y)\textrm{d}y\right) \nonumber \\ & \quad -\frac{\mu }{1-\mu }\ln (\Gamma (n)); \mu > 0, \mu \ne 1. \end{aligned}$$
(1.5)

Several authors namely Nanda and Maiti (2007); Zarezadeh and Asadi (2010) and many more have extensively researched dynamic forms of Rényi entropy and their properties.

A probability model can be well established by utilizing either a distribution or a quantile function. However, both tools serve different purposes. Quantile functions, for example, provide a summary measure of distribution that is less susceptible to extreme values than other measures, including mean or standard deviation. They are often used to construct confidence intervals and conduct hypothesis testing when the null hypothesis does not indicate a particular distribution. In several models like power Pareto, Govindarajulu and various parameters family of lambda distributions, etc. where the distribution function is unknown or not easily modeled through a specific distribution function, traditional approaches may either be insufficient to obtain the required result or too complicated. In such cases, the tools based on quantile functions provide a flexible and robust way of summarizing a dataset’s distribution without relying on certain underlying distribution assumptions, refer to Gilchrist (2000) and Nair et al. (2013). This alternative approach, known as quantile functions (QFs), is defined by

$$\begin{aligned} Q(p)=G^{-1}(p)=\inf \{ y\;|\; G(y)\ge p\}; \;\;0\le p \le 1, \end{aligned}$$
(1.6)

where G(y) is the continuous cdf of a non-negative random variable Y and \(q(p) = \frac{d}{dp}(Q(p))\) is the quantile density function.

The Shannon quantile entropy and its residual form were studied by Sunoj and Sankaran (2012). The quantile interpretation of Shannon’s past entropy was investigated by Sunoj et al. (2013). Nanda et al. (2014) proposed Rényi residual entropy in quantile form, defined as

$$\begin{aligned} \Phi _{\mu }(Y; p) = \frac{1}{1-\mu }\ln \left( \int _{p}^{1}\frac{(q(s))^{1-\mu }}{(1-p)^{\mu }}\textrm{d}s\right) ; \mu > 0, \mu \ne 1. \end{aligned}$$
(1.7)

When \(\mu \rightarrow 1\), the measure (1.7) reduces to

$$\begin{aligned} {\Phi (Y;p) = \ln (1-p)+\frac{1}{1-p}\left( \int _{p}^{1}\ln (q(s))\textrm{d}s\right) ,} \end{aligned}$$
(1.8)

which is a quantile-based residual form of Shannon entropy.

The study of records in various domains such as climatology, sports, medicine, traffic, and industry is of great significance as it offers a natural and comprehensive account of scientific and technological progress. It enables us to examine the evolution of human achievements in different domains of activity. As a result, a significant amount of record data has been collected and saved over time and therefore numerous mathematical models like the Weibull distribution, the Gumbel distribution, and the generalized extreme value distribution, etc. have been developed that capture the nature of the underlying record processes and project future statistics and probabilities of future events related to the records. At the same time, most recently, academics have gained interest in researching the information quantities based on quantiles, as a more natural alternative to the routine distribution function approach, since they can resolve issues that are unresolvable in the latter approach by focusing on abstract quantiles-based characteristics of the underlying distribution, which are more precise and less susceptible to the influence of outliers. One of the intriguing questions that remain is whether it is possible to quantify the volume of information in a spectrum of record data numbers via a sequence of i.i.d. random variables by means of residual or past forms of various information measures and their quantiles. The study of Shannon quantile entropy for records and its proven statisticians remains under research by Kumar and Dangi (2023). However, the quantile-based Rényi entropy measure seems to be more promising as it also generalizes Shannon quantile entropy and provides more accurate and robust results while characterizing the underlying distribution uniquely. Considering the importance of Rényi entropy and its measures, quantile functions, and record statistics, we are greatly motivated to investigate the quantile variant of the Rényi entropy measure for record statistics.

This communication analyzes the findings of the record statistics using the quantile variant of the Rényi entropy measure. We study its application to a variety of models with simple or tractable quantile functions or quantile density functions but no closed-form equation for the pdf or cdf. Section 2 introduces the quantile variant of the Rényi entropy of the \(n^{th}\) higher record and its investigation is carried out for different extended models. The dynamic versions (past and residual) of the Rényi quantile entropy of the \(n^{th}\) higher record are illustrated in Sect. 3, along with an alternate expression for the same in terms of truncated Gamma distribution expectations. In addition, an important result on the bound of the residual version of this entropy based on the mode of the truncated Gamma distribution has been derived. In Sect. 4, we provide a characterization result on the uniqueness followed by the concluding remarks of the paper in the last section.

2 Quantile-based Rényi entropy of record statistics

Using Eq. (1.6), when the function G is continuous, it follows that \(G(Q(p)) = p\). Taking the derivative with respect to p gives \(q(p)g(Q(p)) = 1\). These results allow us to express the pdf of the \(n^{th}\) upper record as defined in Eq. (1.1) in terms of quantile as below

$$\begin{aligned} {\c {h}}_{n}(p) = \frac{\{-\ln (1-p)\}^{n-1}}{\Gamma (n) q(p)};~~~0\le p \le 1. \end{aligned}$$
(2.1)

Similar to Eq. (1.2), the quantile-based Rényi entropy of \(n^{th}\) upper record value can be defined as

$$\begin{aligned} \Phi _{\mu }(R_n)&=\frac{1}{1-\mu } \ln \left( \int _0^1 ({\c h}_{n}(p))^{\mu } q(p) \textrm{d}p\right) \nonumber \\&= \frac{1}{1-\mu } \ln \left[ \int _0^1 \left( \frac{\{-\ln (1-p)\}^{n-1}}{\Gamma (n) q(p)}\right) ^{\mu } q(p) \textrm{d}p\right] \end{aligned}$$
(2.2)
$$\begin{aligned}&=\frac{-\mu \ln (\Gamma (n)) }{1-\mu }\nonumber \\&\quad + \frac{1}{1-\mu }\ln \left[ \int _0^1 (-\ln (1-p))^{(n-1)\mu } (q(p))^{1-\mu } \textrm{d}p\right] . \end{aligned}$$
(2.3)

When \(n=1\), Eq. (2.3) simplifies to the quantile Rényi entropy of the parent distribution and \(\mu\) approaching 1 reduces Eq. (2.3) to the quantile-based Shannon entropy of the \(n^{th}\) upper record.

There may be probability models in real-world situations without a closed-form distribution function, but the quantile function still remains available. Based on this framework, we take into consideration the following model and get the quantile-based Rényi entropy \(\Phi _{\mu }(R_n)\) of the \(n^{th}\) upper record value, where q(.) exists.

2.1 Rényi quantile entropy of nth record for a generalized model

Consider an i.i.d. random variable Y with a quantile density function given as

$$\begin{aligned} q(p) = \eta (-\ln (1-p))^{\delta }(1-p)^{\nu }, \end{aligned}$$

where \(\eta\), \(\delta\) and \(\nu\) are real valued parameters in this model. Then using Eq. (2.2), Rényi quantile entropy of \(n^{th}\) upper record is obtained as

$$\begin{aligned} \Phi _{\mu }(R_n)&= \frac{1}{1-\mu }\ln \left[ \int _{0}^{1}\frac{\eta ^{1-\mu }}{(\Gamma (n))^{\mu }}\right. \nonumber \\&\quad \left. (-\ln (1-p))^{(n-1)\mu +\delta (1-\mu )}(1-p)^{\nu (1-\mu )}\textrm{d}p \right] \nonumber \\&= \ln \eta + \frac{1}{1-\mu} \ln \left[ \frac{\Gamma (n\mu + (\delta +1)(1-\mu ))}{(\Gamma (n))^{\mu } (\nu (1-\mu )+1)^{n\mu + (\delta +1)(1-\mu )}} \right] . \end{aligned}$$
(2.4)

The substitution of different values of parameters in Eq. (2.4) leads to diverse lifetime distributions and to offer insight into the Rényi quantile entropies of \(n^{th}\) record value corresponding to these distributions, Table 1 is included. In practical applications, quantiles can be employed to determine the values of these parameters. Cook (2010) has developed an advanced set of algorithmic rules for determining the parameters of commonly used distributions, accompanied by a software tool called ParameterSolver. This software tool is a valuable resource for researchers and practitioners who require reliable and efficient computing of these parameters.

Table 1 Rényi quantile entropy of nth record for different lifetime distributions

In certain situations, employing the quantile functions approach can prove to be more effective than using the cumulative distribution functions approach, as it is less affected by extreme statistical observations. Even though certain models don’t have closed-form equations for the pdf or cdf, they do have quantile density or simple quantile functions. For such models, see, Hankin and Lee (2006); van Staden and Loots (2009) and Nair et al. (2013). Here, we present some distributions for which \(q(\cdot )\) exists, and obtain \(\Phi _{\mu }(R_n)\).

Example 2.1

Consider the distribution proposed by Govindarajulu which does not have any closed-form formulations for its density or distribution functions. However, its quantile and quantile density functions are given respectively by \(Q(p) = \theta + \sigma \{(\lambda +1)p^\lambda -\lambda p^{\lambda +1}\}\) and \(q(p) = \sigma \lambda (\lambda +1)(1-p)p^{\lambda -1}; \theta , \sigma , \lambda >0.\) Then Rényi quantile entropy of \(n^{th}\) record for this distribution is derived as

$$\begin{aligned} \Phi _{\mu }(R_n) =&\ln [\sigma \lambda (\lambda +1)]-\frac{\mu }{1-\mu }\ln \Gamma (n)\nonumber \\&+\frac{1}{1-\mu }\ln \left[ \int _0^1 (-\ln (1-p))^{(n-1)\mu }p^{(\lambda -1)(1-\mu )}\right. \nonumber \\&\qquad \left. (1-p)^{1-\mu }dp\right] , \end{aligned}$$
(2.5)

which is readily traceable for further approximation.

Example 2.2

Consider the Davis Distribution, a lambda family of distributions that was proposed by Hankin and Lee (2006) with quantile density function

$$\begin{aligned} q(p) = C p^{\lambda _{1} -1}(1-p)^{-\lambda _2 -1}\{ \lambda _1 (1-p) +\lambda _2 p\}; C,\lambda _{1}, \lambda _{2}\ge 0. \end{aligned}$$
(2.6)

This family of distributions offers a decent approximation to the Weibull, Gamma, Exponential, and Lognormal distributions and is best fitted to the right-skewed and non-negative data. Next, for this distribution, the Rényi quantile entropy of \(n^{th}\) record is determined as

$$\begin{aligned} \Phi _{\mu }(R_n) =&\ln C-\frac{\mu }{1-\mu }\ln \Gamma (n)\nonumber \\&+ \frac{1}{1-\mu }\ln \left[ \int _0^1 (-\ln (1-p))^{(n-1)\mu } \left\{ \lambda _{1}p^{\lambda _{1}-1}\right. \right. \nonumber \\&\qquad \left. \left. (1-p)^{-\lambda _{2}}+\lambda _{2}p^{\lambda _{1}}(1-p)^{-(1+\lambda _{2})} \right\} ^{1-\mu }dp\right] , \end{aligned}$$
(2.7)

which can be easily estimated numerically. When \(\lambda _1 = \lambda _2 >0\), then Eq. (2.7) corresponds to Log Logistic distribution. Also as \(\lambda _{1}\) and \(\lambda _{2}\) approach 0, Eq. (2.7) reduces to the Pareto I distribution and the Power distribution respectively.

Example 2.3

Take into consideration, respectively, the quantile function and quantile density function of the van Staden and Loots (2009) distribution as

$$\begin{aligned} Q(p)&= \lambda _{1}+ \lambda _{2}\left[ \left( \frac{1-\lambda _{3}}{\lambda _{4}}\right) (p^{\lambda _{4}}-1)\right. \nonumber \\&\quad \left. - \left( \frac{\lambda _{3}}{\lambda _{4}}\right) \{(1-p)^{\lambda _{4}}-1\}\right] , \, \text {where} \, \lambda _{i}> 0 \nonumber \\&\quad \, \text {for}\, i = 1,2,3,4,\nonumber \\ \text { and }~~~q(p)&= \lambda _{2}[(1-\lambda _{3})p^{\lambda _{4}-1} + \lambda _{3}(1-p)^{\lambda _{4}-1}]. \end{aligned}$$
(2.8)

Then using Eqs. (2.3) and (2.8), the quantile variant of Rényi’s entropy of \(n^{th}\) record for van Staden-Loots distribution is calculated as

$$\begin{aligned} \Phi _{\mu }(R_n) =&\ln \lambda _2-\frac{\mu }{1-\mu }\ln \Gamma (n)\nonumber \\&+ \frac{1}{1-\mu }\ln \left[ \int _0^1 (-\ln (1-p))^{(n-1)\mu } \left\{ (1-\lambda _3)p^{\lambda _4 - 1}\right. \right. \nonumber \\&\quad \left. \left. +\lambda _3 (1-p)^{\lambda _4 -1}\right\} ^{1-\mu }\textrm{d}p\right] , \end{aligned}$$
(2.9)

which can be further estimated numerically. As \(\lambda _{1}\longrightarrow 0,\) Eq. (2.9) reduces to Exponential distribution and as \(\lambda _{4}\longrightarrow 1,\) Eq. (2.9) reduces to \(\ln \lambda _2 + \frac{1}{1-\mu }\ln \left[ \frac{\Gamma ((n-1)\mu + 1)}{(\Gamma (n))^{\mu }}\right] ,\) corresponding to the Uniform distribution and when \(\lambda _2 = 2, \lambda _3 = 1/2, \lambda _{4}=0,\) Eq. (2.9) reduces to \(\frac{1}{1-\mu }\ln \left[ \int _{0}^{1} \frac{(-\ln (1-p))^{(n-1)\mu }}{(p(1-p))^{1-\mu }(\Gamma (n))^{\mu }} \textrm{d}p \right] ,\) coinciding with the Logistic distribution having parameter as 1.

Example 2.4

Consider the quantile density function of the Generalized Lambda distribution as defined by

$$\begin{aligned} q(p) = \frac{1}{\lambda _{2}}[\lambda _{3}p^{\lambda _{3}-1} + \lambda _{4}(1-p)^{\lambda _{4}-1}]. \end{aligned}$$
(2.10)

Then using equations (2.3) and (2.10), we calculate Rényi quantile entropy of \(n^{th}\) record as

$$\begin{aligned} \Phi _{\mu }(R_n) =&-\ln {\lambda _{2}}-\frac{\mu }{1-\mu }\ln \Gamma (n) \nonumber \\&+ \frac{1}{1-\mu } \ln \left[ \int _0^1 (-\ln (1-p))^{(n-1)\mu } \left\{ \lambda _3 p^{\lambda _3 - 1}\right. \right. \nonumber \\&\quad \left. \left. +\lambda _4(1-p)^{\lambda _4 -1}\right\} ^{1-\mu } \textrm{d}p\right] , \end{aligned}$$
(2.11)

which can be further estimated numerically. When \(\lambda _{2} = \lambda _{3} = \lambda _{4} = \lambda\), Eq. (2.11) reduces to \(-\frac{\mu }{1-\mu }\ln \Gamma (n) + \frac{1}{1-\mu }\ln \left[ \int _{0}^{1}(-\ln (1-p))^{(n-1)\mu } \left\{ p^{\lambda - 1}+(1-p)^{\lambda -1}\right\} ^{1-\mu } \textrm{d}p\right] ,\) equivalent to the Lambda distribution. As \(\lambda \longrightarrow 1\) and \(\lambda \longrightarrow 0\), it further reduces to \(\ln 2 + \frac{1}{1-\mu }\ln \left[ \frac{\Gamma ((n-1)\mu + 1)}{(\Gamma (n))^{\mu }} \right] ,\) equivalent to Uniform distribution in [-1,1] and \(-\frac{\mu }{1-\mu }\ln \Gamma (n) + \frac{1}{1-\mu }\ln \left[ \int _{0}^{1}\frac{(-\ln (1-p))^{(n-1)\mu }}{ (p(1-p))^{1-\mu }} \textrm{d}p\right] ,\) equivalent to the logistic distribution with parameter as 1 respectively.

Example 2.5

Consider the Lambda family of distribution with five parameters as developed by Gilchrist (2000) with quantile density function given as

$$\begin{aligned} q(p) = \lambda _2\left[ \frac{1-\lambda _3}{2}p^{\lambda _4-1} + \frac{1+\lambda _3}{2}(1-p)^{\lambda _{5}-1}\right] . \end{aligned}$$
(2.12)

Tarsitano (2005) offered very close approximations to a range of symmetric and asymmetric distributions using this model. Furthermore, he advocated for using this model in cases where a specific distributional form cannot be inferred from the prevailing physical context. Then Rényi quantile entropy of \(n^{th}\) record for this family can be derived as

$$\begin{aligned} \Phi _{\mu }(R_n) =&\ln \lambda _{2} -\frac{\mu }{1-\mu }\ln \Gamma (n)\nonumber \\&+\frac{1}{1-\mu }\ln \left[ \int _{0}^{1}(-\ln (1-p))^{(n-1)\mu } \left\{ \frac{1-\lambda _3}{2}p^{\lambda _4-1}\right. \right. \nonumber \\&\left. \left. + \frac{1+\lambda _3}{2}(1-p)^{\lambda _{5}-1} \right\} ^{1-\mu } \textrm{d}p \right] , \end{aligned}$$
(2.13)

which can be further estimated numerically. As \(\lambda _3\longrightarrow 0\), this family corresponds to the Generalized Tuckey Lambda family of distribution with Eq. (2.13) reduced to \(\ln \frac{\lambda _{2}}{2} -\frac{\mu }{1-\mu }\ln \Gamma (n)+\frac{1}{1-\mu }\ln \left[ \int _{0}^{1}(-\ln (1-p))^{(n-1)\mu } \{ p^{\lambda _4-1} +(1-p)^{\lambda _{5}-1} \}^{1-\mu } \textrm{d}p\right]\). This family also includes the exponential distribution when \(\lambda _4\longrightarrow \infty , \lambda _5 \longrightarrow 0\), the generalized Pareto distribution when \(\lambda _4\longrightarrow \infty\) and \(|\lambda _5| < \infty\), and power distribution when \(\lambda _5\longrightarrow \infty\) and \(|\lambda _4| < \infty\).

The expression for the Rényi’s quantile entropy of the nth upper record value that we have been looking at in this section is obtained in terms of the random variable’s quantile density function. In our further analysis, we are aiming to present an alternative expression for the same based on the expectation of another random variable. This approach will provide a different perspective on quantile entropy calculation and could offer valuable insights into the behavior of the distribution.

Theorem 2.1

Assume a sequence \(\{Y_i: i\ge 1\}\) of i.i.d. continuous random variables with Q(p) and q(p) as quantile function and quantile density function respectively with Rényi entropy \(\Phi _{\mu }(\cdot ) < \infty\). Then for all \(i\ge 1\), an alternate formulation for Rényi’s quantile entropy of \(n^{th}\) upper record, \(R_n\), can be derived as

$$\begin{aligned} \Phi _{\mu }(R_n) = \frac{1}{1-\mu }\ln \left[ \frac{\Gamma ((n-1)\mu +1)E\{q(R_{n}^{*})\}^{1-\mu }}{(\Gamma (n))^{\mu }}\right] , \end{aligned}$$

where E indicates a random variable’s expectation; \(R_{n}^{*}\) follows the gamma distribution with parameters \((n-1)\mu +1\) and 1.

Proof

The gamma distribution’s quantile version of the pdf, with parameters \((n-1)\mu +1\) and 1 is \(\frac{(1-p)(-\ln (1-p))^{(n-1)\mu }}{\Gamma ((n-1)\mu + 1)}\) and hence

$$\begin{aligned} E[\{q(R_{n}^{*})\}^{1-\mu }] = \int _0^1 (q(p))^{1-\mu } \frac{(-\ln (1-p))^{(n-1)\mu }}{\Gamma ((n-1)\mu + 1)} \textrm{d}p. \end{aligned}$$

Switching these values in Eq. (2.2), we obtain the needed outcome. \(\square\)

3 Quantile-based dynamic Rényi entropy of record statistics

The concept of dynamic quantile entropy bridges the gap between traditional entropy measures and the dynamic behavior of time-varying data. By considering quantiles and their evolution, it offers a nuanced and contextually relevant understanding of how data distributions change over time, leading to improved modeling, prediction, and decision-making.

Time plays a pivotal role in studying quantile entropies, as it introduces the temporal dimension into the analysis of information complexity and uncertainty. The role of time in studying quantile entropies is to capture the changing distributional characteristics of data over different quantiles and time intervals. This provides insights into evolving uncertainty, risks and patterns, making quantile entropies an essential tool across various fields involving time-varying data. Consequently, the exploration of dynamic variants of these statistical quantities holds increasing importance and relevance. Numerous researchers have conducted studies on the significance of residual entropy as an uncertainty measure in order and record statistics; see, for example, Zarezadeh and Asadi (2010); Baratpour et al. (2007a, 2007a); Sunoj et al. (2013), and Kumar (2015, 2016). Quantile-based Rényi residual entropy of \(n^{th}\) record is expressed as

$$\begin{aligned} \Phi _{\mu }(R_n; p)=\frac{1}{1-\mu } \ln \left[ \int _p^1 \left( \frac{{\c h}_{n}(s)}{\overline{{\c H}}_n(p)}\right) ^{\mu } q(s) \textrm{d}s\right] , \end{aligned}$$
(3.1)

where \(\overline{{\c H}}_n(p) = \frac{\Gamma (n; -\ln (1-p))}{\Gamma (n)}\) is the survival function of \(n^{th}\) upper record in terms of quantile and \(\Gamma (n;-\ln (1-p))\) denotes the truncated upper gamma function. It can also be written as

$$\begin{aligned} \Phi _{\mu }(R_n; p)&=\frac{-\mu }{1-\mu } \ln (\overline{{\c H}}_n(p))\nonumber \\&\quad + \frac{1}{1-\mu } \ln \left[ \int _p^1 ({\c h}_{n}(s))^{\mu } q(s) \textrm{d}s\right] \nonumber \\&= \frac{-\mu }{1-\mu } \ln \left( \frac{\Gamma (n; -\ln (1-p)}{\Gamma (n)} \right) \nonumber \\&\quad + \frac{1}{1-\mu } \ln \left[ \int _p^1 \left( \frac{(-\ln (1-s))^{n-1}}{\Gamma (n) q(s)} \right) ^{\mu } q(s) \textrm{d}s\right] \nonumber \\&= \frac{-\mu }{1-\mu } \ln [\Gamma (n; -\ln (1-p))] \nonumber \\&\quad + \frac{1}{1-\mu } \ln \left[ \int _p^1 (-\ln (1-s))^{(n-1)\mu } (q(s))^{1-\mu } \textrm{d}s\right] . \end{aligned}$$
(3.2)

Table 2 presents a summary of the quantile-based Rényi residual entropy of the \(n^{th}\) record for various lifetime distributions, utilizing the generalized model detailed in Sect. 2.1 with Eq. (3.2) and incorporating a range of parameter values.

Table 2 Rényi residual quantile entropy of \(n^{th}\) record for different lifetime distributions

It is noteworthy that in certain practical scenarios, uncertainty may be associated with past lifetime rather than future events. As a response, we have examined the nth record’s Rényi quantile entropy for previous time or inactivity. Below is the quantile-based Rényi entropy of the \(n^{th}\) record value for the previous lifetime,

$$\begin{aligned} \overline{\Phi }_{\mu }(R_{n}; p) =\frac{1}{1-\mu } \ln \left[ \int _0^p \left( \frac{{\c h}_{n}(s)}{{\c H}_n(p)}\right) ^{\mu } q(s) \textrm{d}s\right] , \end{aligned}$$
(3.3)

where \({\c H}_n(p) = \frac{\gamma (n; -\ln (1-p))}{\Gamma (n)}\) is the reliability function of nth upper record in terms of quantile and \(\gamma (n;-\ln (1-p))\) denotes the truncated lower gamma function. Similar to Rényi residual quantile entropy of nth upper record, it can be simplified to yield

$$\begin{aligned} \overline{\Phi }_{\mu }(R_{n}; p)&= \frac{-\mu }{1-\mu } \ln [\gamma (n; -\ln (1-p))] \nonumber \\&\quad + \frac{1}{1-\mu } \ln \left[ \int _0^p (-\ln (1-s))^{(n-1)\mu } (q(s))^{1-\mu } \textrm{d}s\right] . \end{aligned}$$
(3.4)

Remark 3.1

When \(n=1\), Eqs. (3.2) and (3.4) correspond to the Rényi quantile entropy for residual and inactive lifetimes of underlying distribution, respectively. Moreover, as \(p \rightarrow 0\) and \(p \rightarrow 1\) in Eqs. (3.2) and (3.4) respectively, they both correspond to the Rényi quantile entropy of \(n^{th}\) record of the underlying distribution.

In our upcoming analysis, we will delve into the dynamic forms of quantile-based Rényi entropy of the \(n^{th}\) upper record values, expressed in relation to the expectation of the truncated Gamma distribution. This investigation is poised to yield essential insights regarding the boundaries of these entropies with respect to the mode of the truncated Gamma distribution. This could further lead to a better understanding of the behavior of these entropies under certain conditions and potentially inform decision-making processes in relevant industries and fields.

Theorem 3.1

Assume a sequence \(Y_{i}, i \ge 1\) of i.i.d. continuous random variables with Q(p) and q(p) as quantile distribution and quantile density functions respectively. Then quantile-based Rényi residual entropy of the nth upper record value, \(R_{n}\), can be derived as

$$\begin{aligned} \Phi _{\mu }(R_n; p)= & \frac{1}{1-\mu } \ln \left[ \frac{\Gamma (\mu (n-1) +1; -\ln (1-p))}{\{\Gamma (n; -\ln (1-p))\} ^{\mu }} \right] \nonumber \\ & \quad + \frac{1}{1-\mu } \ln [E\{ q^{1-\mu }(S_{z})\}], \end{aligned}$$
(3.5)

where \(z = -\ln (1-p)\) and \(S_{z} \sim \Gamma (\mu (n-1) +1; -\ln (1-p))\).

Proof

Rewriting Eq. (3.2) as

$$\begin{aligned} \Phi _{\mu }(R_n; p) =&\frac{1}{1-\mu } \ln \left[ \frac{\Gamma (\mu (n-1) + 1; -\ln (1-p))}{\{\Gamma (n; -\ln (1-p))\} ^{\mu }} \right] \nonumber \\&+ \frac{1}{1-\mu } \ln \left[ \int _{p}^{1} \frac{(-\ln (1-s))^{(n-1)\mu }q^{1-\mu }(s)}{\Gamma (\mu (n-1) +1; -\ln (1-p))} \textrm{d}s \right] . \end{aligned}$$
(3.6)

The pdf of truncated upper gamma distribution with parameter \((n-1)\mu + 1\) in quantile form is \(\frac{(1-p)(-\ln (1-p))^{(n-1)\mu }}{\Gamma (\mu (n-1) +1; -\ln (1-p))}\) and hence

$$\begin{aligned} E\{ q^{1-\mu }(S_{z})\} = \int _p^1 \frac{(-\ln (1-s))^{(n-1)\mu }q^{1-\mu }(s)}{\Gamma (\mu (n-1) +1; -\ln (1-p))} \textrm{d}s. \end{aligned}$$

Substituting this value in Eq. (3.6), we get the desired result. \(\square\)

Theorem 3.2

Under the presumptions of theorem 3.1, the quantile-based Rényi past entropy of the \(n^{th}\) upper record value, \(R_{n}\), can be derived as

$$\begin{aligned} \overline{\Phi }_{\mu }(R_{n}; p)= & \frac{1}{1-\mu } \ln \left[ \frac{\gamma (\mu (n-1) +1; -\ln (1-p))}{\{\gamma (n; -\ln (1-p))\}^{\mu } } \right] \nonumber \\ & \quad + \frac{1}{1-\mu } \ln [E\{ q^{1-\mu }(S_{z})\}], \end{aligned}$$
(3.7)

where \(z = -\ln (1-p)\) and \(S_{z} \sim \gamma (\mu (n-1) +1; -\ln (1-p))\).

Proof

Using Eq. (3.4), where \(z=-\ln (1-p)\), it can be proved similarly to theorem 3.1. \(\square\)

Theorem 3.3

An Important Result on Bounds: Let \(Y_{i}, i\ge 1\) be a sequence of i.i.d. continuous random variables with quantile Rényi residual entropy of \(n^{th}\) upper record, \(R_{n}\) as \(\Phi _{\mu }(R_n; p) < \infty\), then this quantile Rényi residual entropy of \(n^{th}\) upper record is bounded as follows: for \(\mu > 1(0< \mu < 1)\);

$$\begin{aligned} \Phi _{\mu }(R_n; p)> & (<) \frac{-\mu }{1-\mu } \ln [\Gamma (n;-\ln (1-p))]\nonumber \\ & \quad +\frac{1}{1-\mu } \ln [(1-m_{n})(-\ln (1-m_{n}))^{(n-1)\mu }]\nonumber \\ & \quad +A(p), \end{aligned}$$
(3.8)

where \(A(p) = \frac{1}{1-\mu } \ln \left( \int _{p}^{1} \frac{(q(s))^{1-\mu }}{1-s} \textrm{d}s\right)\).

Proof

If \(\Phi _{\mu }(R_n; p) < \infty\) and \(m_{n} = max\{ \mu (n-1), -\ln (1-p)\},\) where \(m_{n}\) is the mode of truncated Gamma distribution with parameter \((n-1)\mu + 1\) and quantile version of density function is \(\frac{(1-m_{n})(-\ln (1-m_{n}))^{(n-1)\mu }}{\Gamma ((n-1)\mu + 1; -\ln (1-p))} = M_{n}\)(say). Now, we write, for \(\mu > 1(0< \mu < 1)\)

$$\begin{aligned}&\frac{1}{1-\mu } \ln [E(q^{1-\mu }(S_{z}))] \\&= \frac{1}{1-\mu }\ln \left( \int _p^1 \frac{(-\ln (1-s))^{(n-1)\mu }(q(s))^{1-\mu }}{\Gamma (\mu (n-1) +1; -\ln (1-p))} \textrm{d}s\right) \\&{>(<) \frac{1}{1-\mu }\ln (M_{n}) + \frac{1}{1-\mu } \ln \left( \int _{p}^{1} \frac{(q(s))^{1-\mu }}{1-s} \textrm{d}s\right) .} \end{aligned}$$

Using these values, Eq. (3.5) reduces to

$$\begin{aligned} \Phi _{\mu }(R_n; p)&>(<) \frac{1}{1-\mu } \ln \left[ \frac{\Gamma (\mu (n-1) +1; -\ln (1-p))}{\Gamma ^{\mu } (n; -\ln (1-p))} \right] \\&\quad + \frac{1}{1-\mu }\ln (M_{n}) + \frac{1}{1-\mu } \ln \left( \int _{p}^{1} \frac{(q(s))^{1-\mu }}{1-s} \textrm{d}s\right) \\&= -\frac{\mu }{1-\mu } \ln [\Gamma (n; -\ln (1-p))] \\&\quad + \frac{1}{1-\mu }\ln [(1-m_{n})(-\ln (1-m_{n}))^{(n-1)\mu }]\\&\quad +A(p). \end{aligned}$$

Hence the proof. \(\square\)

4 The problem of unique characterization

This section emphasizes how our problem becomes an initial value problem (IVP) and then uniquely specifies the parent distribution using the hazard quantile function in quantile-based Rényi residual entropy of \(n^{th}\) upper record.

Our first task is to determine the sufficient criteria for the unique solution of an IVP: Given a function f(xy) with two variables defined in a region \(D \subset R^2, (x_0,y_0)\) is a particular point in D, y is the unknown function such that \(y' = f(x,y)~,~ y(x_0)=y_0\). Then a function \(\varphi (x)\) is the solution of this IVP on an interval \(I \subset R\) only if (i) the graph of \(\varphi\) falls into D, (ii) \(\varphi\) is differentiable on I, (iii) \({\varphi }^{'} = f(x,\varphi (x))\) for all \(x \in I\), and (iv) \(\varphi (x_0) = y_0\).

We may prove our characterization result using the following theorem and the lemma.

Theorem 4.1

Suppose that the function f is defined and continuous in a domain \(D \subset R^2\), and f satisfies a Lipschitz condition (with respect to y) in D, namely \(|f(x,y_1)-f(x,y_2)| \le K|y_1 - y_2|, K > 0,\) for every point \((x,y_1)\) and \((x,y_2)\) in D, then the function \(y = \varphi (x)\) satisfies the initial value problem \(y' = f(x,y)\) and \(\varphi (x_0) = y_0, x \in I\) is unique.

We now provide a sufficient condition under which any function f(xy) of two variables defined in \(D \subset R^2\) will satisfy the Lipschitz condition.

Lemma 4.1

The function f satisfies the Lipschitz condition in D if it is continuous in a convex region \(D \subset R^2\) and that \(\frac{\partial f}{\partial y}\) exists and is continuous in D.

For the proof and other findings concerning the aforementioned theorem and lemma, see, Gupta and Kirmani (2008). The hazard quantile function is another significant quantile measure that is comparable to the widely recognized measure of hazard rate and is expressed as

$$\begin{aligned} \wedge (p) = \zeta (Q(p))= \frac{g(Q(p))}{\overline{G}(Q(p))} = \frac{1}{(1-p)q(p)}, \end{aligned}$$

where \(\zeta (t) = \frac{g(t)}{\overline{G}(t)}\) is the hazard rate of a random variable Y. Then the hazard quantile function of the \(n^{th}\) upper record can be expressed as follows:

$$\begin{aligned} \wedge _n(p)= & \frac{{\c h}_{n}(p)}{\overline{{\c H}}_n(p)} = \frac{(-\ln (1-p))^{n-1}}{q(p) \Gamma (n; -\ln (1-p))} \\= & \frac{-\Gamma '(n; -\ln (1-p))}{q(p) \Gamma (n; -\ln (1-p))} = -\frac{\psi (n; -\ln (1-p))}{q(p)}, \end{aligned}$$

where prime \('\) denotes the differentiation with respect to p and \(\psi (n; -\ln (1-p)) = \frac{\Gamma '(n; -\ln (1-p))}{\Gamma (n; -\ln (1-p))}\) is the truncated upper digamma function. We will now proceed to establish the proof for our characterization result.

Rewriting Eq. (3.2) as

$$\begin{aligned} (1-\mu )\Phi _{\mu }(R_{n}; p)= & -\mu \ln [\Gamma (n; -\ln (1-p))]\\ & +\ln \left[ \int _p^1\{-\ln (1-s)\}^{(n-1)\mu } (q(s))^{1-\mu } \textrm{d}s\right] . \end{aligned}$$

Using hazard quantile function of \(n^{th}\) upper record,

$$\begin{aligned} (1-\mu )\Phi _{\mu }(R_{n}; p)= & -\mu \ln [\Gamma (n; -\ln (1-p))]\\ & +\ln \left[ \int _p^1\{\wedge _n(s)\Gamma (n;-\ln (1-s)) \}^{\mu }q(s) \textrm{d}s\right] . \end{aligned}$$

Differentiating it with respect to p, we obtain

$$\begin{aligned} (1-\mu )\Phi '_{\mu }(R_{n}; p)= & -\mu \frac{\Gamma '(n;-ln(1-p))}{\Gamma (n;-\ln (1-p))}\\ & \quad -\frac{\{\wedge _n(p)\Gamma (n;-\ln (1-p))\}^{\mu }q(p) }{ \int _p^1(\wedge _n(s)\Gamma (n;-\ln (1-s)))^{\mu }q(s) \textrm{d}s}\\ (1-\mu )\Phi '_{\mu }(R_{n}; p)= & -\mu \psi (n; -\ln (1-p))\\ & \quad -\frac{\{\wedge _n(p)\Gamma (n;-\ln (1-p))\}^{\mu }q(p) }{ e^{[(1-\mu )\Phi _{\mu }(R_{n}; p)+\mu \ln \Gamma (n; -\ln (1-p)) ]}}, \\ e^{[(1-\mu )\Phi _{\mu }(R_{n}; p)+\mu \ln \Gamma (n; -\ln (1-p)) ]}\\ [(1-\mu )\Phi '_{\mu }(R_{n}; p)+\mu \psi (n; -\ln (1-p))] \\ & = [\wedge _n(p)\Gamma (n;-\ln (1-p))]^{\mu - 1}\Gamma ' (n; -\ln (1-p)). \end{aligned}$$

Differentiating it again with respect to p, we get

$$\begin{aligned} & e^{[(1-\mu )\Phi _{\mu }(R_{n}; p)+\mu \ln \Gamma (\cdot ) ]}\left[ \{(1-\mu )\Phi '_{\mu }(R_{n}; p)+\mu \psi (\cdot ) \}^{2} \right. \\ & \qquad \left. + (1-\mu )\Phi ''_{\mu }(R_{n}; p)\right. \left. +\mu \psi _{1}(\cdot )\right] \\ & \quad = (\wedge _{n}(p))^{\mu - 1} (\Gamma (\cdot ))^{\mu }[\psi _{1}(\cdot )\\ & \qquad + (\psi (\cdot ))^{2}+(\mu - 1)\Gamma ' (\cdot )(\psi (\cdot ) \wedge _{n}(p)+\wedge '_{n}(p))], \end{aligned}$$

where \(\Gamma (\cdot ) = \Gamma (n;-\ln (1-p)), \Gamma '(\cdot ) = \Gamma '(n;-\ln (1-p)), \psi (\cdot ) = \psi (n;-\ln (1-p))\),

and \(\psi _{1}(\cdot ) = \psi _{1}(n;-\ln (1-p))= \psi '(n; -\ln (1-p))\), is the truncated upper trigamma function.

$$\begin{aligned}&\wedge '_n(p)\\ &\quad = \frac{e^{[(1-\mu )\Phi _{\mu }(R_{n}; p)+\mu \ln \Gamma (\cdot ) ]}\left[ \{(1-\mu )\Phi '_{\mu }(R_{n}; p)+\mu \psi (\cdot ) \}^{2}+ (1-\mu )\Phi ''_{\mu }(R_{n}; p)+\mu \psi _{1}(\cdot )\right] }{(\mu - 1)(\wedge _{n}(p))^{\mu - 1}(\Gamma (\cdot ))^{\mu + 1}\psi (\cdot )}\\ &\qquad -\frac{\psi _{1}(\cdot )+(\psi (\cdot )^{2}}{(\mu - 1)\Gamma ' (\cdot )} -\psi (\cdot ) \wedge _{n}(p) . \end{aligned}$$

Taking \(\wedge _n(p) = y\) and \(\Phi _{\mu }(R_{n}; p) =f(p)\), we get

$$\begin{aligned} y' = F(p,y), \end{aligned}$$
(4.1)

where,

$$\begin{aligned}&F(p,y) \\ &\quad = \frac{e^{[(1-\mu )f(p)+\mu \ln \Gamma (\cdot ) ]}\left[ \{(1-\mu )f'(p)+\mu \psi (\cdot ) \}^{2}+ (1-\mu )f''(p)+\mu \psi _{1}(\cdot )\right] }{(\mu - 1)y^{\mu - 1}(\Gamma (\cdot ))^{\mu + 1}\psi (\cdot )}\\ &\qquad -\frac{\psi _{1}(\cdot )+(\psi (\cdot ))^{2}}{(\mu - 1)\Gamma ' (\cdot )} -\psi (\cdot )y. \end{aligned}$$

Theorem 4.1 and Lemma 4.1 collectively demonstrate that the initial value problem (IVP) (4.1) possesses a unique solution \(y = \wedge _n(p)\) and serves as the unique characterization of the parent distribution.

5 Concluding remarks

Quantile variations of Rényi entropy and its dynamic forms in record statistics analyze the quantiles of the distribution instead of the original data, offering insights into the dynamics of extreme events and their underlying distributions. These dynamic variations could be valuable in fields such as environmental science, finance, and reliability engineering, where understanding extreme events is critical.

In this research, we have formulated expressions for quantile-based Rényi entropy and its dynamic versions (residual and past) for the \(n^{th}\) upper record value and examined them for various generalized and lifetime distributions commonly used in life testing. Additionally, we have derived an alternative expression for Rényi entropy in relation to the expectation of the Gamma distribution and for dynamic forms in relation to the expectation of the truncated upper and lower Gamma distributions, demonstrating their connection with other distributions. These equations have been utilized to establish an extreme boundary for quantile Rényi residual entropy for the \(n^{th}\) highest record in terms of the mode of the truncated Gamma distribution. We have also presented a unique characterization outcome for the distributions using the residual form of Rényi entropy for the \(n^{th}\) upper record and the quantile hazard function.

The potential applications of quantile analysis of Rényi entropy for records are extensive and interdisciplinary, with implications across various fields where analyzing extreme events and quantifying uncertainty are crucial for decision-making and risk management. Ongoing research in this area is likely to yield valuable insights and innovations with significant practical implications. If one is open to using a more intricate model with greater flexibility, the current work can be expanded to encompass more generalized models and entropies.