It is known that the theory of reliability is a branch of statistical technology that deals with general regularities. The importance of reliability appears in its dealing with the length of human beings, organisms’ structures, materials, etc. This theory is widely used in biological, engineering and medicine sciences. The reliability of effective origins has been considered by the works that participated in the reliability theory by [1, 2]. The statisticians and reliability analysts have used classifications of life distributions—based on some aspects of aging—to model survival data. The aging concepts function is to describe how a population of units or systems improves or deteriorates with age.
Aging properties is the means to define the life distribution classes. The exponential distribution is one of the most important aspects of such classifications, and it is always a member of each class. The notion of aging has great importance in many reliability analyses. On the other hand, many statistics have been developed for testing exponentiality against different aging alternatives.
The main classes of life distributional are based on new better than used NBU, new better than used failure rate NBUFR, new better than average failure rate NBAFR, new better than used renewal failure rate NBURFR, new better than renewal used NBRU, and exponentially better than used in Laplace transform order EBUL. Many researchers have introduced testing exponentiality against some classes of life distributions. For testing exponentiality versus NBU class, see [3]. A new class of life distributions named NBUCL has introduced by [4]. The classes NBAFR, NBARFR, NBURFR, and RNBRUE have proposed by [5,6,7,8], respectively.
Renewal classes
Consider a device (system or component) with lifetime T and continuous life distribution F(t), is put on the operation. When the failure occurs, the device will be replaced by a sequence of mutually independent devices. The spare devices are independent of the first device and identically distributed with the same life distributions F(t). In the long run, the remaining life distribution of the system under operation at time t is given by stationary renewal distribution as follows:
$$ {W}_F(t)={\mu_F}^{-1}\underset{0}{\overset{t}{\int }}\overline{F}(u) du,0\le t<\infty, $$
where \( {\mu}_F=\mu =\underset{0}{\overset{\infty }{\int }}\overline{F}(u) du \).The corresponding renewal survival function is
$$ {\overline{W}}_F(t)={\mu_F}^{-1}\underset{t}{\overset{\infty }{\int }}\overline{F}(u) du,0\le t<\infty, $$
For details, see [5, 9].
The NRBU, RNBU, NRBUE, and HNRBUE classes of life distributions have introduced by [10], and the relation between them has been studied. Testing exponentiality versus NRBU based on TTT-transform has been investigated by [11]. A new test statistic for testing exponentiality against RNBU class of life distribution based on U-statistic is studied by [12].
Definition 1
If X is a random variable with survival function \( \overline{F}(x) \), then X is said to have renewal new better (worth) than used property, denoted by RNBU (RNWU), if
$$ {\overline{W}}_F\left(x+t\right)\le \left(\ge \right){\overline{W}}_F(x){\overline{W}}_F(t)x\ge 0,t\ge 0. $$
(1.1)
Now, a new class of life distributions called renewal new better than used in Laplace transform order has been defined by [13].
Definition 2
X is said to be renewal new better than used in Laplace transform order (RNBUL) if:
$$ \underset{0}{\overset{\infty }{\int }}{e}^{- sx}{\overline{W}}_F\left(x+t\right) dx\le {\overline{W}}_F(t)\underset{0}{\overset{\infty }{\int }}{e}^{- sx}{\overline{W}}_F(t) dx,x\ge 0,t\ge 0. $$
(1.2)
Testing exponentiality against RNBUL
In this section, we test the null hypotheses H0 : F is exponential with mean μ against H1 : F belongs to RNBUL and not exponential.
The following lemma is essential for the development of our test statistic.
Lemma 2.1.
Let X be RNBUL random variable with distribution function F, then
$$ \frac{\mu }{s^3}-\frac{\mu }{s^3}E\left({e}^{- sX}\right)-\frac{\mu^2}{s^2}\le \frac{\mu_2}{2{s}^2}E\left({e}^{- sX}\right)-\frac{\mu_2}{2{s}^2}, $$
(2.1)
where
$$ E\left({e}^{- sX}\right)=\underset{0}{\overset{\infty }{\int }}{e}^{- sx} dF(x). $$
Proof.
Since F belongs to RNBUL, then from Definition 2 and Integrating both sides with respect to t over [0, ∞), gives
$$ \underset{0}{\overset{\infty }{\int }}\underset{0}{\overset{\infty }{\int }}{e}^{- sx}{\overline{W}}_F\left(x+t\right) dxdt\le \underset{0}{\overset{\infty }{\int }}{\overline{W}}_F(t)\underset{0}{\overset{\infty }{\int }}{e}^{- sx}{\overline{W}}_F(t) dxdt, $$
(2.2)
setting
$$ I=\underset{0}{\overset{\infty }{\int }}\underset{0}{\overset{\infty }{\int }}{e}^{- sx}{\overline{W}}_F\left(x+t\right) dxdt. $$
Therefore,
$$ I=\frac{1}{\mu }E\underset{0}{\overset{X}{\int }}\left[\frac{1}{s}X+\frac{1}{s^2}{e}^{-s\left(X-t\right)}-\frac{1}{s^2}-\frac{1}{s}t\right] dt, $$
Then,
$$ I=\frac{\mu_2}{2 s\mu}-\frac{1}{s^3\mu }E\left({e}^{- sX}\right)-\frac{1}{s^2}+\frac{1}{s^3\mu }. $$
(2.3)
Similarly,
$$ II=\underset{0}{\overset{\infty }{\int }}{\overline{W}}_F(t)\underset{0}{\overset{\infty }{\int }}{e}^{- sx}{\overline{W}}_F(t) dxdt, $$
Then,
$$ II=\frac{\mu_2}{2 s\mu}-\frac{\mu_2}{2{s}^2{\mu}^2}+\frac{\mu_2}{2{s}^2{\mu}^2}E\left({e}^{- sX}\right). $$
(2.4)
Substituting (2.3) and (2.4) in (2.2), we get (2.1). This completes the proof.
Empirical test statistic for RNBUL
Let X1, X2, Xn be a random sample from a population with a distribution function F ∈ RNBUL class. The measure of departure from exponentiality δ(s) is determined from the previous lemma, where
$$ \delta (s)=\frac{\mu_2}{2{s}^2}E\left({e}^{- sX}\right)-\frac{\mu_2}{2{s}^2}+\frac{\mu }{s^3}E\left({e}^{- sX}\right)+\frac{\mu^2}{s^2}-\frac{\mu }{s^3}. $$
(2.5)
Note that under H0 : δ(s) = 0, and H1 : δ(s) > 0.Let \( \hat{\delta}(s) \) be the empirical estimate of δ(s), where
$$ \hat{\delta}(s)=\frac{1}{n\left(n-1\right)}\sum \limits_{i=1}^n\sum \limits_{j=1}^n\left[\frac{{X_i}^2}{2{s}^2}{e}^{-s{X}_j}-\frac{{X_i}^2}{2{s}^2}+\frac{X_i}{s^3}{e}^{-s{X}_j}+\frac{X_i{X}_j}{s^2}-\frac{X_i}{s^3}\right], $$
To make the test scale-invariant under H0,we use \( \hat{\Delta }(s)=\frac{\hat{\delta}(s)}{{\overline{X}}^2} \), where \( \overline{X}=\frac{1}{n}\sum \limits_{i=1}^n{X}_i \) is the sample mean. Then,
$$ \hat{\Delta }(s)=\frac{1}{n\left(n-1\right){\overline{X}}^2}\sum \limits_{i=1}^n\sum \limits_{j=1}^n\left[\frac{{X_i}^2}{2{s}^2}{e}^{-s{X}_j}-\frac{{X_i}^2}{2{s}^2}+\frac{X_i}{s^3}{e}^{-s{X}_j}+\frac{X_i{X}_j}{s^2}-\frac{X_i}{s^3}\right]. $$
(2.6)
Setting
$$ \phi \left({X}_1,{X}_2\right)=\frac{{X_1}^2}{2{s}^2}{e}^{-s{X}_2}-\frac{{X_1}^2}{2{s}^2}+\frac{X_1}{s^3}{e}^{-s{X}_2}+\frac{X_1{X}_2}{s^2}-\frac{X_1}{s^3}, $$
(2.7)
and defining the symmetric kernel
$$ \psi \left({X}_1,{X}_2\right)=\frac{1}{2!}\sum \phi \left({X}_1,{X}_2\right), $$
where the summation is over all arrangements of X1, X2, Xn, then \( \hat{\delta}(s) \) is equivalent to U-statistic
$$ {U}_n=\frac{1}{\left(\genfrac{}{}{0pt}{}{n}{2}\right)}\sum \phi \left({X}_1,{X}_2\right). $$
The following theorem summarizes the asymptotic properties of the test.
Theorem 2.1
-
(i)
\( As\ n\to \infty, \sqrt{n}\left[\hat{\Delta }(s)-\delta (s)\right] \)is asymptotically normal with mean zero and variance
$$ {\sigma}^2(s)=\mathrm{Var}\left\{\left(E\left({e}^{- sX}\right)-1\right)\left(\frac{X^2}{2{s}^2}-\frac{X}{s^3}\right)+\left({e}^{- sX}-1\right)\left(\frac{\mu_2}{2{s}^2}+\frac{\mu }{s^3}\right)+\frac{2 X\mu}{s^2}\right\}. $$
(2.8)
-
(ii)
Under H0 the variance reduced to
$$ {\sigma_0}^2=\frac{2}{{\left(1+s\right)}^3\left(1+2s\right)},s\ne -1,-\frac{1}{2}. $$
(2.9)
Proof:
Using standard U-statistics theory, see [14], and by direct calculations, we can find the mean and the variance as follows:
$$ {\sigma}^2=\mathrm{Var}\left\{\eta (X)\right\}, $$
Where η(X) = η1(X) + η2(X),
$$ {\eta}_1(X)=E\left[\phi \left({X}_1,{X}_2\right)\left|{X}_1\right.\right] $$
$$ =\frac{X^2}{2{s}^2}E\left({e}^{- sX}\right)-\frac{X^2}{2{s}^2}+\frac{X}{s^3}E\left({e}^{- sX}\right)+\frac{X\mu}{s^2}-\frac{X}{s^3} $$
and
$$ {\eta}_2(X)=E\left[\phi \left({X}_1,{X}_2\right)\left|{X}_2\right.\right] $$
$$ =\frac{\mu_2}{2{s}^2}{e}^{- sX}-\frac{\mu_2}{2{s}^2}+\frac{\mu }{s^3}{e}^{- sX}+\frac{X\mu}{s^2}-\frac{\mu }{s^3}, $$
Therefore,
$$ \eta (X)=\left(E\left({e}^{- sX}\right)-1\right)\left(\frac{X^2}{2{s}^2}-\frac{X}{s^3}\right)+\left({e}^{- sX}-1\right)\left(\frac{\mu_2}{2{s}^2}+\frac{\mu }{s^3}\right)+\frac{2 X\mu}{s^2} $$
and Eq. (2.8) is deduced.
Under H0, the mean μ0 and the variance σ02 are given by
$$ {\mu}_0=E\left(\ {\eta}_0(X)\right)=0, $$
$$ {\sigma_0}^2=E\left[\ {\left({\eta}_0(X)\right)}^2\right], $$
Then, (2.9) is obtained.