Skip to main content
  • Original research
  • Open access
  • Published:

On the joint distribution of order statistics from independent non-identical bivariate distributions

Abstract

In this note, the exact joint probability density function (jpdf) of bivariate order statistics from independent non-identical bivariate distributions is obtained. Furthermore, this result is applied to derive the joint distribution of a new sample rank obtained from the rth order statistics of the first component and the sth order statistics of the second component.

Introduction

Multivariate order statistics especially Bivariate order statistics have attracted the interest of several researchers, for example, see [1]. The distribution of bivariate order statistics can be easily obtained from the bivariate binomial distribution, which was first introduced by [2]. Considering a bivariate sample, David et al. [3] studied the distribution of the sample rank for a concomitant of an order statistic. Bairamove and Kemalbay [4] introduced new modifications of bivariate binomial distribution, which can be applied to derive the distribution of bivariate order statistics if a certain number of observations are within the given threshold set. Barakat [5] derived the exact explicit expression for the product moments (of any order) of bivariate order statistics from any arbitrary continuous bivariate distribution function (df). Bairamove and Kemalbay [6] used the derived jpdf by [5] to derive the joint distribution on new sample rank of bivariate order statistics. Moreover, Barakat [7] studied the limit behavior of the extreme order statistics arising from n two-dimensional independent and non-identically distributed random vectors. The class of limit dfs of multivariate order statistics from independent and identical random vectors with random sample size was fully characterized by [8].

Consider n two-dimensional independent random vectors \({{\underline W}_{j}}=(X_{j},Y_{j})\), j=1,2,...,n, with the respective distribution function (df) \(F_{j}(\underline w)=F_{j}(x,y)= P(X_{j}\leq x, Y_{j}\leq y), j=1,2,...,n \). Let X1:nX2:n≤...≤Xn:n and Y1:nY2:n≤...≤Yn:n be the order statistics of the X and Y samples, respectively. The main object of this work is to derive the jpdf of the random vector \(Z_{k,k^{\prime }:n}= (X_{n-k+1:n},Y_{n-k^{\prime }+1:n})\phantom {\dot {i}\!},\) where 1≤k, kn. Let \(G_{j}(\underline {w})=P({\underline {W}}_{j}>\underline {w})\) be the survival function of \(F_{j}(\underline {w}), j=1,2,...,n\) and let F1,j(.), F2,j(.), G1,j(.)=1−F1,j(.) and G2,j(.)=1−F2,j(.) the marginal dfs and the marginal survival functions of \(\Phi _{k,k':n}= P(Z_{k,k':n}\leq \underline {w}),\ F_{j}(\underline {w})\) and \(~G_{j}(\underline {w}), j=1,2,...,n,\) respectively. Furthermore, let \({F_{j}}^{1,.}=\frac {\partial F_{j}(\underline {w})}{\partial {x}}\) and \({F_{j}}^{.,1}=\frac {\partial F_{j}(\underline {w})}{\partial {y}}.\) Also, the jpdf of \((X_{n-k+1:n},Y_{n-k^{\prime }+1:n})\phantom {\dot {i}\!}\) is conveniently denoted by \(\phantom {\dot {i}\!}f_{k,k^{\prime }:n}(\underline {w}).\) Finally, the abbreviations min(a, b)=ab, and max(a, b)=ab will be adopted.

The jpdf of non-identical bivariate order statistics

The following theorem gives the exact formula of the jpdf of non-identical bivariate order statistics.

Theorem 1

The jpdf of non-identical bivariate order statistics is given by

$${{} \begin{aligned} f_{k,k':n}(\underline{w})\,=\,\sum_{\theta,\varphi=0}^{1}\sum_{r=r_{**}}^{r^{**}}\sum_{\rho_{\theta,\varphi,r}}\, \Pi_{j=1}^{\theta}{F}^{.,1}_{i_{j}}(\underline{w})\Pi_{j=\theta+1}^{1}(f_{2,i_{j}}(y)\,-\,{F}^{.,1}_{i_{j}}(\underline{w})) \Pi_{j=2}^{\varphi+1}{F}^{1,.}_{i_{j}}(\underline{w})\\ \times\Pi_{j=\varphi+2}^{2}(f_{1,i_{j}}(x)-{F}^{1,.}_{i_{j}}(\underline{w}))\Pi_{j=3}^{k-\theta-r+1} (F_{1,i_{j}}(x)-{F}_{i_{j}}(\underline{w}))\Pi_{j=k-\theta-r+2}^{k-\theta+1}F_{i_{j}}(\underline{w})\\ \times \Pi_{j=k-\theta+2}^{k+k'-\theta-\varphi-r}(F_{2,i_{j}}(y)-{F}_{i_{j}}(\underline{w})) \Pi_{j=k+k'-\theta-\varphi-r+1}^{n}G_{i_{j}}(\underline{w})+\sum_{r=0\vee(k+k'-n-1)}^{(k-1)\wedge(k'-1)}\sum_{\rho_{r}}f_{j}(\underline{w})\\ \Pi_{j=2}^{k-r}(F_{1,i_{j}}(x)-{F}_{i_{j}}(\underline{w})) \times\Pi_{j=k-r+1}^{k}F_{i_{j}}(\underline{w})\Pi_{j=k+1}^{k+k'-r}(F_{2,i_{j}}(y)-{F}_{i_{j}}(\underline{w}))\Pi_{j=k+k'-r+1}^{n}G_{i_{j}}(\underline{w}), \end{aligned}} $$

where \(r_{**}=0\vee (k+k'-\theta -\varphi -n), r^{**}= (k-\theta -1)\wedge (k'-\varphi -1),\ \sum _{\rho }\) denotes summation subject to the condition ρ, and \(\sum _{\rho _{\theta _{1},\theta _{2},\varphi _{1},\varphi _{2},\omega,r}}\) denotes the set of permutations of i1,...,in such that \(i_{j_{1}}<...< i_{j_{n}}.\)

Proof

A convenient expression of \(f_{k,k':n}(\underline {w})\) may derived by noting that the compound event E={x<Xk:n<x+δx, y<Yk:n<y+δy} may be realized as follows: r;φ1;s1;θ1;ω;θ2;s2;φ2 and t observations must fall respectively in the regions I1=(−,x]∩(−,y];I2=(x, x+δx]∩(−,y];I3=(x+δx,]∩(−,y];I4=(−,x]∩(y, y+δy];I5=(x, x+δx]∩(y, y+δy];I6=(x+δx,]∩(y, y+δy];I7=(−,x]∩(y+δy,);I8=(x, x+δx]∩(x+δx,);and I9=(x+δx,)∩(y+δy,) with the corresponding probability \(P_{ij}=P({\underline {W}}_{j}\in I_{i}), i=1,2,...,9\). Therefore, the joint density function \(f_{k,k':n}(\underline {w})\) of \(\phantom {\dot {i}\!}(X_{k:n},Y_{k^{\prime }:n})\) is the limit of \(\frac {P(E)}{\delta x\delta y}\) as δx,δy→0, where P(E) can be derived by noting that \(\theta _{1}+\theta _{2}+\omega =\varphi _{1}+\varphi _{2}+\omega =1;\ r+\theta _{1}+s_{2}=k-1;\ r+\varphi _{1}+s_{1}=k'-1;\ r,\theta _{1},s_{2},\varphi _{1},\omega,\theta _{2},s_{1},\varphi _{2},t \geq 0;\ P_{1j}=F_{j}(\underline {w}),P_{2j}=F_{j}^{1,.}(\underline {w})\delta x, P_{3j}=F_{2,j}(y)-F_{j}(x+\delta x,y), P_{4j}= F_{j}^{.,1}(\underline {w})\delta y, P_{5j\cong }F_{j}^{1,1}(\underline {w})\delta x\delta y=f_{j}(\underline {w})\delta x\delta y, P_{6j}\cong (f_{2,j}(y)-F_{j}^{.,1}(\underline {w}+\delta \underline {w}))\delta y,\) where \(f_{2,j}(y)=\frac {\partial F_{2,j}(y)}{\partial y},j=1, 2,...,n,~ \partial \underline {w}=(\delta x,\delta y), \underline {w}+\delta \underline {w}=(x+\delta x,y+\delta y),P_{7j}=F_{1,j}(x)-F_{j}(x,y+\delta y), P_{8j}=(f_{1,j}(x)-F_{j}^{1,.}(\underline {w}+\delta \underline {w}))\delta {x}, P_{9j}=1-F_{1,j}(x+\delta x)-F_{2,j}(y+\delta y)+F_{j}(\underline {w}).\) Thus, we get

$$ {{} \begin{aligned} f_{k,k':n}(\underline{w})\!&=\!\sum_{\theta_{1},\varphi_{1},\theta_{2},\varphi_{2}=0}^{1}\sum_{r=r_{*}}^{r^{*}} \sum_{\rho_{\theta_{1},\theta_{2},\varphi_{1},\varphi_{2},\omega,r}}\,\Pi_{j=1}^{\theta_{1}}P_{4i_{j}}\Pi_{\theta_{1}+1}^{\theta_{1}+\varphi_{1}}P_{2i_{j}} \Pi_{j=\theta_{1}+\varphi_{1}+1}^{\theta_{1}+\varphi_{1}+\theta_{2}}P_{6i_{j}} \Pi_{j=\theta_{1}+\varphi_{1}+\theta_{2}+1}^{\theta_{1}+\varphi_{1}+\theta_{2}+\varphi_{2}}P_{8i_{j}}\\ &\Pi_{j=\theta_{1}+\varphi_{1}+\theta_{2}+\varphi_{2}\,+\,1}^{\theta_{1}+\varphi_{1}+\theta_{2}+\varphi_{2}+\omega}P_{5i_{j}} \Pi_{j=\theta_{1}+\varphi_{1}+\theta_{2}+\varphi_{2}+\omega+1}^{\theta_{2}+\varphi_{1}+\theta_{2}+\omega+k-r\!-1}P_{7i_{j}} \Pi_{j=\theta_{2}+\varphi_{1}+\varphi_{2}+\omega+k\!-r}^{\varphi_{1}\,+\,\theta_{2}+\varphi_{2}+\omega+k-1}P_{1i_{j}} \Pi_{j=\varphi_{1}\,+\,\theta_{2}+\varphi_{2}+\omega\!+k}^{\theta_{2}\!+\varphi_{2}+\omega+k+k'\!-r\,-\,2}P_{3i_{j}}\\ &\Pi_{j=\theta_{2}+\varphi_{2}+\omega+k+k'-r-1}^{n}P_{9i_{j}}, \end{aligned}} $$
(1)

where \(r_{*}=0\vee (k+k'+\theta _{2}+\varphi _{2}+\omega -r-1-n), r^{*}= (k-\theta _{1}-1)\wedge (k'-\varphi _{1}-1),\sum _{\rho }\) denotes summation subject to the condition ρ, and \(\sum _{\rho _{\theta _{1},\theta _{2},\varphi _{1},\varphi _{2},\omega,r}}\) denotes the set of permutations of i1,...,in such that \(i_{j_{1}}<...< i_{j_{n}}\) for each product of the type \(\Pi _{j=j_{1}}^{j_{2}}\). Moreover, if j1>j2, then \(\Pi _{j=j_{1}}^{j_{2}}=1\). But (1) can be written in the following simpler form

$${{} \begin{aligned} P(E)=\sum_{\theta,\varphi=0}^{1}\sum_{r=r_{**}}^{r^{**}}\sum_{\rho_{\theta,\varphi,r}}\,\Pi_{j=1}^{\theta}P_{4i_{j}}\Pi_{j= \theta+1}^{1}P_{6i_{j}}\Pi_{j=2}^{\varphi+1}P_{2i_{j}} \Pi_{j=\varphi+2}^{2}P_{8i_{j}}\Pi_{j=3}^{k-\theta-r+1}P_{7i_{j}}\Pi_{j=k-\theta-r+2}^{k-\theta+1}P_{1i_{j}} \\ \Pi_{j=k-\theta+2}^{k+k'-\theta-\varphi-r}P_{3i_{j}} \Pi_{j=k+k'-\theta-\varphi-r+1}^{n}P_{9i_{j}}+\sum_{r=0\vee(k+k'-n-1)}^{(k-1)\wedge(k'-1)}\sum_{\rho_{r}}P_{5i_{3}}\Pi_{j=2}^{k-r}P_{7i_{j}} \Pi_{j=k-r+1}^{k}P_{1i_{j}} \Pi_{j=k+1}^{k+k'-r}P_{3i_{j}}\Pi_{j=k+k'-r}^{n}P_{9i_{j}}, \end{aligned}} $$

where r=0(k+kθφn),r=(kθ−1)(kφ−1). Therefore,

$$ {{} \begin{aligned} f_{k,k':n}(\underline{w})=\sum_{\theta,\varphi=0}^{1}\sum_{r=r_{**}}^{r^{**}}\sum_{\rho_{\theta,\varphi,r}}\,\Pi_{j=1}^{\theta}P_{4i_{j}} \Pi_{j=\theta+1}^{1}P_{6i_{j}}\Pi_{j=2}^{\varphi+1}P_{2i_{j}} \Pi_{j=\varphi+2}^{2}P_{8i_{j}}\Pi_{j=3}^{k-\theta-r+1}P_{7i_{j}}\\ \Pi_{j=k-\theta-r+2}^{k-\theta+1}P_{1i_{j}}\Pi_{j=k-\theta+2}^{k+k'-\theta-\varphi-r}P_{3i_{j}} \Pi_{j=k+k'-\theta-\varphi-r+1}^{n}P_{9i_{j}}+\sum_{r=0\vee(k+k'-n-1)}^{(k-1)\wedge(k'-1)}\sum_{\rho_{r}}P_{5i_{3}}\Pi_{j=2}^{k-r}P_{7i_{j}}\\ \Pi_{j=k-r+1}^{k}P_{1i_{j}}\Pi_{j=k+1}^{k+k'-r}P_{3i_{j}}\Pi_{j=k+k'-r}^{n}P_{9i_{j}}. \end{aligned}} $$
(2)

Thus, we get

$$ {{} \begin{aligned} f_{k,k':n}(\underline{w})=\sum_{\theta,\varphi=0}^{1}\sum_{r=r_{**}}^{r^{**}}\sum_{\rho_{\theta,\varphi,r}}\, \Pi_{j=1}^{\theta}{F}^{.,1}_{i_{j}}(\underline{w})\Pi_{j=\theta+1}^{1}(f_{2,i_{j}}(y)-{F}^{.,1}_{i_{j}}(\underline{w})) \Pi_{j=2}^{\varphi+1}{F}^{1,.}_{i_{j}}(\underline{w})\\ \Pi_{j=\varphi+2}^{2}(f_{1,i_{j}}(x)-{F}^{1,.}_{i_{j}}(\underline{w}))\Pi_{j=3}^{k-\theta-r+1} (F_{2,i_{j}}(x)-{F}_{i_{j}}(\underline{w}))\Pi_{j=k-\theta-r+2}^{k-\theta+1}F_{i_{j}}(\underline{w}) \Pi_{j=k-\theta+2}^{k+k'-\theta-\varphi-r}(F_{2,i_{j}}(y)-{F}_{i_{j}}(\underline{w}))\\ \Pi_{j=k+k'-\theta-\varphi-r+1}^{n}G_{i_{j}}(\underline{w})+\sum_{r=0\vee(k+k'-n-1)}^{(k-1)\wedge(k'-1)}\sum_{\rho_{r}}f_{i_{3}}(\underline{w}) \Pi_{j=2}^{k-r}(F_{1i_{j}}(x)-{F}_{i_{j}}(\underline{w}))\\ \Pi_{j=k-r+1}^{k}F_{i_{j}}(\underline{w})\Pi_{j=k+1}^{k+k'-r}(F_{2,i_{j}}(y)-{F}_{i_{j}}(\underline{w}))\Pi_{j=k+k'-r+1}^{n}G_{i_{j}}(\underline{w}). \end{aligned}} $$
(3)

Hence, the proof.

Relation (3) may be written in term of permanents (c.f [9]) as follows:

$$ {{} \begin{aligned} f_{k,k':n}(\underline{w})= \sum_{\theta,\varphi=0}^{1}\sum_{r=r_{**}}^{r^{**}}\frac{1}{(k-\theta-r-1)!r!(k'-\varphi-r-1)!(n-k-k'+\varphi+\theta+r-1)!}\\ \begin{array}{ccccccc}\text{Per} [\underline{U}^{.,1}_{1,1}&(\underline{U}^{1}_{.,1}\,-\,\underline{U}^{.,1}_{1,1})&\underline{U}^{1,.}_{1,1}& \left(\underline{U}^{1}_{1,.}\,-\,\underline{U}^{1,.}_{1,1}\right)& \left(\underline{U}_{1,.}\,-\,\underline{U}_{1,1}\right)&\underline{U}_{1,1}&(\underline{U}_{.,1}\,-\,\underline{U}_{1,1})~\\ ~~ {\theta}~ &~ { 1-\theta}~~&~ {\varphi} ~~&~ {1-\varphi}~~&~ {k-\theta-r-1}~&~ {r}&~ {k'-\varphi-r-1}\\ (1-\underline{U}_{1,.}-\underline{U}_{1,.}+\underline{U}_{1,1}){\vphantom{\underline{U}^{.,1}_{1,1}}}]\\~~ {n-k-k'+\theta+\varphi+r-1} \end{array}\\ +\sum_{r=r_{*}}^{r^{*}}\frac{1}{(k-r)!r!(k'-r)!(n-k-k'+r)!} ~{\renewcommand{\arraystretch}{0.6} \begin{array}{cccccccc}\text{Per} [\underline{U}^{1,1}_{1,1}~&~(\underline{U}_{1,.}-\underline{U}_{1,1})~&\underline{U}_{1,1}~&~ (\underline{U}_{.,1}-\underline{U}_{1,1})~&~ (1-\underline{U}_{1,.}-\underline{U}_{1,.}+\underline{U}_{1,1})]\\ ~ {1}~ &~ {k-r}~~&~ {r} ~~&~ {k'-r}~&~ {n-k-k'+r-1}\end{array}}, \end{aligned}} $$
(4)

where \(~\underline U_{1,.}=(F_{11}(x_{1})~~F_{12}(x_{1})~...~ F_{1n}(x_{1}))',\ \underline U_{.,1}=(F_{2,1}(x_{2})~~F_{2,2}(x_{2})~...~ F_{2,n}(x_{2}))',\ \underline U_{1,1}=(F_{1}(\underline x)~~F_{2}(\underline x)~...~ F_{n}(\underline x))'\) and \(\underline 1\) is the n×1 column vector of ones. Moreover, if \({\underline {a}}_{1}, {\underline {a}}_{2},... \) are column vectors, then

$$\begin{array}{cccc} \text{Per}[&{\underline{a}}_1~~&~~{\underline{a}}_2~~&~~...]\\ &~~{i_{1}}~~&~~{i_{2}}~~&~~... \end{array} $$

will denote the matrix obtained by taking i1 copies of \({\underline {a}}_{1},\ i_{2}\) copies of \({\underline {a}}_{2},\) and so on.

Finally, when k=k=1, in (3), we get

$$\begin{array}{*{20}l} f_{1,1:n}(\underline{w})=\sum_{\rho_{\theta,\varphi,r}}\, (f_{2,i_{1}}(y)-{F}^{.,1}_{i_{1}}(\underline{w})) (f_{1,i_{2}}(x)-{F}^{1,.}_{i_{2}}(\underline{w}))\Pi_{j=3}^{n} G_{i_{j}}(\underline{w})+\sum_{\rho_{r}}f_{i_{3}}(\underline{w})\\ (F_{2,i_{2}}(y)-{F}_{i_{2}}(\underline{w}))\Pi_{j=3}^{n}G_{i_{3}}(\underline{w}). \end{array} $$

Also, for k=k=n, we get

$$\begin{array}{*{20}l} {} f_{n,n:n}(\underline{w})=\sum_{\rho_{\theta,\varphi,r}}\, {F}^{.,1}_{i_{1}}(\underline{w}){F}^{1,.}_{i_{2}}(\underline{w})\Pi_{j=3}^{n}F_{i_{j}}(\underline{w})+\sum_{\rho_{r}}f_{i_{3}}(\underline{w}) \Pi_{j=2}^{n}F_{i_{j}}(\underline{w}))(F_{2,i_{n+1}}(y)-{F}_{i_{n+1}}(\underline{w})). \end{array} $$

Joint distribution of the new sample rank of X r:n and Y s:n

Consider n two-dimensional independent vectors \(\underline {W}_{j}=(X_{j},Y_{j}),j= 1,...,n,\) with the respective df \(F_{j}(\underline {W})\) and the jpdf\(F_{j}(\underline {W})\). Further assume that (Xn+1,Yn+1), (Xn+2,Yn+2),..., (Xn+m,Yn+m), (m≥1) is another random sample with absolutely continuous df \(G^{*}_{j}(x,y), j= 1,...,m\) and jpdf gj(x, y). We assume that the two samples (Xn+1,Yn+1),(Xn+2,Yn+2),...,(Xn+m, Yn+m), (m≥1) and (X1,Y1),(X2,Y2),...,(Xn,Yn) are independent.

For 1≤r, sn, m≥1, we define the random variables η1 and η2 as follows:

$$\begin{array}{*{20}l} \eta_{1}=\sum_{i=1}^{m}I_{(X_{r:n}-X_{n+i})} \end{array} $$

and

$$\begin{array}{*{20}l} \eta_{2}=\sum_{i=1}^{m}I_{(Y_{s:n}-Y_{n+i})}, \end{array} $$

where I(x)=1 if x>0 and I(x)=0 if x≤0 is an indicator function. The random variables η1 and η2 are referred to as exceedance statistics. Clearly η1 shows the total number of new X observations Xn+1,Xn+2,..., Xn+m which does not exceed a random threshold based on the rth order statistic Xr:n. Similarly, η2 is the number of new observations Yn+1,Yn+2,...,Yn+m which does not exceed Ys:n.

The random variable ζ1=η1+1 indicates the rank of Xr:n in the new sample Xn+1, Xn+2,..., Xn+m, and the random variable ζ2=η2+1 indicates the rank of Ys:n in the new sample Yn+1,Yn+2,...,Yn+m. We are interested in the joint probability mass function of random variables ζ1 and ζ2. We will need the following representation of the compound event P(ζ1=p,ζ2=q)=P(η1=p−1,η2=q−1).

Definition 1

Denote A={Xn+iXr:n},Ac={Xn+i>Xr:n}, B={Yn+iYs:n} and Bc={Yn+i>Ys:n}. Assume that in a fourfold sampling scheme, the outcome of the random experiment is one of the events A or Ac, and simultaneously one of B or Bc, where Ac is the complement of A.

In m independent repetitions of this experiment, if A appears together with B times, then A and Bc appear together p−1 times. Therefore, B appears together with Ac q−1 times and Bc mpq++2 times. This can be described as follows:

Clearly, the random variables η1 and η2 are the number of occurrences of the events A and B in m independent trials of the fourfold sampling scheme, respectively. By conditioning on Xr:n=x and Ys:n=y, the joint distribution of η1 and η2 can be obtained from bivariate binomial distribution considering the four sampling scheme with events A={Xn+ix}, B={Yn+iy}, and with respective probabilities

$$\begin{array}{*{20}l} P(AB)=P(X_{n+i}\leq x,Y_{n+i}\leq y),\\ P(AB^{c})=P(X_{n+i}\leq x,Y_{n+i}> y),\\ P(A^{c}B)=P(X_{n+i}> x,Y_{n+i}\leq y),\\ P(A^{c}B^{c})=P(X_{n+i}> x,Y_{n+i}> y). \end{array} $$

Now, we can state the following theorem.

Theorem 2

The joint probability mass function of ζ1 and ζ2, is given by

$${{} \begin{aligned} &P(\zeta_{1}=p, \zeta_{2}=q) = P(\eta_{1}= p-1,\eta_{2}=q-1)= \sum_{\ell=max (0, p+q-m-2)}^{min (p-1,q-1)}\int_{-\infty}^{\infty}\int_{-\infty}^{\infty}\\&\Pi_{j=1}^{\ell}G^{*}_{i_{j}}(x,y) \Pi_{j=\ell+1}^{p-1}[G^{*}_{1,i_{j}}(x)-G^{*}_{i_{j}}(x,y)]\Pi_{j=p}^{q-\ell-1+p}\left[G^{*}_{2,i_{j}}(y)-G^{*}_{i_{j}}(x,y)\right] \Pi_{j=q-\ell+p}^{m+2}\overline{G}^{*}_{1,i_{j}}(x) f_{k,k':n(\underline{w})} dxdy, \end{aligned}} $$

where, \(p,q= 1,...,m+1,\ f_{k,\acute {k}:n}(\underline {w})\) is defined in (3).

Proof

Consider the fourfold sampling scheme described in Definition (1). By conditioning with respect to Xr:n=x and Ys:n=y, we obtain

$$ {{} \begin{aligned} P(\zeta_{1}=p, \zeta_{2}=q) \equiv P(\eta_{1}= p-1,\eta_{2}=q-1)= P\left\{\sum_{i=1}^{m}I_{(X_{r:n}-X_{n+i})}=p-1,I_{(Y_{r:n}-Y_{n+i})}=q-1\right\}\\ =\int_{-\infty}^{\infty}\int_{-\infty}^{\infty}P\left\{\sum_{i=1}^{m}I_{(X_{r:n}-X_{n+i})}=p-1,I_{(Y_{s:n}-Y_{n+i})}=q-1| X_{r:n}=x, Y_{s:n}=y\right\}\\ \times P\{X_{r:n} = x, Y_{s:n} = y \} dxdy\\ =\int_{-\infty}^{\infty}\int_{-\infty}^{\infty}P\left\{\sum_{i=1}^{m}I_{(x-X_{n+i})}=p-1,I_{(y-Y_{n+i})}=q-1\right\}{dF}_{r,s:n}(x,y). \end{aligned}} $$
(5)

On the other hand,

$$ {{} \begin{aligned} P\left(\sum_{i=1}^{m}I_{(x-X_{n+i})}=p-1,I_{(y-Y_{n+i})}=q-1\right)=\sum_{\ell=max (0,p+q-m-2)}^{min(p-1,q-1)}\Pi_{j=1}^{\ell}P_{i_{j}}(AB)\Pi_{j=\ell+1}^{p-1}P_{i_{j}}(AB^{c})\\ \Pi_{j=p}^{q-\ell-2+p}P_{i_{j}}\Pi_{j=q-\ell-1+p}^{m}P_{i_{j}}. \end{aligned}} $$
(6)

Substituting (6) in (5), we get

$${{} \begin{aligned} P(\zeta_{1}=p, \zeta_{2}=q) = P(\eta_{1}= p-1,\eta_{2}=q-1)= \sum_{\ell=max (0, p+q-m-2)}^{min (p-1,q-1)}\int_{-\infty}^{\infty}\int_{-\infty}^{\infty}\Pi_{j=1}^{\ell}G^{*}_{i_{j}}(x,y)\\ \Pi_{j=\ell+1}^{p-1}[G^{*}_{1,i_{j}}(x)-G^{*}_{i_{j}}(x,y)]\Pi_{j=p}^{q-\ell-1+p}[G^{*}_{2,i_{j}}(y)-G^{*}_{i_{j}}(x,y)]\Pi_{j =q-\ell+p}^{m}\overline{G}^{*}_{1,i_{j}}(x) f_{k,k':n(\underline{w})} dxdy, \end{aligned}} $$

where p, q=1,...,m+1. This completes the proof.

References

  1. Galambos, J.: Order statistics of samples from multivariate distributions. J. Amer. Statist. Assoc. 70(351), 674–680 (1975).

    Article  MathSciNet  Google Scholar 

  2. Aitken, A. C., Gonin, H. T.: On fourfold sampling with and without replacement. Proc. Roy. Soc. Edinburgh. 55, 114–125 (1935).

    Article  Google Scholar 

  3. David, H. A., O’Connell, M. J., Yang, S. S: Distribution and expected value of the rank of a concomitant of an order statistic. Ann. Statist. 5, 216–223 (1977).

    Article  MathSciNet  Google Scholar 

  4. Bairamove, I., Kemalbay, G.: Some novel discrete distributions under fourfold sampling schemes and conditional bivariate order statistics. J. Comput. Appl. Math. 248, 1–14 (2013).

    Article  MathSciNet  Google Scholar 

  5. Barakat, H. M.: On moments of bivariate order statistics. Ann. Instit. Statist. Math. 51(2), 351–358 (1999).

    Article  MathSciNet  Google Scholar 

  6. Bairamove, I., Kemalbay, G.: Joint distribution of new sample rank of bivariate order statistics. J. Appl. Statist. 42(10), 2280–2289 (2015).

    Article  MathSciNet  Google Scholar 

  7. Barakat, H. M.: Limit theorems for bivariate extremes of non-identically distributed random variable. Appl. Math. 29(4), 371–386 (2002).

    MathSciNet  MATH  Google Scholar 

  8. Barakat, H. M., Nigm, E. M., Al-Awady, M. A.: Asymptotic properties of multivariate order statistics with random index. Bull. Malaysian Math. Soc. (second series). 38(1), 289–301 (2015).

    Article  MathSciNet  Google Scholar 

  9. Bapat, R. B., Beg, M. I.: Order statistics for nonidentically distributed variables and permanents. Sankhya Indian J. Stat. Ser. A. 51(1), 79–93 (1989).

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

Not applicable.

Availability of data and materials

Not applicable.

Author information

Authors and Affiliations

Authors

Contributions

The author read and approved the final manuscript.

Corresponding author

Correspondence to A. R. Omar.

Ethics declarations

Competing interests

The author declares that she has no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Omar, A.R. On the joint distribution of order statistics from independent non-identical bivariate distributions. J Egypt Math Soc 27, 29 (2019). https://doi.org/10.1186/s42787-019-0034-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s42787-019-0034-9

Keywords

Subject classifications