Inference on P [ Y < X ] for Geometric Extreme Exponential Distribution

Geometric Extreme Exponential Distribution (GEE) is one of the statistical models that can be useful in ﬁtting and describing lifetime data. In this paper, the problem of estimation of the reliability R = P ( Y < X ) when X and Y are independent GEE random variables with common scale parameter but different shape parameters has been considered. The probability R = P ( Y < X ) is also known as stress-strength reliability parameter and demonstrates the case where a component has stress X and is subjected to strength Y . The reliability R = P ( Y < X ) has applications in engineering, ﬁnance and biomedical sciences. We present the maximum likelihood estimator of R and study its asymptotic behavior. We ﬁrst study the asymptotic distribution of the maximum likelihood estimators of the GEE parameters. We prove that the maximum likelihood estimators and so the reliability R have asymptotic normal distribution. A bootstrap conﬁdence interval for R is also presented. Monte Carlo simulations are performed to assess he performance of the proposed estimation method and validity of the conﬁdence interval. We found that the performance of the maximum likelihood estimator and also the bootstrap conﬁdence interval is satisfactory even for small sample sizes. Analysis of a dataset has been given for illustrative purposes.


Introduction
The study of stress-strength model is very common in reliability context. In this model, we are interested on the inference of the reliability R = P [Y < X], where X is the strength of a component which is subject to the stress Y . The system fails whenever the applied stress is greater than its strength and R will be the chosen measure of system performance. The problem of statistical inference on R has been considered by many authors. The earliest work perhaps was done by [30] who considered the case when X and Y are normally distributed. The Gaussian case was further studied by [9], [11], [14] and [39]. The setting that X and Y follow independent exponential distribution was studied by [37,38], [17], [34] and [8]. The more general case that X and Y are independent gamma random variables was considered briefly by [37], and more extensively by [10], [3] and [35,36]. References [31] and [4] studied the case where X and Y are Burr Type X and Levy random variables, respectively. More recently, the logistic, Laplace, generalized exponential, Weibull and generalized Pareto cases were treated respectively, by [25], [26], [20], [21] and [32].
The case where X and Y follow bivariate beta, bivariate gamma and bivariate exponential models were considered respectively by [27,28] and Nadarajah and Kotz [29].

528
Inference on P[Y < X] for Geometric Extreme Exponential Distribution Recently, [24] considered the case with two independent Poisson half logistic random variables.
In this article, we consider the inference on R = P (Y < X), when X and Y are independent geometric extreme exponential (GEE) distribution random variables with common scale parameter but different shape parameters. Geometric extreme exponential distribution is one of the nonnegative right skewed models that could be used for analyzing lifetime data. The GE-exponential distribution was introduced by [22] and further studied extensively by [23] and [2].
We shall denote a geometric extreme exponential model with shape parameter γ and scale parameter β by GEE(γ, β) and the corresponding density function is given by: where γ > 0, β > 0 andγ = 1 − γ. The GEE distribution reduces to the exponential distribution if we take γ = 1. When 0 < γ < 1, the distribution reduces to exponential-geometric distribution studied by [1]. This distribution has decreasing failure rate and we shall not consider this case in this article. We shall denote the GEE distribution function by F (x; γ, β) and is given by The mean and the variance of the GEE distribution are given by the following equations where Φ(a, b, c) is the Lerch's transcendent (see e.g. [13]). The rest of the article is organized as follows. In Section 2, we study the problem of finding the maximum likelihood estimator of R. Asymptotic distribution of the maximum likelihood estimators (MLEs) of the parameters using Fisher information matrix are provided in Section 3. A non-parametric bootstrap confidence interval is presented in Section 4. Validity of the proposed estimation method and the bootstrap confidence interval are assessed by a Monte Carlo simulation study in Section 5 and Section 6 is devoted to a data analysis study of two simulated datasets. Finally, We will conclude with a discussion in Section 7.
2 Maximum likelihood estimation of R Suppose X and Y are independent random variables follow GEE(γ 1 , β) and GEE(γ 2 , β), respectively. Note that It can be shown that To find the maximum likelihood estimator of R, we first study the MLE of the parameters β, γ 1 and γ 2 . Let X 1 , X 2 , . . . , X n be a random sample of size n from GEE (γ 1 , β) and Y 1 , Y 2 , . . . , Y m be a random sample of size m from GEE (γ 2 , β). Hence, the likelihood function of the whole sample will be given by The log-likelihood function is then . Taking the derivatives with respect to β, γ 1 and γ 2 and setting the resulting equals equal to zero yield the following normal equations The MLEs of the parameters γ 1 , γ 2 and β are the solutions of the system of nonlinear equations (6), (7) and (8). The following algorithm could be used to solve the equations simultaneously. 1. Choose any positive value for β, say β (0) and solve equation (7) for γ 1 .
3. Solve equation (6) for β (1) with the values γ 2 , β (1) ). 5. Repeat steps 1-4 several times until the absolute difference between two consecutive L * (γ ) is less than some tolerance level. The maximum likelihood estimator of R is then using the invariance property of maximum likelihood estimators and equation (4) will be given by: In the following section, we derive the asymptotic distribution of the MLEs of the unknown parameters γ 1 , γ 2 and β.
Theorem 1. As n → ∞ and m → ∞ and n/m → p, then and and a 12 = a 21 = 1 n Proof: The proof follows by the asymptotic properties of the maximum likelihood estimators.
In the following section, we present a non-parametric bootstrap procedure to construct a bootstrap confidence interval for R.

Bootstrap confidence interval
In this section, we propose a non-parametric bootstrap confidence interval for R, using the percentile bootstrap method of [12]. To construct the bootstrap confidence interval we follow the following steps: Step 1: Calculate the maximum likelihood estimators (β,γ 1 ,γ 2 ) using equations (6) and (7) or (8) from the sample (x 1 , . . . , x n ) and (y 1 , . . . , y m ) by the numerical iteration method described in Section 2.
Step 5: is the cumulative distribution function ofR * . Then the 100(1 − ν)% bootstrap confidence interval for R is given by In the next section, we evaluate the performance of the proposed bootstrap confidence interval in terms of confidence interval length and coverage percentage through a Monte Carlo simulation study.

Simulation study
In this section, we study the performance of the maximum likelihood estimator of R using Monte Carlo simulations. All the simulations were carried out in R using the pseudo-random generator in that software package. The performance of estimation is evaluated in terms of bias and mean squared error (MSE). We also constructed the 90% bootstrap confidence interval for R using the method described in Section 4. To assess the performance of the bootstrap confidence interval we reported the confidence length and the coverage percentage. We considered different sample sizes say; (n, m) = (5, 5), (5, 10), (10,5), (10,10), (10,15), (15,10), (20,20), (30,30). Since the value of R is independent of β without loss of generality we kept it constant at β = 1.0. In all cases we considered γ 1 = 1.5 and γ 2 = 2.0, 2.5, 3.0, 3.5 and 4.0. All the simulation results are based on 10000 replications. We computed the bootstrap confidence intervals based on 1000 re-sampling. Table 1. Biases, MSEs, confidence lengths and coverage percentages. In each cell, the number in the first row represents the average bias and the number in the bracket represents mean squared error. Similarly the confidence length and the coverage percentage based on bootstrap method are reported in the second row.   Table 1 gives average bias and mean squared error for the maximum likelihood estimator of R. The average confidence length and coverage proportions for the bootstrap confidence interval are also reported in this table.
As is clear from Table 1 the performance of the proposed estimators are good in terms of biases and mean squared errors even for as small sample sizes as (5, 5) specifically for smaller values of γ 2 . Indeed, when m = n and m and n increases the mean squared errors decreases. This indicates the consistency of the maximum likelihood estimator of R. Also, as one wold expect, for fixed n, as m increases the MSEs decreases and similarly for fixed m, as n increases the MSEs decreases.
Inference on P[Y < X] for Geometric Extreme Exponential Distribution The performance of the bootstrap confidence intervals are quite satisfactory in terms of coverage percentage but not very good in terms of confidence length. Indeed, the confidence lengths are quite large which lead to the larger values for the coverage percentage even more than the nominal level for most cases.

Data analysis
In this section, we present a data analysis for illustrative purposes through simulated datasets. The data has been generated from GEE models with the following parameter values; n = m = 25, β = 1.0, γ 1 = 1.5 and γ 2 = 2.5. The datasets are presented in Tables 2 and 3. The true value of R from (4) will be given by R = 0.4156. The maximum likelihood estimators of the parameters are given byγ 1 = 1.6953,γ 2 = 2.8122 andβ 1 = 0.9962. Therefore, from (9) we deduceR = 0.4164. Note the closeness of the approximate R to its true value which is exactly the same with two decimal points. We then followed the steps given in Section 4 to find the 95% bootstrap confidence interval using 1000 bootstrap iterations as (0.2158, 0.7246).

Conclusions and discussion
In this article, we studied the estimation of the reliability R = P [Y < X] when X and Y are independent generalized extreme exponential random variables with common scale parameter but different shape parameters. We provided the maximum likelihood estimators the unknown parameters using a numerical method such as Newton's method. We also studies the asymptotic distribution of the MLEs. A bootstrap confidence interval for R has also been given.
Another important question we have not addressed in this article is the estimation of R when we have two population with the same shape parameter but different scale parameters and also when there are arbitrary relationship between the parameters. Note that in both cases R does not have any closed form and it is expressed only in terms of integration. Hence, the problem becomes quite complicated and more work is needed in this direction.

Acknowledgment
We are grateful to anonymous reviewers for providing some useful comments and suggestions on an earlier draft which led to this improved version of the paper.