On the Exactness of Distribution Density Estimates Constructed by Some Classes of Dependent Observations

On the probabilistic space (Ω, F, P) we consider a given two-component stationary (in the narrow sense) sequence { } 1 ≥ i i i X ξ , where { } 1 i i ξ ≥ ( : i ξ Ω→Ξ ) is the controlling sequence and the members : i X R Ω→ of the sequence { } 1 i i X ≥ are the observations of some random variable X which are used in the construction of kernel estimates of Rosenblatt-Parzen type for an unknown density ( ) f x of the variable X . The cases of conditional independence and chain dependence of these observations are considered. The upper bounds are established for mathematical expectations of the square of deviation of the obtained estimates from ( ) f x .


Introduction
Nonparametric estimates of a distribution density are one of the intensively studied issues of mathematical statistics. The previously studied estimates were constructed using independent samples. In recent years the construction of estimates has been done by dependent observations.
Our present study deals with two types of dependence: conditional independence and chain dependence.
Let us present some definitions and auxiliary facts concerning the nonparametric estimation of the distribution density by independent observations. Assume that the variables i X , i X R ∈ , i = 1,2…, are independent observations of the random variable X with an unknown density ( ) g x . The first attempts to construct nonparametric density estimates were made by V. Glivenko ([1]) and N. Smirnov  ( ) g x k with probability 1 to the metric 2 L . Along with Rosenblatt and Parzen type estimates, projection type estimates were also considered (see [7], [8]  Lemma 1 (see. [7]). If the values i X , i x R ∈ , i = 1,2,…, are independent observations of some random variable X with an unknown density  i i X ≥ also undergo changes. For example, in stock-exchange transactions the price of some goods changes according to a season though it had been previously fixed at the auction. Hence the flow of revenue due to such transactions also changes and so on.
In such cases to estimate the density X it is advisable to consider dependent observations. It should be noted one interesting work by Sidney Yakowitz ([11]) concerning density estimation by observations bound to Markov chain, which deals with Markov chain by general phase space of states.
On the probabilistic space (Ω, F, P) let us consider the two-component stationary (in the narrow sense) sequence of random variables (6)  , , , n X X X … become independent and for all natural Definition 4. The conditionally independent sequence (1) , be the frequencies of those moments of time at which during the first n steps the chain is in the states ,

Methodology
The method used to validate the theorems is to transition from a mathematical wait to a conditional mathematical wait during the observation of a 1 1 2 ( , ,..., ) The sum to be considered on fixed trajectory is divided into several summons. One of them is the sum of independent summons on fixed trajectory.After some transformations it results as a type for which famous results are used from the theory of building of estimations with independent observation ( The method used allows for the future to consider the estimation of various parameters by dependent observations.

Main Results
Let us consider the sequence (6) Theorem 1. Let us consider the sequence (6).
The elements of the controlling sequence : Let us proceed to proving the theorems. is fulfilled by virtue of the condition (8) and the following estimate Taking the conditions (3) into account and changing the variable under the sign of the integral we obtain the following chain of equalities is valid by virtue of the condition (9).   We obtain Applying the equality (4) from Lemma 1 we have Let us decompose 2 I from (13): The first two summands from the sum are estimated in the same manner: ( ) ( ) ( ) ( ) , where, 1/2 2 1 1 1 In the same manner it can be shown that where, By the decomposition (13) and the obtained estimates (14-17) the theorem is proved.
Proof of Theorem 2. The proof is carried out in the same way as that of Theorem 1, but we should note that there is some difference.
For the fixed trajectory 1n ξ we denote by the symbols ( ) n i ν 1, 2 i = , the frequencies of the chain being is in the states 1 b and 2 b (respectively) during the first n steps.
We do not need the conditions (8) and (9) of Theorem 1.
As we know (see [10]), if the so-called condition of ergodicity is fulfilled for regular Markov chains, then the condition (8)   Therefore, we built nonparametric density estimates with conditional observations (conditionally independent and chain-dependent observations) and determined the approximation densities with these estimates.