Sheet 3
††course: Wahrscheinlichkeitstheorie 1††semester: FSS 2022††tutorialDate: 07.03.2022††dueDate: 10:15 in the exercise on Monday 07.03.2022Exercise 1 (Martingales Warm-up).
Do enough of these exercises to feel comfortable with them.
-
(i)
Assume that is an integrable adapted stochastic process which is decreasing, i.e.
Prove that is a supermartingale.
Remark.
Similarly an adapted increasing process is a submartingale.
Solution.
We simply use the monotonicity of the conditional expectation to obtain the last property of supermartingales
-
(ii)
Let , be (sub-)martingales, prove that
-
(a)
is a (sub-)martingale.
Solution.
We have and -adapted follows from the fact that sums of measurable functions are measurable. What is left to show is
or the same with for submartingales respectively. ∎
-
(b)
is a submartingale,
Solution.
Adapted and integrable is easy to check, and the submartingale property follows from the monotonicity of conditional expectation
-
(a)
-
(iii)
Let be a -submartingale. Let be -stopping time.
-
(a)
Suppose that is bounded and is a.s. finite (the meaning is different from ” is a.s. bounded”!). Prove that .
Solution.
Since a.s. we have
where is a bounded submartingale, because
-
•
as as therefore is adapted
-
•
we have integrability due to
-
•
Lastly we have the submartingale property
As is bounded, is bounded and therefore . By dominated convergence we further have
-
•
-
(b)
Suppose that there exists a constant such that for -a.e. ,
We also suppose that . Show that .
Solution.
is a submartingale for the same reason as before, and we can again use dominated convergence if we find an integrable upper bound. This upper bound is because
Hint.
Compare this with statements about martingales in the lectures and use the bounded stopping time for some constant and the dominated convergence theorem.
-
(a)
Exercise 2 (Conditional Expectation).
Recall that a gamma distribution with parameter and has density:
-
(i)
Let be two independent exponential random variables with parameter and . Determine the conditional distribution of given (i.e. the Markov kernel ).
Solution.
For all non-negative and measurable functions we have that:
Since the last term is -measurable we can conclude that . Using means that for , we get
Hence, the conditional distribution of given is . ∎
-
(ii)
Conversely, let be a random variable with gamma distribution with parameter , and suppose is a random variable whose conditional distribution given is uniform on , for . Prove that and are independent with exponential distribution .
Solution.
With kernels we can easily write
(1) (2) where is the regular conditional distribution defined from the unique kernel we get via the disintegration theorem. Essentially doing all the same calculations backwards we get
This shows that are independent random variables. In more detail: Use to show . And similarly to show . Independence follows using indicator functions in place of . ∎
Exercise 3 (Branching Process Tools).
Assume are identically distributed, independent and integrable random variables. And assume is an integrable -valued random variable independent of and .
-
(i)
(Wald’s equation) Prove that
Hint.
Use indicators and Fubini.
Solution.
The claim follows from Fubini and independence:
-
(ii)
(Blackwell-Girshick) Further assume that . Prove that
-
(a)
-
(b)
Hint.
Same trick as with Wald’s equation.
-
(a)
Exercise 4 (Intro to Harmonic Functions).
Let be a symmetric random walk, i.e. the jump sizes are a sequence of i.i.d. random variables with and . Let with .
-
(i)
Let . Find a constant such that is a -martingale.
Solution.
To get the martingale property
we need , i.e.
-
(ii)
Prove that is a -martingale.
Solution.
We simply check the three properties of a martingale
-
•
is by definition of adapted.
-
•
-
•
Lastly we check the martingale property
-
•
-
(iii)
Prove that is a -martingale.
Solution.
We again check the definition
-
•
Adaptedness follows again from def of .
-
•
-
•
First, observe that
Using this we get
-
•
-
(iv)
Find a polynomial with degree on and degree on , such that is a -martingale.
Solution.
We are going to use (v) to guess the correct polynomial. Before we start consider the simple polynomial . We have
So when striving for we only need to consider even (odd) exponents, if the degree of the polynomial is even (or odd respectively) (cf. (ii) (iii)). So let us start guessing with where and are polynomials with a degree of at most . From this we obtain
To solve this systematically we assume , which implies
this is solved by and . So we have . With similar assumptions on we get
This implies and therefore . In total we have fulfills (v). Therefore, is a martingale by (v). ∎
-
(v)
Prove that, in general, for a polynomial , the process is a -martingale, if
Solution.
-
•
Adaptability follows from measurability of polynomials,
-
•
integrability from the triangle inequality and the fact that .
-
•
Lastly, note that is -measurable and is independent of , this results in the martingale property
We could have solved (ii) this way, as for , we have
-
•
Exercise 5 (Poisson Process is Poisson).
In this exercise we are going to assume that the Poisson process is defined as the number of events up to time , where the waiting time between events is always exponentially distributed, i.e.
Show that is Poisson distributed for every .
Hint.
The exponential distribution is a special case of the Gamma distribution, i.e. . And since the Gamma distribution is stable under summation of iid Gamma distributed random variables, we get . Using this fact you can either calculate using the cumulative distribution function of the Gamma distribution (which you might know or need to calculate), or you could calculate by conditioning on in this sense
This avoids the usage of the cdf of the Gamma distribution.
Solution.
-
Option 1:
Using the fact that we get
By induction one can show that
where the induction step is simply partial integration:
Therefore we have
which implies that .
-
Option 2:
Conditioning on we can calculate
Using we therefore get
which implies .
∎