Sheet 2
††course: Wahrscheinlichkeitstheorie 1††semester: FSS 2022††tutorialDate: 28.02.2022††dueDate: 10:15 in the exercise on Monday 28.02.2022Exercise 1.
Calculate the following conditional expectations and determine the conditional distributions.
-
(i)
, where are independent uniform random variables on .Solution.
Using our kernel representation we get
-
(ii)
, where has a joint distribution with density:Solution.
We know that
with
where
Using
we therefore get
-
(iii)
, where are independent Poisson random variables with parameters and respectively.Solution.
Since
and are both Poission we are in the discrete setting. So first, we want to compute the conditional probability:Let us calculate each part separately. Starting with the nominator, we get:
For the denominator we can use the results from Stochastics 1, that
. In case, you have not seen it so far, the calculation below shows the claim from Sto1:Now we can calculate the conditional expectation
:Thus, we can conclude that
. ∎
Exercise 2 ((Sub-/Super-)Martingales).
-
(i)
The (sub-/super-)martingale property holds over several time steps, i.e.
Solution.
We use induction over
with induction start , so assuming the statement holds for we have for -
(ii)
Expectation increase/decrease/stay-constant, i.e.
Solution.
Simply taking the expectation over the previous result yields the claim. ∎
-
(iii)
is a supermartingale iff is a submartingale.Solution.
Adapted and
follows from both statements. So let be a supermartingale, then we havetherefore
is a submartingale. The opposite direction is analogous. ∎
Exercise 3 (Markov Inequality for Conditional Expectation).
Prove that for two random variables
Remark.
The proof works the same when one conditions on a sigma algebra instead of
a random variable
Solution.
Simply using monotonicity of conditional expectation we get
Exercise 4.
Let
Solution.
We know that
with
With
we can write the density of this vector explicitly
Lastly we have
This allows us to calculate the conditional density
Which finally implies
Exercise 5 (Stopping Times).
-
(i)
Suppose
are -stopping times. Prove that , , and are stopping times.Solution.
is a stopping time, iff . So we just need to check this definition in every case:-
(a)
-
(b)
-
(c)
-
(d)
∎
-
(a)
-
(ii)
Suppose
are stopping times. Prove and .Solution.
To prove
one needs to prove , soSimilarly we have
, therefore their intersection is also in . ∎ -
(iii)
(Last Hitting Time is no Stopping Time in General) Consider the stochastic process
defined by , and for all . Let be the natural filtration generated by . Show that the last hitting time ofis not a stopping time.
Solution.
We have
andtherefore
is not a stopping time since