Sheet 4

Prof. Leif Döring, Felix Benning
course: Wahrscheinlichkeitstheorie 1semester: FSS 2022tutorialDate: 14.03.2022dueDate: 10:15 in the exercise on Monday 14.03.2022
Exercise 1 (Scheffé’s Lemma).

Suppose that (Xn,n1) is a sequences of random variables in 1(Ω,,) such that limnXn=X almost surely. Show that the following statements are equivalent:

  1. (i)

    XnX in 1, as n;

  2. (ii)

    𝔼[|Xn|]𝔼[|X|]<, as n.

Hint.

Apply Fatou’s Lemma to |Xn|+|X|-|Xn-X|0.

Solution.

(i) (ii) Follows from the reverse triangle inequality

||x|-|y|||x-y|, (1)

and Jensen’s inequality

|𝔼[|Xn|]-𝔼[|X|]| =|𝔼[|Xn|-|X|]|
Jensen𝔼[||Xn|-|X||]
(1)𝔼[|Xn-X|]0,(n).

(ii) (i): By Fatou’s Lemma we have

2𝔼[|X|] =𝔼[lim infn|Xn|+|X|-|Xn-X|]
Fatoulim infn𝔼[|Xn|+|X|-|Xn-X|]
=2𝔼[|X|]-lim supn𝔼[|Xn-X|]

This implies

lim supn𝔼[|Xn-X|]0,

and therefore convergence. ∎

Exercise 1 (Generalized Scheffé’s Lemma - Optional).

Assume that (Xn)n1 converges in probability to X1. Then the following statements are equivalent:

  1. (i)

    𝔼[|Xn|]𝔼[|X|]<, as n.

  2. (ii)

    For all ϵ>0 we have lim supn𝔼[|Xn|𝟙|X-Xn|>ϵ]ϵ

  3. (iii)

    {X,X1,X2,} are uniformly integrable

  4. (iv)

    XnX in 1, as n;

Hint.

Prove them in order. For (i) (ii) you may want to use

𝔼[|Xn|𝟙|X-Xn|>ϵ]
=𝔼[|Xn|]-𝔼[|Xn|𝟙|X-Xn|ϵ]
=𝔼[|Xn|]-𝔼[|X|]Similar total mass+𝔼[(|X|-|Xn|)𝟙|X-Xn|ϵ]+𝔼[|X|𝟙|X-Xn|>ϵ]Convergence in probability

For (ii) (iii) you might want to ponder how Xn can be larger than some M when X is smaller than M-ϵ.

Solution.

(i) (ii): Fix ϵ>0 and choose any η>0. Then there exists n0 such that

𝔼[|Xn|]-𝔼[|X|]η2nn0. (2)

Since {X} is uniformly integrable, there exists δ(η)>0 such that

𝔼[|X|𝟙A]η2A𝒜:(A)<δ(η).

Since XnX in probability, there therefore exists n1 such that for all nn1

(|X-Xn|>ϵ)δ(η)𝔼[|X|𝟙|X-Xn|>ϵ]η2 (3)

Now we plug things together. We have for nmax{n0,n1}

𝔼[|Xn|𝟙|X-Xn|>ϵ] =𝔼[|Xn|]-𝔼[|Xn|𝟙|X-Xn|ϵ]
=𝔼[|Xn|]-𝔼[|X|](2)η2+𝔼[(|X|-|Xn|)𝟙|X-Xn|ϵ]𝔼[|X-Xn|𝟙|X-Xn|ϵ]ϵ+𝔼[|X|𝟙|X-Xn|>ϵ](3)η2

This implies the claim. The first sum exploits the fact that the mass of the random variables has to be similar. So we can not concentrate mass on a small event such as Xn being far from X. Which is what examples of convergence in probability without L1 convergence exploit.

(ii) (iii): We want to show

limMsupn𝔼[|Xn|𝟙|Xn|>M]=0.

So fix some δ>0. We select ϵ:=δ4 and then select n0 such that for all nn0 we have

𝔼[|Xn|𝟙|X-Xn|>ϵ]ϵ+δ4. (4)

As {X1,,Xn0} is a finite set it is uniformly integrable, and there exists M0 such that

sup0nn0𝔼[|Xn|𝟙|Xn|>M]δMM0. (5)

Similarly there exists M1, such that

𝔼[|X|𝟙|X|>M-ϵ]δ4MM1 (6)

To put things together, note that we always have

𝟙|Xn|>M𝟙|X-Xn|>ϵ+𝟙|X|>M-ϵ𝟙|X-Xn|ϵ. (7)

This implies for all Mmax{M0,M1}

supn𝔼[|Xn|𝟙|Xn|>M]
(7)max{sup0nn0𝔼[|Xn|𝟙|Xn|>M](5)δ,supnn0𝔼[|Xn|𝟙|X-Xn|>ϵ](4)δ2+𝔼[|Xn|𝟙|X|>M-ϵ𝟙|X-Xn|ϵ]𝔼[|X|𝟙|X|>M-ϵ]+ϵ(6)δ2}

(iii) (iv): Fix some ϵ>0, then due to uniform integrability there exists some δ>0, such that (A)<δ implies for all n

𝔼[|Xn|𝟙A]ϵand𝔼[|X|𝟙A]ϵ. (8)

Now we choose n0 such that

(|X-Xn|>ϵ)δnn0.

With (8) this implies for all nn0

𝔼[|X-Xn|]𝔼[|X-Xn|𝟙|X-Xn|ϵ]ϵ+𝔼[|X|𝟙|X-Xn|>ϵ]ϵ+𝔼[|Xn|𝟙|X-Xn|>ϵ]ϵ

(iv) (i): This finally follows from the reverse triangle inequality

||x|-|y|||x-y|, (9)

and Jensen’s inequality

|𝔼[|Xn|]-𝔼[|X|]| =|𝔼[|Xn|-|X|]|
Jensen𝔼[||Xn|-|X||]
(9)𝔼[|Xn-X|]0,(n).
Exercise 2 (Lemma 6.3.5).

Let (Xn) be an (n)-submartingale and S,T be bounded stopping times such that ST a.s. Prove that, 𝔼[XS]𝔼[XT].

Hint.

Consider the stochastic integral of H=(Hn) against X with Hn:=𝟙Sn-1<T.

Solution.

Since S,T are stopping times, we have that

Hn:=𝟙Sn-1<T=𝟙Sn-1𝟙T>n-1=𝟙Sn-1(1-𝟙Tn-1)

is n-1 measurable and therefore that the process H=(Hn) is previsible. The stochastic integral

(HX)n =k=1n𝟙Sk-1<T=𝟙S+1k𝟙kT(Xk-Xk-1)=k=(S+1)nTn(Xk-Xk-1)
=XTn-XSn.

is a submartingale since Hn is positive. Since S,T are bounded, there exists N such that

𝔼[XT]-𝔼[XS] =𝔼[XTN-XSN]=𝔼[(HX)N]𝔼[(HX)0]=0.
Exercise 3 (Uniform Integrability).
  1. (i)

    (Example 6.3.12 (i)) Prove that every finite set of integrable random variables is uniformly integrable.

    Solution.

    The first property follows by the fact that the set of random variables is integrable. The second property follows from

    limM(supαI𝔼[|Xα|𝟙|Xα|M]) limM(αI𝔼[|Xα|𝟙|Xα|M])
    =αIlimM𝔼[|Xα|𝟙|Xα|M]=0(DCT),

    where we use that limits can be interchanged with finite sums. ∎

  2. (ii)

    If Xαδxα, then the set (Xα)αI is uniformly bounded if and only if S={xα:αI} is uniformly bounded.

    Solution.

    Assume that S is uniformly bounded. Then there exists some M0>0 such that |x|<M0 for all xS. Therefore we have for all MM0 that 𝟙|Xα|M=0 a.s. for all αI, which implies

    supαI𝔼[|Xα|𝟙|Xα|M]=0  MM0

    In particular the limit in M is zero. Assume on the other hand that S is not uniformly bounded. Then for every M there exists some βI with |xβ|>M1, implying

    supαI𝔼[|Xα|𝟙|Xα|M]𝔼[|Xβ|𝟙|Xβ|M]=|xβ|1.

    Therefore the limes superior in M is at least 1, so we know that (Xα)αI is not uniformly bounded. ∎

Exercise 4 (Uniqueness of Limits in Probability).
  1. (i)

    (Triangle Inequality of Probability) Prove that for X,Y,Z random variables on some metric space with metric d we have

    {d(X,Y)ϵ}{d(X,Z)ϵ2}{d(Z,Y)ϵ2}
    Solution.

    Assume ω is in the set on the left, then due to

    ϵd(X,Y)(ω)d(X,Z)(ω)+d(Z,Y)(ω)2max{d(X,Z)(ω),d(Z,Y)(ω)}

    we have either ϵ2d(X,Z)(ω) or ϵ2d(Z,Y)(ω), therefore ω has to be in the union of sets on the right. ∎

Let us now assume that XnPX and XnPY

  1. (ii)

    Prove that P(d(X,Y)ϵ)=0 for all ϵ>0.

    Hint.

    Recall the proof of uniqueness for deterministic sequences. Apply a similar trick.

    Solution.

    As the triangle inequality of probability with Z=Xn, holds for all n it also holds for the limit, therefore

    (d(X,Y)ϵ) lim supn({d(X,Xn)ϵ2}{d(Xn,Y)ϵ2})
    limn(d(X,Xn)ϵ)+(d(Xn,Y)ϵ)
    =0
  2. (iii)

    Conclude that X=Y a.s.

    Solution.

    We can calculate the complementing event by

    (XY) =(ϵ>0{d(X,Y)ϵ})=limϵ0(d(X,Y)ϵ)=0.
Exercise 5 (Bound on Stopping Times).

Let T be a (n)n0-stopping time. Suppose that there exists N and ε(0,1), such that for every n,

(Tn+N|n)>ε,almost surely.
  1. (i)

    Prove that for each k, we have (T>kN)(1-ε)k.

    Hint.

    (T>kN)=(T>kN,T>(k-1)N).

    Solution.

    Using induction we have the statement for k=0, as (1-ϵ)0=1. The induction step is:

    (T>(k+1)N) =𝔼[𝟙T>(k+1)N𝟙T>kN]
    =𝔼[𝔼[𝟙T>(k+1)N𝟙T>kN|n]]
    =𝔼[𝟙T>kN(T>kN+N|n)=1-(TkN+N|n)1-ϵ]
    (1-ϵ)(T>kN)
    ind.(1-ϵ)k+1.
  2. (ii)

    Deduce that 𝔼[T]<.

    Solution.

    As T is positive we have

    𝔼[T] =0𝟙s<tdsdPT(t)
    =0𝟙s<tdPT(t)ds
    =k=0kN(k+1)N(T>s)(T>kN)𝑑s
    k=0N(T>kN)
    Nk=0(1-ϵ)kN1-ϵ
Exercise 6 (Monkey Typewriter Theorem).

At each of times 1,2,, a monkey types a capital letter at random, the sequence of letters typed forming an i.i.d. sequence of random variables chosen uniformly amongst the 26 possible capital letters. Let T be the first time by which the monkey has produced the consecutive sequence ABRACADABRA of 11 letters. We prove that

𝔼[T]=2611+264+26.

For this, imagine that just before each time n, a new gambler indexed by n arrives on the scene. He bets 1 euro that

the n-th letter will be A.

If he loses, he loses all his money and leaves. If he wins, he receives 26 euros, all of which he bets on the event that

the n+1-th letter will be B.

If he loses, he loses all his money and leaves. If he wins, he bets his whole current fortune of 262 that

the n+2-th letter will be R.

and so on through the ABRACADABRA sequence. All gamblers come and play and leave independently of each other. Let (Xik,i0) denote the fortune of the k-th gambler just after time i: in particular, Xik=1 for all ik-1.

  1. (i)

    Show that indeed, 𝔼[T]<.

    Hint.

    See Exercise 5.

    Solution.

    We see that T is a stopping time with respect to (n)n0 and (Tn+11|n)>26-11 for every n0. Thus 𝔼[T]< by Exercise 5. ∎

For simplicity of notation, we can add infinitely many Z’s after ABRACADABRA. Gamblers then bet on the word ABRACADABRAZZZZZ… We denote by Lk the k-th letter of this word for every k1. We also denote by Un the n-th letter typed for every n1 and (n)n0 the corresponding filtration.

  1. (ii)

    For every n1, let Mn be the total net profit of all gamblers who have participated in the game just after time n, i.e.

    Mn=k=1n(Xnk-1),

    and M00. Show that (Mn)n0 is a martingale.

    Hint.

    Show that Mnk:=Xnk-1 is a martingale for every k, by defining Mnk:=0 for all n<k.

    Solution.

    By definition, (Mn)n0 is adapted to the filtration (n)n0 and it is also clearly integrable. The net profit of the first gambler Mn1:=Xn1-1 is as follows:

    M01=0,M11=𝟚𝟞𝟙U1=L1-1,M21=26(M11+1)𝟙U2=L2-1,

    and more generally, for every n0,

    Mn+11=26(Mn1+1)𝟙Un+1=Ln+1-1,

    and then

    𝔼[Mn+11|n]=26(Mn1+1)(Un+1=Ln+1)-1=Mn1.

    So (Mn1)n0 is a martingale with respect to (n)n0. More generally, for every k1, Mnk=0 for every n<k and for every nk,

    Mn+1k=26(Mnk+1)𝟙Un+1=Ln+2-k-1,

    and (Mnk)n0 is also a martingale with respect to (n)n0. For every n1, Mn=k=1nMnk and therefore (Mn)n0 is a martingale with respect to (n)n0. ∎

  2. (iii)

    Show that T=inf{n0:Mn+n2611+264+26}. Conclude that 𝔼[T]=2611+264+26.

    Solution.

    If T=n, then all gamblers who started before time n-10 (where ABRACADABRA starts) have lost at time n so that Mnk=-1 for every kn-11. The (n-10)-th gambler is the winner: Mnn-10=2611-1. We also have Mnn-3=264-1 (because the four first and last letters of ABRACADABRA are the same, viz. ABRA) and for the same reason, Mnn=26-1. Finally, Mnk=-1 for every k{n-9,,n-4,n-2,n-1}. Putting all pieces together,

    Mn𝟙T=n=(2611+264+26-n)𝟙T=n.

    Therefore, almost surely,

    T{n0:Mn+n2611+264+26}.

    We also see that if n<T, we have

    Mn=k=1nMnk =k=n-9nMnk26n+1-k-1+Mnn-10=n<T-1+k=0n-11Mnk=-1
    k=0926k+1-n
    =262610-126-1-n26(2610-1)-n
    <2611+264+26-n.

    Thus

    T=inf{n0:Mn+n2611+264+26}.

    Moreover, since T< almost surely,

    MT+T=n=0(Mn+n)𝟙T=n=n=0(2611+264+26)𝟙T=n=2611+264+26.

    Now for every n0, -T-(nT)MnT2611+264+26. Since TL1, we derive from the dominated convergence theorem

    𝔼[MT]=limn𝔼[MnT]=0.

    Then, finally,

    𝔼[T] =𝔼[MT+T-MT]=𝔼[MT+T]-𝔼[MT]=2611+264+26.