Sheet 5

Prof. Leif Döring, Felix Benning
course: Wahrscheinlichkeitstheorie 1semester: FSS 2022tutorialDate: 21.03.2022dueDate: 10:15 in the exercise on Monday 21.03.2022
Exercise 1 (Uniform Integrability).
  1. (i)

    (Prop. 6.3.14 (ii)(i)) (Xα,αI) a family of random variables with supαI𝔼[|Xα|]<, i.e. bounded in 1(Ω,,). Suppose that for every ϵ>0, there exists δ>0, such that for all A with (A)<δ, we have

    𝔼[|Xα|𝟙A]<ϵ,αI.

    Prove that the Xα are uniformly integrable.

    Solution.

    Let ϵ>0. By the Markov inequality we have

    (|Xβ|M)𝔼[|Xβ|]MsupαI𝔼[|Xα|]McM.

    So for all MM0:=cδ(ϵ) and for all βI, we have

    (|Xβ|M)δ(ϵ),

    where δ(ϵ) is selected such, that by assumption we get

    𝔼[|Xα|𝟙|Xβ|M]ϵαI.

    Since δ(ϵ) does not depend on β, it holds for all βI. So in particular we have

    supαI𝔼[|Xα|𝟙|Xα|M]ϵMM0.

    As ϵ>0 was arbitrary, this is by definition uniform integrability. ∎

  2. (ii)

    Prove that a sequence of identically distributed random variables {Xi,i} with 𝔼[|X1|]< is uniformly integrable.

    Solution.

    Since 𝔼[|X1|]<, we deduce by the DCT

    limM𝔼[|X1|𝟙|X1|>M]=𝔼[|X1|limM𝟙|X1|>M=0]=0.

    Therefore for every ϵ>0, there exists M>0, such that

    𝔼[|X1|𝟙|X1|>M]<ϵ.

    The sequence is identically distributed, therefore, for every i, we conclude that

    supi𝔼[|Xi|𝟙|Xi|>M]=𝔼[|X1|𝟙|X1|>M]<ϵ.
  3. (iii)

    Let (Xi,i) be a sequence of i.i.d. random variables with 𝔼[|X1|]<. Let Sn=i=1nXi for every n. Then the family {1nSn,n} is uniformly integrable.

    Solution.

    Using (ii), (Xi,i) is uniformly integrable. By Proposition 6.3.14 (Theorem 6.24 in Klenke), for every ϵ>0, there exists δ>0 (dependent on ϵ but not on i), such that for every measurable set A with (A)<δ,

    supi𝔼[|Xi|𝟏A]<ϵ.

    Then for every n,

    𝔼[|Sn|n𝟙A]1ni=1n𝔼[|Xi|𝟏A]<ϵ.

    We also have (Sn/n,n) is bounded in L1:

    supn𝔼[|Sn|n]supn1n𝔼[i=1n|Xi|]=𝔼[|X1|]<.

    Then by Proposition 6.3.14 (Theorem 6.24 in Klenke), (Sn/n,n) is uniformly integrable. ∎

Exercise 2 (Not Uniformly Integrable).

Let (Un,n1) be i.i.d. Bernoulli random variables with (Un=1)=p(0,1) and (Un=0)=1-p. Set τ:=inf{n1:Un=1}. Define Xn:=(1-p)-n𝟙τ>n.

  1. (i)

    Show that (Xn) is a martingale (choose a suitable filtration).

    Solution.

    Choose n=σ(U1,,Un). Because of

    {τ>n}={U1=0,,Un=0}n,

    Xn is adapted, and we have

    𝔼[Xn|n-1] =(1-p)-n𝔼[𝟙τ>n𝟙τ>n-1|n-1]
    =(1-p)-n𝔼[𝟙τ>n-1𝟙Un=0|n-1]
    =(1-p)-n𝟙τ>n-1(Un=0)
    =(1-p)-n𝟙τ>n-1
    =Xn-1.

    By induction and positivity of Xn we also have 𝔼[|Xn|]=𝔼[Xn]=𝔼[X0]=1<. So Xn is a martingale. ∎

  2. (ii)

    Prove that Xn0 almost surely

    Solution.

    As Xn is a martingale, we have

    1=𝔼[Xn]=(1-p)-n𝔼[𝟙τ>n],

    so we get

    n0(𝟙τ>n>ϵ)=n0(τ>n)=n0(1-p)n<.

    By Borell-Cantelli we therefore have

    (𝟙τ>n0) =1-(ϵ>0,n0nn0:𝟙τ>n>ϵ)
    =1-(ϵ>0n0nn0:{𝟙τ>n>ϵ})
    =1-limϵ0(lim supn{𝟙τ>n>ϵ})=0(BC).

    𝟙τ>n0 almost surely implies there exists N(ω) large enough such that 𝟙τ>n=0 for all n>N(ω). So we can conclude that

    Xn(ω)=(1-p)-n𝟙τ>n(ω)=0

    for all n>N(ω). Thus, Xn0 a.s. ∎

  3. (iii)

    Prove that (Xn,n1) is not uniformly integrable.

    Solution.

    If (Xn)n was u.i., then XnX in 1 and 𝔼[X]=1 by Theorem 6.3.16. This is however a contradiction to X=0 a.s. ∎

Exercise 3 (Tail σ-Algebra).
  1. (i)

    𝒜 is trivial if an only if all (𝒜,(d))-measurable functions are constant

    Solution.

    Assume that 𝒜 is trivial and let f:Ωd be (𝒜,(d))-measurable, then there exists yf(A) and therefore

    f-1({y})𝒜={,Ω}

    therefore f-1(y)=Ω which implies f(ω)=y for all ωΩ.

    Assume on the other hand that, that 𝒜 is not trivial, then there exists some set A𝒜 with AΩ. Then 𝟙A is measurable but not constant. ∎

Let X1,X2, be a sequence of random variables with τn=σ(Xn,) and tail σ-algebra τ=nτn.

  1. (ii)

    Prove that lim infnXn and lim supnXn are τ measurable.

    Solution.

    It is sufficient to prove measurability on a generator. In particular it is enough if the preimages of the sets (-,λ] for all λ are measurable. Now for every N we have

    {lim infnXnλ} ={limninfknXkλ}=n=1kn{Xkλ}
    =n=Nkn{Xkλ}τnτNτN

    therefore the set is in τ. Replacing inf with sup and unions with intersections and vice versa proves the same result for the lim sup. ∎

  2. (iii)

    Prove that

    C:={ωΩ:limnXnexists}τ
    Solution.

    As

    Y:=lim supnXn-lim infnXn

    is τ-measurable by (ii), we have

    C ={lim infnXn=lim supnXn}=Y-1(0)τ.
Exercise 4 (Gambler’s ruin).

Let (Xn)n1 be a sequence of i.i.d. random variables with (X1=1)=1-(X1=-1)=p for some p(0,1), p1/2. Let a,b with 0<a<b. Define S0a and for every n1, SnSn-1+Xn. Finally, define the following stopping time:

Tinf{n0:Sn=0 or Sn=b}.

We consider the filtration generated by (Xn)n1.

  1. (i)

    Show that 𝔼[T]<.

    Solution.

    We have for every n0, (Tn+b|n)>inf(pb,(1-p)b)>0. We deduce from Sheet 5, Exercise 5 that 𝔼[T]<. ∎

  2. (ii)

    Consider for every n0,

    Mn(1-pp)SnandNnSn-n(2p-1).

    Prove that (Mn)n0 and (Nn)n0 are martingales.

    Solution.

    (Sn)n0 is adapted to the filtration (n)n0 generated by (Xn)n0 and then so are (Mn)n0 and (Nn)n0. They are integrable: clear for (Nn)n0 and comes from |Sn|a+n for every n0 for (Mn)n0. Then for every n0,

    𝔼[Mn+1|n]=Mn𝔼[(1-pp)Xn+1]=Mn[p1-pp+(1-p)p1-p]=Mn,

    and

    𝔼[Nn+1-Nn|n] =𝔼[Xn+1]-(2p+1)=[p-(1-p)]-(2p-1)=0.
  3. (iii)

    Deduce the values of (ST=0) and (ST=b).

    Solution.

    We apply optional stopping Theorem to obtain that (MnT)n0 is a martingale. Since T< a.s., we have limnMnT=MT a.s. We also notice that this process is bounded:

    |MnT|max((1-pp)b,1),n.

    By dominated convergence,

    𝔼[MT]=limn𝔼[MnT]=𝔼[M0].

    That is,

    (1-pp)a=𝔼[(1-pp)S0]=𝔼[(1-pp)ST]=(ST=0)+(1-pp)b(ST=b). (1)

    Since (ST=b)=1-(ST=0), we get

    (ST=b) =(1-pp)a-1(1-pp)b-1,and(ST=0)=(1-pp)b-(1-pp)a(1-pp)b-1.
  4. (iv)

    Compute the value of 𝔼[T].

    Solution.

    Recall that the process (SnT)n0 is bounded by b. For every n0, we then get

    |SnT-(2p-1)nT|b+(2p-1)TL1.

    Then from dominated convergence theorem

    a=𝔼[N0]=𝔼[NnT]=𝔼[SnT-(2p-1)nT]𝔼[ST-(2p-1)T].

    Finally

    𝔼[T]=𝔼[ST]-a2p-1 =b2p-1(1-pp)a-1(1-pp)b-1-a2p-1.
  5. (v)

    Let τ0inf{n0:Sn=0} and τbinf{n0:Sn=b}. Compute the value of (τ0<).

    Hint.

    Consider (τ0<τb) and let b.

    Solution.

    Due to

    T=inf{n0:Sn=0 or Sn=b}=τ0τb.

    we can apply (1) to have

    (1-pp)a =(ST=0)+(1-pp)b(ST=b)
    =(τ0<τb)+(1-pp)b(τ0>τb). (2)
    1. Case 1:

      If p>1/2, then 1-pp<1. Letting b, we have

      (1-pp)a =limb[(τ0<τb)+(1-pp)b0(τ0>τb)]
      =limb(τ0<τb)
      =τb<τb+1(b=1{τ0<τb})(τ0<)

      Where the inequality follows from τbb-a, so τb as b, which implies

      {τ0<}b=1{τ0<τb}.

      As τ0τb=T< almost surely, and the countable union of their complements is still a zero set, we have

      (b=1{τ0τb<})=1

      As intersections with probability one sets do not change the probability, we have

      (1-pp)a=(b=1{τ0<τb}b=1{τ0τb<})(τ0<).

      Because if ω is in the set in the middle, there exists b such that

      ω{τ0<τb}{τ0τb<}{τ0<}.
    2. Case 2:

      If p<1/2, then 1-pp>1. Reordering (2), we get

      0 =limb(1-pp)a-b=limb(1-pp)-b(τ0<τb)+(τ0>τb)
      =limb(τ0>τb)=τb<τb+1(b=1{τ0>τb})
      =1-(b=1{τ0<τb}b=1{τ0τb<})
      1-(τ0<)
      =(τ0=)

      In other words: (τ0<)=1. So in total

      (τ0<) ={1p<12(1-pp)ap>12