Sheet 8

Prof. Leif Döring, Felix Benning
course: Wahrscheinlichkeitstheorie 1semester: FSS 2022tutorialDate: 25.04.2022dueDate: 10:15 in the exercise on Monday 25.04.2022
Exercise 1 (Complex Integration).

Let (Ω,,μ) be a measure space, f,g:Ω an integrable function. Show that

  1. (i)

    Ωaf+gdμ=aΩfdμ+Ωgdμ, for all a

    Solution.

    Since a, there exist x,y such that a=x+iy. Hence we can compute

    Ωaf+gdμ =Ω(x+iy)(Re(f)+iIm(f))+(Re(g)+iIm(g))dμ
    =sortΩxRe(f)-yIm(f)+Re(g)real+i(yRe(f)+xIm(f)+Im(g))complexdμ
    =defΩxRe(f)-yIm(f)+Re(g)dμ+iΩyRe(f)+xIm(f)+Im(g)dμ
    =lin. real int.xΩRe(f)dμ-yΩIm(f)dμ+ΩRe(g)dμ+i[yΩRe(f)dμ+xΩIm(f)dμ+ΩIm(g)dμ]
    =sort(x+iy)[ΩRe(f)dμ+iΩIm(f)dμ]+Ω(Re(g)+iIm(g))dμ
    =defaΩfdμ+Ωgdμ
  2. (ii)

    Re(Ωfdμ)=ΩRe(f)dμ, Im(Ωfdμ)=ΩIm(f)dμ

    Solution.
    Ωfdμ =ΩRe(f)+iIm(f)dμ
    =ΩRe(f)dμ+iΩIm(f)dμ
    =ΩRe(f)dμ=Re(Ωfdμ)+iΩIm(f)dμ=Im(Ωfdμ)
  3. (iii)

    Ωfdμ¯=Ωf¯dμ

    Solution.
    Ωfdμ¯ =ΩRe(f)dμ+iΩIm(f)dμ¯
    =ΩRe(f)dμ-iΩIm(f)dμ
    =ΩRe(f)-iIm(f)=f¯dμ
Exercise 2 (Characteristic Functions).
  1. (i)

    Let X be a random variable on d with characteristic function φX. For a,bd, show that the characteristic function of aX+b is φaX+b(t)=φX(at)eib,t.

    Solution.

    Using (complex) linearity of the integral and bilinearity of the scalar product

    φaX+b(t) =𝔼[eit,aX+b]=bilinear𝔼[ei(at,X+t,b)]=lineareit,b𝔼[eiat,X]
    =eit,bφX(at).
  2. (ii)

    Let X be a random variable on d and Y be a random variable on k. Prove that, X is independent of Y, if

    𝔼[exp(i(t,X+s,Y))]=φX(t)φY(s),td,sk.
    Hint.

    The characteristic function determines the distribution, uniquely! Pick a different random vector (X~,Y~) with X~=(d)X and X~=(d)X and assume independence.

    Solution.

    Construct a different random vector (X~,Y~) with X~=(d)X and similarly Y~ on a product space to guarantee independence. Then we have due to independence

    𝔼[exp(i(t,X+s,Y))] =𝔼[exp(it,Xexp)exp(is,Y)]
    =φX(t)φY(s),td,sk.

    Which implies that (X,Y) needs to have the same distribution as (X~,Y~). But then they are independent by construction of X~,Y~. ∎

Exercise 3 (One-Point Compactification).

We consider the space ([0,],d) with

d(x,y):=|e-x-e-y|.

Prove that

  1. (i)

    d is a metric

    Solution.

    We check the requirements:

    • (Positive Definiteness) For x=y we have d(x,y)=0 but for xy we have e-xe-y and therefore d(x,y)>0.

    • (Symmetry) as |x|=|-x|.

    • (Triangle Inequality) By the triangle inequality of the absolute value we have for x,y,z

      d(x,z) =|e-x-e-y+e-y-e-z|d(x,y)+d(y,z).
  2. (ii)

    and the space is compact.

    Hint.

    Sequences.

    Solution.

    We show that any sequence (xn)n[0,] has a converging subsequence.

    1. Case 1:

      If there is an infinite number of xn=, then we already have a convergent subsequence. If there is only a finite number, we can only consider the elements afterwards and therefore have w.l.o.g. (xn)[0,).

    2. Case 2:

      If the sequence is bounded in [0,) it has a converging subsequence in [0,) which then also converges with regard to d due to continuity of xe-x. If it is unbounded.

    3. Case 3:

      If the sequence is unbounded in [0,), we can find a subsequence such that xnk>k for all k. This implies it converges to in [0,] because

      d(xnk,) =|e-xnk-e-=0|
      e-n0.
Exercise 4 (Gaussian Vectors).

For any k1, a random variable Z:=(Z1,,Zk) on k is called a centred Gaussian vector, if every linear combination of Z1,,Zk is a centred Gaussian random variable on .

  1. (i)

    Prove that, Z is a centred Gaussian vector, if and only if

    𝔼[exp(it,Z)]=exp(-12tTMt),tk,

    where M is a k×k matrix, with Mij=Cov(Zi,Zj).

    Hint.

    recall that the characteristic function of 𝒩(0,σ2) is φ(t)=exp(-12σ2t2)

    Solution.
    \enquote

    Suppose that Z is a centered Gaussian random vector. Then t,Z is by definition a centered Gaussian random variable as a linear combination of the Zi. But we know the characteristic function of the univariate Gaussian distribution, i.e. we have

    𝔼[exp(it,Z)]=exp(-12σt2)

    where σt2 denotes the variance of t,Z. And as it is centered, we have

    σt2=𝔼[t,Z2]=𝔼[t,ZZ,t]=𝔼[tTZZTt]=tT𝔼[ZZT]t=tTCov(Z)t=tTMt.
    \enquote

    As the statement holds for all t we can add a parameter and get that

    𝔼[exp(iλt,Z)]=exp(-12λ2tTMt)λ.

    But the first term is the characteristic function in λ of t,Z. So it is centered Gaussian random variable with variance tTMt (unique determination of characteristic function). Since this holds for any t (any linear combination) Z is a centered Gaussian vector by definition. ∎

  2. (ii)

    Let (X,Y) be a centred Gaussian vector. Prove that, X is independent of Y, if and only if Cov(X,Y)=0.

    Solution.

    We only need to show that uncorrelated random variables are independent. By exercise 2(ii) we know that X and Y are independent if and only if

    𝔼[exp(i(tX+sY))]=φX(t)φY(s)

    But Cov(X,Y)=0 implies by (i)

    𝔼[exp(i(tX+sY))] =exp(-12(t,s)(Var(X)00Var(Y))(ts))
    =exp(-12[t2Var(X)+s2Var(Y))
    =φX(t)φY(s)
Exercise 5 (Complex Analysis).

(Optional Difficult Easter Challenge)

Read about path integrals in Complex Analysis (Funktionentheorie). In particular Cauchy’s Integral Theorem.

  1. (i)

    To calculate an integral over the path γ1:[0,1],s-(n+it)+2sn you could therefore find a path γ2 which connects n-it to -n-it. Because connecting both paths together results in a closed loop, we get by Cauchy’s integral theorem

    -nnf(y-it)𝑑y=x=y-itγ1f(x)𝑑x=-γ2f(x)𝑑x.

    Use this insight to calculate the characteristic function of the standard normal distribution.

    Hint.

    Draw boxes in considering

    φX(t) =12πeitxe-x22dx=12πe-(x-it)22e-t22dx
    =e-t2212πlimn-nne-(x-it)22dx.
    Solution.

    We have

    φX(t) =12πeitxe-x22dx=12πe-(x-it)22e-t22dx
    =e-t2212πlimn-nne-(x-it)22dx,

    and we therefore get by Cauchy’s Integral Theorem (since ex2 is holomorphic),

    -nne-(x-it)22dx=-(t0e-(n-is)22ds=:I1+n-ne-x22dx-2π(n)+0te-(-n-is)22ds=:I2).

    So if we can show that limnI1=0 and limnI2=0 we would be finished with

    φX(t)=e-t22.

    And indeed we have

    |I1| 0t|e-(n2-2ins-s2)2|=|e-n22||e-ins|=1|es22|ds=e-n220tes2/2dstet2/20(n0).

    And similarly for I2. ∎

  2. (ii)

    Use a similar approach to prove that φX(t)=11-it for XExp(1).

    Hint.

    Here simple boxes will not be enough. A particularly interesting path is the linear path connecting (1-it)a to (1-it)b. Mind the derivative when substituting!

    Solution.

    Call γa,b the path linearly connecting (1-it)a to (1-it)b. Then we have

    φX(t) =0eitxe-xdx=0e-(1-it)xdx
    =z=(1-it)x11-itγ0,e-zdz

    Now by closing the loop we get

    abe-ydy=0ate-(a-is)ds=:Ia+γa,be-ydy+bt0e-(b-is)ds=-Ib.

    And if we can similarly get rid of the connecting integrals Ia and Ib for a0 and b, then we immediately get our claim. And this is in fact the case

    |Ic| =|0cte-(c-is)ds|
    =cx=s|0te-(c-icx)cdx|
    c0t|e-(c-icx)|=|e-c||e-icx|dx
    =cte-c0for (c0) or (c).