# Homework set 4

## Question 1

Let \(A, B\subset \mathbb{R}\) be compact. We show that \(A\times B\) the product space is sequentially compact in \(\mathbb{R}^2\). Consider any sequence \(((a_i, b_i))_{i \in \mathbb{N}}\). The projected sequence \((a_i)_{i\in\mathbb{N}}\) has a convergent subsequence because \(A\) is a compact set, so call it \((a_{i_k})_{k\in\mathbb{N}}\). Now we have a subsequence \(((a_{i_k}, b_{i_k}))_{k\in\mathbb{N}}\), the next step is to project its second coordinate, so we consider the sequence \((b_{i_k})_{k\in\mathbb{N}}\) in \(B\). Now because \(B\) is compact, \((b_{i_k})_{k\in\mathbb{N}}\) has a further convergent subsequence \((b_{i_{k_l}})_{l\in\mathbb{N}}\). Finally we can see that \(((a_{i_{k_l}}, b_{i_{k_l}}))_{l\in\mathbb{N}}\) is a convergent subsequence of \(A\times B\). \(\square\)

## Question 2

Fix \(x\in X\), we have \(d(x,A) := \inf_{y\in A} d(x,y)\). Either we already have \(y^*\in A\) such that \(d(x,A) = d(x,y^*)\) in which case we are done, or we have a sequence of points \((y_n)_{n\in\mathbb{N}}\) such that \(\lim_{n\to\infty} d(x,y_n) = d(x, A)\). By compactness of \(A\), we find a subsequence \(n_1 < n_2 < \dots\) such that \(y_{n_k} \to y^*\) for some \(y^*\in A\). Claim is that \(d(x,y^*) = d(x, A)\). Note that because \(y^*\in A\), it is already established that \(d(x,y^*) \geq d(x,A)\). For each \(k\in\mathbb{N}\), by triangle inequality \[ d(x,y^*) \leq d(x, y_{n_k}) + d(y_{n_k}, y^*) \] taking limit as \(k\to\infty\) we obtain \[ d(x,y^*) \leq d(x, A) + 0 \] which shows that \(d(x,y^*) = d(x, A)\).\(\square\)

## Question 3

Let \(A,B \in \mathcal{H}(X)\) and we can define the Hausdorff distance as \[ d_H(A,B) := \max{\left\{ \sup_{x\in A} d(x,B), \sup_{y\in B} d(y,A) \right\}}. \]

By reading off the definition we can see that it’s symmetric, so \(d_H(A,B) = d_H(B,A)\).

The property that \(d_H(A,B)\geq 0\) is inherited from that of \(d(-,-)\).

Now we show \(A = B\) iff \(d_H(A,B) = 0\). Suppose \(A = B\), then it is easy to see that for any \(x\in A\) \(d(x,A) = \inf_{y\in A} d(x,y) = d(x,x) = 0\). This means \(d_H(A,B) = \max{\left\{ 0,0 \right\}} = 0\). Conversely assume \(d_H(A,B) = 0\). By positiveness of \(d(-,-)\), we have \[ \sup_{x\in A} d(x,B) = \sup_{y\in B} d(y, A) = 0 \] again using positiveness we can deduce that \[ \forall x\in A.~ d(x,B) = 0 \text{ and } \forall y\in B.~ d(y,A) = 0. \] Now we have enough results to prove the equality. Let \(x\in A\), then \(d(x,B) = \inf_{y\in B} d(x,y) = 0\). Use Q2 to get existence of \(y^* \in B\) such that \(d(x,y^*) = 0\). By definiteness of \(d(-,-)\) we have \(x = y^*\) which shows \(x\in B\). Therefore \(A\subset B\). The proof of the reverse containment is the same, so \(A = B\).

Note that this property breaks down if \(A\) or \(B\) is not compact. Let \(A\) be the open unit ball in \(R^k\) and \(B\) be the closure of \(A\). We have for any \(x\in A\), \(d(x, B) = 0\) because \(A\subset B\). Also for any \(x\in B\), \(d(x, A) = \inf_{y\in B} d(x,y) = 0\) in both cases where \(x\in A\) and \(x\) is a limit point of \(A\). Then in this example \(d_H(A,B) = 0\) but by construction \(A \ne B\).

Now we show the triangle inequality holds. Let \(A,B,C\subset X\) be nonempty and bounded so that their distance is real. Note that compactness was not assumed. We want to show that \[ d_H(A,B) \leq d_H(A,C) + d_H(C,B) \] although \(d_H(A,B) := \max{\left\{ \sup_{x\in A} d(x,B), \sup_{y\in B} d(y,A) \right\}}\), we assume without loss of generality that we are in the case \(d_H(A,B) = \sup_{x\in A} d(x,B)\), as interchanging role of \(A,B\) will cover the other case. Let \(a\in A\) be arbitrary, we have \[ d(a,B) = \inf_{b\in B} d(a,b) \] for all \(c\in C\), by triangle inequality each \(d(a,b) \leq d(a,c) + d(c,b)\), so for all \(c\in C\), \[\begin{align*} d(a,B) &\leq \inf_{b\in B} ( d(a,c) + d(c,b) ) \\ &= d(a,c) + \inf_{b\in B} d(c,b) \\ &= d(a,c) + d(c,B) \\ &\leq d(a,c) + d(C,B) \end{align*}\] since this bound holds for all \(c\in C\), \[\begin{align*} d(a,B) &\leq (\inf_{c\in C} d(a,c) + d(C,B)) \\ &= \inf_{c\in C} d(a,c) + d(C,B) \\ &= d(a,C) + d(C,B) \\ &\leq d(A,C) + d(C,B) \end{align*}\] Since \(a\in A\) was arbitrary, \[ \sup_{a\in A} d(a,B) = d(A,B) \leq d(A,C) + d(C,B). \tag*{$\square$}\] As compactness was not invoked in the proof triangle inequality holds as long as \(d_H\) between the sets are defined.

# Homework set 5

## Rudin Chapter 3 Q1

We can assume the sequence is complex. Suppose \((s_n)\) converges, to see that \((\left\lvert s_n \right\rvert)\) converges, let \(\varepsilon>0\), by convergence of \((s_n)\) there exists \(s\in\mathbb{C}, N\in\mathbb{N}\) such that \[ n\geq N\implies \left\lvert s_n - s \right\rvert < \varepsilon. \] We now claim that \((\left\lvert s_n \right\rvert) \to \left\lvert s \right\rvert\), to see why, note that whenever \(n\in\mathbb{N}, n > N\), we have by Chapter 1 Q13 (from Homework 2) \[ \left\lvert \left\lvert s_n \right\rvert - \left\lvert s \right\rvert \right\rvert \leq \left\lvert s_n - s \right\rvert < \varepsilon. \tag*{$\square$}\]

The converse can be seen to be false if we consider the sequence \((s_n)\) where for each \(n\in\mathbb{N}\), \(s_n = (-1)^n\). In this case \(\left\lvert s_n \right\rvert \to 1\) but \(s_n\) oscillates.

## Rudin Chapter 3 Q20

Suppose \((p_n)_{n\in\mathbb{N}}\) is a Cauchy sequence in a metric space \(X\) and we have a subsequence \(n_1 < n_2 < \dots\) such that \((p_{n_i})_{i\in\mathbb{N}} \to p\) in \(X\). We need to show that the original sequence converges to \(p\), so fix \(\varepsilon>0\). By convergence of the subsequence, we can find \(M_0\in\mathbb{N}\) such that \[ m\geq M_0 \implies d(p_{n_m}, p) < \varepsilon/5. \] Also because the sequence \((p_n)\) itself is Cauchy, we can find \(M_1\in \mathbb{N}\) such that \[ n,m\geq M_1 \implies d(p_n, p_m) < \varepsilon/5. \] Now let \(N := \max(n_{M_0}, M_1)\), whenever \(n\in\mathbb{N}, n\geq N\), we can define \(k\) to be the smallest number such that \(n_k \geq n\). Note that by our choice of \(N\), \(k \geq M_0\) and both \(n_k, n \geq M_1\), then \[\begin{align*} d(p_n, p) &\leq d(p_n, p_{n_k}) + d(p_{n_k}, p) \\ &< \frac{\varepsilon}5 + \frac{\varepsilon}5 < \varepsilon\tag*{$\square$} \end{align*}\]

## Rudin Chapter 3 Q21

For each \(n\in\mathbb{N}\) we choose a point \(p_n \in E_n\). We first claim that \((p_n)_{n\in\mathbb{N}}\) is Cauchy. For any \(\varepsilon>0\), since \(\lim_{n\to\infty}\operatorname{diam}E_n = 0\), find large enough \(N\in\mathbb{N}\) such that \(\operatorname{diam}E_N < \varepsilon\), then since we have a descending chain of sets, for any \(m,n> N\), \(p_m, p_n\in E_N\), so \(d(p_m, p_n) \leq \operatorname{diam}E_N < \varepsilon\). Since our metric space \(X\) is complete, let \(p\in X\) such that \((p_n)_{n\in\mathbb{N}} \to p\). The immediate observation we can make here is that \(p \in \bigcap E_n\). This is because for each \(n\in\mathbb{N}\), \(E_n\) is closed and \(p_{n+1}, p_{n+2}, \dots\) is a sequence of points converging at \(p\), so \(p\) is a limit point of \(E_n\) and \(p\in E_n\).

Now we show that no other point \(p'\) can exist in the intersection. Suppose \(p' \ne p\) lies in the intersection, then we have a contradiction as by choosing a large enough \(M\) we can have \(E_M\) such that \(\operatorname{diam}E_M < d(p',p)\), a contradiction. \(\square\)

## Q

Use the definition of Cauchy sequences to prove that if \((x_n)_{n\in\mathbb{N}}\) and\((y_n)_{n\in\mathbb{N}}\) are two Cauchy sequences of real numbers, then \((x_ny_n)_{n\in\mathbb{N}}\) is also a Cauchy sequence.

We first note that the Cauchy sequences are convergent, thus bounded. Let \(X>0\) be a bound for \((x_n)_{n\in\mathbb{N}}\) and \(Y>0\) a bound for \((y_n)_{n\in\mathbb{N}}\), so for any \(n\in\mathbb{N}\) we have \[ \left\lvert x_n \right\rvert < X, \left\lvert y_n \right\rvert < Y. \] To show that \((x_ny_n)_{n\in\mathbb{N}}\) is a Cauchy sequence, first fix any \(\varepsilon>0\), we want to find \(N\in\mathbb{N}\) such that \[m,n>N \implies \left\lvert x_ny_n - x_my_m \right\rvert < \varepsilon. \tag{5.1}\] By Cauchy criterion applied to \((x_n)_{n\in\mathbb{N}}\) and \((y_n)_{n\in\mathbb{N}}\), we get existence of \(N_1, N_2\in\mathbb{N}\) such that \[\begin{align*} m,n>N_1 &\implies \left\lvert x_n - x_m \right\rvert < \frac{\varepsilon}{2X} \\ m,n>N_2 &\implies \left\lvert y_n - y_m \right\rvert < \frac{\varepsilon}{2Y}. \end{align*}\] Choose \(N = \max(N_1, N_2)\), then for any \(m,n>N\), \[\begin{align*} \left\lvert x_ny_n - x_my_m \right\rvert &= \left\lvert (x_ny_n - x_ny_m) + (x_ny_m - x_my_m) \right\rvert \\ &\leq \left\lvert x_n(y_n - y_m) \right\rvert + \left\lvert y_m(x_n - x_m) \right\rvert \\ &= \left\lvert x_n \right\rvert\left\lvert y_n - y_m \right\rvert + \left\lvert y_m \right\rvert\left\lvert x_n - x_m \right\rvert \\ &< X\frac{\varepsilon}{2X} + Y\frac{\varepsilon}{2Y} \\ &= \varepsilon\tag*{$\square$} \end{align*}\]

# Homework set 6

## Rudin Chapter 3 Q2

Observe that \[\begin{align*} \sqrt{n^2 + n} - n &= \frac{ n^2 + n - n^2 }{ \sqrt{n^2 + n} + n } \\ &= \frac{ 1 }{ \frac{\sqrt{n^2 + n}}n + 1 } \\ &= \frac{ 1 }{ \sqrt{1 + \frac1n} + 1 } \\ \end{align*}\] so the limit as \(n\to\infty\) is \(\frac12\). \(\square\)

## Question A

Define \[ S := {\left\{ f(x,y): x\in X, y\in Y \right\}}. \] We show that \[ \inf_{x\in X} \inf_{y\in Y} f(x,y) = \inf S. \] We can see that \[ S_1 := \inf_{x\in X} (\inf_{y\in Y} f(x,y)) \geq \inf S. \] To show the reverse inequality, we show that for all \(x\in X, y\in Y\), \(f(x,y) \geq S_1\), this is true because \[ f(x,y) \geq \inf_{y'\in Y} f(x,y') \geq \inf_{x'\in X} \inf_{y'\in Y} f(x',y') = S_1. \] A similar argument proves \(S = S_2 := \inf_{y\in Y}\inf_{x\in X} f(x,y)\). \(\square\)

In general we cannot interchange the order of \(\liminf_{m\to\infty}\) and \(\liminf_{n\to\infty}\). If the limit exists, then \(\liminf\) and limit are equal, so a counterexample similar to the tutorial question, like \(\left(1/n\right)^{1/m}\) can be applied.

## Question B

Given a real sequence \((x_n)_{n\in\mathbb{N}}\). Suppose \(x = \liminf_{n\to\infty} x_n\). By definition, \(x = \inf E\) where \(E\) is the set of subsequential limits. Since \(E\) is closed we see that \(x\) is also a subsequential limit of \((x_n)_{n\in\mathbb{N}}\). Now let \(\varepsilon>0\), we want to show that there exists \(N\) such that for all \(n>N\), \(x_n > x - \varepsilon\). Suppose not, then we have a number \(\varepsilon_0 > 0\) such that for all \(N\), there exists \(n>N\) such that \(x_n \leq x - \varepsilon_0\). This property will allow us to create a subsequence bounded above by \(x -\varepsilon_0\), which contradicts the definition.

Conversely suppose \(x\) is a subsequential limit of \((x_n)_{n\in\mathbb{N}}\) and it has the property \[\forall\varepsilon>0.~ \exists N\in\mathbb{N}.~ \forall n > N.~ x_n > x - \varepsilon.\] We want to show that \(x = \inf E\). Since \(x\in E\) already we just have to show that there cannot be an \(x' < x\) with \(x' \in E\). Suppose not, so we have a subsequence \(n_1 < n_2 < \dots\) such that \((x_{n_i})_{i\in\mathbb{N}} \to x'\). Because it converges, we can just consider the tail of the subsequence, so without loss of generality \(\left\lvert x_{n_1} - x' \right\rvert < r\) where \(r = (x - x')/2\). Now we see that the property of \(x\) is violated, consider the number \(r\), for all \(N > n_1\), we can find the next term in the subsequence \(x_{n_k}\) where \(n_k > N\) and \[ \left\lvert x_{n_k} - x' \right\rvert < r \] so in particular \[ x_{n_k} - x' < (x - x')/2 \] so \(x_{n_k} < (x + x')/2 = x - r\) which is a contradiction. \(\square\)

## Question C

Suppose \((x_n)_{n\in\mathbb{N}}\to x\), where \(x \in \overline{\mathbb{R}}\). Case \(x = \infty\), We want to show that \((y_n) \to \infty\). Fix any real \(M>0\), there is a \(N\in\mathbb{N}\) such that \[ n > N \implies x_n \geq 4M.\] Now we consider the sequence \[ z_n = \frac\alpha{n} \text{ where } \alpha = x_1 + \dots + x_N \] which clearly converges to \(0\). By convergence let \(L\in\mathbb{N}\) such that \[n > L \implies \left\lvert z_n \right\rvert < M\]

So for all sufficiently large \(n\) (\(n > \max(2N, L)\)), we have \[\begin{align*} y_n &= \frac{x_1 + x_2 + \dots + x_n}{n} \\ &= \frac{x_1 + \dots + x_N}{n} + \frac{x_{N+1} + \dots + x_n}{n} \\ &\geq \frac{x_1 + \dots + x_N}{n} + \frac{(n-N)(4M)}{n} \\ &\geq \frac{x_1 + \dots + x_N}{n} + \frac{n(4M)}{2} \\ &= \frac{x_1 + \dots + x_N}{n} + 2nM \\ &> -M + 2nM = (2n-1)M \geq M \end{align*}\] The case where \(x = -\infty\) is symmetric but similar.

Case \(x\in\mathbb{R}\), we show \(y_n \to x\). Fix any \(\varepsilon>0\) and by convergence of \(x_n\), we can find \(N\in\mathbb{N}\) such that \[ n>N \implies \left\lvert x_n - x \right\rvert < \varepsilon/2. \]

Again we consider the sequence \[z'_n = \frac{\alpha'}{n} \text{ where } \alpha' = (x_1 - x) + (x_2 - x) + \dots + (x_N - x). \] Similarly let \(L'\in\mathbb{N}\) such that \[ n > L' \implies \left\lvert z'_n \right\rvert < \varepsilon. \]

Then for any sufficiently large \(n\) (\(n > \max(N,L)\)) \[\begin{align*} \left\lvert y_n - x \right\rvert &= \left\lvert \frac{x_1 + \dots + x_N + x_{N+1} + \dots + x_n}{n} - \frac{\overbrace{x + \dots + x}^{n \text{copies}}}{n} \right\rvert \\ &= \left\lvert \frac{(x_1-x) + \dots + (x_N-x) + (x_{N+1}-x) + \dots + (x_n-x)}n \right\rvert \\ &< \frac{\varepsilon}2 + \left\lvert \frac{(x_{N+1}-x) + \dots + (x_n-x)}n \right\rvert \\ &\leq \frac{\varepsilon}2 + \left\lvert \frac{(x_{N+1}-x) + \dots + (x_n-x)}n \right\rvert \\ &< \frac{\varepsilon}2 + \frac{(n-N)(\varepsilon/2)}{n} \\ &\leq \frac{\varepsilon}2 + \frac{n\varepsilon}{2n} \\ &\leq \frac{\varepsilon}2 + \frac{\varepsilon}2 = \varepsilon \tag*{$\square$} \end{align*}\]

As for the counter example, consider \(x_n = (-1)^n\) which is not convergent, but the Cesaro average of this sequence converges to 0.