Skip to main content

Section 1.3 Convergence and Completeness

The idea behind convergence of a sequence is that the distance to the limit approaches \(0\text{.}\) This concept has a natural extension to metric spaces.

Definition 1.29.

Let \((X,d)\) be a metric space and \((x_n)_{n \in \N}\) a sequence in \(X\text{.}\) A point \(x_0 \in X\) is said to be a limit of the sequence if \(d(x_n,x_0) \to 0\) in \(\R\) as \(n \to \infty\text{.}\) If so, we say that the sequence converges to \(x_0\) and we write \(x_0 = \lim_{n \to \infty} x_n\) or \(x_n \to x_0\text{.}\)

Example 1.30.

In \(\R\text{,}\) consider the sequence \((x_n)_{n \in \N}\) with \(x_n = \frac{1}{n}\text{.}\) Then \(d(x_n,0) = \frac{1}{n}\text{,}\) so \(x_n\) converges to \(0\text{.}\)

Example 1.31.

In \(X = (0,1]\text{,}\) let \(x_n = \frac{1}{n}\text{.}\) Then there is no point \(x_0\) in \(X\) such that \(d(x_n,x_0) \to 0\) as \(n \to \infty\text{.}\) The sequence \((x_n)_{n \in \N}\) is not convergent in this space.

Convention 1.32. Notation for sequences.

Sequences are one of the main characters in this unit, and so it is important that we agree on clear and consistent notation for them.
  • On the board and on problem sheets and exams, we allow ourselves to drop the subscript “\(n \in \N\)” from \(\seq xn\) when it is clear from context.
  • It is sometimes convenient to indicate sequences by listing their terms with an ellipsis, \(\seq xn = (x_1,x_2,\ldots)\text{.}\)
  • There is unfortunately no standard notation for the phrase “\(\seq xn\) is a sequence in \(X\)”. It is incorrect to write \(\seq xn \in X\text{,}\) since the sequence \(\seq xn\) is not a point in \(X\text{,}\) and it also incorrect to write \(\seq xn \subset X\text{,}\) as \(\seq xn\) is not subset of \(X\) either. It is technically correct to write \(\seq xn \in X^\N\text{,}\) thinking of the sequence as a mapping \(\N \to X\text{,}\) but this is not common. Another uncommon but technically acceptable option is to write \(x_1,x_2,\ldots \in X\text{.}\)

Example 1.33. Uniform convergence.

Given a set \(S\text{,}\) consider the normed space \((B(S),\n\blank_{\sup})\) from Example 1.7. A sequence \((f_n)_{n \in \N}\) converges to \(f_0\) in \(B(S)\) if and only if
\begin{equation*} \forall \varepsilon \gt 0 \, \exists N \in \N \st \forall n \ge N \, \forall s \in S ,\; |f_n(s) - f_0(s)| \lt \varepsilon\text{.} \end{equation*}
This is also known as uniform convergence.

Proof.

Part i.

By the triangle inequality,
\begin{equation*} d(x_0,y_0) \le d(x_0,x_n) + d(y_0,x_n) \to 0 \end{equation*}
as \(n \to \infty\text{.}\) Hence \(d(x_0,y_0) = 0\text{,}\) which means that \(x_0 = y_0\text{.}\)

Part ii.

Recall: when we say that \((x_{n_k})_{k \in \N}\) is a subsequence of \((x_n)_{n \in \N}\text{,}\) this means that \((n_k)_{k \in \N}\) is a strictly increasing sequence of natural numbers. Let \(\varepsilon \gt 0\text{.}\) Since \(x_n \to x_0\) as \(n \to \infty\text{,}\) there exists \(N \in \N\) such that \(d(x_n,x_0) \lt \varepsilon\) for \(n \ge N\text{.}\) Moreover, there exists \(K \in \N\) such that \(n_k \ge N\) for \(k \ge K\text{.}\) Hence \(d(x_{n_k},x_0) \lt \varepsilon\) for \(k \ge K\text{,}\) which proves the convergence \(x_{n_k} \to x_0\) as \(k \to \infty\text{.}\)
The following criterion is very useful for checking whether or not a given set is closed.

Proof.

Suppose that \(S\) is closed and \((x_n)_{n \in \N}\) is a sequence in \(S\) with limit \(x_0 \in X\text{.}\) Then for any \(r \gt 0\text{,}\) there exists \(n \in \N\) with \(x_n \in B_r(x_0)\text{.}\) But since \(x_n \in S\text{,}\) this means that \(B_r(x_0) \cap S \ne \varnothing\text{;}\) so \(x_0 \in \overline S = S\text{.}\)
Now suppose that \(S\) is not closed. Then \(X \setminus S\) is not open. Hence there exists \(x_0 \in X \setminus S\) such that \(B_r(x_0) \cap S \ne \varnothing\) for any \(r \gt 0\text{.}\) Use this observation for \(r = \frac{1}{n}\) for every \(n \in \N\text{.}\) Thus we find points \(x_n \in B_{1/n}(x_0) \cap S\text{.}\) So \((x_n)_{n \in \N}\) is a sequence in \(S\) that converges to \(x_0\text{.}\) But \(x_0 \not\in S\text{,}\) so the condition from the theorem is not satisfied.
Thinking back to first and second year analysis, you may remember the Cauchy principle, which states that a sequence in \(\R\) is convergent if and only if it is a Cauchy sequence. The concept of a Cauchy sequence has an obvious generalisation to metric spaces.

Definition 1.36.

A sequence \((x_n)_{n \in \N}\) in a metric space \((X,d)\) is called a Cauchy sequence if
\begin{equation*} \forall \varepsilon \gt 0 \, \exists N \in \N \st \forall m,n \ge N ,\; d(x_m,x_n) \lt \varepsilon\text{.} \end{equation*}

Proof.

Let \((x_n)_{n \in \N}\) be a convergent sequence in the metric space \((X,d)\text{.}\) Denote its limit by \(x_0\text{.}\) Fix \(\varepsilon \gt 0\text{.}\) Then there exists an \(N \in \N\) such that \(d(x_n,x_0) \lt \frac{\varepsilon}{2}\) for \(n \ge N\text{.}\) For \(m,n \ge N\text{,}\) we then conclude that
\begin{equation*} d(x_m,x_n) \le d(x_m,x_0) + d(x_n,x_0) \lt \varepsilon\text{.} \end{equation*}
Is the Cauchy principle true in metric spaces, too? In general, the answer is no. But in many cases, it is true, and we pay these spaces particular attention.

Definition 1.38.

A metric space \((X,d)\) is called complete if every Cauchy sequence in \(X\) is convergent. A complete normed space is called a Banach space and a complete inner product space is called a Hilbert space.

Example 1.39.

The Euclidean spaces \(\R^n\) are complete by results from second year analysis. Hence they are Hilbert spaces.

Proof.

We already know that \(B(S)\) is a normed space. To verify that the space is complete, let \(\seq fn\) be a Cauchy sequence in \(B(S)\text{.}\) We must show that there exists \(f_0 \in B(S)\) such that \(\n{f_n-f_0} \to 0\) as \(n \to \infty\text{.}\)

Step 1: Defining \(f_0\).

Fix \(s \in S\text{.}\) For any given \(\varepsilon \gt 0\text{,}\) there exists \(N \in \N\) such that \(d(f_m,f_n) \lt \varepsilon\) when \(m,n \ge N\text{.}\) But then
\begin{equation*} |f_m(s) - f_n(s)| \le \|f_m - f_n\|_{\sup} \lt \varepsilon\text{.} \end{equation*}
Hence \((f_n(s))_{n \in \N}\) is a Cauchy sequence in \(\R\text{.}\) Since \(\R\) is complete, there exists a limit. Call the limit \(f_0(s)\text{.}\) Thus for every \(s \in S\) we obtain a number \(f_0(s)\text{,}\) and this gives rise to a function \(f_0 \maps S \to \R\text{.}\)

Step 2: \(f_0 \in B(S)\).

Since \(\seq fn\) is a Cauchy sequence, it is bounded by Exercise 1.3.3. Since \(B(S)\) is a normed space, this is equivalent (by Exercise 1.1.11) to
\begin{gather*} M = \sup_{n \in \N} {\n{f_n}_{\sup}} \lt \infty \text{.} \end{gather*}
Therefore, for any \(s \in S\text{,}\)
\begin{equation*} |f_0(s)| = \lim_{n \to \infty} |f_n(s)| \le M\text{.} \end{equation*}
This proves that \(f_0\) is bounded, i.e., that \(f_0 \in B(S)\text{.}\)

Step 3: \(f_n \to f_0\).

Fix \(\varepsilon \gt 0\text{,}\) and choose \(N \in \N\) such that \(\|f_m - f_n\|_{\sup}\lt \frac{\varepsilon}{2}\) when \(m,n \ge N\text{.}\) Then for any \(s \in S\text{,}\)
\begin{align*} |f_n(s) - f_0(s)| \amp= \left|f_n(s) - \lim_{m \to \infty} f_m(s)\right| \\ \amp= \lim_{m \to \infty} |f_n(s) - f_m(s)| \le \frac{\varepsilon}{2} \end{align*}
whenever \(n \ge N\text{.}\) Hence
\begin{equation*} \|f_n - f_0\|_{\sup} = \sup_{s \in S} |f_n(s) - f_0(s)| \le \frac{\varepsilon}{2} \lt \varepsilon \end{equation*}
for all \(n \ge N\text{.}\)

Proof.

Suppose that \(Y\) is closed and let \((x_n)_{n \in \N}\) be a Cauchy sequence in \(Y\text{.}\) Then it is also a Cauchy sequence in \(X\text{,}\) and it follows that it has a limit \(x_0\) in \(X\text{.}\) By Theorem 1.35, we know that \(x_0 \in Y\text{.}\) Since
\begin{equation*} d'(x_n,x_0) = d(x_n,x_0) \to 0\text{,} \end{equation*}
this means that \(x_n \to x_0\) in \(Y\text{.}\) So \(Y\) is complete.
Conversely, suppose that \(Y\) is not closed. Then by Theorem 1.35 again, there exists a sequence \((x_n)_{n \in \N}\) in \(Y\) that is convergent in \(X\) with a limit \(x_0 \in X \setminus Y\text{.}\) According to Theorem 1.37, this is a Cauchy sequence in \(X\) and therefore a Cauchy sequence in \(Y\) as well. Then by the uniqueness of limits (Theorem 1.34), it cannot be convergent in \(Y\text{.}\) So \(Y\) is not complete.

Exercises Exercises

1. (PS3) Some basic limits.

Let \((X,d)\) be a metric space, and suppose that \(\seq xn\) and \(\seq yn\) are convergent sequences in \(X\) with \(x_n \to x_0 \in X\) and \(y_n \to y_0 \in X\text{.}\)
(a)
Show that \(d(x_n,y_n) \to d(x_0,y_0)\text{.}\)
Hint.
Use the triangle inequality to show \(d(x_0,y_0) \le d(x_0,x_n) + d(x_n,y_n) + d(y_n,y_0)\text{,}\) as well as the same inequality with the roles of \(0\) and \(n\) reversed. Then use this to estimate the difference \(\abs{d(x_n,y_n) - d(x_0,y_0)}\text{.}\)
Solution.
Applying the triangle inequality twice we find, for any \(n \in \N\text{,}\)
\begin{gather*} d(x_0,y_0) \le d(x_0,x_n) + d(x_n,y_n) + d(y_n,y_0), \end{gather*}
which rearranges to
\begin{gather*} d(x_0,y_0) - d(x_n,y_n) \le d(x_0,x_n) + d(y_n,y_0). \end{gather*}
Reversing the roles of \((x_0,y_0)\) and \((x_n,y_n)\) we similarly find
\begin{gather*} d(x_n,y_n) - d(x_0,y_0) \le d(x_n,x_0) + d(y_0,y_n). \end{gather*}
By the symmetry of the metric, the right hand sides of these two inequalities are the same, and so combining them we get
\begin{gather*} \abs{d(x_n,y_n) - d(x_0,y_0)} \le d(x_n,x_0) + d(y_0,y_n) \to 0 \end{gather*}
as \(n \to \infty\text{,}\) which is the desired convergence.
Comment 1.
Several students wrote statements like “\(d(x_n,y_n) \le d(x_0,y_0)\) as \(n \to \infty\)”. Not sure what this means (so don’t use it on an exam!) but my best guess is something like \(\limsup_{n\to\infty} d(x_n,y_n) \le d(x_0,y_0)\text{?}\)
Comment 2.
We could of course also have written out an argument using \(\varepsilon\)s and \(N\)s without too much additional effort, and many students did so. You will never lose marks on an exam for writing out a perfect \(\varepsilon\)-\(N\) proof, but on the other hand you may find that the argument in the official solution, which takes advantage of some basic facts about limits of sequences of real numbers, is a bit easier.
(b)
Suppose that \((X,\n\blank)\) is in fact a normed space, and let \(\seq \alpha n\) be a convergent sequence in \(\R\) with \(\alpha_n \to \alpha_0 \in \R\text{.}\) Show that
  1. \(\n{x_n} \to \n{x_0}\text{,}\)
  2. \(x_n + y_n \to x_0 + y_0\) and
  3. \(\alpha_n x_n \to \alpha_0 x_0\text{.}\)
Hint.
For the first part, find a way to express \(\n{x_n}\) in terms of the metric \(d\text{.}\) For the other two parts you will want to use the triangle inequality, perhaps after adding and subtracting an appropriate ‘cross term’.
Solution.
Since \(\n{x_n} = d(x_n,0)\text{,}\) the limit in i follows from the previous part. For ii, we use the triangle inequality to estimate
\begin{align*} \n{(x_n+y_n) - (x_0+y_0)} \amp = \n{(x_n-x_0) + (y_n - y_0)}\\ \amp \le \n{x_n-x_0} + \n{y_n - y_0}\\ \amp \to 0\text{.} \end{align*}
For iii, we add and subtract \(\alpha_0 x_n\) and use the triangle inequality to estimate
\begin{align*} \n{\alpha_n x_n - \alpha_0 x_0} \amp = \n{\alpha_n x_n - \alpha_0 x_n + \alpha_0 x_n - \alpha_0 x_0}\\ \amp \le \n{(\alpha_n -\alpha_0)x_n} + \n{\alpha_0 (x_n-x_0)}\\ \amp = \abs{ \alpha_n -\alpha_0} \n{x_n} + \abs{\alpha_0} \n{x_n-x_0}\\ \amp \to 0 \cdot \n {x_0} + \abs{\alpha_0} \cdot 0\\ \amp = 0\text{,} \end{align*}
where in the second-to-last step we have used i.
Comment 1.
In the official solution we really did need to use i, in which case it is probably a good idea to mention this (e.g. on an exam). Depending on which term you add and subtract, though, this may not be an issue.
Comment 2.
As with the previous part, it is possible to write out an \(\varepsilon\)-\(N\) argument which accomplishes the same thing. For this one has to think much harder about inequalities, and either avoid dividing by things like \(\n{x_0}\) or \(\abs{\alpha_0}\) which might be zero or treat these as special cases.
(c)
Suppose that \((X,\scp\blank\blank)\) is an inner product space. Show that \(\scp{x_n}{y_n} \to \scp {x_0}{y_0}\text{.}\)
Hint.
Again, experiment with adding and subtracting an appropriate ‘cross term’. The Cauchy–Schwarz inequality is also useful.
Solution.
Adding and subtracting \(\scp{x_n}{y_0}\text{,}\) we estimate
\begin{align*} \abs{\scp{x_n}{y_n} - \scp{x_0}{y_0}} \amp = \abs{(\scp{x_n}{y_n} - \scp{x_n}{y_0}) - (\scp{x_0}{y_0} - \scp{x_n}{y_0})}\\ \amp = \abs{\scp{x_n}{y_n-y_0} + \scp{x_n-x_0}{y_0}}\\ \amp \le \abs{\scp{x_n}{y_n-y_0}} + \abs{\scp{x_n-x_0}{y_0}}\\ \amp \le \n{x_n}\n {y_n-y_0} + \n{x_n-x_0}\n{y_0}\\ \amp \to \n {x_0} \cdot 0 + 0 \cdot \n{y_0}\\ \amp = 0\text{,} \end{align*}
where we have used the Cauchy–Schwarz inequality and i from the previous part.
Comment.
Again one can give an \(\varepsilon\)-\(N\) argument instead, but this requires a bit more work.

2. (PS3) Convergence in product spaces.

Let \((X, d_X)\) and \((Y, d_Y)\) be metric spaces, and consider the product metric space \((X \times Y, d_{X \times Y})\) defined in Definition 1.14. Let \((x_n)_{n \in \N}\) be a sequence in \(X\) and \((y_n)_{n \in \N}\) a sequence in \(Y\text{.}\) Let \(x_0 \in X\) and \(y_0 \in Y\text{.}\) Show that \((x_n, y_n) \to (x_0, y_0)\) in \(X \times Y\) as \(n \to \infty\) if and only if \(x_n \to x_0\) in \(X\) and \(y_n \to y_0\) in \(Y\) as \(n \to \infty\text{.}\)
Solution.
The convergence \((x_n,y_n) \to (x_0,y_0)\) in \(X \times Y\) is equivalent to the condition
\begin{equation*} \lim_{n \to \infty} \sqrt{(d_X(x_n,x_0))^2 + (d_Y(y_n,y_0))^2} = 0\text{.} \end{equation*}
This in turn is equivalent to
\begin{equation*} \lim_{n \to \infty} d_X(x_n,x_0) = 0 \quad \text{and} \quad \lim_{n \to \infty} d_Y(y_n,y_0) = 0\text{,} \end{equation*}
which means that \(x_n \to x_0\) in \(X\) and \(y_n \to y_0\) in \(Y\text{.}\)
Comment 1.
Here we have used the fact that \((d_X(x_n,x_0))^2 \ge 0\) and similarly for the \(d_Y\) term. This is true just because it is the square of a real number, the additional fact that \(d_X(x_n,x_0) \ge 0\) is not needed.
Comment 2.
The official solution, which I’ve inherited from previous years, is quite short. It assumes that the reader is well-versed in things from Analysis 1, and could write out a more detailed argument (e.g. with \(\varepsilon\)s and \(N\)s) without too much difficulty. Many students wrote out such arguments, which is of course totally fine.
While no one did so this year, it’s interesting to note that some of the inequalities in the last part of Exercise 1.1.5 could also be used here.

3. (PS4) Cauchy implies bounded.

A sequence \((x_n)_{n \in \N}\) in a metric space \((X,d)\) is called bounded if the set \(\set{x_n}{n \in \N}\) is bounded. Show that any Cauchy sequence in a metric space is bounded.
Hint.
Apply the definition of \(\seq xn\) being Cauchy with \(\varepsilon=1\) to find an appropriate \(N\in\N\text{.}\) Then estimate \(d(x_n,x_m)\) in different ways depending on whether \(n\ge N\text{,}\) \(m \ge N\text{,}\) neither, or both.
Solution.
Let \((X,d)\) be a metric space and \((x_n)_{n \in \N}\) a Cauchy sequence in \(X\text{.}\) Then there exists \(N \in \N\) such that \(d(x_m, x_n) \lt 1\) for all \(m, n \ge N\text{.}\) Set
\begin{equation*} M = \max_{m, n \le N} d(x_m, x_n)\text{.} \end{equation*}
Then for any \(m, n \in \N\text{,}\) if \(m, n \le N\text{,}\) then clearly \(d(x_m, x_n) \le M\text{.}\) If \(m, n \gt N\text{,}\) then \(d(x_m, x_n) \lt 1\text{.}\) Moreover, if \(m \le N\) and \(n \gt N\text{,}\) then
\begin{equation*} d(x_m, x_n) \le d(x_m, x_N) + d(x_N, x_n) \le M + 1\text{,} \end{equation*}
and the same inequality also holds for \(n \le N\) and \(m \gt N\text{.}\) Combing all of the cases we conclude that
\begin{equation*} \diam \set{x_n}{n \in \N} \le \max\{1,M,M+1\} = M + 1 \lt \infty \end{equation*}
and hence that the sequence is bounded.
Comment 1.
Recall from Definition 1.15 that, in metric spaces, we have defined boundedness of subsets in terms of diameters.
Comment 2.
Alternatively, we could show that the two sets \(\{x_n : n \le N\}\) and \(\{x_n : n \ge N\}\) were each bounded, and then prove a lemma which says that finite unions of bounded sets are bounded.
Comment 3.
Please take a look at the last part of Exercise B.2, and if you have any questions about this get in touch with me over email or come to office hours. This sort of confusion over what a supremum means would be disastrous on an exam.
In the definitions in this unit the word ‘bounded’ almost always refers to some sort of supremum being finite. The supremum
\begin{equation*} \sup_{n \in \N} n^2 = \sup\{ n^2 : n \in \N\} \end{equation*}
is clearly infinite. At the same time, for each fixed \(n \in \N\text{,}\) \(n^2\) is certainly finite.
When showing that a set \(S\) in a metric space \((X,d)\) is bounded (according to Definition 1.15), we need to show that the supremum
\begin{equation*} \diam S = \sup_{x,y \in S} d(x,y) \lt \infty. \end{equation*}
As in the above example, it is not enough to observe that each individual distance \(d(x,y)\) is finite. Indeed, this is always the case for any set, because \(d \maps X \times X \to \R\) always outputs a (finite) real number.
Comment 4.
As is often the case in this unit, in this question we only know that \((X,d)\) is a metric space. So \(X\) is just a set, which for all we know could be the set of all spaghetti recipes. In particular we cannot assume that \(X=\R\text{,}\) or even that \(X\) is a vector space. We do not know if there is a ‘zero’ in \(X\text{,}\) or if it makes sense to add two elements of \(X\) or to take their absolute value. Making such assumptions can completely change the character of a problem, and is a very fast way to lose most or all of the marks on an exam question.

4. (PS4) Pairs of Cauchy sequences.

Let \(\seq xn\) and \(\seq yn\) be Cauchy sequences in a metric space \((X,d)\text{.}\) Show that the sequence \((d(x_n,y_n))_{n \in \N}\) is also Cauchy, and conclude that the limit \(\lim_{n\to \infty} d(x_n,y_n)\) exists.
Hint.
The inequality \(\abs{d(x_n,y_n) - d(x_m,y_m)} \le d(y_n,y_m) + d(x_n,x_m)\) is quite useful here, and has already been established in Exercise 1.3.1.
Solution.
Since \(\R\) (with the usual Euclidean metric) is complete, it suffices to show that \((d(x_n,y_n))_{n \in \N}\) is Cauchy. Let \(\varepsilon \gt 0\text{,}\) and pick \(N \in \N\) such that \(n,m \ge N\) implies both \(d(x_n,x_m) \lt \varepsilon/2\) and \(d(y_n,y_m) \lt \varepsilon/2\text{.}\) For \(n,m \ge N\) we then have
\begin{align*} \abs{d(x_n,y_n) - d(x_m,y_m)} \amp\le d(y_n,y_m) + d(x_n,x_m)\\ \amp\lt \frac \varepsilon 2 + \frac \varepsilon 2 = \varepsilon \end{align*}
where the first inequality was shown in our solution to the first part of Exercise 1.3.1.
Comment 1.
Several students wrote \(d(d(x_n,y_n),d(x_m,y_m))\) as a synonym for \(\abs{d(x_n,y_n) - d(x_m,y_m)}\text{,}\) so that, depending on context, \(d\) either meant the metric on \(X\) or the metric on \(\R\text{.}\) While not completely unreasonable, I would still advise against using such a convention in this unit, and to instead give each metric in a problem its own name.
Comment 2.
After establishing that \((d(x_n,y_n))_{n \in \N}\) was Cauchy, several students claimed that “\(d(x_n,y_n) \to d(x_m,y_m)\)”. I cannot see any sensible way to interpret this statement. Is the limit as \(n \to \infty\text{?}\) What is \(m\text{?}\) In general there is no reason that the limit of a convergent sequence would be one of the terms in the this sequence.

5. Cauchy sequences and double limits.

Let \(\seq xn\) be a Cauchy sequence in a metric space \((X,d)\text{.}\)
(a)
For any \(n \in \N\text{,}\) show that the limit
\begin{equation*} \lim_{k \to \infty} d(x_n,x_k) \end{equation*}
exists.
Hint.
One method is to show that the sequence \((d(x_n,x_k))_{k\in \N}\) is Cauchy. This is related to Exercise 1.3.4.
(b)
Show that
\begin{equation*} \lim_{n\to \infty} \left( \lim_{k \to \infty} d(x_n,x_k) \right) = 0 \text{.} \end{equation*}

6. Limit points.

Suppose that \((x_n)_{n \in \N}\) is a sequence in a metric space \((X, d)\) and \(x_0 \in X\) is a point such that \(B_r(x_0) \cap \set{x_n}{n \in \N} \not\subseteq \{x_0\}\) for all \(r \gt 0\text{.}\) Show that there exists a subsequence \((x_{n_k})_{k \in \N}\) with \(x_0 = \lim_{k \to \infty} x_{n_k}\text{.}\)
Solution.
We construct the subsequence step by step as follows. First, set \(r_1 = 1\text{.}\) Observe that the set \(B_{r_1}(x_0) \cap \set{x_n}{n \in \N}\) contains a point other than \(x_0\) by the given condition. So we may choose \(n_1 \in \N\) such that \(d(x_{n_1}, x_0) \lt r_1\text{.}\)
Next, we set \(r_2 = \min\{\frac 12, d(x_1, x_0), \dotsc, d(x_{n_1}, x_0)\}\text{.}\) The set \(B_{r_2}(x_0) \cap \set{x_n}{n \in \N}\) contains a point other than \(x_0\text{,}\) hence we may choose \(n_2 \in \N\) such that \(d(x_{n_2}, x_0) \lt r_2\text{.}\) This implies in particular that \(n_2 \gt n_1\) by the choice of \(r_2\text{.}\)
Set \(r_3 = \min\{\frac{1}{3}, d(x_1, x_0), \dotsc, d(x_{n_2}, x_0)\}\) and find \(n_3 \in \N\) such that \(d(x_{n_3}, x_0) \lt r_3\text{.}\) Then \(n_3 \gt n_2\text{.}\)
Repeat the process indefinitely, which gives rise to an increasing sequence \((n_k)_{k \in \N}\text{.}\) The corresponding subsequence has the property that \(d(x_{n_k}, x_0) \lt r_k \le \frac{1}{k}\text{;}\) hence \(x_0 = \lim_{k \to \infty} x_{n_k}\text{.}\)