Skip to main content

Section 3.2 Hölder Spaces

Definition 3.3.

Let \(\alpha \in (0, 1]\text{.}\) Then \(C^{0, \alpha}([a,b])\) is the subset of \(C^0([a,b])\) comprising all \(f \in C^0([a,b])\) such that the following quantity is finite:
\begin{equation} [f]_{C^{0, \alpha}([a,b])} = \sup_{x, y \in [a,b], \ x \ne y} \frac{|f(x) - f(y)|}{|x - y|^\alpha} \lt \infty.\tag{3.1} \end{equation}
For \(f \in C^{0, \alpha}([a,b])\text{,}\) we define
\begin{equation*} \n f_{C^{0, \alpha}([a,b])} = \n f_{C^0([a,b])} + [f]_{C^{0, \alpha}([a,b])}. \end{equation*}

Remark 3.4.

  • A function satisfying (3.1) is called Hölder continuous. For \(\alpha = 1\text{,}\) the condition is equivalent to Lipschitz continuity, and \(L=[f]_{C^{0,1}([a,b])}\) is the smallest Lipschitz constant for for \(f\text{.}\)
  • One can check that \([\blank]_{C^{0, \alpha}([a,b])}\) is not a norm on \(C^{0,\alpha}([a,b])\text{.}\) It is called a Hölder seminorm. It can be equivalently defined as the smallest constant \(M \ge 0\) for which \(\abs{f(x)-f(y)} \le M\abs{x-y}^\alpha\) for all \(x,y \in [a,b]\text{.}\)

Proof.

The vector space and norm properties are routine, so we concentrate on the completeness. Let \((f_n)_{n \in \N}\) be a Cauchy sequence in \(C^{0, \alpha}([a,b])\text{.}\) Then it is also Cauchy in \(C^0([a,b])\text{.}\) Hence there exists a uniform limit \(f = \lim_{n \to \infty} f_n\text{.}\) We need to show that \(f \in C^{0, \alpha}([a,b])\) and that the convergence holds in \(C^{0, \alpha}([a,b])\text{.}\)
By Exercise 1.3.3, there exists \(M \gt 0\) such that \(\|f_n\|_{C^{0, \alpha}([a,b])} \le M\) for all \(n \in \N\text{.}\) Hence for any \(x, y \in [a,b]\) with \(x \ne y\text{,}\)
\begin{equation*} \frac{|f(x) - f(y)|}{|x - y|^\alpha} = \lim_{n \to \infty} \frac{|f_n(x) - f_n(y)|}{|x - y|^\alpha} \le M. \end{equation*}
In particular, it follows that
\begin{equation*} [f]_{C^{0, \alpha}([a,b])} \lt \infty, \end{equation*}
and therefore \(f \in C^{0, \alpha}([a,b])\text{.}\)
In order to prove the convergence, fix \(\varepsilon \gt 0\) and choose \(N \in \N\) such that \(\|f_m - f_n\|_{C^{0, \alpha}([a,b])} \lt \varepsilon\) whenever \(m, n \ge N\text{.}\) Then for any \(x, y \in [a,b]\) with \(x \ne y\text{,}\)
\begin{align*} \amp\frac{|(f(x) - f_n(x)) - (f(y) - f_n(y))|}{|x - y|^\alpha} \\ \amp= \lim_{m \to \infty} \frac{|(f_m(x) - f_n(x)) - (f_m(y) - f_n(y))|}{|x - y|^\alpha} \\ \amp\le \limsup_{m \to \infty} [f_m - f_n]_{C^{0, \alpha}([a,b])} \le \varepsilon, \end{align*}
provided that \(n \ge N\text{.}\) Hence \([f - f_n]_{C^{0, \alpha}([a,b])} \le \varepsilon\text{,}\) and since \(\varepsilon\) was chosen arbitrarily and we already have uniform convergence, this shows that \(f_n \to f\) in \(C^{0, \alpha}([a,b])\text{.}\)

Exercises Exercises

1. Hölder continuity in metric spaces.

Let \((X, d_X)\) and \((Y, d_Y)\) be metric spaces and let \(\alpha \in (0, 1]\text{.}\) If \(f \maps X \to Y\) is a map such that there exists \(L \ge 0\) satisfying the inequality
\begin{equation*} d_Y(f(x), f(y)) \le L \left(d_X(x, y)\right)^\alpha, \end{equation*}
then we say that \(f\) is Hölder continuous (or Lipschitz continuous if \(\alpha = 1\)). Show that any Hölder (or Lipschitz) continuous map is uniformly continuous.

2. (PS7) Regularity of \(x \mapsto x^\alpha\).

Let \(\alpha \in (0,1)\) and consider the function \(f \maps [0,1] \to \R\) defined by \(f(x) = x^\alpha\text{.}\)
(a)
Show that \(f \notin C^1([0,1])\text{.}\)
Hint.
Show that there is no \(g \in C^0([0,1])\) with \(g(t)=f'(t)\) for \(t \in (0,1)\text{.}\)
Solution.
We know that \(f\) is continuously differentiable on the open interval \((0,1)\) with \(f'(x)=\alpha x^{\alpha-1}\text{.}\) Since \(\alpha-1 \lt 0\text{,}\) we have \(f'(x) \to \infty\) as \(x \to 0\) with \(x \gt 0\text{.}\)
 1 
In other words, for all \(N \gt 0\) there exists \(\delta \gt 0\) such that \(f'(x) \ge N\) for \(x \in (0,\delta)\text{.}\)
Thus there cannot exist \(g \in C^0([0,1])\) with \(f'(t)=g(t)\) for all \(t \in (0,1)\text{,}\) and so by definition \(f \notin C^1([0,1])\text{.}\)
To give a bit more detail about the key step, suppose for the sake of contradiction that there were such a \(g \in C^0([0,1])\text{.}\) Then \(g\) would be sequentially continuous, and so we would have
\begin{equation*} \R \ni g(0) = \lim_{n \to \infty} g(1/n) = \lim_{n \to \infty} f'(1/n) = \infty\text{,} \end{equation*}
which is a contradiction.
Comment.
Some students wrote things along the lines of “\(g(x)\) is undefined at \(x=0\)” or “\(f'(x)\) is undefined at \(x=0\)” without further explanation. Without more context, these statements do not have a precise meaning. The derivative \(f'\text{,}\) by the usual limit definition, only makes sense on \((0,1)\text{,}\) and so is undefined at \(1\) just as much as it is undefined at \(0\text{.}\) The function \(g\text{,}\) on the other hand, is a hypothetical function \([0,1] \to \R\text{,}\) i.e. a hypothetical function whose values at \(0\) and \(1\) are defined.
For instance, one way to argue would be to first say that, since \(g\) is an extension of \(f'\text{,}\) it must be given by
\begin{equation*} g(x) = \begin{cases} A \amp \text{ if } x=0 \\ \alpha x^{\alpha - 1} \amp \text{ if } x \in (0,1) \\ B \amp \text{ if } x = 1 \end{cases} \end{equation*}
for some choices of \(A,B \in \R\text{.}\) Certainly at this point we could think of \(g(0)=A\) as a well-defined real number. The next step in the argument, as in the official solution, would be to explain why no choice of \(A\) makes \(g\) continuous at \(0\text{.}\)
Several students also seemed to implicitly assume that \(f'\) and \(g\) necessarily have the same ‘formula’ \(x \mapsto \alpha x^{\alpha-1}\text{.}\) This is in general false when speaking of continuous extensions. For instance the function \(x \mapsto \sin(x)/x\) is continuous on \((0,1)\) and has a continuous extension \(g\) to \([0,1]\) with a different, piecewise formula
\begin{equation*} g(x) = \begin{cases} 1 \amp \text{ if } x = 0, \\ \frac{\sin x}x \amp \text{ if } x \ne 0. \end{cases} \end{equation*}
(b)
Prove that \(f \notin C^{0,\beta}([0,1])\) for \(\beta \in (\alpha,1)\text{.}\)
Hint.
Estimate the supremum in (3.1) from below by setting \(y=0\text{.}\)
Solution.
We estimate
\begin{align*} [f]_{C^{0,\beta}([0,1])} \amp= \sup_{x,y \in [0,1],\, x \ne y} \frac{\abs{f(x)-f(y)}}{\abs{x-y}^\beta}\\ \amp\ge \sup_{x \in (0,1]} \frac{\abs{f(x)-f(0)}}{\abs{x-0}^\beta}\\ \amp= \sup_{x \in (0,1]} \frac{x^\alpha}{x^\beta}\\ \amp=\infty\text{.} \end{align*}
Comment.
Several students, following the hint, assumed that
\begin{align*} [f]_{C^{0,\beta}([0,1])} \amp= \sup_{x,y \in [0,1],\, x \ne y} \frac{\abs{f(x)-f(y)}}{\abs{x-y}^\beta}\\ \amp= \sup_{x \in (0,1]} \frac{\abs{f(x)-f(0)}}{\abs{x-0}^\beta}\text{.} \end{align*}
This is absolutely not true for general functions \(f \in C^{0,\beta}([0,1])\text{.}\) To see why, let’s think about what these suprema mean in more detail. Define
\begin{equation*} S = \set{(x,y)}{x,y \in [0,1],\, x \ne y} \end{equation*}
and
\begin{equation*} T = \set{(x,0)}{x \in (0,1]}\text{.} \end{equation*}
Then \(T \subset S\text{,}\) and so
\begin{align*} [f]_{C^{0,\beta}([0,1])} \amp= \sup_{(x,y) \in S}\frac{\abs{f(x)-f(y)}}{\abs{x-y}^\beta}\\ \amp\ge \sup_{(x,y) \in T}\frac{\abs{f(x)-f(y)}}{\abs{x-y}^\beta}\\ \amp= \sup_{x \in (0,1]} \frac{\abs{f(x)-f(0)}}{\abs{x-0}^\beta}\text{,} \end{align*}
where the inequality comes from the fact \(A \subseteq B\) implies \(\sup A \le \sup B\text{.}\)
(c)
Show that \(f \in C^{0,\alpha}([0,1])\text{.}\)
Hint.
First, use calculus to show that
\begin{gather} (t+1)^\alpha-t^\alpha \le 1 \text{ for } t \ge 0\text{.}\tag{✶} \end{gather}
Then, for \(0 \le y \lt x \le 1\text{,}\) pick \(t\) appropriately in (✶) to show that
\begin{equation*} \frac{x^\alpha-y^\alpha}{(x-y)^\alpha} \le 1\text{.} \end{equation*}
Solution.
We know that \(f\) is continuous on \([0,1]\text{,}\) and easily calculate
\begin{equation*} \n f_{C^0([0,1])} = \sup_{x \in [0,1]} \abs{x^\alpha} = 1 \text{.} \end{equation*}
To estimate \([f]_{C^{0,\alpha}([0,1])}\text{,}\) we follow the hint. Differentiating, we find that \((t+1)^\alpha - t^\alpha\) is a strictly decreasing function of \(t \ge 0\text{,}\) and hence that
\begin{equation*} (t+1)^\alpha - t^\alpha \le (0+1)^\alpha - 0^\alpha = 1 \quad \text{ for } t \ge 0. \end{equation*}
For \(0 \le y \lt x\text{,}\) we then use rules of exponents to find
\begin{align*} \frac{x^\alpha-y^\alpha}{(x-y)^\alpha} \amp= \Big(\frac x{x-y}\Big)^\alpha- \Big(\frac y{x-y}\Big)^\alpha \\ \amp= \Big(1+\frac y{x-y}\Big)^\alpha- \Big(\frac y{x-y}\Big)^\alpha \\ \amp= (t+1)^\alpha - t^\alpha\\ \amp\le 1\text{,} \end{align*}
where \(t = y/(x-y) \ge 0\text{.}\) This allows us to estimate
\begin{align*} [f]_{C^{0, \alpha}([0,1])} \amp= \sup_{x, y \in [a,b], \ x \ne y} \frac{|f(y) - f(x)|}{|x - y|^\alpha}\\ \amp= \sup_{0 \le y \lt x \le 1} \frac{|f(y) - f(x)|}{|x - y|^\alpha}\\ \amp= \sup_{0 \le y \lt x \le 1} \frac{x^\alpha-y^\alpha}{(x-y)^\alpha} \\ \amp\le 1\text{.} \end{align*}
(By considering the supremum over \(x \in [0,1]\) with \(y=0\text{,}\) we can in fact show that it is equal to 1.)

3. (PS7) \(C^1\) as a subset of \(C^{0,1}\).

Let \(a, b \in \R\) with \(a \lt b\text{.}\)
(a)
Show that \(C^1([a,b])\) is a linear subspace of \(C^{0, 1}([a,b])\text{.}\)
Hint.
As you can easily check yourself, \(C^1([a,b])\) is closed under vector space operations, and so the main thing to prove here is that it is a subset of \(C^{0,1}([a,b])\text{.}\) To prove this, fix \(x,y \in [a,b]\) with \(x \ne y\text{,}\) and try to estimate the quotient in the definition of \([f]_{C^{0,1}([a,b])}\) in terms of \(f'\) using the mean value theorem. Taking a supremum, conclude that \([f]_{C^{0,1}([a,b])} \le \n {f'}_{C^0([a,b])}\text{.}\)
Solution.
First we must show that \(C^1([a,b])\) is a subset of \(C^{0,1}([a,b])\text{.}\) Let \(f \in C^1([a,b])\text{.}\) Then by definition \(f \in C^0([a,b])\text{,}\) and thus it suffices to show that \([f]_{C^{0,1}([a,b])} \lt \infty\text{.}\) So let \(x, y \in [a,b]\text{.}\) By the mean value theorem, there exists some \(c\) between \(x\) and \(y\) such that
\begin{equation*} \frac{f(x)-f(y)}{x-y} = f'(c)\text{.} \end{equation*}
In particular,
\begin{equation*} \frac{\abs{f(x)-f(y)}}{\abs{x-y}} = \abs{f'(c)} \le \n{f'}_{C^0([a,b])}\text{.} \end{equation*}
Taking a supremum we conclude that \([f]_{C^{0,1}([a,b])} \le \n{f'}_{C^0([a,b])}\text{,}\) which is finite since \(f \in C^1([a,b])\text{,}\) and hence that \(f \in C^{0,1}([a,b])\text{.}\) Thus \(C^1([a,b])\) is a subset of \(C^{0,1}([a,b])\text{.}\) As \(C^1([a,b])\) is closed under the vector space operations, it is therefore a linear subspace of \(C^{0,1}([a,b])\text{.}\)
Comment 1.
A comment from last year: It is not enough to observe that \(\abs{f(x)-f(y)}/\abs{x-y}\) is finite for any fixed \(x \ne y\) and also in the limit as \(y \to x\) — we must show that the supremum of these finite values is also bounded. On the other hand, if we could show, e.g., that \(\abs{f(x)-f(y)}/\abs{x-y}\) extends to a continuous function of \((x,y) \in [a,b]^2\text{,}\) then we could appeal to (a suitable generalisation of) the Weierstrass extreme value theorem.
Comment 2.
In principle the fact that \(C^1([a,b])\) is closed under vector space operations is contained in Theorem 3.2, but of course we skipped that part of the proof. So many students, not unreasonably, decided to give proofs here. In the context of such an argument, it is important to remember that the relevant operations here vector addition and scalar multiplication. In other words, for \(f,g \in C^1([a,b])\) and \(\alpha \in \R\text{,}\) we must show that \(f+g\) and \(\alpha f\) also lie in \(C^1([a,b])\text{.}\) We do not need to show that the product \(fg\) (i.e. \(t \mapsto f(t)g(t)\)) is an element of \(C^1([a,b])\text{!}\) Linear subspaces which are closed under this additional operation are called algebras, and will appear later in Chapter 5.
(b)
Show that, for \(f \in C^1([a,b])\text{,}\) \(\n{f'}_{C^0([a,b])} = [f]_{C^{0,1}([a,b])}\text{.}\)
Hint.
In the previous part you have hopefully already shown that \(\n{f'}_{C^0([a,b])} \ge [f]_{C^{0,1}([a,b])}\) for any \(f \in C^1([a,b])\text{,}\) and so it suffices to show the reverse inequality. Fix \(x \in (a,b)\text{,}\) and write \(\abs{f'(x)}\) as a limit as \(y \to x\text{,}\) and then estimate inside the limit.
Solution.
We have already shown that, for \(f \in C^1([a,b])\text{,}\) \([f]_{C^{0,1}([a,b])} \le \n{f'}_{C^0([a,b])}\text{.}\) Recalling the formulas for the \(C^1([a,b])\) and \(C^{0,1}([a,b])\) norms, we will therefore be done if we can show the reverse inequality. For any \(x,y \in [a,b]\) with \(x\ne y\text{,}\) we have
\begin{equation*} \frac{\abs{f(x)-f(y)}}{\abs{x-y}} \le [f]_{C^{0,1}([a,b])} \text{.} \end{equation*}
Thus for any \(x \in (a,b)\) we have
\begin{equation*} \abs{f'(x)} = \lim_{y \to x} \frac{\abs{f(x)-f(y)}}{\abs{x-y}} \le [f]_{C^{0,1}([a,b])} \text{.} \end{equation*}
It follows that
\begin{equation*} \sup_{x \in (a,b)} |f'(x)| \le [f]_{C^{0, 1}([a,b])}. \end{equation*}
Since \(f'\) is extended continuously to \([a,b]\text{,}\) this then implies \(\|f'\|_{C^0([a,b])} \le [f]_{C^{0, 1}([a,b])}\) as desired.
(c)
Show that, for \(f \in C^1([a,b])\text{,}\) \(\n f_{C^{0,1}([a,b])}= \n f_{C^1([a,b])}\text{.}\) Conclude that \(C^1([a,b])\) (equipped with \(\n\blank_{C^1([a,b])}\)) is a normed subspace of \(C^{0,1}([a,b])\) (equipped with \(\n\blank_{C^{0,1}([a,b])}\)).
Solution.
By Part b, for any \(f \in C^1([a,b])\) we have
\begin{align} \n f_{C^{0,1}([a,b])} \amp = \n f_{C^0([a,b])} + [f]_{C^{0,1}([a,b])}\notag\\ \amp = \n f_{C^0([a,b])} + \n{f'}_{C^0([a,b])}\notag\\ \amp = \n f_{C^1([a,b])}\text{,}\tag{✶} \end{align}
as desired. Thus \(\n\blank_{C^1([a,b])}\) is the restriction of \(\n\blank_{C^{0,1}([a,b])}\) to the set \(C^{0,1}([a,b])\text{.}\) Combining with Part a, we conclude that \(\big(C^1([a,b]),\n\blank_{C^1([a,b])}\big)\) is a normed subspace of \(\big(C^{0,1}([a,b]),\n\blank_{C^{0,1}([a,b])}\big)\text{.}\)
(d)
Show that \(C^1([a,b])\) is closed in \(C^{0, 1}([a,b])\text{.}\)
Hint.
Part c is useful here, as is the completeness of \(C^1([a,b])\) when equipped with the usual \(\n\blank_{C^1([a,b])}\) norm.
Solution.
Since \(C^1([a,b])\) is complete when equipped with the \(C^1([a,b])\) norm (Theorem 3.2), by Part c it is also complete as a metric subspace of \(C^{0,1}([a,b])\text{.}\) Since \(C^{0,1}([a,b])\) is complete by Theorem 3.5, Theorem 1.41 therefore implies that \(C^1([a,b])\) must be closed as a subset of \(C^{0,1}([a,b])\text{.}\)
Comment.
Note that we cannot apply Theorem 1.41 immediately, without first showing something like (✶). This is because Theorem 1.41 requires not only that \(Y \subseteq X\text{,}\) but that \((Y,d')\) is a metric subspace of \((X,d)\text{,}\) i.e. \(d\) and \(d'\) must agree on \(Y \times Y\text{.}\) But here it is not immediately obvious that \(\n\blank_{C^1([a,b])}\) and \(\n\blank_{C^{0,1}([a,b])}\) agree on \(C^1([a,b])\text{,}\) and so we have to work to prove something like (✶).

4. \(C^{0,\alpha}\) is non-separable.

Let \(\alpha \in (0,1)\) and \(a,b \in \R\) with \(a \lt b\text{,}\) and consider the normed space \(C^{0,\alpha}([a,b])\) of Hölder continuous functions (Definition 3.3). For every \(t \in [a,b]\text{,}\) define a function \(f_t \in C^{0,\alpha}([a,b])\) by \(f_t(x) = \abs{x-t}^\alpha\text{.}\)
(a)
Show that, if \(s,t \in [a,b]\) and \(s \ne t\text{,}\) then
\begin{equation*} [f_t-f_s]_{C^{0,\alpha}([a,b])} \ge \lim_{x \to t} \frac{\abs{\abs{x-t}^\alpha-\abs{x-s}^\alpha+\abs{t-s}^\alpha}}{\abs{x-t}^\alpha} = 1\text{.} \end{equation*}
Conclude that \(\n{f_t-f_s}_{C^{0,\alpha}([a,b])} \ge 1\text{.}\)
Hint.
This is probably the hardest part of this problem. If you are getting stuck, I would recommend moving on to the others parts and doing them first. One way to compute the limit is to compare some of the terms to the limit definition of the derivative \(f_s'(t)\text{.}\)
(b)
Suppose that \(G \subseteq C^{0,\alpha}([a,b])\) is dense. Show that for each \(t \in [a,b]\) there must exist \(g_t \in G\) with \(\n{g_t-f_t}_{C^{0,\alpha}([a,b])} \lt 1/2\text{.}\)
(c)
Use the triangle inequality and Part a to show that \(g_t \ne g_s\) for \(t \ne s\text{.}\)
(d)
Conclude that \(G\) cannot be countable, and hence that \(C^{0,\alpha}([a,b])\) is not separable (Definition 2.5).

5. (PS8) \(C^{1,\alpha}([a,b])\).

For \(\alpha \in (0,1]\text{,}\) the set \(C^{1,\alpha}([a,b])\) comprises all \(f \in C^1([a,b])\) such that (the continuous extension of) \(f' \in C^{0,\alpha}([a,b])\text{.}\) Show that the norm
\begin{equation} \n f_{C^{1,\alpha}([a,b])} = \n f_{C^1([a,b])} + [f']_{C^{0,\alpha}([a,b])}\tag{3.2} \end{equation}
makes \(C^{1,\alpha}([a,b])\) a Banach space. You do not have to check that it is a normed space.
Hint 1.
Theorem 3.2 and Theorem 3.5 are very useful here. You might also want to look at their proofs for some general inspiration.
Hint 2.
Suppose that \(\seq fn\) is Cauchy in \(C^{1,\alpha}([a,b])\text{.}\) Argue that \(\seq fn\) is also Cauchy in \(C^1([a,b])\) and that \(\seq{f'}n\) is Cauchy in \(C^{0,\alpha}([a,b])\text{.}\) Now appeal to the completeness of \(C^1([a,b])\) and \(C^{0,\alpha}([a,b])\text{.}\)
Hint 3.
Suppose that \((X,d_X)\) and \((Y,d_Y)\) are metric spaces with \(Y \subset X\text{.}\) Then a sequence \(\seq yn\) in \(Y\) is also a sequence in \(X\text{.}\) Without knowing how the metrics \(d_X,d_Y\) compare, however, we can conclude nothing about how \(\seq yn\) being Cauchy/convergent as a sequence in \((Y,d_Y)\) is related to it being Cauchy/convergent as a sequence in \((X,d_X)\text{.}\)
Solution.
Expanding out the definitions, we note that norm in (3.2) could equivalently be written
\begin{gather} \n f_{C^{1,\alpha}([a,b])} = \n f_{C^0([a,b])} + \n{f'}_{C^{0,\alpha}([a,b])}.\tag{#} \end{gather}
As in the lecture notes, we will neglect to check that \(C^{1,\alpha}([a,b])\) is a normed space, and focus on completeness.
Let \(\seq fn\) be a Cauchy sequence in \(C^{1,\alpha}([a,b])\text{.}\) We will be done if we can show that this sequence is convergent in this space. By (3.2), \(\seq fn\) is also a Cauchy sequence in \(C^1([a,b])\text{.}\) Since \(C^1([a,b])\) is complete by Theorem 3.2, this sequence therefore converges in \(C^1([a,b])\) to some limit \(f \in C^1([a,b])\text{:}\)
\begin{gather} \n{f_n - f}_{C^1([a,b])} \to 0 \text{ as } n \to \infty \text{.}\tag{✶} \end{gather}
In particular, this means that
\begin{gather} \n{f'_n - f'}_{C^0([a,b])} \to 0 \text{ as } n \to \infty\text{.}\tag{†} \end{gather}
Looking at (#) we also notice that the sequence \(\seq {f'}n\) is Cauchy in \(C^{0,\alpha}([a,b])\text{.}\) Thus by Theorem 3.5 it is convergent in this space, with some limit \(g \in C^{0,\alpha}([a,b])\text{:}\)
\begin{gather} \n{f'_n - g}_{C^{0,\alpha}([a,b])} \to 0 \text{ as } n \to \infty \text{.}\tag{✶✶} \end{gather}
In particular, this means that
\begin{gather} \n{f'_n - g}_{C^0([a,b])} \to 0 \text{ as } n \to \infty\text{.}\tag{††} \end{gather}
Comparing (†) and (††), the uniqueness of limits (Theorem 1.34) in \(C^0([a,b])\) means that \(g=f'\text{.}\)
Therefore we have \(f' = g \in C^{0,\alpha}([a,b])\text{,}\) and hence that \(f \in C^{1,\alpha}([a,b])\text{.}\) Finally, combining (✶) and (✶✶) and using (#), we have
\begin{align*} \n{f_n - f}_{C^{1,\alpha}([a,b])} \amp = \n{f_n - f}_{C^0([a,b])} + \n{f_n' - f'}_{C^{0,\alpha}([a,b])} \\ \amp = \n{f_n - f}_{C^0([a,b])} + \n{f_n' - g}_{C^{0,\alpha}([a,b])} \\ \amp \to 0 \end{align*}
as \(n \to \infty\text{.}\)
Comment 1.
Suppose that \((X,d_X)\) and \((Y,d_Y)\) are metric spaces, and that, as sets, \(Y \subseteq X\text{.}\) If this is all we know, then we cannot say anything at all about the relationship between Cauchy sequences in \((X,d_X)\) and in \((Y,d_Y)\text{.}\) Indeed, without information about how \(d_X\) and \(d_Y\) compare, we can say very little at all about how these two spaces are related.
If, as in the above problem, though, we happen to know that \(d_X(y_1,y_2) \le d_Y(y_1,y_2)\) for all \(y_1,y_2 \in Y\text{,}\) then we can show that Cauchy sequences in \((Y,d_Y)\) are also Cauchy in \((X,d_X)\text{.}\)
Comment 2.
The above proof is slightly slick, which is not always a good thing in terms of understanding. Another perfectly valid argument involves repeating most of the key steps in the proof of Theorem 3.5.
Comment 3.
In the official solution, we first argue that \(\seq fn\) converges in \(C^1([a,b])\) while \(\seq{f'}n\) converges in \(C^{0,\alpha}([a,b])\text{.}\) Call the first limit \(f\) and the second limit \(g\text{.}\) While it turns out that \(f'=g\text{,}\) this is not immediately obvious, and requires justification. For this reason, it is a particularly bad idea to call the second limit \(f'\) from the get-go. The official solution uses the relationships between the various norms and the uniqueness of limits to establish \(f'=g\text{,}\) but this is not the only way to argue. For instance, it is possible that in Analysis 2A you were taught a theorem which is relevant here. One can also argue very directly, as in the proof of Theorem 3.2.

6. (PS7) Inclusions between Hölder spaces.

Let \(0 \lt \alpha \lt \beta \le 1\) and \(f \in C^{0,\beta}([a,b])\text{.}\) Show that
\begin{equation*} [f]_{C^{0,\alpha}([a,b])} \le \abs{b-a}^{\beta-\alpha} [f]_{C^{0,\beta}([a,b])} \text{,} \end{equation*}
and conclude that \(C^{0,\beta}([a,b]) \subseteq C^{0,\alpha}([a,b])\text{.}\)
Hint.
Write \(\abs{x-y}^\beta=\abs{x-y}^{\beta-\alpha} \abs{x-y}^\alpha\text{.}\)
Solution.
Let \(x, y \in [a,b]\) with \(x \ne y\text{.}\) Then
\begin{align*} \abs{f(x)-f(y)} \amp \le [f]_{C^{0,\beta}([a,b])} \abs{x-y}^\beta \\ \amp \le [f]_{C^{0,\beta}([a,b])} \abs{x-y}^{\beta-\alpha} \abs{x-y}^\alpha\\ \amp \le [f]_{C^{0,\beta}([a,b])} \abs{b-a}^{\beta-\alpha} \abs{x-y}^\alpha \end{align*}
Dividing by \(\abs{x-y}^\alpha\) and taking a supremum yields the desired estimate for \([f]_{C^{0,\alpha}([a,b])}\text{.}\) In particular, \([f]_{C^{0,\alpha}([a,b])} \lt \infty\text{,}\) and so \(f \in C^{0,\alpha}([a,b])\text{.}\) Since \(f \in C^{0,\beta}([a,b])\) was arbitrary, we conclude that \(C^{0,\beta}([a,b]) \subseteq C^{0,\alpha}([a,b])\text{.}\)
Comment 1.
If, on an exam, you write
\begin{gather*} \sup_{x,y \in [a,b],\ x \ne y} \frac{ \abs{f(x)-f(y)}}{\abs{x-y}^\beta} = \sup \left(\frac{ \abs{f(x)-f(y)}}{\abs{x-y}^\alpha} \abs{x-y}^{\alpha-\beta} \right) \le \sup \cdots, \end{gather*}
then I will assume that the second two suprema are over the same values of \(x\) and \(y\text{.}\) But if you just write
\begin{gather*} \sup \frac{ \abs{f(x)-f(y)}}{\abs{x-y}^\beta} = \sup \left(\frac{ \abs{f(x)-f(y)}}{\abs{x-y}^\alpha} \abs{x-y}^{\alpha-\beta} \right) \le \sup \cdots, \end{gather*}
and never indicate what set your supremum is being taken over in any way, then you will likely lose marks.
Comment 2.
This question ends with the phrase ‘and conclude that…’. This means that you should write something about how this conclusion can in fact be reached. Failing to do so on an exam would likely cost marks.
Comment 3.
While this question generally went quite well this year, there were a handful of confusing solutions which involved statements like
\begin{gather*} [f]_{C^{0,\alpha}} = \abs{f(x)-f(y)}{\abs{x-y}^\alpha} = \abs{f(x)-f(y)}{\abs{x-y}^{\beta}} \abs{x-y}^{\beta-\alpha} = [f]_{C^{0,\beta}} \abs{x-y}^\alpha \end{gather*}
where the supremum in the definition of the Hölder seminorm has disappeared, making everything an equality, and the logical status of \(x,y\) becomes deeply unclear. Such solutions would have received few (if any) marks on an exam.