This is a note written for my Fall 2016 Math 100 class at Brown University. We are currently learning about various tests for determining whether series converge or diverge. In this note, we collect these tests together in a single document. We give a brief description of each test, some indicators of when each test would be good to use, and give a prototypical example for each. Note that we do justify any of these tests here — we’ve discussed that extensively in class. [But if something is unclear, send me an email or head to my office hours]. This is here to remind us of the variety of the various tests of convergence.

A copy of just the statements of the tests, put together, can be found here. A pdf copy of this whole post can be found here.

In order, we discuss the following tests:

- The $n$th term test, also called the basic divergence test
- Recognizing an alternating series
- Recognizing a geometric series
- Recognizing a telescoping series
- The Integral Test
- P-series
- Direct (or basic) comparison
- Limit comparison
- The ratio test
- The root test

## The $n$th term test

### Statement

Suppose we are looking at $\sum_{n = 1}^\infty a_n$ and

\begin{equation}

\lim_{n \to \infty} a_n \neq 0. \notag

\end{equation}

Then $\sum_{n = 1}^\infty a_n$ does not converge.

### When to use it

When applicable, the $n$th term test for divergence is usually the easiest and quickest way to confirm that a series diverges. When first considering a series, it’s a good idea to think about whether the terms go to zero or not. But remember that if the limit of the individual terms is zero, then it is necessary to think harder about whether the series converges or diverges.

### Example

Each of the series

\begin{equation}

\sum_{n = 1}^\infty \frac{n+1}{2n + 4}, \quad \sum_{n = 1}^\infty \cos n, \quad \sum_{n = 1}^\infty \sqrt{n} \notag

\end{equation}

diverges since their limits are not $0$.

## Recognizing alternating series

### Statement

Suppose $\sum_{n = 1}^\infty (-1)^n a_n$ is a series where

- $a_n \geq 0$,
- $a_n$ is decreasing, and
- $\lim_{n \to \infty} a_n = 0$.

Then $\sum_{n = 1}^\infty (-1)^n a_n$ converges.

Stated differently, if the terms are alternating sign, decreasing in absolute size, and converging to zero, then the series converges.

### When to use it

The key is in the name — if the series is alternating, then this is the goto idea of analysis. Note that if the terms of a series are alternating and decreasing, but the terms *do not* go to zero, then the series diverges by the $n$th term test.

### Example

Suppose we are looking at the series

\begin{equation}

\sum_{n = 1}^\infty \frac{(-1)^n}{\log(n+1)} = \frac{-1}{\log 2} + \frac{1}{\log 3} + \frac{-1}{\log 4} + \cdots \notag

\end{equation}

The terms are alternating.

The sizes of the terms are $\frac{1}{\log (n+1)}$, and these are decreasing.

Finally,

\begin{equation}

\lim_{n \to \infty} \frac{1}{\log(n+1)} = 0. \notag

\end{equation}

Thus the alternating series test applies and shows that this series converges.

## Recognizing geometric series

A geometric series is a series of the from

\begin{equation}

a + ar + ar^2 + \cdots + ar^n + \cdots \notag

\end{equation}

where we sometimes call $a$ the *first term* and we call $r$ the *common ratio*.

### Statement

Given a geometric series

\begin{equation}

\sum_{n = 0}^\infty a r^n, \notag

\end{equation}

the series converges exactly when $\lvert r \rvert < 1$. If $\lvert r \rvert \geq 1$, then the series diverges.

Further, if $\lvert r \rvert < 1$ (so that the series converges), then the series converges to

\begin{equation}

\sum_{n = 0}^\infty a r^n = \frac{1}{1-r}. \notag

\end{equation}

### When to use it

(At the risk of pointing out the obvious): Given a geometric series, one should always interpret its convergence by considering the ratio.

### Example

Suppose we are considering the series

\begin{equation}

\sum_{n = 0}^\infty \frac{3}{4} \frac{2^{n+1}}{3^n} = \frac{6}{4} + \frac{12}{12} + \frac{24}{36} + \cdots \notag

\end{equation}

This is a geometric series with first term $a = \frac{6}{4} = \frac{3}{2}$ and ratio $r = \frac{2}{3}$. Since $\lvert r \rvert < 1$, this geometric series converges, and it converges to $\frac{3}{2} \frac{1}{1 – \frac{2}{3}} = \frac{9}{4}$.

## Recognizing a telescoping series

A series is said to telescope if, after some point, all the terms in the series cancel with later terms in the series. These series are often easier to recognize after writing out several terms in the series (and perhaps after performing a partial fraction decomposition).

Note that telescoping series are some of the few series that you can actually evaluate exactly (when they converge). And note that not every telescoping series converges!

### Example

Suppose we are considering the series

\begin{equation}

\sum_{n = 1}^\infty \frac{1}{n(n+1)}. \notag

\end{equation}

By expanding the terms through partial fraction decomposition, we rewrite this series as

\begin{equation}

\sum_{n = 1}^\infty \frac{1}{n} – \frac{1}{n+1} = (\frac{1}{1} – \frac{1}{2}) + (\frac{1}{2} – \frac{1}{3}) + (\frac{1}{3} – \frac{1}{4}) + \cdots. \notag

\end{equation}

Notice that all terms after the first $\frac{1}{1} = 1$ cancel with later terms, and the limit of the terms is going to zero. Thus this series telescopes, and further it converges to $1$.

## The Integral Test

An infinite sum is used both in integrals and in infinite series. The idea of the integral test is that for a nice function $f$, the series $\displaystyle \sum_{n = 1}^\infty f(n)$ is approximately $\displaystyle \int_1^\infty f(x) dx$, and so their convergence or divergence are related.

### Statement

Suppose that $f(x)$ is a positive, decreasing function. Then the series $\displaystyle \sum_{n = 1}^\infty f(n)$ and the integral $\displaystyle \int_1^\infty f(x) dx$ either both converge, or both diverge.

### When to use it

If you recognize a function that you can integrate, then the integral test is very useful. In particular, if you see a function and its derivative (for use in $u$-substitution), then the integral test is a good idea.

### Example

Suppose we are examining the series

\begin{equation}

\sum_{n = 2}^\infty \frac{1}{n (\log n)^{1/2}}. \notag

\end{equation}

The function $f(x) = \frac{1}{x (\log x)^{1/2}}$ is positive and decreasing, so we apply the integral test. Our series will converge or diverge depending on whether the following integral converges or diverges:

\begin{equation}

\int_2^\infty \frac{1}{x (\log x)^{1/2}}dx. \notag

\end{equation}

Setting $u = \log x$, this (improper) integral becomes

\begin{equation}

\lim_{b \to \infty} \int_{\log 2}^b \frac{1}{u^{1/2}} du = \lim_{b \to \infty} 2u^{1/2} \bigg|_{\log 2}^b \to \infty. \notag

\end{equation}

So the integral diverges, and thus the series diverges.

## P-series

P-series concern the behavior of the series

\begin{equation}

\sum_{n = 1}^\infty \frac{1}{n^p}. \notag

\end{equation}

These follow immediately from the integral test, but they happen to be very useful for comparison tests (and for building intuition).

### Statement

The series

\begin{equation}

\sum_{n = 1}^\infty \frac{1}{n^p} \notag

\end{equation}

converges if $p > 1$ and diverges if $p \leq 1$.

### When to use it

On its own, it’s only occasionally useful. But its power comes when you use this as a basis for comparison in the Direct Comparison or Limit Comparison tests.

### Example

The series

\begin{equation}

\sum_{n = 1}^\infty \frac{1}{n} \notag

\end{equation}

diverges by p-series as $p = 1$.

[Note that in this series, the individual terms go to zero and yet the series still diverges!]

## Direct comparison

For the majority of *interesting* series, it is often easier to compare to other, simpler-to-understand series. In particular, it is usually easier to identify either

- a larger convergent series, or
- a smaller divergent series.

### Statement

Suppose we are considering the two series

\begin{equation}

\sum_{n = 0}^\infty a_n \quad \text{and} \quad \sum_{n = 0}^\infty b_n, \notag

\end{equation}

where $a_n \geq 0$ and $b_n \geq 0$. Suppose further that

\begin{equation}

a_n \leq b_n \notag

\end{equation}

for all $n$ (or for all $n$ after some particular $N$). Then

\begin{equation}

0 \leq \sum_{n = 0}^\infty a_n \leq \sum_{n = 0}^\infty b_n. \notag

\end{equation}

Further, if $\displaystyle \sum_{n = 0}^\infty a_n$ diverges, then so does $\displaystyle \sum_{n = 0}^\infty b_n$. And if $\displaystyle \sum_{n = 0}^\infty b_n$ converges, then so does $\displaystyle \sum_{n = 0}\infty a_n$.

This can be restated in the following informal way: if the bigger one converges, then so does the smaller. And in the other direction, if the smaller one diverges, then so does the larger.

### When to use it

The two comparison tests don’t have clear times to use them. But the idea is this: once you have a suspicion that a series converges or diverges, it is often a good idea to try to simplify the expressions for the terms by bounding them (in the correct direction!).

Some terms that are often good to bound are trigonometric terms (like bounding $-1 \leq \sin x \leq 1$).

### Example

Suppose we are considering the series

\begin{equation}

\sum_{n = 1}^\infty \frac{2 + \sin n}{n(n+1)}. \notag

\end{equation}

Roughly speaking, for large $n$ the terms look like $\frac{1}{n^2}$, so it is natural to think that this series converges. So we want to compare to a larger, convergent series. To do this we will bound the numerator above and bound the denominator below. Notice that the numerator is bounded above by $2 + \sin n \leq 3$, and the denominator is bounded below by $n(n+1) \geq n^2$.

Together, these mean that $\frac{2+\sin n}{n(n+1)} \leq \frac{3}{n^2}$, and so

\begin{equation}

\sum_{n = 1}^\infty \frac{2 + \sin n}{n(n+1)} \leq \sum_{n = 1}^\infty \frac{3}{n^2}. \notag

\end{equation}

Since the series on the right converges by P-series ($2 > 1$), by direct comparison the original series converges.

## Limit comparison

For most series, the first step is to consider what is *looks like* for large $n$. Essentially, this means that we try to identify the parts contributing the largest growth in the $a_n$. The idea of limit comparison is to relate the series with what is *looks like* for large $n$.

### Statement

Suppose we are considering the series

\begin{equation}

\sum_{n = 0}^\infty a_n \quad \text{and} \quad \sum_{n = 0}^\infty b_n, \notag

\end{equation}

where $a_n \geq 0$ and $b_n \geq 0$. Then if

\begin{equation}

\lim_{n \to \infty} \frac{a_n}{b_n} = L \notag

\end{equation}

and $L \neq 0, \infty$, then the two series either both converge or both diverge.

[Recall that we discussed a stronger version of this statement in class, concerning what can be said when $L = 0$ or $L = \infty$. We don’t reinclude that here.]

### When to use it

Limit comparison is a very powerful tool that can be used to remove a lot of the unimportant and smaller parts of terms of a series.

### Example

The classic example is to handle gross ratios of polynomials. Suppose we are considering the series

\begin{equation}

\sum_{n = 10}^\infty \frac{2n^3 + 4n – 4}{4n^{7/2} – n^2 + 5n – 1}. \notag

\end{equation}

For large $n$, the $2n^3$ in the numerator is much larger than the rest of the numerator, and the $4n^{7/2}$ in the denominator is much larger than the rest of the denominator. So the terms *look like* $\frac{2n^3}{4n^{7/2}} = \frac{1}{2n^{1/2}}$ for large $n$.

We perform a limit comparison test, comparing our series against the series

\begin{equation}

\sum_{n=10}^\infty \frac{1}{2n^{1/2}}. \notag

\end{equation}

To do this, we compute the limit

\begin{equation}

\lim_{n \to \infty} \frac{2n^3 + 4n – 4}{4n^{7/2} – n^2 + 5n – 1} \cdot 2n^{1/2}. \notag

\end{equation}

Notice that the degree of the numerator and denominator are each $7/2$, so the limit is the ratio of the leading coefficients. Thus the limit is $1$.

Since the limit exists and is not equal to $0$ or $\infty$, the two series either both converge or both diverge. Since $\displaystyle \sum_{n = 10}^\infty \frac{1}{2n^{1/2}}$ diverges by P-series ($1/2 < 1$), our original series diverges, by limit comparison.

## The ratio test

The ratio test is built on the idea that if the $(n+1)$st term of a series is approximately a fixed ratio $r$ times the $n$th term of a series (for sufficiently large $n$), then reasoning similar to the reasoning behind geometric series applies.

### Statement

Suppose we are considering

\begin{equation}

\sum_{n = 0}^\infty a_n. \notag

\end{equation}

Suppose that the following limit exists:

\begin{equation}

\lim_{n \to \infty} \frac{\lvert a_{n+1} \rvert}{\lvert a_n \rvert} = r. \notag

\end{equation}

Then if $r < 1$, the series converges absolutely. If $r > 1$, the series diverges.

If $r = 1$, then this test is inconclusive and one must try other techniques.

### When to use it

If you see factorials, the ratio test is probably a good thing to use. (It is also very useful when finding the radius of convergence of a power series.)

### Example

Suppose that we are considering

\begin{equation}

\sum_{n = 0}^\infty \frac{n^2}{n!}. \notag

\end{equation}

To apply the ratio test, we consider the limit of the ratio

\begin{equation}

\lim_{n \to \infty} \frac{(n+1)^2}{(n+1)!} \frac{n!}{n^2} = \lim_{n \to \infty} \frac{(n+1)^2 n!}{n^2 (n+1) n!} = \lim_{n \to \infty} \frac{(n+1)^2}{n^2 (n+1)} = 0. \notag

\end{equation}

In the last step, we recognized that the degree of the polynomial in the denominator was larger than the degree in the numerator.

Since the limit is $0$, and $0 < 1$, the series converges absolutely.

## The root test

The root test is based on similar intuition behind the ratio test: if the $n$th term is approximately of the form $r^n$ for some ratio $r$, then reasoning similar to the reasoning behind geometric series applies.

### Statement

Suppose that we are considering

\begin{equation}

\sum_{n = 0}^\infty a_n. \notag

\end{equation}

If the limit

\begin{equation}

\lim_{n \to \infty} \sqrt[n]{\lvert a_n \rvert} = r \notag

\end{equation}

exists and $r < 1$, then the series converges absolutely. If the limit exists and $r > 1$, then the series diverges.

If the limit does not exist, or if the limit exists and $r = 1$, then the test is inconclusive and one must try something else.

### When to use it

If everything (or almost everything) is raised to the $n$th power, then the root test is a good test to try. It is also often useful when finding the interval of convergence of power series.

### Example

Suppose that we are considering

\begin{equation}

\sum_{n = 0}^\infty \left(\frac{2n}{3n+1}\right)^{2n}. \notag

\end{equation}

Then to apply the root test, we consider the limit

\begin{equation}

\lim_{n \to \infty} \sqrt[n] {\left( \frac{2n}{3n+1}\right)^{2n}} = \lim_{n \to \infty} \left( \frac{2n}{3n+1}\right)^{2n/n} = \lim_{n \to \infty} \left(\frac{2n}{3n+1}\right)^2, \notag

\end{equation}

which we can rewrite as

\begin{equation}

\lim_{n \to \infty} \frac{4n^2}{(3n+1)^2} = \frac{4}{9}. \notag

\end{equation}

In the last step, we evaluated the limit by noting that the degree of the numerator matches the degree of the denominator, so the limit is the ratio of the leading coefficients, giving $4/9$.

Since $4/9 < 1$, the series converges absolutely by the root test.

## Concluding Remarks

This is an overview of each test that we have learned thus far. Note that sometimes, when confronted with a new series, the first technique that you try won’t work out. And that’s ok! It may be necessary to try a few techniques, or perhaps even combine various tests together in order to understand the convergence or divergence of a series.

If there are any questions, let me know. Good luck, and I’ll see you in class.