Remember, from now on we call $\sigma_0=\tau$ and $\sigma_1=\sigma$. There are so many interesting things to be said about these functions!
Here is one. $$\left(\sum_{d\mid n}\tau(d)\right)^2=\sum_{d\mid n}\tau(d)^3$$
Click to the left again to hide and once more to show the dynamic interactive window 
The easiest way to prove this is this:
Here's another largely open question which seems like it should be easy... what rational numbers can be gotten as $\frac{\sigma(n)}{n}$? (Sometimes this ratio is called the abundancy index of $n$.)
Click to the left again to hide and once more to show the dynamic interactive window 
There are some interesting theorems about this out there. For one thing, the abundancy is also the same thing as $\sigma_{1}(n)$; this is not hard to prove, but instructive.
A more nuanced view will ask what rational numbers are possible for the abundancy. Clearly all such numbers are in the interval $\left[1,\infty\right)$! Here are some more facts.
What is particularly interesting about this is the connection to something we silently omitted last time  the question whether there are odd perfect numbers!
Fact: $$\text{If }\frac{5}{3}\text{ is the abundancy index of }N\text{, then }5N\text{is an odd perfect number.}$$
However, we don't know whether this is ever true.
Open Question: $$\text{Does there exist an odd perfect number?}$$
Yikes! This question is still open after two and a half millennia. We do know some things about the question, though. First, recall that $$\frac{\sigma(n)}{n}=\prod_{i=1}^k\frac{p_i1/p_i^{e_i}}{p_i1}< \prod_{i=1}^k\frac{p_i}{p_i1}$$ when $n$ is a product of the prime powers $p_i^{e_i}$.
Of course, this is pretty elementary. There are many more criteria. They keep on getting more complicated, so I can't list them all, but here is a selection, including information from two big searches going on right now.
Finally, here is the link to an article about Euler and his own criterion.
Appropriate, since he finished the characterization of even perfect numbers.
Whew! Fun, but at times overwhelming.
Well, we will now move on to think of a different way to think of these same functions. We will examine, for the next week or so, different limits in number theory  and how integrals and calculus are inextricably bound up with this sort of question.
Our motivational example was the one we discussed a few times ago. Let $r(n)$ denote the number of ways to represent $n$ as a sum of squares  so that $r(3)=0$ but $r(9)=4$ and $r(5)=8$. Then we saw, more or less rigorously, that $$\lim_{n\to\infty}\frac{1}{n}\sum_{k=1}^n r(k)=\pi\, .$$
But we can say more. One intermediate step in our proof was the following: $$\pi \left(1\sqrt{\frac{2}{n}}+1/2n\right)\leq \frac{1}{n}\sum_{k=0}^{n}r(k)\leq \pi \left(1+\sqrt{\frac{2}{n}}+1/2n\right)\, ,$$ which we may think of in terms of an error (so using absolute values): $$\left\frac{1}{n}\sum_{k=0}^{n}r(k)\pi\right\leq \pi \left(\frac{\sqrt{2}}{\sqrt{n}}+1/2n\right)\leq Cn^{1/2}\, ,\text{ for large enough }n,$$ where of course the value of $C$ is not just $\pi\sqrt{2}$, but something a little bigger because of the $1/2n$ term.

Click to the left again to hide and once more to show the dynamic interactive window 
(As often happens with such things, the constant we proved is a lot bigger than it needs to be.)
It turns out there is a nice notation for this.
Definition:
We say that $f(x)$ is $O(g(x))$ (eff of eks is Big Oh of gee of eks) if there is some constant for which $$f(x)\leq Cg(x)\text{ for all large enough }x\, .$$ So the average number of representations of an integer as a sum of squares is $\pi$, and if you do the average up to $N$, then the error will be no worse than some constant times $1/\sqrt{N}$, or Big Oh of $1/\sqrt{N}$.
(This is known as Landau notation. It is unknown in this case, by the way, just how small that error term really is. In 1906 it was shown that you could make it $O(x^{2/3})$, but it is also known that you cannot make it $O(x^{3/4})$.)
Click to the left again to hide and once more to show the dynamic interactive window 
Now we want to apply these same ideas to the $\tau$ and $\sigma$ functions.
Questions: What is the "average" number of divisors of a positive integer? What is the "average" sum of divisors of a positive integer?
It turns out that clever combinations of many ideas from the semester as well as calculus ideas will help us solve these questions! And they will motivate us to ask the (much harder) similar questions one can ask about prime numbers.
Click to the left again to hide and once more to show the dynamic interactive window 
This graphic shows how the average value of $\tau$ up to $n$ changes as we let $n$ get bigger. This just isn't enough data to tell whether there is a limiting value for the average value of $\tau(n)$, even if you look out to the first 1000 integers. Remember, every prime number contributes just 2 to the total (and so reduces the average value)! But thinking about this might lead us to look a little deeper.
At the very least I can tell that the average value is Big Oh of a certain function. We go right back to the sieve of Eratosthenes for this.
But that is very important! This means we can get a sense of what the average value of $\tau$ might be. We certainly have that $$\frac{1}{n}\sum_{k=1}^n \tau(k)\leq \frac{1}{n}\sum_{k=1}^n 2\sqrt{k}=\sum_{k=1}^n \frac{1}{n}2\sqrt{n(k/n)}$$ You might wonder why I wrote the sum in this way. The answer is that this looks an awful lot like a Riemann sum with $\Delta x=\frac{1}{n}$.
You may likely recall writing a Riemann sum for $\int_0^1 x^2\; dx$ in the form $$\frac{1}{n}\left(\frac{1}{n}\right)^2+\frac{1}{n}\left(\frac{2}{n}\right)^2+\cdots+\frac{1}{n}\left(\frac{n}{n}\right)^2\; ;$$ this is the same thing for the function $2\sqrt{nx}$. That would give.
$$\sum_{k=1}^n \frac{1}{n}2\sqrt{n(k/n)}\approx \int_0^1 2\sqrt{nx}dx=2\sqrt{n}\int_0^1 \sqrt{x}\; dx=\frac{4}{3}\sqrt{n}\, .$$ (See Appendix for rigorous version.) Then one can write $$\frac{1}{n}\sum_{k=1}^n \tau(k)\text{ is }\frac{1}{n}\sum_{k=1}^n O(\sqrt{k})=O(\sqrt{n})\, ,$$ so that the average value is also bounded by a constant time $\sqrt{n}$  implying perhaps that the average number of divisors goes steadily up!
But how does it go up?

Here, out to one million, is once again our graph of $\tau$ versus $n$. Certainly this looks sort of like some kind of fractional exponent function, though a very slowly growing one  probably slower than $\sqrt{x}$, our first estimate. We'll see what happens next time.
Homework:

Appendix:
To make the calculation of $O(\sqrt{n})$ rigorous, we will need to make a slight change of point of view in order to ensure it will be viewed as a lefthand sum of an increasing function (and hence the Riemann sum is less than the actual value of the integral): $$\frac{1}{n}\sum_{k=1}^n 2\sqrt{k}=\sum_{k=0}^{n1}\left(\frac{1}{n}\right)2\sqrt{k+1}=\sum_{k=0}^{n1}\left(\frac{1}{n}\right)2\sqrt{n(k/n)+1}\leq \int_0^1 2\sqrt{nx+1}\; dx$$ And then, since this $$=\frac{4}{3}\sqrt{n}\left[\left(1+\frac{1}{n}\right)^{3/2}\left(\frac{1}{n}\right)^{3/2}\right]$$ and since the big fudge factor on the right can be shown to be decreasing using derivatives, and is always less than $2$ for integers, it will always be less than $\frac{8}{3}\sqrt{n}$ and so our calculation that the average value of $\tau$ is $O(\sqrt{n})$ is correct.
