Tuesday, January 19, 2016

Cosine and 0.739

Playing around with the cosine or sine function, one may find that upon iteration, both functions begin to behave unexpectedly 'nicely'.

What I mean by this is as follows:

0. Make sure you are computing everything in RADIANS.

1. Take any number ($1$, $2$, $0.7$, whatever)
2. Compute $cos$(your number)
3. Compute $cos$($cos$(your number))
4. Compute $cos$($cos$($cos$(your number)))
5. Compute $cos$($cos$($cos$($cos$(your number))))
6. You get the idea

If you've made sure your calculator is in radians, this sequence should start getting closer and closer to about $0.7$, and you can perform the same procedure with other numbers (it should still work!).

Here's a nice graph depicting this process starting at $-1$ (Courtesy of Wikipedia):





Perplexed by this, I did some research, and came across an interesting theorem on Wikipedia.

Theorem (A la Wikipedia) - If a function $f(x)$ has a fixed point at $x= x_0$ and is continuously differentiable in an open neighborhood of $x_0$ with $|f'(x_0)| < 1$, attraction at $x_0$ is guaranteed.

Now the 'attraction is guaranteed' part seemed a bit vague. So I tried to reformulate the theorem so that it was a bit more precise, and then furnished a proof. Both of which are presented below:

Theorem 1 - Suppose $x_0 \in [a,b]$ is a point such that $f(x_0) = x_0$.  Given $f$ is continuously differentiable at $x_0$ such that $|f'(x_0)|<1$, there exists a $\delta > 0$ such that $\forall x \in (x_0 - \delta , x_0 + \delta) $ the sequence $ f^n(x)$ converges to $x_0$ as $ {n \to \infty} $ . $f^n(x)$ is $f$ iterated $n$ times, or composed with itself $n$ times.

Proof - $f'$ is continuous at $x_0$. Therefore, there must exist $\delta >0 $ such that $\forall x\in (x_0 - \delta, x_0 + \delta), |f'(x) - f'(x_0)| < \epsilon $ where $\epsilon > 0 $ such that $\epsilon + |f'(x_0)| = \frac{1}{p}$ where $p>1$.

So by the reverse triangle inequality, we can deduce $|f'(x)| < \frac{1}{p}$ $\forall x\in (x_0 - \delta, x_0 + \delta)$.

Now choose some $z \in (x_0 - \delta, x_0 + \delta) $. Consider $f(z) - f(x_0)$.

By the mean-value theorem, $\exists c_1\in (x_0 - \delta, x_0 + \delta) $ such that $\frac{f(z) - f(x_0)}{z - x_0} = f'(c_1)$.

Now $c_1\in (x_0 - \delta, x_0 + \delta)$, therefore, $|f(z) - f(x_0)| = |z-x_0||f'(c_1)| < |z-x_0|\frac{1}{p}$.

Suppose $|f^k(z) - f^k(x_0)| < |z-x_0|\frac{1}{p^k}$ where $k \geq1 $ and $ k$ is an integer.

Now consider $\frac{f(f^k(z)) - f(f^k(x_0))}{f^k(z) - f^k(x_0)} = f'(c_{k+1})$ by the mean value theorem.

$c_{k+1} \in (x_0 - \delta, x_0 + \delta)$ as $c_{k+1}$ is between $f^k(z)$ and $f^k(x_0)$ and $|f^k(z) - f^k(x_0)| < |z-x_0|\frac{1}{p^k}$ and $f^k(x_0) = x_0$, by hypothesis and by definition respectively.

Then $|f^{k+1}(z) -  f^{k+1}(x_0)| < |f^k(z) - f^k(x_0)|\frac{1}{p} < |z-x_0|\frac{1}{p^{k+1}}$.


The theorem follows by induction and the definition of convergence.

$\Box$

Now how does this explain why $cos(x)$ behaves this nicely upon iteration?

Well, notice that in the proof above, all that was required for the convergence of $f^n(x)$ was that $|f'(x)| < 1$ for all $x$ in some neighborhood of $f(x)$'s fixed point.

It isn't difficult to see that $cos(x)$ has a fixed point. That is, $cos(x)$ intersects $y=x$ at some point. (Namely $0.739$, to three decimal places).

Furthermore, in the entire interval $[0,\frac{\pi}{2})$ we have the desired fact $|cos'(x)| = |sin(x)|<1$

Thus, by the theorem we proved above, iterating cosine for any $x$ in $[0,\frac{\pi}{2})$ will yield convergence to approximately $0.7$.

Now it isn't too hard to prove that this is the case with cosine's entire domain, $[0,2\pi]$, using some properties of cosine, but I'll omit this part. If you want to try proving it for yourself, the fact that cosine is an even function may prove to be useful.

After a bit of reflecting, I realized an even 'cheaper' form of Theorem 1 could be proved, which doesn't require for $f$ to be continuously differentiable at $x_0$, rather, only differentiability at $x_0$ is needed.

This 'cheaper' version of Theorem 1 is presented and proved below:

Theorem 1 (Cheap version) - Suppose $x_0 \in [a,b]$ is a point such that $f(x_0) = x_0$.  Given $f$ is  differentiable at $x_0$ such that $|f'(x_0)|<1$, there exists a $\delta > 0$ such that $\forall x \in (x_0 - \delta , x_0 + \delta) $ the sequence $ f^n(x)$ converges to $x_0$ as $ {n \to \infty} $ . $f^n(x)$ is $f$ iterated $n$ times, or composed with itself $n$ times.

Suppose $f'(x_0)$ exists, then we know that $\forall \epsilon > 0 , \exists \delta>0 $ such that $\forall x\in (x_0-\delta, x_0 + \delta), |\frac{f(x) - f(x_0)}{x-x_0} - L | < k $ where $k>0$ such that $|L| + k = \frac{1}{p}$ and $p>1$ .

Then we have, $\forall x\in (x_0-\delta, x_0+\delta), |\frac{f(x) - f(x_0)}{x-x_0}| < \frac{1}{p} $so that $|f(x) - f(x_0)| < |x-x_0|\frac{1}{p}$.

Now we similarly employ an inductive hypothesis, namely, $|f^k(x) - f^k(x_0)| < |x-x_0|\frac{1}{p^k} $ for integral $k \geq 1$.

By the inductive hypothesis, we see that $f^k(x) \in (x_0-\delta, x_0 + \delta)$

Therefore, by differentiability at $x_0$ we know $|f(f^k(x)) - f(x_0)| < |f^k(x) - x_0|\frac{1}{p} $

Now employing the inductive hypothesis, $|f^{k+1}(x) - f(x_0)| < |x-x_0|\frac{1}{p^{k+1}}$.

$\Box$








No comments:

Post a Comment