The number is a fixed point for a given function if
Given a root finding problem , we can define function with a fixed point at , as
for . Conversely, ik a function has a fixed point at , then the function defined by
has a zero at
Th:
If a function and , then has at least one fixed point in
If, in addition, is Lipschitz continuous with a Lipschitz constant , then there is exactly one fixed point in .
Fixed Point of Functional Iteration
def fixed_point_iteration(g, x0, tolerance:float = 1e-6, max_iterations = 1_000):
x_current = x0
for _ in range(max_iterations):
x_next = g(x_current)
if abs(x_next - x_current) < tolerance:
return x_next
x_current = x_next
print('Did not converge within the maximum iterations')
return x_current
g = lambda x: 0.5 * (x + 2 / x)
print(fixed_point_iteration(g, 1))
Fixed Point Theorem
Let be such that . Suppose that is Lipscitz continuous with Lipschitz constant . Then for any , the sequence defined by
converges to the unique fixed point , in
Cor: Let be such that . Suppose that is Lipschitz continuous with Lipschitz constant , then the bounds for the error involved in using to approximate are given by
and
Meaning that . With this we can construct given a tolerance , we can construct, the least necessary to have a precision of . We have that
We have a that if , with , then there’s a such that if , then . Thus, no matter how close the initial approximation is to , the next iterate is farther away, so the fixed-point iteration does not converge if. This is where the definition of unstable fixed points comes from
Similarly, let be continuously differentiable on some interval that contains the fixed point of . Then if , then there’s a , such that if , then the fixed point iteration converges.