Set f be infinitely differentiable on an interval I containi
Set f be infinitely differentiable on an interval I containing 0. If there are constants c > 0, R > 0 s.t for any x in a neighborhood of 0 and every n |f^(n) (x)|
Solution
errror term of taylor series is given by
Fk+1*Xk+1/(k+1)!<Ck!/Rn *(xk+1/(k+1)!) which converges
from ratio test=|tn+1/tn|=ck+1/R radius of convergence
