I wrote a code that uses the trapezoid method to calculate integrals numerically.
I wanted to see how accurate the formula that gives the upper bound of the fault made.
This formula is:
I-T{2n}\approx (1/3)*(T{2n}-T_n) With I the actual value of the integral, and T_k the value that the trapezoid method gives for k partitions.
I plotted the difference between the actual fault (aka 'the fault on the fault formula') and what the formula tells us against the number of partitions for \int\limits_2^5 x^2 dx = 39.
3 weird things happened: 1) The fault seems REALLY, maybe too small. It could be true, but it seems weird. Some values even pop up at (seemingly?) zero.
2) The freaky things are the lines that are starting to form. They appear for every upper number of partitions, the bigger, the clearer.
3) When running the code for a more complicated integral, it behaved like you'd expect: the y-values get smaller as n gets bigger.
What on Earth could be the cause of this, and why!?