Hi!
I'm having a bit of an issue with large symbolic derivatives involving interpolated functions. Basically, I have an interpolated function f(t) defined by the following code.
noise = Interpolation[
Normal[RandomFunction[WhiteNoiseProcess[0.000025], {0, tmax}]][[
1]]];
NDSolve[{lineTension'[t] + 1/150*lineTension[t] == noise[t],
lineTension[0] == 0}, lineTension, {t, 0, tmax}, AccuracyGoal -> 2,
PrecisionGoal -> 3]
I'm aware that making white noise continuous defeats the point of white noise in any given time segment being uncorrelated with other segments, but it's a neat way to get a random walk of the form I want.
Next I use this to define a function and take a partial derivative with respect to a different variable (not t).
The following is a simplified version of this code:
E[diagram_] := Sum[lineTension[t]*EuclideanDistance[Subscript[v, i],Subscript[v, i+1]), {i,
Length[edges]}]
Then
force[diagram_] :=
Sum[-D[E[diagram], Subscript[x,i]],{i,Length[vertices]}]
if the code doesn't make sense I cut out a bunch of lists and things so I wouldn't have to include 40 datafiles for it to make sense.
Anyways, when I time the calculation of the force, it massively increases as the standard deviation of the white noise from the initial creation of the lineTension function. This makes no sense to me, because the lineTension function is a function of t, whereas I'm taking the derivative with respect to a bunch of x_i's. It should be treated as a constant.
Thanks! Kaden.