From
a quite long time I am fascinated to compute running trend in ferret (I
hope we can do it in ferret). Lets say I have monthly data and and I
like to compute 10 year running trend with a sliding window of 12
months. Please help me out in this regard because I am completely blank
in doing this in ferret. I strongly want to say that this type of
statistic is very useful for many of us. I am attaching a netcdf file as
a test data. I hope I will get an elegant solution like we did in
running "correlation" case.
I'm not 100% sure, but I have the feeling that you can do it.
I'm too lazy. So, you, or somebody else, please figure it out. I just outline my idea.
Our model is y = a x + b.
Problem: Find a and b that minimize:
J = sum_i (Y_i - y_i)^2
where y_i = a x_i + b and Y_i are the original data points. (Hint: take partial derivatives of J with respect to a and to b and equate them to zero.) Then, you'll find that a and b are each a function of sum_i (x_i)^2, sum_i x_i, sum_i Y_i, etc.
As we saw in the running correlation case, these sums can be calculated in the "sliding way". Therefore we can calculate a and b in the sliding way.
I may be totally wrong. Just try.
Cheers,
Ryo