http://www.oliford.co.uk/phys/far4-lid4-78601.svg http://www.oliford.co.uk/phys/far4-lid4-78601.txt <-- t, l, f I have a signal of 3 variables: t, l and f f(t) is the interesting thing, but f has some spurious oscillation in something like f ~ cos(l) but, the oscillation seems to change in amplitude and/or phase as it mvoes around in t, or in f maybe mission: remove the oscillation the wavelength of the osc in f(l) should in theory be 1.143e19 / 2, but might not be (anyone who solves it will get mentioned in the appropriate paper)