-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improving Spectral #53
base: master
Are you sure you want to change the base?
Conversation
…is PR are no longer necessary
derivative/differentiation.py
Outdated
@@ -228,6 +228,8 @@ def x(self, X, t, axis=1): | |||
|
|||
|
|||
def _align_axes(X, t, axis) -> tuple[NDArray, tuple[int, ...]]: | |||
"""Reshapes the data so the derivative always happens along axis 1. Do we need this? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need this function? All this reshaping is a bit confusing. Can we just take the data as is and differentiate along whichever axis the user specifies? There are ways of indexing in numpy
that make this fairly easy usually. For instance I have this line.
The code could probably be both more general and simplified if we do this right, but it's a separate PR for sure.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Happy for this, but the challenge is that all methods internally have been allowed to expect one particular axis. So it probably means changing a fair amount.
… because evidently unnecessary
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I'm done messing with it and ready for proper review from the maintainers.
# Test some basic functions | ||
# ========================= | ||
# Test that basic functions are differentiated correctly | ||
# ====================================================== | ||
funcs_and_derivs = ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a good library of examples, well thought out.
# Add noise to avoid all zeros non-convergence warning for sklearn lasso | ||
f_mod = lambda t: func(t) + 1e-9 * np.random.randn(*t.shape) # rename to avoid infinite loop | ||
t = np.linspace(0, 2*np.pi, 100, endpoint=False) # For periodic functions, it's important the endpoint not be included | ||
if m == "spectral": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I decided to treat spectral as a special case in here, because it already kind of was, and this was the easiest extension to get coverage.
def test_fn(m, func_spec): | ||
func, fname, deriv, f_id = func_spec | ||
t = np.linspace(0, 2*np.pi, 100) | ||
if m == 'trend_filtered': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
f_mod
stuff needn't be here to silence the warning. We can just filter warnings with pytest.
Ah! Failing the interface tests. I neglected to run them because I was getting this
I can see in the
But if I open up
Never used poetry, and not sure whether this hyperparam_opt is doing anything. Looks like I did cause a failure related to aligning axes. I'll try to address that one. |
Rerun that workflow? 🥺 |
@_memoize_arrays(1) # the memoization is 1 deep, as in only remembers the most recent args | ||
def _global(self, t, x): | ||
if self.basis == 'chebyshev': | ||
return cheb_deriv(x, t, self.order, self.axis, self.filter) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It struck me just now that filtering by frequency on cosine-spaced points isn’t totally sound: high-frequency variations look lower frequency around the edges where sample density is higher.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've been pondering this more pavelkomarov/spectral-derivatives#14. Basically what I'm currently doing is keeping filter(k)
amount of the filter(k)
amount of the
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've made a notebook to empirically compare spectral filtering in the Fourier and Chebyshev bases, and pulled in a little of Floris' PyNumDiff to compare with Savitzky-Golay filtering + Gaussian smoothing. Fourier spectral is the gold standard, but even it can be fragile. Chebyshev can work okay (not great but really not worse than savgoldiff
) for 1st and 2nd derivatives in the presence of a little noise but starts to blow up at the edges of the domain beyond that.
Hey sorry for the delay Pavel... I should be able to get to this next week when I'm back in town. |
…is is a little more subtle and bumped spectral-derivatives version number to get latest features
I'll add comments throughout this PR. It's far from acceptable, but I want to have something to point to and talk about.