Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improving Spectral #53

Open
wants to merge 13 commits into
base: master
Choose a base branch
from

Conversation

pavelkomarov
Copy link

I'll add comments throughout this PR. It's far from acceptable, but I want to have something to point to and talk about.

@pavelkomarov pavelkomarov changed the title spitballing Improving Spectral Jan 18, 2025
@@ -228,6 +228,8 @@ def x(self, X, t, axis=1):


def _align_axes(X, t, axis) -> tuple[NDArray, tuple[int, ...]]:
"""Reshapes the data so the derivative always happens along axis 1. Do we need this?
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need this function? All this reshaping is a bit confusing. Can we just take the data as is and differentiate along whichever axis the user specifies? There are ways of indexing in numpy that make this fairly easy usually. For instance I have this line.

The code could probably be both more general and simplified if we do this right, but it's a separate PR for sure.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Happy for this, but the challenge is that all methods internally have been allowed to expect one particular axis. So it probably means changing a fair amount.

Copy link
Author

@pavelkomarov pavelkomarov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think I'm done messing with it and ready for proper review from the maintainers.

# Test some basic functions
# =========================
# Test that basic functions are differentiated correctly
# ======================================================
funcs_and_derivs = (
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a good library of examples, well thought out.

# Add noise to avoid all zeros non-convergence warning for sklearn lasso
f_mod = lambda t: func(t) + 1e-9 * np.random.randn(*t.shape) # rename to avoid infinite loop
t = np.linspace(0, 2*np.pi, 100, endpoint=False) # For periodic functions, it's important the endpoint not be included
if m == "spectral":
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I decided to treat spectral as a special case in here, because it already kind of was, and this was the easiest extension to get coverage.

def test_fn(m, func_spec):
func, fname, deriv, f_id = func_spec
t = np.linspace(0, 2*np.pi, 100)
if m == 'trend_filtered':
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

f_mod stuff needn't be here to silence the warning. We can just filter warnings with pytest.

@pavelkomarov
Copy link
Author

pavelkomarov commented Feb 5, 2025

Ah! Failing the interface tests. I neglected to run them because I was getting this

E           ValueError: No reduction method named kalman.default is installed.Reduction methods need to be installed as an entry point tothe 'derivative.hyperparam_opt' group

I can see in the .toml

[tool.poetry.plugins.'derivative.hyperparam_opt']
"kalman.default" = "derivative.utils:_default_kalman"

But if I open up utils.py I see

def _default_kalman(t, z):
    return 1

Never used poetry, and not sure whether this hyperparam_opt is doing anything.

Looks like I did cause a failure related to aligning axes. I'll try to address that one.

@pavelkomarov
Copy link
Author

Rerun that workflow? 🥺

@_memoize_arrays(1) # the memoization is 1 deep, as in only remembers the most recent args
def _global(self, t, x):
if self.basis == 'chebyshev':
return cheb_deriv(x, t, self.order, self.axis, self.filter)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It struck me just now that filtering by frequency on cosine-spaced points isn’t totally sound: high-frequency variations look lower frequency around the edges where sample density is higher.

Copy link
Author

@pavelkomarov pavelkomarov Feb 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've been pondering this more pavelkomarov/spectral-derivatives#14. Basically what I'm currently doing is keeping filter(k) amount of the $k^{th}$ Chebyshev polynomial, versus in the Fourier case keeping filter(k) amount of the $k^{th}$ wavenumber. This may be the best thing to do, most spiritually-aligned with "spectral". I so far haven't thought of a better alternative. But it's counter-intuitive: Because the Chebyshev polynomials have steeper slope at the edges of the domain, we can fit high-frequencies better there. So if the assumption is that noise is high frequency and needs to be separated from the data this way, we can't filter it as consistently using this basis.

Copy link
Author

@pavelkomarov pavelkomarov Mar 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've made a notebook to empirically compare spectral filtering in the Fourier and Chebyshev bases, and pulled in a little of Floris' PyNumDiff to compare with Savitzky-Golay filtering + Gaussian smoothing. Fourier spectral is the gold standard, but even it can be fragile. Chebyshev can work okay (not great but really not worse than savgoldiff) for 1st and 2nd derivatives in the presence of a little noise but starts to blow up at the edges of the domain beyond that.

@Jacob-Stevens-Haas
Copy link
Collaborator

Hey sorry for the delay Pavel... I should be able to get to this next week when I'm back in town.

…is is a little more subtle and bumped spectral-derivatives version number to get latest features
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants