Skip to content

Use ProbeTable to generate NP probes information #349

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 11 commits into
base: main
Choose a base branch
from

Conversation

chrishalcrow
Copy link
Member

@chrishalcrow chrishalcrow commented May 15, 2025

PR to use ProbeTable (https://github.com/billkarsh/ProbeTable/) to generate NeuroPixels probe information, instead of hardcoding this ourselves.

Summary

Import differences in src/probeinterface/neuropixels_tools.py

Does a few things:

  • Replaces the npx_descriptions hardcoded dict with a make_npx_descriptions function which generates this data from the probe_features.json file from ProbeTable.
  • We now extract a mux_table_array from the probe_features.json, file, which tells us the multiplexing groupings. This can then be used in spikeinterface for the inter_sample_shift. PR incoming. This is saved as a probe annotation.
  • Removes probe_type from metadata creation, and from the read_openephys and read_spikeglx functions. Still needed for the read/write imro functions, where probe_type is stored in the format. E.g. tests/data/imro/test_single_shak_2.0.imro starts “(21,384)(…” and the “21” is the probe_type of the probe.
  • Adds a get_probe_contour_vertices function which computes the probe contour vertices from metadata.

We now additionally support the following probes: 'NP2005', 'NP3011', 'NP1033', 'NP1122', 'NP3010', 'NP1120', 'NP1210', 'NP1014', 'NP3021', 'NP2020', 'NP3020', 'NP1011', 'NP3000', 'NP1013', 'NP1123', 'NP2021', 'NP1200', 'NP2006', 'NP1012'
We LOSE support for NP1110

Implementation questions and annoying edge cases:

  • We had a model_name for each probe, which ProbeTable doesn’t contain. @alejoe91 suggested using the description, which changes all our probe names. Ok??
  • I’ve saved the json file at src/probeinterface/resources/probe_features.json. I think it has to be in source to work for a normal pip installation? Any better places to store?
  • We want to check backwards compatibility. This PR mostly changes the npx_description, so it's not too radical. So we only really need check that the old hardcoded dict matches the new function. To do this, I’ve moved the old hardcoded dict to a test file. Any other ideas for how to check this?

Backwards compatibility

There are some mismatches between the ProbeTable and old ProbeInterface implementation we found by comparing the npx_descriptions dict with the make_npx_descriptions function. These are:

Probe type key ProbeTable ProbeInterface
1020 x_pitch 87 91
stagger 16 12
1021 x_pitch 87 91
stagger 16 12
1030 x_pitch 87 56
stagger 16 12
1031 x_pitch 87 91
stagger 16 12
1121 x_pitch 0 6
contact_width 2.5 2
1110 fields_in_imro_table (“group”, “bankA”, “bankB”) (“channel_ids”, “banks”, “references”, “ap_gains”, “lf_gains”, “ap_hp_filters”

We also tested the neuropixels directory from the probeinterface_library and the new library generated by running generate_neuropixels_library.py. We checked their differences using the following code. First, we checks which probes were generated from each library:

import json
from pathlib import Path

new_library_path = Path('/Users/christopherhalcrow/Work/fromgit/probeinterface/resources/neuropixels_library_generated/')
new_generated_paths = list(new_library_path.glob('*'))
probes_in_new_paths = set([path.name for path in new_generated_paths])

old_library_path = Path("/Users/christopherhalcrow/Work/fromgit/probeinterface_library/neuropixels")
old_generated_paths = list(old_library_path.glob('*'))
probes_in_old_paths = set([path.name for path in old_generated_paths])

new_not_old = probes_in_new_paths.difference(probes_in_old_paths)
old_not_new = probes_in_old_paths.difference(probes_in_new_paths)

print(new_not_old)
print(old_not_new)

>>> {'NP2005', 'NP3011', 'NP1033', 'NP1122', 'NP3010', 'NP1120', 'NP1210', 'NP1014', 'NP3021', 'NP2020', 'NP3020', 'NP1011', 'NP3000', 'NP1013', 'NP1123', 'NP2021', 'NP1200', 'NP2006', 'NP1012'}
>>> {'README.txt', 'NP1110'}

The new_not_old contains all the new probes which are in ProbeTable but not in our previous table. Nice! The old_not_new contained NP1110 which we currently don't support (@alejoe91 @samuelgarcia I think we're not supporting it, right??)

We can then check that all the generated probes are equal by comparing their json files. Here's some code that does that. We do assertions, then I've added continues when we want to skip a property (if we know it's different):

for probe_name in old_generated_paths[:2]:
    if probe_name.name not in ['NP1110', 'README.txt']:

        old_json_filename = probe_name / (probe_name.name + '.json')
        new_json_filename = new_library_path / probe_name.name / (probe_name.name + '.json')
        
        with open(old_json_filename) as f:
            old_probe_info = json.load(f)
        with open(new_json_filename) as g:
            new_probe_info = json.load(g)

        for a, old_probe in enumerate(old_probe_info['probes']):
            new_probe = new_probe_info['probes'][a]
            for key, value in old_probe.items():
                if key == "annotations":
                    for annotations_key, annotations_value in value.items():
                        new_annotations_value = new_probe['annotations'][annotations_key]
                        if annotations_key in ["model_name", "shank_tips"]:
                            continue
                        assert annotations_value == new_probe["annotations"][annotations_key], f"{annotations_key} doesn't match.\nOLD: {annotations_value}.\nNEW: {new_annotations_value}"
                elif key in ["probe_planar_contour"]:
                    continue
                elif probe_name.name == "NP1121" and key == "contact_shape_params":
                    continue
                elif probe_name.name in ["NP1031", "NP1030", "NP1020", "NP1021"] and key == "contact_positions":
                    continue
                else:
                    assert value == new_probe[key], f"{probe_name}. {key} doesn't match.\nOLD: {value}.\nNEW: {new_probe[key]}"

The skipped properties exactly match those in the table above.

Copy link

codecov bot commented May 15, 2025

Codecov Report

Attention: Patch coverage is 95.29412% with 4 lines in your changes missing coverage. Please review.

Project coverage is 90.09%. Comparing base (43ccfcb) to head (851a696).
Report is 4 commits behind head on main.

Files with missing lines Patch % Lines
src/probeinterface/neuropixels_tools.py 95.29% 4 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #349      +/-   ##
==========================================
+ Coverage   89.75%   90.09%   +0.33%     
==========================================
  Files          12       12              
  Lines        1972     2019      +47     
==========================================
+ Hits         1770     1819      +49     
+ Misses        202      200       -2     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@h-mayorquin
Copy link
Collaborator

Thanks for taking on this, @chrishalcrow

I’ve saved the json file at src/probeinterface/resources/probe_features.json. I think it has to be in source to work for a normal pip installation? Any better places to store?

Yes, you will need to modify the pyproject.toml to ensure that this is packaged on the wheel/stds. An alternative is to store it as a python dict which will make package simpler (that way you don't need to modify the gitignore).

All that said, json should be fine as well, any reason you prefer it? Whatever we choose, we should add a cron test that runs weekly or monthy to ensure that what we have matches:

https://github.com/billkarsh/ProbeTable/blob/main/Tables/probe_features.json

I can help with that if you want.


is_phase3a = False
# These are all prototype NP1.0 probes, not contained in ProbeTable
if probe_part_number in ["PRB_1_4_0480_1", "PRB_1_4_0480_1_C", "PRB_1_2_0480_2", None]:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PRB_1_4_0480_1 and PRB_1_4_0480_1_C are on the table, why are we fetching NP1010 instead for those?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oops - tests were failing for these and we used to group them together so I thought that was why. Now realise there's a typo in PRB_1_2_0480_2 in the ProbeTable. Will raise an issue over there.

@h-mayorquin
Copy link
Collaborator

We had a model_name for each probe, which ProbeTable doesn’t contain. @alejoe91 suggested using the description, which changes all our probe names. Ok??

I think this is a good idea.I am +1 on it.


if imDatPrb_pn is None:
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note: we push this logic to the read_imro function.

@h-mayorquin
Copy link
Collaborator

The logic seems fine and the tests are passing. We can improve later if we need to so this is fine to me. To points left are that:

@chrishalcrow
Copy link
Member Author

Great! I've...

  • added more backwards compatibility information in the PR description.
  • added a github action (99% written by Claude) which checks the probe_features.json every week. It works fine on chrishalcrow/probeinterface but I'm unsure how to test it here.

Not sure how to check the packaging stuff - we can discuss offline.

@chrishalcrow
Copy link
Member Author

Me and @h-mayorquin discussed the GitHub action and checked if the probe_features.json file would be included in the PyPi build - it is! I'm now happy with this PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants