-
Notifications
You must be signed in to change notification settings - Fork 612
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
move all njit
calls into a decorator
#3335
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #3335 +/- ##
==========================================
- Coverage 77.26% 76.59% -0.68%
==========================================
Files 111 111
Lines 12630 12827 +197
==========================================
+ Hits 9759 9825 +66
- Misses 2871 3002 +131
|
Benchmark changes
Comparison: https://github.com/scverse/scanpy/compare/9d3c340152543a6364d9c55bc11e610027ea319f..e8c5d144270c11de4d9a9afead8dea016dde2d1a More details: https://github.com/scverse/scanpy/pull/3335/checks?check_run_id=32725494502 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Almost every njit
ted function seems to have a clear answer to “this was/wasn’t parallel before, should it be now?”
src/scanpy/_compat.py
Outdated
@cache | ||
def _is_threading_layer_threadsafe() -> bool: | ||
import importlib | ||
|
||
import numba | ||
|
||
if (available := LAYERS.get(numba.config.THREADING_LAYER)) is None: | ||
# given by direct name | ||
return numba.config.THREADING_LAYER in LAYERS["threadsafe"] | ||
|
||
# given by layer type (safe, …) | ||
for layer in cast(list[Layer], numba.config.THREADING_LAYER_PRIORITY): | ||
if layer not in available: | ||
continue | ||
try: # `importlib.util.find_spec` doesn’t work here | ||
importlib.import_module(f"numba.np.ufunc.{layer}pool") | ||
except ImportError: | ||
continue | ||
# the layer has been found | ||
return layer in LAYERS["threadsafe"] | ||
msg = f"No loadable threading layer: {numba.config.THREADING_LAYER=}" | ||
raise ValueError(msg) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This here business is complicated because numba doesn’t support getting the configured threading layer without trying to run something. And that’s exactly what we want to avoid!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/edit: ah the logic isn’t there yet, I’ll fix
n = len(indptr) - 1 | ||
result = np.ones(n, dtype=np.bool_) | ||
for i in range(n): | ||
for i in numba.prange(n): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I parallelized this function by replacing the numba.njit
import with our wrapper. Seems to work fine!
@@ -182,7 +181,7 @@ def _gearys_c_vec_W( | |||
# https://github.com/numba/numba/issues/6774#issuecomment-788789663 | |||
|
|||
|
|||
@numba.njit(cache=True) | |||
@numba.njit(cache=True, parallel=False) # noqa: TID251 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is an inner function. It wasn’t parallelized before and isn’t now. See also @ivirshup’s comment above
@@ -203,7 +202,7 @@ def _gearys_c_inner_sparse_x_densevec( | |||
return numer / denom | |||
|
|||
|
|||
@numba.njit(cache=True) | |||
@numba.njit(cache=True, parallel=False) # noqa: TID251 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ditto
@@ -137,7 +137,7 @@ def _morans_i_vec( | |||
return _morans_i_vec_W(g_data, g_indices, g_indptr, x, W) | |||
|
|||
|
|||
@numba.njit(cache=True) | |||
@numba.njit(cache=True, parallel=False) # noqa: TID251 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ditto
@@ -159,7 +159,7 @@ def _morans_i_vec_W( | |||
return len(x) / W * inum / z2ss | |||
|
|||
|
|||
@numba.njit(cache=True) | |||
@numba.njit(cache=True, parallel=False) # noqa: TID251 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ditto
# parallel=False needed for accuracy | ||
@numba.njit(cache=True, parallel=False) # noqa: TID251 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
added a comment her why this one is sequential
# TODO: can/should this be parallelized? | ||
@numba.njit(cache=True) # noqa: TID251 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe you have an idea how to answer this q
Huh weird, it gets detected, but it doesn’t seem to help to call the non-parallel version lol. If I replace the Seems like calling numba from a $ hatch test tests/test_utils.py::test_is_constant_dask[csr_matrix-0] --capture=no
Numba function called from a non-threadsafe context. Try installing `tbb`.
Numba function called from a non-threadsafe context. Try installing `tbb`.
Numba workqueue threading layer is terminating: Concurrent access has been detected.
- The workqueue threading layer is not threadsafe and may not be accessed concurrently by multiple threads. Concurrent access typically occurs through a nested parallel region launch or by calling Numba parallel=True functions from multiple Python threads.
- Try using the TBB threading layer as an alternative, as it is, itself, threadsafe. Docs: https://numba.readthedocs.io/en/stable/user/threading-layer.html
Fatal Python error: Aborted
Thread 0x000000016fd2f000 (most recent call first):
File "~/Dev/scanpy/src/scanpy/_compat.py", line 133 in wrapper
File "~/Dev/scanpy/src/scanpy/_utils/compute/is_constant.py", line 109 in _
File "<venv>/lib/python3.12/functools.py", line 909 in wrapper
File "~/Dev/scanpy/src/scanpy/_utils/compute/is_constant.py", line 30 in func
File "<venv>/lib/python3.12/site-packages/dask/core.py", line 127 in _execute_task
File "<venv>/lib/python3.12/site-packages/dask/core.py", line 157 in get
File "<venv>/lib/python3.12/site-packages/dask/optimization.py", line 1001 in __call__
File "<venv>/lib/python3.12/site-packages/dask/core.py", line 127 in _execute_task
File "<venv>/lib/python3.12/site-packages/dask/local.py", line 225 in execute_task
File "<venv>/lib/python3.12/site-packages/dask/local.py", line 239 in batch_execute_tasks
File "<venv>/lib/python3.12/concurrent/futures/thread.py", line 64 in run
File "<venv>/lib/python3.12/concurrent/futures/thread.py", line 92 in _worker
File "<venv>/lib/python3.12/threading.py", line 1010 in run
File "<venv>/lib/python3.12/threading.py", line 1073 in _bootstrap_inner
File "<venv>/lib/python3.12/threading.py", line 1030 in _bootstrap
Thread 0x000000016ed23000 (most recent call first):
File "<venv>/lib/python3.12/concurrent/futures/thread.py", line 89 in _worker
File "<venv>/lib/python3.12/threading.py", line 1010 in run
File "<venv>/lib/python3.12/threading.py", line 1073 in _bootstrap_inner
File "<venv>/lib/python3.12/threading.py", line 1030 in _bootstrap
Current thread 0x000000016dd17000 (most recent call first):
File "~/Dev/scanpy/src/scanpy/_compat.py", line 133 in wrapper
File "~/Dev/scanpy/src/scanpy/_utils/compute/is_constant.py", line 109 in _
File "<venv>/lib/python3.12/functools.py", line 909 in wrapper
File "~/Dev/scanpy/src/scanpy/_utils/compute/is_constant.py", line 30 in func
File "<venv>/lib/python3.12/site-packages/dask/core.py", line 127 in _execute_task
File "<venv>/lib/python3.12/site-packages/dask/core.py", line 157 in get
File "<venv>/lib/python3.12/site-packages/dask/optimization.py", line 1001 in __call__
File "<venv>/lib/python3.12/site-packages/dask/core.py", line 127 in _execute_task
File "<venv>/lib/python3.12/site-packages/dask/local.py", line 225 in execute_task
File "<venv>/lib/python3.12/site-packages/dask/local.py", line 239 in batch_execute_tasks
File "<venv>/lib/python3.12/concurrent/futures/thread.py", line 58 in run
File "<venv>/lib/python3.12/concurrent/futures/thread.py", line 92 in _worker
File "<venv>/lib/python3.12/threading.py", line 1010 in run
File "<venv>/lib/python3.12/threading.py", line 1073 in _bootstrap_inner
File "<venv>/lib/python3.12/threading.py", line 1030 in _bootstrap
Thread 0x000000016cd0b000 (most recent call first):
File "<venv>/lib/python3.12/socket.py", line 295 in accept
File "<venv>/lib/python3.12/site-packages/pytest_rerunfailures.py", line 433 in run_server
File "<venv>/lib/python3.12/threading.py", line 1010 in run
File "<venv>/lib/python3.12/threading.py", line 1073 in _bootstrap_inner
File "<venv>/lib/python3.12/threading.py", line 1030 in _bootstrap
Thread 0x00000001f9bdf240 (most recent call first):
File "<venv>/lib/python3.12/threading.py", line 355 in wait
File "<venv>/lib/python3.12/queue.py", line 171 in get
File "<venv>/lib/python3.12/site-packages/dask/local.py", line 138 in queue_get
File "<venv>/lib/python3.12/site-packages/dask/local.py", line 501 in get_async
File "<venv>/lib/python3.12/site-packages/dask/threaded.py", line 90 in get
File "<venv>/lib/python3.12/site-packages/dask/base.py", line 662 in compute
File "<venv>/lib/python3.12/site-packages/dask/base.py", line 376 in compute
File "~/Dev/scanpy/tests/test_utils.py", line 243 in test_is_constant_dask
File "<venv>/lib/python3.12/site-packages/_pytest/python.py", line 159 in pytest_pyfunc_call
File "<venv>/lib/python3.12/site-packages/pluggy/_callers.py", line 103 in _multicall
File "<venv>/lib/python3.12/site-packages/pluggy/_manager.py", line 120 in _hookexec
File "<venv>/lib/python3.12/site-packages/pluggy/_hooks.py", line 513 in __call__
File "<venv>/lib/python3.12/site-packages/_pytest/python.py", line 1627 in runtest
File "<venv>/lib/python3.12/site-packages/_pytest/runner.py", line 174 in pytest_runtest_call
File "<venv>/lib/python3.12/site-packages/pluggy/_callers.py", line 103 in _multicall
File "<venv>/lib/python3.12/site-packages/pluggy/_manager.py", line 120 in _hookexec
File "<venv>/lib/python3.12/site-packages/pluggy/_hooks.py", line 513 in __call__
File "<venv>/lib/python3.12/site-packages/_pytest/runner.py", line 242 in <lambda>
File "<venv>/lib/python3.12/site-packages/_pytest/runner.py", line 341 in from_call
File "<venv>/lib/python3.12/site-packages/_pytest/runner.py", line 241 in call_and_report
File "<venv>/lib/python3.12/site-packages/_pytest/runner.py", line 132 in runtestprotocol
File "<venv>/lib/python3.12/site-packages/_pytest/runner.py", line 113 in pytest_runtest_protocol
File "<venv>/lib/python3.12/site-packages/pluggy/_callers.py", line 103 in _multicall
File "<venv>/lib/python3.12/site-packages/pluggy/_manager.py", line 120 in _hookexec
File "<venv>/lib/python3.12/site-packages/pluggy/_hooks.py", line 513 in __call__
File "<venv>/lib/python3.12/site-packages/_pytest/main.py", line 362 in pytest_runtestloop
File "<venv>/lib/python3.12/site-packages/pluggy/_callers.py", line 103 in _multicall
File "<venv>/lib/python3.12/site-packages/pluggy/_manager.py", line 120 in _hookexec
File "<venv>/lib/python3.12/site-packages/pluggy/_hooks.py", line 513 in __call__
File "<venv>/lib/python3.12/site-packages/_pytest/main.py", line 337 in _main
File "<venv>/lib/python3.12/site-packages/_pytest/main.py", line 283 in wrap_session
File "<venv>/lib/python3.12/site-packages/_pytest/main.py", line 330 in pytest_cmdline_main
File "<venv>/lib/python3.12/site-packages/pluggy/_callers.py", line 103 in _multicall
File "<venv>/lib/python3.12/site-packages/pluggy/_manager.py", line 120 in _hookexec
File "<venv>/lib/python3.12/site-packages/pluggy/_hooks.py", line 513 in __call__
File "<venv>/lib/python3.12/site-packages/_pytest/config/__init__.py", line 175 in main
File "<venv>/lib/python3.12/site-packages/_pytest/config/__init__.py", line 201 in console_main
File "<venv>/bin/pytest", line 10 in <module> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Apologies for jitting comment if it's wrong, otherwise looks good
@ilan-gold said