Skip to content

Memory Leak on inference #1422

Answered by TomekPro
TomekPro asked this question in Q&A
Discussion options

You must be logged in to vote

@felixdittrich92 finally, three things are needed to fix this memory leak:

  1. export DOCTR_MULTIPROCESSING_DISABLE=TRUE
  2. export ONEDNN_PRIMITIVE_CACHE_CAPACITY=1
  3. Upgrade torch to 2.1 (in my case cpu-only version): pip install torch==2.1.1 torchvision==0.16.1 torchaudio==2.1.1 --index-url https://download.pytorch.org/whl/cpu

Thanks for your help :)

Replies: 15 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by felixdittrich92
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
type: bug Something isn't working
3 participants
Converted from issue

This discussion was converted from issue #1418 on January 04, 2024 14:41.