You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Notify ties up a lot of resources on the Postgres Server querying for Cold Chain Data.
This process of individually querying each sensor has a lot of over head (lots of querys to process) vs a process that queried all relevent sensors at once then processed in notify.
Notify should also probably throttle itself so that it waits a little bit longer between query attempts.
Possibly user configurable, so that if you want to get more realtime alerts (e.g. 10s you can but in a big system you might want to only check for alerts every 5 or 10 minutes)
To Reproduce
Steps to reproduce the behaviour:
Setup cold chain alerts for a significant number of sensors
Check CPU usage for Postgres
Expected behaviour
Additional context
The text was updated successfully, but these errors were encountered:
Hey team - just a couple of videos showing the CPU spiking - Happening roughly every 10 seconds. The CPU spiking is causing some slowness for mSupply users.
The ideal situation would be to migrate Postgres to a separate server so it does not throttle the CPU. I understand the CPU spiking is fairly normal, but if possible it would be great if the query could be optimised.
Describe the bug
Notify ties up a lot of resources on the Postgres Server querying for Cold Chain Data.
This process of individually querying each sensor has a lot of over head (lots of querys to process) vs a process that queried all relevent sensors at once then processed in notify.
Notify should also probably throttle itself so that it waits a little bit longer between query attempts.
Possibly user configurable, so that if you want to get more realtime alerts (e.g. 10s you can but in a big system you might want to only check for alerts every 5 or 10 minutes)
To Reproduce
Steps to reproduce the behaviour:
Expected behaviour
Additional context
The text was updated successfully, but these errors were encountered: