Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Monitor Multiple Celery Backends, Merging Metrics #323

Open
coneybeare opened this issue Sep 30, 2024 · 8 comments
Open

Comments

@coneybeare
Copy link

I'd love to pass a comma separated list of CE_BROKER_URLs and have this spin up some multiprocessing under the hood, merging the results in the prometheus metrics endpoint. This is possible with Multiprocess Mode on the standard prometheus_client library. Are there any plans to add this functionality?

@danihodovic
Copy link
Owner

What's the deployment setup that you are running? Django on K8s? What prevents you from deploying one celery-exporter per application / K8s namespace?

@coneybeare
Copy link
Author

We have lots of different apps per env/namespace, with separate brokers. Without merging, we need a unique exporter pod per app per namespace, when we could just have one per namespace

@coneybeare
Copy link
Author

It’s just numerous pods to manage when we could just do three, less overhead, especially when the merging is almost free. Thought it might be a useful enhancement for others too, so thought I would ask!

@danihodovic
Copy link
Owner

How would you differentiate metrics from two different Django apps?

@coneybeare
Copy link
Author

Not sure how Django differs in its default setup, but we setup our fastapi/celery tasks to use different queue names by app and purpose, and can filter dashboards in grafana based on the the hostname task name and queue_name labels using promql.

@danihodovic
Copy link
Owner

but we setup our fastapi/celery tasks to use different queue names by app and purpose

Are all of the applications running against the same broker (Redis / RabbitMQ) instance? Or do you have one for each? If it's one for each, how would you differentiate application A from application B, if they're using the same task and queue names?

@coneybeare
Copy link
Author

coneybeare commented Oct 18, 2024 via email

@danihodovic
Copy link
Owner

While the work done by @DanArmor in #339 doesn't allow you to use one exporter to read from multiple Celery deployments, it does allow you to deploy multiple exporters and collect the data from one Prometheus backend. It's been released as 0.11.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants