Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

V0.13 is very lag #814

Open
mengru-lotusflare opened this issue Mar 5, 2024 · 7 comments
Open

V0.13 is very lag #814

mengru-lotusflare opened this issue Mar 5, 2024 · 7 comments
Assignees

Comments

@mengru-lotusflare
Copy link

mengru-lotusflare commented Mar 5, 2024

when upgrade from hubble-ui from v0.12.1 to v0.13.0, hubble-ui is very slow, a lot of api/service-map-stream and api/control-stream requests are queued。
image

only part of the flow can be rendered.

and reconnecting issue.

image

compared to old getEvents requests, ui can rendered very fast.

@yandzee
Copy link
Collaborator

yandzee commented Mar 7, 2024

Hey @mengru-lotusflare,
In that release we did a major change of how UI client interacts with its backend and now its done not via grpc-web but with simple short polling scheme. I believe that's why you might experience all that.
Could you please try this version? Let's see if that could improve your situation..

@mengru-lotusflare
Copy link
Author

let me have a try

@mengru-lotusflare
Copy link
Author

mengru-lotusflare commented Mar 8, 2024

@yandzee I have tried with hubble ui frontend quay.io/cilium/hubble-ui-ci:f41966374314aa145dfb8fcf78e4f80a45461fb3@sha256:4db5ed2dc6a1eee84235dd66c0f106b25717ee4be5ecab2b94b776eba7722a51 and hubble ui backend quay.io/cilium/hubble-ui-backend-ci:f41966374314aa145dfb8fcf78e4f80a45461fb3@sha256:35459ec5a39e09854a9ab2fb46d18580b151f65c0b2fa5350a5710bce2e6a861 with cilium helm v0.13.12. but the hubble-ui pod can be start

  backend:
    Port:          8090/TCP
    Host Port:     0/TCP
    State:         Waiting
      Reason:      CrashLoopBackOff
    Last State:    Terminated
      Reason:      Error
      Message:     exec /usr/bin/backend: exec format error

@geakstr
Copy link
Collaborator

geakstr commented Mar 8, 2024

@mengru-lotusflare exec format error error means that backend binary isn't compatible with the CPU architecture. Please try to not specify sha256:* parts for images

@mengru-lotusflare
Copy link
Author

mengru-lotusflare commented Mar 11, 2024

Hi @geakstr the exec format error problems has been fixed, and the ui is rendered sometimes faster than before, sometimes can not load data for a long time, sometimes can only load part of the data.
image
If we can make this flow buffer size and duration configurable, it will be much better.

@yandzee
Copy link
Collaborator

yandzee commented Mar 11, 2024

Pushed another commit to that PR, now you can try to tune it manually by setting FLOWS_THROTTLE_DELAY and FLOWS_THROTTLE_SIZE environment variables

@yandzee
Copy link
Collaborator

yandzee commented Mar 11, 2024

Btw, could you please share the logs from the UI backend container? Because I don't really see how you might get that red notification for stream reconnecting..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants