Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: PaddleOCR with OpenVINO (GPU) sometimes gives messed-up OCR results #28897

Open
3 tasks done
Jackochiu opened this issue Feb 10, 2025 · 0 comments
Open
3 tasks done
Labels
bug Something isn't working support_request

Comments

@Jackochiu
Copy link

Jackochiu commented Feb 10, 2025

OpenVINO Version

2024.6.0

Operating System

Windows System

Device used for inference

GPU

Framework

PaddlePaddle

Model used

https://storage.openvinotoolkit.org/repositories/openvino_notebooks/models/paddle-ocr/ch_PP-OCRv3_rec_infer.tar

Issue description

We referred to the OpenVINO GitHub notebooks: PaddleOCR Webcam to run PaddleOCR.

During testing, we encountered the following issue:
Issue 1 : After the second inference, PaddleOCR with OpenVINO (GPU) sometimes produces garbled OCR results for images.

but If we recompile the model for each image, the GPU results are correct and match the CPU results. However, this approach wastes time on model compilation. Is there any other way besides recompiling the model?

Image

  • We tried calling reset_state() and release_memory() before each inference, but the issue still persists.

For example, when we run OCR on the following image:
CPU output: 茉莉茶园 茉莉 清茶 花茶 工法
GPU output: 螳常 ' 刚 工法

Image

Issue 2 : We found that the OCR results from PaddleOCR with OpenVINO (GPU) change depending on the rec_batch_num setting, while the CPU results remain unaffected. We’d like to confirm whether this is a GPU-related bug and how we can fix it.
(rec_batch_num is the batch size for the PaddleOCR recognition (rec) model.)

We’d like to understand the possible causes of this issue and any recommended debugging steps. Additionally, we want to know why rec_batch_num affects the results during GPU inference.

Step-by-step reproduction

No response

Relevant log output

Issue submission checklist

  • I'm reporting an issue. It's not a question.
  • I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found a solution.
  • There is reproducer code and related data files such as images, videos, models, etc.
@Jackochiu Jackochiu added bug Something isn't working support_request labels Feb 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working support_request
Projects
None yet
Development

No branches or pull requests

1 participant