Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Apple's "Vision" framework for on-device OCR text recognition #10480

Open
2 tasks done
jasongitmail opened this issue Jan 23, 2024 · 4 comments
Open
2 tasks done
Labels
feature-pending-triage A new feature request pending triage to confirm validity.

Comments

@jasongitmail
Copy link

jasongitmail commented Jan 23, 2024

Is your feature request related to a problem? Please describe.

I'd like to be able to use Apple's excellent, on-device OCR text recognition "vision" framework, available since iOS 13. (For clarity, this is not for VisionOS head set.)

It offers high accuracy and speed for OCR, due to being on-device, using machine learning for accuracy, and optimized by Apple in recent years.

Describe the solution you'd like

An API in TypeScript to provide an image and get a result of both text and positions of the text within the image, using Apple's Vision OCR APIs.

Describe alternatives you've considered

Anything else?

Please accept these terms

@jasongitmail jasongitmail added the feature-pending-triage A new feature request pending triage to confirm validity. label Jan 23, 2024
@NathanWalker
Copy link
Contributor

Completely agree. We have used the vision kit with apps before and it’s quite nice. RN vision camera from Margelo is likely possible to be used with NativeScript as Marc mentioned its libraries were recently decoupled from react; we want to try sometime early this year.

@farfromrefug
Copy link
Collaborator

this is already possible in N using native iOS APIs through JS? I am not sure I understand what more needs to be added to N

@NathanWalker
Copy link
Contributor

I believe @jasongitmail is just mentioning a condensed API to make it a bit more palatable. We have some examples of vision kit usage in NativeScript apps I’ll have to share sometime.

@farfromrefug
Copy link
Collaborator

@NathanWalker oh ok. I will switch my app to using it too (using tesseract right now) so will have another real time example at that point.
Should we move this ticket to discussions?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature-pending-triage A new feature request pending triage to confirm validity.
Projects
None yet
Development

No branches or pull requests

3 participants