-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to parse a large file #267
Comments
There isn't anything builtin to combine that would let you parallelize the work. You will need to chunk up the file on your own to get that. Of course, you should profile first to see that parallelizing could be useful. Depending on the format, monoidal parsing could be used to parallelize your parser (though it is pretty limited). |
Dose the asyn method can be used to build dependency relations? Monoidal parsing any example on rust, I search out some Haskell version. Thank you for your help. |
What do you mean?
Afraid not, I only know of Haskell examples. If you know enough Haskell to understand it, they should be easy to port to rust, otherwise I wouldn't go with that approach. |
Ok, thank you. I will try to optimize it first. |
I want to parse a large file to 2G. I wonder if there exists a multithread way to acclerate the parsing work.
The text was updated successfully, but these errors were encountered: