-
-
Notifications
You must be signed in to change notification settings - Fork 165
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reading a large file into memory fails with java.lang.ArrayIndexOutOfBoundsException: arraycopy: length -84 is negative
#656
Comments
The problem is that a lot of the parser code assumes that it takes an upickle/ujson/templates/ElemParser.scala Line 32 in fdb5a0d
This can probably fixed, but might take some work:
All three approaches would work if someone wants to take a crack at it |
Fair enough, but I still think this is an issue in upickle/ujson ;) |
I was looking around the library for some sort of "lazy" Visitor that would allow me to scan over the data without buffering it excessively, but from what I saw even the base parser makes assumptions that a JSON-array has a length. I wonder if unpickling performance would suffer greatly if no such assumption was made and the elements of objects/arrays were simply emitted to an underlying Visitor 🤔 |
Trying to read a JSON-file of the format
Seq[Map[String, T]]
usingujson.read(os.read.stream(path))
fails due to internal read buffer overflow.The file is 2.7 GiB large.
Stack trace:
The text was updated successfully, but these errors were encountered: