Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parsing big files: worker runs out of memory with no error message #350

Open
adrisanchu opened this issue Mar 21, 2022 · 0 comments
Open

Comments

@adrisanchu
Copy link

In which part of the interface would this feature applies?
[x] 1. Load your data
[ ] 2. Choose a chart
[ ] 3. Mapping
[ ] 4. Customize
[ ] 5. Export

Is your feature request related to a problem? Please describe.
I have tried to parse a 300MB .csv file (8800 rows x 2000 cols).
In the UI, the "Data parsing options" panel is shown, but the data is not displayed as the worker is running out of memory, so the page looks like if it is broken.
I got the following message in the console:
eee DOMException: Failed to execute 'postMessage' on 'Worker': Data cannot be cloned, out of memory.

Describe the solution you'd like
It would be great to change the UI if the postMessage from the worker sends an 'out of memory' message, and tell the user something like "sorry, your file is too big to be processed". Maybe with a modal window?

Describe alternatives you've considered
I don't know which parser you are currently using (I checked the code but it refers to a core package of your own), but papaparse is a good option to parse big files in the frontend.

Additional context
Here is a screenshot of the page after dropping the file
screenshot

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant