Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out of memory on model create with large datasets #2386

Open
cdbethune opened this issue Mar 19, 2021 · 0 comments
Open

Out of memory on model create with large datasets #2386

cdbethune opened this issue Mar 19, 2021 · 0 comments
Labels

Comments

@cdbethune
Copy link
Collaborator

#2385 details parquet specific issues reading in large datasets. During testing, we also observed that the dataset gets duplicated in memory during the splitting process, and appears to get duplicated again after splitting, with the server eventually running out of RAM and halting when datasets are large.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant