Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: JAVA out of memory #40

Open
1 task done
nhill917 opened this issue May 28, 2024 · 3 comments
Open
1 task done

[Bug]: JAVA out of memory #40

nhill917 opened this issue May 28, 2024 · 3 comments
Assignees
Labels
bug Something isn't working

Comments

@nhill917
Copy link

Describe the bug

When running SDMs with large numbers of presence points, I am getting a JAVA out of memory error from MaxEnt. Is there a way to overcome this? Also, when using the optimizeModel() function, what are reasonable starting values across the different methods (maxent, random forest, boosted regression tree)? I am running a number of SDMs across the various methods and would like to make sure the optimisation step is appropriate. For example, in your vignette you use the below values for MaxEnt, what would you use for random forest and boosted regression tree?
h <- list(reg = seq(0.2, 5, 0.2), fc = c("l", "lq", "lh", "lp", "lqp", "lqph"))

Thank you for your help.
Kind regards,
nick.

Steps to reproduce the bug

library(SDMtune)
NA

Session information

NA

Additional information

NA

Reproducible example

  • I have done my best to provide the steps to reproduce the bug
@nhill917 nhill917 added the bug Something isn't working label May 28, 2024
@rogerio-bio
Copy link

Hi @nhill917,

Most memory problems in JAVA are because your PC does not have enough RAM to run what the model is requiring. I can easily run a model with 6k presences with 32 variables with 32 GB of RAM, but it takes a while. How big is your dataset?

@nhill917
Copy link
Author

nhill917 commented Jun 2, 2024

Hi rogerio, yes I think this is the problem. I am running it with 13 variables and for some species beyond 100000 presence points. It is reporting use of 5-6gb of ram. Is there anyway to bypass this or does maxnet bypass java? From memory I tried a run with Maxnet and had similar issues. Thank you for getting back to me.

@rogerio-bio
Copy link

Hi @nhill917 ,

You can increase the RAM of your PC to meet the requirements of your model, or decrease the number of presence points. Alternatively, consider using a cloud PC for the analysis. This allows you to "rent" the amount of RAM you need to run the model in the cloud.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants