-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create resource for core Econ-ARK contributors to access more powerful setups #22
Comments
@llorracc @mnwhite, when you get a chance could you clarify what you need here? Specifically, what limitations are you currently running up against in terms of specs, and what tools will you need to use on such a machine? I can research the options and set something up for you, but I need more info. For instance, in Chris's email he said that the work he's doing runs on a 64gb machine memory issues -- is that a safe minimum to look for, and are there any other requirements? |
The 64GB is *not* typical for us. It's for one project, and only if you
want to do *absolutely everything* in that project in one run, *and* have
every single piece of simulated data generated available for use
afterward. Like, six or eight floats for 40,000 people on a quarterly
basis for literally the span of human civilization (or all of history, if
you're a young earther). If the code were restructured to *not* keep all
of that data, but instead get rid of it after it has been used for its
purposes in the paper, the memory requirements would be well under 1GB.
I hadn't noticed this issue before now. I thought our bigger related
priority was resources for remote Jupyter notebooks for teaching/seminars?
…On Tue, Mar 19, 2019 at 10:58 AM Shauna ***@***.***> wrote:
@llorracc <https://github.com/llorracc> @mnwhite
<https://github.com/mnwhite>, when you get a chance could you clarify
what you need here? Specifically, what limitations are you currently
running up against in terms of specs, and what tools will you need to use
on such a machine? I can research the options and set something up for you,
but I need more info.
For instance, in Chris's email he said that the work he's doing runs on a
64gb machine memory issues -- is that a safe minimum to look for, and are
there any other requirements?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#22 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ANUQFb0Tw9e3Zfgk084m5Fpb_MOXvjEGks5vYPsJgaJpZM4bmIbQ>
.
|
I agree that the bigger priority is enabling us to quickly load remote Jupyter notebooks (issue #5) but I'm stuck on that one waiting for meetings with various people so I thought I'd push ahead on this one. :) I didn't realize it was such a rare use case, though. |
I think 64 gb and say 4 cores should suffice. But my sense is that
probably we can set up a number of options for configuration and then
choose at the time when we create the machines?
…On Tue, Mar 19, 2019 at 3:58 PM Shauna ***@***.***> wrote:
@llorracc <https://github.com/llorracc> @mnwhite
<https://github.com/mnwhite>, when you get a chance could you clarify
what you need here? Specifically, what limitations are you currently
running up against in terms of specs, and what tools will you need to use
on such a machine? I can research the options and set something up for you,
but I need more info.
For instance, in Chris's email he said that the work he's doing runs on a
64gb machine memory issues -- is that a safe minimum to look for, and are
there any other requirements?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#22 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABQlf8_eZqV0jClOQgKhumF1bPRnBqdlks5vYPsJgaJpZM4bmIbQ>
.
--
- Chris Carroll
|
Just read Matt's comment -- yes, 64gb is not typical, but the point of
getting this in place is (at least partly) to handle not-typical cases.
Typical cases are ones that we can run on our normal machines without
needing AWS (or whatever).
On Tue, Mar 19, 2019 at 4:19 PM Carroll, Christopher <[email protected]>
wrote:
…
I think 64 gb and say 4 cores should suffice. But my sense is that
probably we can set up a number of options for configuration and then
choose at the time when we create the machines?
On Tue, Mar 19, 2019 at 3:58 PM Shauna ***@***.***> wrote:
> @llorracc <https://github.com/llorracc> @mnwhite
> <https://github.com/mnwhite>, when you get a chance could you clarify
> what you need here? Specifically, what limitations are you currently
> running up against in terms of specs, and what tools will you need to use
> on such a machine? I can research the options and set something up for you,
> but I need more info.
>
> For instance, in Chris's email he said that the work he's doing runs on a
> 64gb machine memory issues -- is that a safe minimum to look for, and are
> there any other requirements?
>
> —
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub
> <#22 (comment)>,
> or mute the thread
> <https://github.com/notifications/unsubscribe-auth/ABQlf8_eZqV0jClOQgKhumF1bPRnBqdlks5vYPsJgaJpZM4bmIbQ>
> .
>
--
- Chris Carroll
--
- Chris Carroll
|
Figure out how to allow, for core Econ-ARK contributors, access to much more powerful setups than the default mybinder type machine.
The text was updated successfully, but these errors were encountered: