-
We're using dagster serverless. I developed an asset pipeline locally and got everything working. I have a few assets that return some small simple Python classes that are then used as inputs in other assets. These assets are just pointers to Snowflake objects. Locally, I could materialize the assets independently without issue. However, in serverless, it looks like the materialized assets are not accessible because they're pickled in a What is the right/easy way to persist these simple objects in serverless? Do I need to set up a cloud storage integration or is there something easier for small materialized assets? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 13 replies
-
Are you binding an IO manager explicitly? The IO manager in serverless should default to a cloud-backed serverless specific IO manager rather than the local filesystem-based one. If you are explicitly binding an IO manager to your jobs, you may want to bind it to the following:
|
Beta Was this translation helpful? Give feedback.
Oh hey! I was was changing too many things at once. I think I've finally pinpointed the issue. The dbt cloud thing was just a red herring.
It was just the fact that I didn't have the
boto3
library. If I reset back to where I started and just add that package in my requirements file, it correctly uses the local filesystem io manager when run locally, and the serverless one when run in serverless.I guess the only thing here to do is that maybe
boto3
should be an explicit dependency of the dagster-cloud package.Thanks for your help @prha !