-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow on-demand Scraper usage #214
Comments
4 tasks
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The end goal: a magic button for any Scraper that says something like
Run Scraper Locally
. When this button is clicked, the user needs to do as little as possible for the scraper to run and give them an Extraction. Allows a user to both donate compute time to PDAP and run scrapers for their own benefit.If we have a Scraper written for a Data Source, and we've created an Archive of the Data Source, we should allow people to run that Scraper locally on demand. They will use their own compute power.
Can we write a package or plugin that lets anyone run our scrapers in-browser?
This is achieved adding things to the existing PDAP-app repo and probably deploying it to app.pdap.io or a local version.
This may be some kind of Docker file.
The package should include all necessary dependencies.
It could include a local version of data sources search
Users should be able to "Run Scraper Locally" on any Dataset they find that has a Scraper.
The Extractions should be saved locally.
The text was updated successfully, but these errors were encountered: