Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This is an example of what's required to set up a new scraper.
Read more about why this data can be useful and how it can be made available here:
https://codeforkentuckiana.org/2019-12-18-power-utility-data/
If this repo is cloned,
lgeku_scraper.py
can be replaced with another location-specific scraper, like the one in this PR.instance_id
andview_id
come from the outage map's HTML:https://kubra.io/stormcenter/views/66f63a73-3b4a-4b2a-a833-f01668ef4986
which is iframed in:
https://www.pepco.com/Outages/CheckOutageStatus/Pages/ViewOutageMap.aspx
owner
andrepo
need to be set to the repo where the outage JSON should be written.The only other thing to do is to add a GitHub token to the cloned repo's Actions so that it can write to the destination repo.
The action should have been copied over and should be picked up and execute an Action every 15 minutes, scraping and saving the current outages.