Scrapes the stock information off the YahooAPI and StockAnalysis.com.
-
Supports SQLITE and PostgresQL database and allows exporting to CSVs
-
The Scraping Job runs every 6 hours here on github actions, saves to remote postgres and sqlite databases.
-
The postgresql database is hosted live at custom azure vps @ v.hemantasharma.com.np
-
You can obtain the build artifacts of
data.csv
anddatabase.sqlite
in releases section.
- Clone this repository
cd Stock-Scraper
python -m venv .venv
source .venv/bin/activate
python -m pip install -r requirements.txt
playwright install
python main.py
Sqlite db output will be created at data directory in current folder
- Install postgres client and server:
sudo apt-get install postgresql postgresql-contrib postgresql-client
- Start the postgres service
sudo service postgresql start
- Login to the psql shell
sudo -i -u postgres psql
// psql postgres for macos systems
- Now create a user and password
CREATE USER sammy WITH PASSWORD 'password';
NOTE: Don't forget the ;
semicolon, You should see the output CREATE ROLE
- And create a database using created user account
CREATE DATABASE sammydb OWNER sammy;
- Quit the psql shell
\q
- You can access created database with created user by,
psql -U name_of_user -d name_of_database
- Populate the
.env
file, Fill the dbname, username and password according to your above steps.
POSTGRES_HOST=localhost
POSTGRES_PORT=5432
POSTGRES_DB=yourDbName
POSTGRES_USER=yourUserName
POSTGRES_PASSWORD=yourPassword
- Run the script again.
python main.py
- Exporting Postgres to csv
python export.py
CSV file will be created inside data folder.