Skip to content

Commit 4e6ea30

Browse files
committed
Inicia reorganização de arquivos
A partir de agora teremos um pacote "covid19br", de onde os scripts importarão bibliotecas. Dessa forma ficará mais fácil reutilizar o código entre os scripts, e a biblioteca também poderá ser utilizada no app covid19 do Brasil.IO (toda a parte de demographics, epiweek etc. poderá ficar apenas nessa biblioteca).
1 parent 51e2676 commit 4e6ea30

30 files changed

+90
-102
lines changed

Makefile

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,8 @@ lint:
66
docker-build:
77
docker image build -t covid19-br .
88

9-
docker-collect:
10-
docker container run --rm --name covid19-br --volume $(PWD)/data/output:/app/data/output covid19-br ./collect.sh
9+
docker-run-spiders:
10+
docker container run --rm --name covid19-br --volume $(PWD)/data/output:/app/data/output covid19-br ./run-spiders.sh
1111

1212
docker-run: docker-build
1313
docker container run --rm --name covid19-br --volume $(PWD)/data/output:/app/data/output covid19-br ./run.sh

README.en.md

Lines changed: 9 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -98,22 +98,20 @@ Requires Python 3 (tested in 3.8.2). To set up your environment:
9898

9999
- Install Python 3.8.2
100100
- Create a virtualenv
101-
- Install the dependencies:
102-
- Consolidation script and bot: `pip install -r requirements.txt`
103-
- States data extractors: `pip install -r requirements-collect.txt`
104-
- Run the collect script: `./collect.sh`
101+
- Install the dependencies: `pip install -r requirements.txt`
102+
- Run the collect script: `./run-spiders.sh`
105103
- Run the consolidation script: `./run.sh`
106104

107105
Check the output in `data/output`.
108106

109107
### Docker
110108

111-
If you'd rather use Docker to execute, you just need to follow these steps:
112-
109+
If you'd rather use Docker to execute, you just need to follow these steps:
110+
113111
```shell
114-
make docker-build # to build the image
115-
make docker-collect # to collect the data
116-
make docker-run # to consolidate the data
112+
make docker-build # to build the image
113+
make docker-run-spiders # to collect the data
114+
make docker-run # to consolidate the data
117115
```
118116

119117
## SEE ALSO
@@ -129,7 +127,7 @@ Wanna see which projects and news are using our data? [See the clipping](clippin
129127

130128
## Data Update on Brasil.IO
131129

132-
Create a `.env` file with the correct values to the following environment variables:
130+
Create a `.env` file with the correct values to the following environment variables:
133131

134132
```shell
135133
BRASILIO_SSH_USER
@@ -146,4 +144,4 @@ It will collect the data from the sheets (that are linked in
146144
`data/boletim_url.csv` and `data/caso_url.csv`), add the data to the repository, compact them, send them to the server, and execute the dataset update command.
147145

148146
> Note: the script that automatically downloads and converts data must
149-
> be executed separately, with the command `./collect.sh`.
147+
> be executed separately, with the command `./run-spiders.sh`.

README.md

Lines changed: 6 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -123,10 +123,8 @@ Necessita de Python 3 (testado em 3.8.2). Para montar seu ambiente:
123123

124124
- Instale o Python 3.8.2
125125
- Crie um virtualenv
126-
- Instale as dependências:
127-
- Script de consolidação e robô: `pip install -r requirements.txt`
128-
- Extratores de dados estaduais: `pip install -r requirements-collect.txt`
129-
- Rode o script de coleta: `./collect.sh`
126+
- Instale as dependências: `pip install -r requirements.txt`
127+
- Rode o script de coleta: `./run-spiders.sh`
130128
- Rode o script de consolidação: `./run.sh`
131129

132130
Verifique o resultado em `data/output`.
@@ -136,9 +134,9 @@ Verifique o resultado em `data/output`.
136134
Se você preferir utilizar o Docker para executar, basta usar os comandos a seguir :
137135

138136
```shell
139-
make docker-build # para construir a imagem
140-
make docker-collect # para coletar os dados
141-
make docker-run # para consolidar os dados
137+
make docker-build # para construir a imagem
138+
make docker-run-spiders # para coletar os dados
139+
make docker-run # para consolidar os dados
142140
```
143141

144142
## VEJA TAMBÉM
@@ -176,4 +174,4 @@ repositório, compactá-los, enviá-los ao servidor e executar o comando de
176174
atualização de dataset.
177175

178176
> Nota: o script que baixa e converte os dados automaticamente deve ser
179-
> executado separadamente, com o comando `./collect.sh`.
177+
> executado separadamente, com o comando `./run-spiders.sh`.

consolida.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,8 @@
99
import scrapy
1010
from scrapy.exceptions import CloseSpider
1111

12-
import converters
13-
import demographics
12+
from covid19br import converters
13+
from covid19br import demographics
1414

1515
DATA_PATH = Path(__file__).absolute().parent / "data"
1616
ERROR_PATH = DATA_PATH / "error"

covid19br/__init__.py

Whitespace-only changes.

converters.py renamed to covid19br/converters.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33

44
import rows
55

6-
import demographics
6+
from . import demographics
77

88

99
def extract_boletim(state, data):

demographics.py renamed to covid19br/demographics.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@
55
import rows
66
from rows.utils import load_schema
77

8+
# TODO: use pkg_resources to discover path
89
DATA_PATH = Path(__file__).parent / "data"
910
SCHEMA_PATH = Path(__file__).parent / "schema"
1011
POPULATION_DATA_PATH = {

0 commit comments

Comments
 (0)