We were tasked with creating a "simple" application to make predictions for the next/any given DFB match. The assignment was broken up into 3 parts for us:
- data acquisition
- gather the data required for this task
- data processing
- actually make the predictions using 2 different algorithms
- data visualisation
- present the data/results in a UI (no cheezy excel export sadly...)
As a first step we have to assess our options for each part of the project, for this purpose I have collected multiple options for each major part. All the library choices are related to our language of choice for this project: python3.
- env managment
- IDE/editor
- test automation
- linters (not necessarily mutually exclusive)
- published dataset
- scrape the net
- open api:
- neural networks
- not to sure what types would be applicable for this problem
- stochastics
- meta-project stuff:
- acqusition: open api via openligadb.de (docs)
- storage: database via sqlalchemy
- processing:
- algorithm 1: Poisson regression model
- algorithm 2: ...
- visualisation: native ui via tkinter (+ ttk)
As a start, in my opinion, it would very nice to split this whole project up into 4 epics:
- aquisition
- storage
- processing
- visualisation
Additionally it might be a good idea to start out simple and just create tickets as needed. But later on, when we get a grip on the project, we should have meeting to write all of the tickets needed.
For each step we would extend a cli, to make testing and commandline use possible before we have a UI ready.