Releases: Garmelon/PFERD
v3.3.0
Version 3.3.0 - 2022-01-09 Added - A KIT IPD crawler - Support for ILIAS cards - (Rudimentary) support for content pages - Support for multi-stream videos - Support for ILIAS 7 Removed - [Interpolation](https://docs.python.org/3/library/configparser.html#interpolation-of-values) in config file Fixed - Crawling of recursive courses - Crawling files directly placed on the personal desktop - Ignore timestamps at the unix epoch as they crash on windows
v3.2.0
Version 3.2.0 - 2021-08-04 Added - `--skip` command line option - Support for ILIAS booking objects Changed - Using multiple path segments on left side of `-name->` now results in an error. This was already forbidden by the documentation but silently accepted by PFERD. - More consistent path printing in some `--explain` messages Fixed - Nondeterministic name deduplication due to ILIAS reordering elements - More exceptions are handled properly
v3.1.0
Version 3.1.0 - 2021-06-13 If your config file doesn't do weird things with transforms, it should continue to work. If your `-re->` arrows behave weirdly, try replacing them with `-exact-re->` arrows. If you're on Windows, you might need to switch from `\` path separators to `/` in your regex rules. Added - `skip` option for crawlers - Rules with `>>` instead of `>` as arrow head - `-exact-re->` arrow (behaves like `-re->` did previously) Changed - The `-re->` arrow can now rename directories (like `-->`) - Use `/` instead of `\` as path separator for (regex) rules on Windows - Use the label to the left for exercises instead of the button name to determine the folder name Fixed - Video pagination handling in ILIAS crawler
v3.0.1
Version 3.0.1 - 2021-06-01 Added - `credential-file` authenticator - `--credential-file` option for `kit-ilias-web` command - Warning if using concurrent tasks with `kit-ilias-web` Changed - Cookies are now stored in a text-based format Fixed - Date parsing now also works correctly in non-group exercises
v3.0.0
Version 3.0.0 - 2021-05-31 Added - Proper config files - Concurrent crawling - Crawl external ILIAS links - Crawl uploaded exercise solutions - Explain what PFERD is doing and why (`--explain`) - More control over output (`--status`, `--report`) - Debug transform rules with `--debug-transforms` - Print report after exiting via Ctrl+C - Store crawler reports in `.report` JSON file - Extensive config file documentation (`CONFIG.md`) - Documentation for developers (`DEV.md`) - This changelog Changed - Rewrote almost everything - Better error messages - Redesigned CLI - Redesigned transform rules - ILIAS crawling logic (paths may be different) - Better support for weird paths on Windows - Set user agent (`PFERD/<version>`) Removed - Backwards compatibility with 2.x - Python files as config files - Some types of crawlers
v2.6.2
Download the correct sync_url for your platform and run it in the terminal or CMD. You might need to make it executable on Linux/Mac with chmod +x <file>
. Also please enclose the url you pass to the program in double quotes or your shell might silently screw it up!
v2.6.1
Download the correct sync_url for your platform and run it in the terminal or CMD. You might need to make it executable on Linux/Mac with chmod +x <file>
. Also please enclose the url you pass to the program in double quotes or your shell might silently screw it up!
v2.6.0
Download the correct sync_url for your platform and run it in the terminal or CMD. You might need to make it executable on Linux/Mac with chmod +x <file>
. Also please enclose the url you pass to the program in double quotes or your shell might silently screw it up!
v2.5.4
Download the correct sync_url for your platform and run it in the terminal or CMD. You might need to make it executable on Linux/Mac with chmod +x <file>
. Also please enclose the url you pass to the program in double quotes or your shell might silently screw it up!
v2.5.3
Download the correct sync_url for your platform and run it in the terminal or CMD. You might need to make it executable on Linux/Mac with chmod +x <file>
. Also please enclose the url you pass to the program in double quotes or your shell might silently screw it up!