Releases: roboflow/inference
v0.9.4
Summary
This release includes new logic to validate models on load. This mitigates an issue seen when the model artifacts are corrupted during download.
v0.9.3
v0.9.2
Summary
- Bugfix parsing base64 image string when source is browser (was adding unnecessary prefix)
- Validate that equal or fewer than MAX_BATCH_SIZE images are being passed to object detection inference
- Default MAX_BATCH_SIZE to infinity
- Add batch regression tests
- Add CLI to readme
- Add generic stream object
- Add preprocess/predict/postprocess to Clip to match base interface
- Readme updates
- Landing page
v0.9.1
Summary
This release includes a new stream interface, making it easy to run latency optimized inference and run custom callbacks (documentation coming soon)
v0.9.0
Summary
This release includes:
- The new
inference-cli
to make starting the inference server easy and automated - A new
inference-client
to as a helpful utility when interacting with theinference
HTTP API - Updates and added features to the Device Manager (enterprise feature)
- Unified model APIs so that all Roboflow models adhere to a consistent processing pipeline
- Bug fixes, maintenance
Breaking Changes:
- Some model APIs have been updated (see instance segmentation and classification)
v0.8.9
Summary
This release includes a new env var DISABLE_INFERENCE_CACHE
. When set to true
, internal inference caching will be disabled. Also, logging has been updated to be less verbose by default. To increase verbosity, set LOG_LEVEL=DEBUG
.
v0.8.8
Summary
Contains a fix in imread/imdecode logic. Also moves logic out of version.py
to fix github actions.
v0.8.7
Summary
- Abandons Pillow in favor of OpenCV for faster end to end processing
- Fixes a bug with new device management logic
- Upgrades version checking logic
- Adds env var to fix Jetson 5.1.1 images
v0.8.6
Summary
This release includes logic to detect and log if there is a newer release available. It also contains a new enterprise device manager.
v0.8.5
Summary
Contains bug fixes for configurations that use the LICENSE_SERVER
setting.