This repo contains the code to implement,
- SAM
- fastSAM
- fastSAM-s models for box detection tasks.
Currently fastSAM.ipynb is partially ready for public consumption
- Google Colab
- Local Environment
- Make sure you have python version >=3.10
python3 --version
- Create a python virtual environment (prefereably in the working directory)
python3 -m venv .venv
- Activate the python environment according to your shell
- Install pip dependencies
pip install -r requirements.txt
- Run The Notebook in VSCode with Python and Jupyter Extensions or In the Jupyter Environment
- Set the
INITIALIZED
variable accordingly
- Make sure you have python version >=3.10
WIP - Detailed Documentation at SAM Documentation
- When you are running on Google Colab modify the COLAB and INITIALIZED variables accordingly then you can execute run all 🥂
-
There seems to be an error in tkinter import resolving in venv of a python3.11 version, use python3.8 -> 3.10 to resolve this issue. - Use python version 3.10 or 3.11 for maximum compatibility
pip install -r requirements.txt
will take a long time at first in local testing depending on the python version you choose. Sit back and have a coffee.
source : https://pypi.org/project/segment-anything-fast/
Two model versions of the model are available with different sizes. Click the links below to download the checkpoint for the corresponding model type.
default
orFastSAM
: YOLOv8x based Segment Anything Model | Baidu Cloud (pwd: 0000).- FastSAM-s: YOLOv8s based Segment Anything Model.
- Compare and contrast SAM and fastSAM models
- Model Prompting vs Predicting on SAM and fastSAM
- Exception handling and assertions: null mask handling in python scripts
- Review the literature and documentation for the models to get a better understanding of featureset and interoperation
- Include logic to look on the 8 corners when nothing found on the center
- Integrating Scores, Confidence and Thresholds for the models
- Improve Mask Drawing section of plt_images function
- Time Profiling for comparisions and benchmarks
- Integrate SAM into fastSAM notebook for comparision
- Simplify dependency graph
- Simplify code logic to recreate the results on all platforms
- [Sasika] Add licence information of models
- [Sasika] Add references of models and repos
- [Sasika] Move all dataset and weights hosting to github and huggingface
- [Sasika] Implement plot with score
- Implement Python scripts for demonstration
- Switch to np.random for all random generation
- [Sasika] Implement model tuning controls to adjust detection
- [Sasika] Integration with webcam
- Move Nix configuration to separate branch
- Roadmap to detect and mask other categories (Humans, Vehicles)
- Bin Picking literary review
- Integrating Model Coordinates with Kuka Arm and ROS
- Integrate with common script to switch and test model performance at will