This project is fork of Segment Anything WebUI which uses Segment Anything Model by Meta. The UI is based on Gradio.
- No bells and whistles: this version is stripped from video segmentation and text prompt segmentation. Focusing only on click-based interactive segmentation.
- Performance improvements: faster use of SAM predictor using these tricks:
- The model is loaded only ONCE per session reducing unnecessary I/O (this reduces annotation latency by ~2x.)
- SAM embeddings are calculated only ONCE per image (this reduces annotation latency by ~10x.)
- Cleaner codebase: the codebase is simplified and cleaned up. The codebase is also made more modular and easier to understand.
Following usage is running on your computer.
-
Install Segment Anything(more details about install Segment Anything:
$ pip install git+https://github.com/facebookresearch/segment-anything.git
-
git clone
this repository:$ git clone https://github.com/imadtoubal/sam-home.git $ cd sam-home
-
Make a new folder named
checkpoints
under this project,and put the downloaded weights files incheckpoints
。You can download the weights using following URLs:vit_h
: ViT-H SAM modelvit_l
: ViT-L SAM modelvit_b
: ViT-B SAM model
-
Run:
python app.py
Note: Default model is vit_b
,the demo can run on CPU. Default device is cpu
。
- Original repository: Segment Anything WebUI
- Thanks to the wonderful work Segment Anything and OWL-ViT