Skip to content

Commit 2daeae1

Browse files
authored
chore: add BentoML citation (bentoml#4019)
1 parent f5bf568 commit 2daeae1

File tree

2 files changed

+147
-46
lines changed

2 files changed

+147
-46
lines changed

CITATION.cff

Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,47 @@
1+
cff-version: 1.2.0
2+
title: 'BentoML: The framework for building reliable, scalable and cost-efficient AI application'
3+
message: >-
4+
If you use this software, please cite it using these
5+
metadata.
6+
type: software
7+
authors:
8+
- given-names: Chaoyu
9+
family-names: Yang
10+
11+
- given-names: Sean
12+
family-names: Sheng
13+
14+
- given-names: Aaron
15+
family-names: Pham
16+
17+
orcid: 'https://orcid.org/0009-0008-3180-5115'
18+
- given-names: Shenyang
19+
family-names: ' Zhao'
20+
21+
- given-names: Sauyon
22+
family-names: Lee
23+
24+
- given-names: Bo
25+
family-names: Jiang
26+
27+
- given-names: Fog
28+
family-names: Dong
29+
30+
- given-names: Xipeng
31+
family-names: Guan
32+
33+
- given-names: Frost
34+
family-names: Ming
35+
36+
repository-code: 'https://github.com/bentoml/bentoml'
37+
url: 'https://bentoml.com/'
38+
keywords:
39+
- MLOps
40+
- LLMOps
41+
- LLM
42+
- Infrastructure
43+
- BentoML
44+
- LLM Serving
45+
- Model Serving
46+
- Serverless Deployment
47+
license: Apache-2.0

README.md

Lines changed: 100 additions & 46 deletions
Original file line numberDiff line numberDiff line change
@@ -11,56 +11,84 @@ packaging, and production deployment.</p>
1111
<i><a href="https://l.bentoml.com/join-slack">👉 Join our Slack community!</a></i>
1212
</div>
1313

14-
1514
# Highlights
1615

17-
1816
### 🍱 Bento is the container for AI apps
1917

20-
* Open standard and SDK for AI apps, pack your code, inference pipelines, model files, dependencies, and runtime configurations in a [Bento](https://docs.bentoml.com/en/latest/concepts/bento.html).
21-
* Auto-generate API servers, supporting REST API, gRPC, and long-running inference jobs.
22-
* Auto-generate Docker container images.
18+
- Open standard and SDK for AI apps, pack your code, inference pipelines, model
19+
files, dependencies, and runtime configurations in a
20+
[Bento](https://docs.bentoml.com/en/latest/concepts/bento.html).
21+
- Auto-generate API servers, supporting REST API, gRPC, and long-running
22+
inference jobs.
23+
- Auto-generate Docker container images.
2324

2425
### 🏄 Freedom to build with any AI models
2526

26-
* Import from any model hub or bring your own models built with frameworks like PyTorch, TensorFlow, Keras, Scikit-Learn, XGBoost and [many more](https://docs.bentoml.com/en/latest/frameworks/index.html).
27-
* Native support for [LLM inference](https://github.com/bentoml/openllm/#bentoml), [generative AI](https://github.com/bentoml/stable-diffusion-bentoml), [embedding creation](https://github.com/bentoml/CLIP-API-service), and [multi-modal AI apps](https://github.com/bentoml/Distributed-Visual-ChatGPT).
28-
* Run and debug your BentoML apps locally on Mac, Windows, or Linux.
27+
- Import from any model hub or bring your own models built with frameworks like
28+
PyTorch, TensorFlow, Keras, Scikit-Learn, XGBoost and
29+
[many more](https://docs.bentoml.com/en/latest/frameworks/index.html).
30+
- Native support for
31+
[LLM inference](https://github.com/bentoml/openllm/#bentoml),
32+
[generative AI](https://github.com/bentoml/stable-diffusion-bentoml),
33+
[embedding creation](https://github.com/bentoml/CLIP-API-service), and
34+
[multi-modal AI apps](https://github.com/bentoml/Distributed-Visual-ChatGPT).
35+
- Run and debug your BentoML apps locally on Mac, Windows, or Linux.
2936

3037
### 🍭 Simplify modern AI application architecture
3138

32-
* Python-first! Effortlessly scale complex AI workloads.
33-
* Enable GPU inference [without the headache](https://docs.bentoml.com/en/latest/guides/gpu.html).
34-
* [Compose multiple models](https://docs.bentoml.com/en/latest/guides/graph.html) to run concurrently or sequentially, over [multiple GPUs](https://docs.bentoml.com/en/latest/guides/scheduling.html) or [on a Kubernetes Cluster](https://github.com/bentoml/yatai).
35-
* Natively integrates with [MLFlow](https://docs.bentoml.com/en/latest/integrations/mlflow.html), [LangChain](https://github.com/ssheng/BentoChain), [Kubeflow](https://www.kubeflow.org/docs/external-add-ons/serving/bentoml/), [Triton](https://docs.bentoml.com/en/latest/integrations/triton.html), [Spark](https://docs.bentoml.com/en/latest/integrations/spark.html), [Ray](https://docs.bentoml.com/en/latest/integrations/ray.html), and many more to complete your production AI stack.
36-
39+
- Python-first! Effortlessly scale complex AI workloads.
40+
- Enable GPU inference
41+
[without the headache](https://docs.bentoml.com/en/latest/guides/gpu.html).
42+
- [Compose multiple models](https://docs.bentoml.com/en/latest/guides/graph.html)
43+
to run concurrently or sequentially, over
44+
[multiple GPUs](https://docs.bentoml.com/en/latest/guides/scheduling.html) or
45+
[on a Kubernetes Cluster](https://github.com/bentoml/yatai).
46+
- Natively integrates with
47+
[MLFlow](https://docs.bentoml.com/en/latest/integrations/mlflow.html),
48+
[LangChain](https://github.com/ssheng/BentoChain),
49+
[Kubeflow](https://www.kubeflow.org/docs/external-add-ons/serving/bentoml/),
50+
[Triton](https://docs.bentoml.com/en/latest/integrations/triton.html),
51+
[Spark](https://docs.bentoml.com/en/latest/integrations/spark.html),
52+
[Ray](https://docs.bentoml.com/en/latest/integrations/ray.html), and many more
53+
to complete your production AI stack.
3754

3855
### 🚀 Deploy Anywhere
3956

40-
* One-click deployment to [☁️ BentoCloud](https://bentoml.com/cloud), the Serverless platform made for hosting and operating AI apps.
41-
* Scalable BentoML deployment with [🦄️ Yatai](https://github.com/bentoml/yatai) on Kubernetes.
42-
* Deploy auto-generated container images anywhere docker runs.
43-
57+
- One-click deployment to [☁️ BentoCloud](https://bentoml.com/cloud), the
58+
Serverless platform made for hosting and operating AI apps.
59+
- Scalable BentoML deployment with [🦄️ Yatai](https://github.com/bentoml/yatai)
60+
on Kubernetes.
61+
- Deploy auto-generated container images anywhere docker runs.
4462

4563
# Documentation
4664

47-
* Installation: `pip install bentoml`
48-
* Full Documentation: [docs.bentoml.com](https://docs.bentoml.com/en/latest/)
49-
* Tutorial: [Intro to BentoML](https://docs.bentoml.com/en/latest/tutorial.html)
65+
- Installation: `pip install bentoml`
66+
- Full Documentation: [docs.bentoml.com](https://docs.bentoml.com/en/latest/)
67+
- Tutorial: [Intro to BentoML](https://docs.bentoml.com/en/latest/tutorial.html)
5068

5169
### 🛠️ What you can build with BentoML
5270

53-
* [OpenLLM](https://github.com/bentoml/OpenLLM) - An open platform for operating large language models (LLMs) in production.
54-
* [StableDiffusion](https://github.com/bentoml/stable-diffusion-bentoml) - Create your own text-to-image service with any diffusion models.
55-
* [CLIP-API-service](https://github.com/bentoml/CLIP-API-service) - Embed images and sentences, object recognition, visual reasoning, image classification, and reverse image search.
56-
* [Transformer NLP Service](https://github.com/bentoml/transformers-nlp-service) - Online inference API for Transformer NLP models.
57-
* [Distributed TaskMatrix(Visual ChatGPT)](https://github.com/bentoml/Distributed-Visual-ChatGPT) - Scalable deployment for TaskMatrix from Microsoft.
58-
* [Fraud Detection](https://github.com/bentoml/Fraud-Detection-Model-Serving) - Online model serving with custom XGBoost model.
59-
* [OCR as a Service](https://github.com/bentoml/OCR-as-a-Service) - Turn any OCR models into online inference API endpoints.
60-
* [Replace Anything](https://github.com/yuqwu/Replace-Anything) - Combining the power of Segment Anything and Stable Diffusion.
61-
* [DeepFloyd IF Multi-GPU serving](https://github.com/bentoml/IF-multi-GPUs-demo) - Serve IF models easily across multiple GPUs
62-
* Check out more examples [here](https://github.com/bentoml/BentoML/tree/main/examples).
63-
71+
- [OpenLLM](https://github.com/bentoml/OpenLLM) - An open platform for operating
72+
large language models (LLMs) in production.
73+
- [StableDiffusion](https://github.com/bentoml/stable-diffusion-bentoml) -
74+
Create your own text-to-image service with any diffusion models.
75+
- [CLIP-API-service](https://github.com/bentoml/CLIP-API-service) - Embed images
76+
and sentences, object recognition, visual reasoning, image classification, and
77+
reverse image search.
78+
- [Transformer NLP Service](https://github.com/bentoml/transformers-nlp-service) -
79+
Online inference API for Transformer NLP models.
80+
- [Distributed TaskMatrix(Visual ChatGPT)](https://github.com/bentoml/Distributed-Visual-ChatGPT) -
81+
Scalable deployment for TaskMatrix from Microsoft.
82+
- [Fraud Detection](https://github.com/bentoml/Fraud-Detection-Model-Serving) -
83+
Online model serving with custom XGBoost model.
84+
- [OCR as a Service](https://github.com/bentoml/OCR-as-a-Service) - Turn any OCR
85+
models into online inference API endpoints.
86+
- [Replace Anything](https://github.com/yuqwu/Replace-Anything) - Combining the
87+
power of Segment Anything and Stable Diffusion.
88+
- [DeepFloyd IF Multi-GPU serving](https://github.com/bentoml/IF-multi-GPUs-demo) -
89+
Serve IF models easily across multiple GPUs
90+
- Check out more examples
91+
[here](https://github.com/bentoml/BentoML/tree/main/examples).
6492

6593
# Getting Started
6694

@@ -119,7 +147,8 @@ $ curl -X POST -H "Content-Type: text/plain" --data "BentoML is awesome" http://
119147
{"label":"POSITIVE","score":0.9129443168640137}%
120148
```
121149

122-
Define how a [Bento](https://docs.bentoml.com/en/latest/concepts/bento.html) can be built for deployment, with `bentofile.yaml`:
150+
Define how a [Bento](https://docs.bentoml.com/en/latest/concepts/bento.html) can
151+
be built for deployment, with `bentofile.yaml`:
123152

124153
```yaml
125154
service: 'service.py:svc'
@@ -151,27 +180,37 @@ Successfully built Bento container for "text-classification-svc" with tag(s) "te
151180
$ docker run -p 3000:3000 text-classification-svc:mc322vaubkuapuqj
152181
```
153182

154-
For a more detailed user guide, check out the [BentoML Tutorial](https://docs.bentoml.com/en/latest/tutorial.html).
183+
For a more detailed user guide, check out the
184+
[BentoML Tutorial](https://docs.bentoml.com/en/latest/tutorial.html).
155185

156186
---
157187

158188
## Community
159189

160-
BentoML supports billions of model runs per day and is used by thousands of organizations around the globe.
190+
BentoML supports billions of model runs per day and is used by thousands of
191+
organizations around the globe.
161192

162-
Join our [Community Slack 💬](https://l.bentoml.com/join-slack), where thousands of AI application developers contribute to the project and help each other.
193+
Join our [Community Slack 💬](https://l.bentoml.com/join-slack), where thousands
194+
of AI application developers contribute to the project and help each other.
163195

164-
To report a bug or suggest a feature request, use [GitHub Issues](https://github.com/bentoml/BentoML/issues/new/choose).
196+
To report a bug or suggest a feature request, use
197+
[GitHub Issues](https://github.com/bentoml/BentoML/issues/new/choose).
165198

166199
## Contributing
167200

168201
There are many ways to contribute to the project:
169202

170-
* Report bugs and "Thumbs up" on issues that are relevant to you.
171-
* Investigate issues and review other developers' pull requests.
172-
* Contribute code or documentation to the project by submitting a GitHub pull request.
173-
* Check out the [Contributing Guide](https://github.com/bentoml/BentoML/blob/main/CONTRIBUTING.md) and [Development Guide](https://github.com/bentoml/BentoML/blob/main/DEVELOPMENT.md) to learn more
174-
* Share your feedback and discuss roadmap plans in the `#bentoml-contributors` channel [here](https://l.bentoml.com/join-slack).
203+
- Report bugs and "Thumbs up" on issues that are relevant to you.
204+
- Investigate issues and review other developers' pull requests.
205+
- Contribute code or documentation to the project by submitting a GitHub pull
206+
request.
207+
- Check out the
208+
[Contributing Guide](https://github.com/bentoml/BentoML/blob/main/CONTRIBUTING.md)
209+
and
210+
[Development Guide](https://github.com/bentoml/BentoML/blob/main/DEVELOPMENT.md)
211+
to learn more
212+
- Share your feedback and discuss roadmap plans in the `#bentoml-contributors`
213+
channel [here](https://l.bentoml.com/join-slack).
175214

176215
Thanks to all of our amazing contributors!
177216

@@ -183,11 +222,12 @@ Thanks to all of our amazing contributors!
183222

184223
### Usage Reporting
185224

186-
BentoML collects usage data that helps our team to improve the product.
187-
Only BentoML's internal API calls are being reported. We strip out as much potentially
188-
sensitive information as possible, and we will never collect user code, model data, model names, or stack traces.
189-
Here's the [code](./src/bentoml/_internal/utils/analytics/usage_stats.py) for usage tracking.
190-
You can opt-out of usage tracking by the `--do-not-track` CLI option:
225+
BentoML collects usage data that helps our team to improve the product. Only
226+
BentoML's internal API calls are being reported. We strip out as much
227+
potentially sensitive information as possible, and we will never collect user
228+
code, model data, model names, or stack traces. Here's the
229+
[code](./src/bentoml/_internal/utils/analytics/usage_stats.py) for usage
230+
tracking. You can opt-out of usage tracking by the `--do-not-track` CLI option:
191231

192232
```bash
193233
bentoml [command] --do-not-track
@@ -206,3 +246,17 @@ export BENTOML_DO_NOT_TRACK=True
206246
[Apache License 2.0](https://github.com/bentoml/BentoML/blob/main/LICENSE)
207247

208248
[![FOSSA Status](https://app.fossa.com/api/projects/git%2Bgithub.com%2Fbentoml%2FBentoML.svg?type=small)](https://app.fossa.com/projects/git%2Bgithub.com%2Fbentoml%2FBentoML?ref=badge_small)
249+
250+
### Citation
251+
252+
If you use BentoML in your research, please cite using the following
253+
[citation](./CITATION.cff:
254+
255+
```bibtex
256+
@software{Yang_BentoML_The_framework,
257+
author = {Yang, Chaoyu and Sheng, Sean and Pham, Aaron and Zhao, Shenyang and Lee, Sauyon and Jiang, Bo and Dong, Fog and Guan, Xipeng and Ming, Frost},
258+
license = {Apache-2.0},
259+
title = {{BentoML: The framework for building reliable, scalable and cost-efficient AI application}},
260+
url = {https://github.com/bentoml/bentoml}
261+
}
262+
```

0 commit comments

Comments
 (0)