We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Please, could you give an example show how to use multi-gpu flask service streamer with gunicorn?
I write these code, but they are not available:
# coding=utf-8 from gevent import monkey; monkey.patch_all() from flask_multigpu_example import app def post_fork(server, worker): from service_streamer import RedisStreamer, Streamer import flask_multigpu_example from bert_model import ManagedBertModel, TextInfillingModel as Model flask_multigpu_example.streamer = Streamer(ManagedBertModel, batch_size=64, max_latency=0.1, worker_num=4, cuda_devices=(0, 1, 2, 3)) model = Model() bind = '0.0.0.0:5005' workers = 4 worker_class = 'gunicorn.workers.ggevent.GeventWorker' proc_name = "redis_streamer"
The text was updated successfully, but these errors were encountered:
@miangangzhen 后来解决了吗? 如果 redis stream+ flask_multigpu_example 例子在readme里能够更加详尽就更好? 这个例子有点复杂,好多人不是很理解 @Meteorix 多谢。
Sorry, something went wrong.
No branches or pull requests
Please, could you give an example show how to use multi-gpu flask service streamer with gunicorn?
I write these code, but they are not available:
The text was updated successfully, but these errors were encountered: