Issues: kserve/kserve
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
AttributeError: 'Deployment' object has no attribute 'deploy'
#3661
opened May 1, 2024 by
SagyHarpazGong
Add Modelcars as initContainer with restartPolicy == Always (optional)
kind/feature
#3646
opened Apr 29, 2024 by
rhuss
Merge responses from InferenceGraph Sequence node steps
kind/feature
#3639
opened Apr 27, 2024 by
asd981256
Autoscaling with multiple metrics does not work
kind/feature
#3638
opened Apr 26, 2024 by
shazinahmed
logger not surfacing the error when failed to send cloud event
kind/bug
#3637
opened Apr 26, 2024 by
yuzisun
Any specific optimization did in kserve to support LLM inference?
kind/question
#3623
opened Apr 22, 2024 by
Jeffwan
Upgrade from v0.10 to v0.11 breaks the InferenceService at routing level
kind/bug
#3611
opened Apr 16, 2024 by
bgalvao
Support overriding model mount path in model server container
kind/feature
#3606
opened Apr 15, 2024 by
cmaddalozzo
Old Revisions of Inference Service not Scaled Down
kind/bug
#3591
opened Apr 10, 2024 by
ksgnextuple
Add parameter in ModelMetadataResponse in v2 (aka open inference) protocol
kind/feature
#3574
opened Apr 3, 2024 by
harshita-meena
Support text embedding task in huggingface server
kind/feature
#3572
opened Apr 3, 2024 by
kevinmingtarja
Previous Next
ProTip!
Mix and match filters to narrow down what you’re looking for.